[gmx-users] MPI error in gromacs 4.6, more Errors
Justin Lemkul
jalemkul at vt.edu
Mon Mar 24 13:14:05 CET 2014
On 3/24/14, 7:57 AM, Ankita Naithani wrote:
> Hi, so I modified my mdp file which now looks like the following:
>
> title = production MD
> ; Run parameters
> integrator = md ; leap-frog algorithm
> ;nsteps = 20000000 ; 0.005 * 20000000 = 100000 ps or 100 ns
> ;nsteps = 200000 ; 0.005 * 200000 = 1 ns
> ;dt = 0.005 ; 5 fs
> nsteps = 33333333 ; 0.003 * 33333333 = 100000 ps or 100 n
> dt = 0.003 ; 3 fs
> ; Output control
> nstxout = 0 ; save coordinates every 2 ps
> nstvout = 0 ; save velocities every 2 ps
> nstxtcout = 1000 ; xtc compressed trajectory output every 5 ps
> nstenergy = 1000 ; save energies every 5 ps
> nstlog = 1000 ; update log file every 5 ps
> ; Bond parameters
> constraint_algorithm = lincs ; holonomic constraints
> constraints = all-bonds ; all bonds (even heavy atom-H bonds)
> constrained
> lincs_iter = 1 ; accuracy of LINCS
> lincs_order = 4 ; also related to accuracy
> ; Neighborsearching
> ns_type = grid ; search neighboring grid cells
> nstlist = 5 ; 25 fs
> rlist = 1.0 ; short-range neighborlist cutoff (in nm)
> rcoulomb = 1.0 ; short-range electrostatic cutoff (in nm)
> rvdw = 1.0 ; short-range van der Waals cutoff (in nm)
> rlistlong = 1.0 ; long-range neighborlist cutoff (in nm)
> cutoff-scheme = Verlet
> ; Electrostatics
> coulombtype = PME ; Particle Mesh Ewald for long-range
> electrostatics
> pme_order = 4 ; cubic interpolation
> fourierspacing = 0.16 ; grid spacing for FFT
> nstcomm = 10 ; remove com every 10 steps
> ; Temperature coupling is on
> tcoupl = V-rescale ; modified Berendsen thermostat
> tc-grps = Protein Non-Protein ; two coupling groups - more
> accurate
> tau_t = 0.1 0.1 ; time constant, in ps
> ref_t = 318 318 ; reference temperature, one for each group,
> in K
> ; Pressure coupling is off
> pcoupl = berendsen ; Berendsen thermostat
> pcoupltype = isotropic ; uniform scaling of box vectors
> tau_p = 1.0 ; time constant, in ps
> ref_p = 1.0 ; reference pressure, in bar
> compressibility = 4.5e-5 ; isothermal compressibility of water, bar^-1
> ; Periodic boundary conditions
> pbc = xyz ; 3-D PBC
> ; Dispersion correction
> DispCorr = EnerPres ; account for cut-off vdW scheme
> ; Velocity generation
> gen_vel = yes ; Velocity generation is on
> gen_temp = 318 ; reference temperature, for protein in K
> --------------------------------------------------
>
>
> But, when I try to generate the tpr file on the cluster itself using
> gromacs 4.6.3, I get the following error:
>
>
> NOTE 1 [file md3.mdp]:
> With Verlet lists the optimal nstlist is >= 10, with GPUs >= 20. Note
> that with the Verlet scheme, nstlist has no effect on the accuracy of
> your simulation.
>
>
> NOTE 2 [file md3.mdp]:
> nstcomm < nstcalcenergy defeats the purpose of nstcalcenergy, setting
> nstcomm to nstcalcenergy
>
> Generated 3403 of the 3403 non-bonded parameter combinations
> Generating 1-4 interactions: fudge = 0.5
> Generated 3403 of the 3403 1-4 parameter combinations
> Segmentation fault
>
> Can anyone please suggest further?
>
Do as the notes suggest. They're not fatal errors, they're just cautionary.
You should probably educate yourself a bit further on what all of these
algorithms are by taking a look at
http://www.gromacs.org/Documentation/Cut-off_schemes. The Verlet scheme is not
mandatory, but it is required by the type of parallelization you requested.
Reviewers may question the changes in version and cutoff methods when critiquing
your work, so be aware of that. Also, the instability you are seeing is
probably a result of the large time step, unless you are using virtual sites.
-Justin
--
==================================================
Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow
Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201
jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul
==================================================
More information about the gromacs.org_gmx-users
mailing list