[gmx-users] MPI error in gromacs 4.6
Mark Abraham
mark.j.abraham at gmail.com
Mon Mar 24 13:00:39 CET 2014
On Mon, Mar 24, 2014 at 12:17 PM, Ankita Naithani
<ankitanaithani at gmail.com>wrote:
> Hi,
>
> I am trying to run a simulation of my protein (monomer ~500 residues). I
> had few questions and erors regarding the same.
> I have previously run the simulation of the apo form of the same protein
> using Gromacs 4.5.5 which was available at the cluster facility I was using
> and also which is installed in my system. However, when I tried to run the
> holo form, I got error :
> Fatal error:
> 11 particles communicated to PME node 106 are more than 2/3 times the
> cut-off out of the domain decomposition cell of their charge group in
> dimension y.
> This usually means that your system is not well equilibrated.
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
> This I figured out could be solved using a lower timestep as my previous
> timestep was 4fs and now I have reduced it to 3fs which should work fine
> now.
>
You should need to do that kind of thing only for equilibration, e.g. see
http://www.gromacs.org/Documentation/How-tos/Steps_to_Perform_a_Simulation
> However, after producing the tpr file for production run in my GROMACS
> 4.5.5, I realised that the grant for the cluster facility is over and the
> new clusters which I am trying to set up the same protein for support only
> gromacs 4.6. I am trying to run the code in these clusters and I get he
> following error:
>
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 4.6.3
> Source code file: /home/gromacs-4.6.3/src/kernel/runner
> .c, line: 824
>
> Fatal error:
> OpenMP threads have been requested with cut-off scheme Group, but these are
> only
> supported with cut-off scheme Verlet
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
> ---------------------------------------------------------------------------------
>
> 1. I wanted help with my mdp options to make it compatible.
>
If you want to run with the group scheme, use cutoff-scheme = group,
control your job script to use an MPI rank per core, and do not attempt to
use OpenMP.
2. Since my pevious calculations were based on gromacs 4.5.5, switching to
> gromacs 4.6, would that break the continuity of the run or would that bring
> about differences in the way the trajectories would be analysed?
>
Many things are different, but if you're confident no relevant bugs were
fixed, you can use 4.6.x to do the same things as 4.5.5 could do. But since
such runs would not have
http://www.gromacs.org/Documentation/Terminology/Reproducibility, then they
will not have continuity either.
>
>
> Below, is my mdp file
> title = production MD
> ; Run parameters
> integrator = md ; leap-frog algorithm
> nsteps = 33333333 ; 0.003 * 33333333 = 100000 ps or 100 n
> dt = 0.003 ; 3 fs
> ; Output control
> nstxout = 0 ; save coordinates every 2 ps
> nstvout = 0 ; save velocities every 2 ps
> nstxtcout = 1000 ; xtc compressed trajectory output every 5 ps
> nstenergy = 1000 ; save energies every 5 ps
> nstlog = 1000 ; update log file every 5 ps
> energygrps = Protein ATP
> ; Bond parameters
> constraint_algorithm = lincs ; holonomic constraints
> constraints = all-bonds ; all bonds (even heavy atom-H bonds)
> constrained
> lincs_iter = 1 ; accuracy of LINCS
> lincs_order = 4 ; also related to accuracy
> ; Neighborsearching
> ns_type = grid ; search neighboring grid cells
> nstlist = 5 ; 25 fs
> rlist = 1.0 ; short-range neighborlist cutoff (in nm)
> rcoulomb = 1.0 ; short-range electrostatic cutoff (in nm)
> rvdw = 1.0 ; short-range van der Waals cutoff (in nm)
> rlistlong = 1.0 ; long-range neighborlist cutoff (in nm)
> ; Electrostatics
> coulombtype = PME ; Particle Mesh Ewald for long-range
> electrostatics
> pme_order = 4 ; cubic interpolation
> fourierspacing = 0.16 ; grid spacing for FFT
> nstcomm = 10 ; remove com every 10 steps
> ; Temperature coupling is on
> tcoupl = V-rescale ; modified Berendsen thermostat
> tc-grps = Protein Non-Protein ; two coupling groups - more
> accurate
> tau_t = 0.1 0.1 ; time constant, in ps
> ref_t = 318 318 ; reference temperature, one for each group,
> in K
> ; Pressure coupling is off
> pcoupl = berendsen ; Berendsen thermostat
> pcoupltype = isotropic ; uniform scaling of box vectors
> tau_p = 1.0 ; time constant, in ps
> ref_p = 1.0 ; reference pressure, in bar
> compressibility = 4.5e-5 ; isothermal compressibility of water, bar^-1
> ; Periodic boundary conditions
> pbc = xyz ; 3-D PBC
> ; Dispersion correction
> DispCorr = EnerPres ; account for cut-off vdW scheme
> ; Velocity generation
> gen_vel = yes ; Velocity generation is on
> gen_temp = 318 ; reference temperature, for protein in K
>
>
>
>
> Kind regards--
> Ankita Naithani
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list