[gmx-users] MPI error in gromacs 4.6, more Errors

Ankita Naithani ankitanaithani at gmail.com
Mon Mar 24 12:57:17 CET 2014


Hi, so I modified my mdp file which now looks like the following:

title        = production MD
; Run parameters
integrator    = md        ; leap-frog algorithm
;nsteps        = 20000000    ; 0.005 * 20000000 = 100000 ps or 100 ns
;nsteps        = 200000    ; 0.005 * 200000 = 1 ns
;dt        = 0.005        ; 5 fs
nsteps        = 33333333    ; 0.003 * 33333333 = 100000 ps or 100 n
dt        = 0.003        ; 3 fs
; Output control
nstxout        = 0        ; save coordinates every 2 ps
nstvout        = 0        ; save velocities every 2 ps
nstxtcout    = 1000        ; xtc compressed trajectory output every 5 ps
nstenergy    = 1000            ; save energies every 5 ps
nstlog        = 1000            ; update log file every 5 ps
; Bond parameters
constraint_algorithm = lincs    ; holonomic constraints
constraints    = all-bonds    ; all bonds (even heavy atom-H bonds)
constrained
lincs_iter    = 1        ; accuracy of LINCS
lincs_order    = 4        ; also related to accuracy
; Neighborsearching
ns_type        = grid        ; search neighboring grid cells
nstlist        = 5        ; 25 fs
rlist        = 1.0        ; short-range neighborlist cutoff (in nm)
rcoulomb    = 1.0        ; short-range electrostatic cutoff (in nm)
rvdw        = 1.0        ; short-range van der Waals cutoff (in nm)
rlistlong    = 1.0        ; long-range neighborlist cutoff (in nm)
cutoff-scheme   = Verlet
; Electrostatics
coulombtype    = PME        ; Particle Mesh Ewald for long-range
electrostatics
pme_order    = 4        ; cubic interpolation
fourierspacing    = 0.16        ; grid spacing for FFT
nstcomm = 10                    ; remove com every 10 steps
; Temperature coupling is on
tcoupl        = V-rescale    ; modified Berendsen thermostat
tc-grps        = Protein Non-Protein    ; two coupling groups - more
accurate
tau_t        = 0.1    0.1    ; time constant, in ps
ref_t        = 318     318    ; reference temperature, one for each group,
in K
; Pressure coupling is off
pcoupl          = berendsen    ; Berendsen thermostat
pcoupltype    = isotropic    ; uniform scaling of box vectors
tau_p        = 1.0        ; time constant, in ps
ref_p        = 1.0        ; reference pressure, in bar
compressibility = 4.5e-5    ; isothermal compressibility of water, bar^-1
; Periodic boundary conditions
pbc        = xyz        ; 3-D PBC
; Dispersion correction
DispCorr    = EnerPres    ; account for cut-off vdW scheme
; Velocity generation
gen_vel        = yes        ; Velocity generation is on
gen_temp    = 318        ; reference temperature, for protein in K
--------------------------------------------------


But, when I try to generate the tpr file on the cluster itself using
gromacs 4.6.3, I get the following error:


NOTE 1 [file md3.mdp]:
  With Verlet lists the optimal nstlist is >= 10, with GPUs >= 20. Note
  that with the Verlet scheme, nstlist has no effect on the accuracy of
  your simulation.


NOTE 2 [file md3.mdp]:
  nstcomm < nstcalcenergy defeats the purpose of nstcalcenergy, setting
  nstcomm to nstcalcenergy

Generated 3403 of the 3403 non-bonded parameter combinations
Generating 1-4 interactions: fudge = 0.5
Generated 3403 of the 3403 1-4 parameter combinations
Segmentation fault

Can anyone please suggest further?


Kind regards,

Ankita


On Mon, Mar 24, 2014 at 11:48 AM, Ankita Naithani
<ankitanaithani at gmail.com>wrote:

> Hi Pavan,
> Thank you for your response. I am trying to generate the tpr file with the
> following parameter;
> ; Neighborsearching
>  ns_type        = grid        ; search neighboring grid cells
>  nstlist        = 5        ; 25 fs
>  rlist        = 1.0        ; short-range neighborlist cutoff (in nm)
>  rcoulomb    = 1.0        ; short-range electrostatic cutoff (in nm)
>  rvdw        = 1.0        ; short-range van der Waals cutoff (in nm)
>  rlistlong    = 1.0        ; long-range neighborlist cutoff (in nm)
> cutoff-scheme = Verlet
>
> But, I get a warning of Unknown left-hand 'cutoff-scheme' in parameter
> file.
>
>
> On Mon, Mar 24, 2014 at 11:26 AM, Pavan Kumar <kumar.pavan006 at gmail.com>wrote:
>
>> Hello Ankita
>> You have to just include the following line in your mdp file
>> cutoff-scheme=Verlet
>> And run your grompp with the modfied mdp file to generate tpr file and
>> then
>> mdrun.
>> Hope this doesn't give you the same error
>>
>>
>> On Mon, Mar 24, 2014 at 4:47 PM, Ankita Naithani
>> <ankitanaithani at gmail.com>wrote:
>>
>> > Hi,
>> >
>> > I am trying to run a simulation of my protein (monomer ~500 residues). I
>> > had few questions and erors regarding the same.
>> > I have previously run the simulation of the apo form of the same protein
>> > using Gromacs 4.5.5 which was available at the cluster facility I was
>> using
>> > and also which is installed in my system. However, when I tried to run
>> the
>> > holo form, I got error :
>> > Fatal error:
>> > 11 particles communicated to PME node 106 are more than 2/3 times the
>> > cut-off out of the domain decomposition cell of their charge group in
>> > dimension y.
>> > This usually means that your system is not well equilibrated.
>> > For more information and tips for troubleshooting, please check the
>> GROMACS
>> > website at http://www.gromacs.org/Documentation/Errors
>> >
>> > This I figured out could be solved using a lower timestep as my previous
>> > timestep was 4fs and now I have reduced it to 3fs which should work fine
>> > now.
>> > However, after producing the tpr file for production run in my GROMACS
>> > 4.5.5, I realised that the grant for the cluster facility is over and
>> the
>> > new clusters which I am trying to set up the same protein for support
>> only
>> > gromacs 4.6. I am trying to run the code in these clusters and I get he
>> > following error:
>> >
>> >
>> > -------------------------------------------------------
>> > Program mdrun_mpi, VERSION 4.6.3
>> > Source code file: /home/gromacs-4.6.3/src/kernel/runner
>> > .c, line: 824
>> >
>> > Fatal error:
>> > OpenMP threads have been requested with cut-off scheme Group, but these
>> are
>> > only
>> >  supported with cut-off scheme Verlet
>> > For more information and tips for troubleshooting, please check the
>> GROMACS
>> > website at http://www.gromacs.org/Documentation/Errors
>> >
>> >
>> ---------------------------------------------------------------------------------
>> >
>> > 1. I wanted help with my mdp options to make it compatible.
>> > 2. Since my pevious calculations were based on gromacs 4.5.5, switching
>> to
>> > gromacs 4.6, would that break the continuity of the run or would that
>> bring
>> > about differences in the way the trajectories would be analysed?
>> >
>> >
>> > Below, is my mdp file
>> > title        = production MD
>> > ; Run parameters
>> > integrator    = md        ; leap-frog algorithm
>> > nsteps        = 33333333    ; 0.003 * 33333333 = 100000 ps or 100 n
>> > dt        = 0.003        ; 3 fs
>> > ; Output control
>> > nstxout        = 0        ; save coordinates every 2 ps
>> > nstvout        = 0        ; save velocities every 2 ps
>> > nstxtcout    = 1000        ; xtc compressed trajectory output every 5 ps
>> > nstenergy    = 1000            ; save energies every 5 ps
>> > nstlog        = 1000            ; update log file every 5 ps
>> > energygrps      = Protein ATP
>> > ; Bond parameters
>> > constraint_algorithm = lincs    ; holonomic constraints
>> > constraints    = all-bonds    ; all bonds (even heavy atom-H bonds)
>> > constrained
>> > lincs_iter    = 1        ; accuracy of LINCS
>> > lincs_order    = 4        ; also related to accuracy
>> > ; Neighborsearching
>> > ns_type        = grid        ; search neighboring grid cells
>> > nstlist        = 5        ; 25 fs
>> > rlist        = 1.0        ; short-range neighborlist cutoff (in nm)
>> > rcoulomb    = 1.0        ; short-range electrostatic cutoff (in nm)
>> > rvdw        = 1.0        ; short-range van der Waals cutoff (in nm)
>> > rlistlong    = 1.0        ; long-range neighborlist cutoff (in nm)
>> > ; Electrostatics
>> > coulombtype    = PME        ; Particle Mesh Ewald for long-range
>> > electrostatics
>> > pme_order    = 4        ; cubic interpolation
>> > fourierspacing    = 0.16        ; grid spacing for FFT
>> > nstcomm = 10                    ; remove com every 10 steps
>> > ; Temperature coupling is on
>> > tcoupl        = V-rescale    ; modified Berendsen thermostat
>> > tc-grps        = Protein Non-Protein    ; two coupling groups - more
>> > accurate
>> > tau_t        = 0.1    0.1    ; time constant, in ps
>> > ref_t        = 318     318    ; reference temperature, one for each
>> group,
>> > in K
>> > ; Pressure coupling is off
>> > pcoupl          = berendsen    ; Berendsen thermostat
>> > pcoupltype    = isotropic    ; uniform scaling of box vectors
>> > tau_p        = 1.0        ; time constant, in ps
>> > ref_p        = 1.0        ; reference pressure, in bar
>> > compressibility = 4.5e-5    ; isothermal compressibility of water,
>> bar^-1
>> > ; Periodic boundary conditions
>> > pbc        = xyz        ; 3-D PBC
>> > ; Dispersion correction
>> > DispCorr    = EnerPres    ; account for cut-off vdW scheme
>> > ; Velocity generation
>> > gen_vel        = yes        ; Velocity generation is on
>> > gen_temp    = 318        ; reference temperature, for protein in K
>> >
>> >
>> >
>> >
>> > Kind regards--
>> > Ankita Naithani
>> > --
>> > Gromacs Users mailing list
>> >
>> > * Please search the archive at
>> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> > posting!
>> >
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >
>> > * For (un)subscribe requests visit
>> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> > send a mail to gmx-users-request at gromacs.org.
>> >
>>
>>
>>
>> --
>> Cheers
>> Pavan
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>
>
>
>
> --
> Ankita Naithani
>



-- 
Ankita Naithani


More information about the gromacs.org_gmx-users mailing list