[gmx-users] Domain Decomposition does not support simple neighbor searching.

Justin Lemkul jalemkul at vt.edu
Thu Jun 9 01:19:21 CEST 2016



On 6/8/16 9:41 AM, Daniele Veclani wrote:
> Dear Gromacs Users
>
> I'm trying to do a simulation in a  NVE ensemble in vaccum. But I but
> I find this
> error:
>
> "Domain Decomposition does not support simple neighbor searching, use grid
> searching or run with one MPI rank"
>
> If I use ns_type=grid I can generate the .tpr file, But when I run mdrun I
> find:
>
> "NOTE: This file uses the deprecated 'group' cutoff scheme. This will be
> removed in a future release when 'verlet' supports all interaction forms.
>
> and mdrun program crashes.
>
>
> How can I do  energy minimization  and  simulation in NVE ensemble in
> vaccum with GROMACS 5.0.4?
>
> This is my .mdp file for energy minimization:
>
> ; Run control
> integrator               = steep
> nsteps                   = 500000
> ; EM criteria and other stuff
> emtol                    = 10
> emstep                   = 0.001
> niter                    = 20
> nbfgscorr                = 10
> ; Output control
> nstlog                   = 1
> nstenergy                = 1
> ; Neighborsearching PARAMETERS
> cutoff-scheme            = group
> vdw-type                 = Cut-off
> nstlist                  = 1        ; 10 fs
> ns_type                  = grid     ; search neighboring grid cells
> pbc                      = No
> rlist                    = 0.0       ; short-range neighborlist cutoff (in
> nm)
> rlistlong                = 0.0
> ; OPTIONS FOR ELECTROSTATICS AND VDW
> coulombtype              = cut-off ; Particle Mesh Ewald for long-range
> electrostatics
> rcoulomb-switch          = 0
> rcoulomb                 = 0.0        ; short-range electrostatic cutoff
> (in nm)
> rvdw                     = 0.0        ; short-range van der Waals cutoff
> (in nm)
> rvdw-switch              = 0.0
> epsilon_r                = 1
> ; Apply long range dispersion corrections for Energy and Pressure
> DispCorr                  = No
> ; Spacing for the PME/PPPM FFT grid
> fourierspacing           = 0.12
> ; EWALD/PME/PPPM parameters
> pme_order                = 6
> ewald_rtol               = 1e-06
> epsilon_surface          = 0
> ; Temperature and pressure coupling are off during EM
> tcoupl                   = no
> pcoupl                   = no
> ; No velocities during EM
> gen_vel                  = no
>  Bond parameters
> continuation             = no            ; first dynamics run
> constraint_algorithm = lincs    ; holonomic constraints
> constraints              = h-bonds     ; all bonds (even heavy atom-H
> bonds) constrained
> lincs_iter               = 2             ; accuracy of LINCS
> lincs_order              = 4             ; also related to accuracy
>
>
> To use a single MPI is correct to do so:
>
> -mpirun -np 1 mdrun -s *.trp ??
>

mdrun -nt 1 -s (etc) will do it, or if you want/need parallelization via OpenMP:

mdrun -nt N -tmpi 1 -s (etc)

> There still exists the possibility of using the Point Decomposition method
> (mdrun -pd) in gromacs 5.x?
>

Nope, that's long gone.

-Justin

-- 
==================================================

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==================================================


More information about the gromacs.org_gmx-users mailing list