[gmx-users] Problem with OpenMP+MPI
jalemkul at vt.edu
Wed Feb 27 16:57:13 CET 2013
On 2/27/13 10:54 AM, jesmin jahan wrote:
> Many thanks to you Justin for your help.
> Now my .mdp file looks like:
> constraints = none
> integrator = md
> pbc = no
> verlet-buffer-drift = -1
> dt = 0.001
> nsteps = 0
> ns_type = simple
> comm-mode = angular
> rlist = 0
> rcoulomb = 0
> rvdw = 0
> nstlist = 0
> rgbradii = 0
> nstgbradii = 1
> coulombtype = cutoff
> vdwtype = cutoff
> implicit_solvent = GBSA
> gb_algorithm = HCT ;
> sa_algorithm = None
> gb_dielectric_offset = 0.02
> optimize_fft = yes
> energygrps = protein
> And when I run mdrun, it says
> "OpenMP threads have been requested with cut-off scheme Group, but
> these are only supported with cut-off scheme Verlet"
> Now if I add
> cutoff-scheme = Verlet
> It says
> With Verlet lists only full pbc or pbc=xy with walls is supported
> With Verlet lists nstlist should be larger than 0
> So, what is the solution??
Use the group cutoff scheme rather than Verlet, otherwise we wind up right back
where we were before.
> Also, you told me
> "Note that implicit solvent calculations can be run on no more than 2
> processors. You'll get a fatal error if you try to use more."
> Is there any documentation for this? Can I cite you if I want to add
> this information in an article?
It's a bug, listed on Redmine somewhere. I doubt that requires any sort of
citation, but if anyone asks, there are discussions in the list archive and on
Justin A. Lemkul, Ph.D.
Department of Biochemistry
jalemkul[at]vt.edu | (540) 231-9080
More information about the gromacs.org_gmx-users