[gmx-users] Possible bug: energy changes with the number of nodes for energy minimization
Stephen Cox
stephen.cox.10 at ucl.ac.uk
Tue May 29 12:22:13 CEST 2012
Hi,
I'm running a number of energy minimizations on a clathrate supercell and I
get quite significantly different values for the total energy depending on
the number of mpi processes / number of threads I use. More specifically,
some numbers I get are:
#cores energy
1 -2.41936409202696e+04
2 -2.43726425776809e+04
3 -2.45516442350804e+04
4 -2.47003944216983e+04
#threads energy
1 -2.41936409202696e+04
2 -2.43726425776792e+04
3 -2.45516442350804e+04
4 -2.47306458924815e+04
I'd expect some numerical noise, but these differences seem to0 large for
that. Before submitting a bug report, I'd like to check:
a) if someone has seen something similar;
b) should I just trust the serial version?
c) have I simply done something stupid (grompp.mdp appended below);
Any help greatly appreciated.
Steve
NB I am using a flexible methane and settles for TIP4P/2005. I have
reproduced the above trend with ubuntu repository versions of
gromacs/openmpi and my own compiled versions of gromacs/mpich2.
; run control
integrator = steep
dt = 0.001
nsteps = -1
comm_mode = linear
nstcomm = 1
; energy minimization
emtol = 0.01
emstep = 0.01
; output control
nstxout = 1000
nstvout = 1000
nstfout = 1000
nstlog = 1000
nstcalcenergy = 1
nstenergy = 1000
; neighbour searching
nstlist = 1
ns_type = grid
pbc = xyz
periodic_molecules = no
rlist = 0.9
; electrostatics
coulombtype = pme
rcoulomb = 0.9
; vdw
vdwtype = cut-off
rvdw = 0.9
dispcorr = ener
; ewald
fourierspacing = 0.1
pme_order = 4
ewald_geometry = 3d
optimize_fft = yes
; temperature coupling
tcoupl = nose-hoover
nh-chain-length = 10
tau_t = 0.5
ref_t = 300.0
tc_grps = system
constraints
constraint_algorithm = lincs
shake_tol = 0.0001
lincs_order = 8
lincs_iter = 2
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20120529/182f6372/attachment.html>
More information about the gromacs.org_gmx-users
mailing list