[gmx-users] EM did not converge crashing drude system
Dayhoff, Guy
gdayhoff at health.usf.edu
Wed Mar 15 18:19:51 CET 2017
I have a drude system that I have minimized, and equilibrated through nvt
and npt ensembles using position restraints (all with scf). As I perform the
production runs i’m facing random crashes that seem to be related to EM
did not converge messages. This has happened anywhere from tens of
thousands of steps in, to millions of steps in. Below is an example and
the mdp options for the run.
My Best,
Guy Dayhoff
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
step 393698: EM did not converge in 20 iterations, RMS force 15.668
step 393699: EM did not converge in 20 iterations, RMS force 88.569
step 393700: EM did not converge in 20 iterations, RMS force 341.002
step 393701: EM did not converge in 20 iterations, RMS force 680.888
step 393702: EM did not converge in 20 iterations, RMS force 1147.229
step 393703: EM did not converge in 20 iterations, RMS force 1077.486
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
MDP options:
; RUN CONTROL
integrator = md
nsteps = 50000000 ;50ns
dt = 0.001
; OUTPUT CONTROL
nstxout = 0
nstvout = 10000 ;every 10ps for now
nstfout = 10000
nstlog = 10000
nstenergy = 10000
nstcalcenergy = 1
nstxout-compressed = 10000
compressed-x-grps = System
; NEIGHBOR SEARCHING
cutoff-scheme = verlet
nstlist = 10
ns-type = Grid
pbc = xyz
rlist = 1.2
; BONDS
constraints = h-bonds
continuation = yes
; ELECTROSTATICS AND VdW
coulombtype = PME
rcoulomb = 1.2
vdwtype = cutoff
vdw-modifier = potential-switch
rvdw-switch = 1.0
rvdw = 1.2
DispCorr = EnerPres
; EWALD
pme_order = 4
fourierspacing = 0.16
; TEMPERATURE COUPLING
Tcoupl = Nose-Hoover
tc-grps = Water non-Water
tau_t = 10.0 10.0
ref_t = 300 300
nh-chain-length = 1
; VELOCITY GENERATION
gen-vel = no
; PRESSURE COUPLING
Pcoupl = Parrinello-Rahman
Pcoupltype = semiisotropic
tau_p = 50.0
compressibility = 4.5e-5 4.5e-5
ref_p = 0.0 3.0
; DRUDE POLARIZATION
drude = yes
drude-mode = SCF
drude-hyper = yes
drude-khyp = 16736000.0
drude-r = 0.02
drude-pow = 4
More information about the gromacs.org_gmx-users
mailing list