[gmx-users] Strong egative energy drift (losing energy) in explicit water AMBER protein simulation

ms devicerandom at gmail.com
Wed Jun 13 16:48:24 CEST 2012

On 13/06/12 16:36, Justin A. Lemkul wrote:
> Here, you're not preserving any of the previous state information.
> You're picking up from 2 ns, but not passing a .cpt file to grompp - the
> previous state is lost. Is that what you want? In conjunction with
> "gen_vel = no" I suspect you could see some instabilities.

This is interesting -I have to ask the guys who devised the group's 
standard procedure :)

>> mpirun -np 8 mdrun_d -v -deffn 1AKI_production_GPU -s
>> 1AKI_production_GPU.tpr
>> -g 1AKI_production_GPU.log -c 1AKI_production_GPU.gro -o
>> 1AKI_production_GPU.trr
>> -g 1AKI_production_GPU.log -e 1AKI_production_GPU.edr
> As an aside, proper use of -deffnm (not -deffn) saves you all of this
> typing :)
> mpirun -np 8 mdrun_d -v -deffnm 1AKI_production_GPU
> That's all you need.

FFFFUUUU that's why -deffn it didn't work! silly me. Thanks!

>> I am using Gromacs 4.5.5 compiled in double precision.
>> I am very rusty with Gromacs, since I last dealt molecular dynamics
>> more than 1
>> year ago :) , so probably I am missing something obvious. Any hint on
>> where
>> should I look for to solve the problem? (Also, advice on if the .mdp
>> is indeed
>> correct for CUDA simulations are welcome)
> I see the same whenever I run on GPU, but my systems are always implicit
> solvent. Do you get reasonable performance with an explicit solvent PME
> system on GPU? I thought that was supposed to be really slow.
> Do you observe similar effects on CPU? My tests have always indicated
> that equivalent systems on CPU are far more stable (energetically and
> structurally) than on GPU. I have never had any real luck on GPU. I get
> great performance, and then crashes ;)

Sorry, perhaps I wasn't clear. This was on normal CPUs! I was trying to 
get the system working on CPU and to see how it behaved before diving in 
the GPU misty sea...


Massimo Sandal, Ph.D.

More information about the gromacs.org_gmx-users mailing list