[gmx-users] micelle disaggregated in serial, but not parallel, runs using sd integrator

Chris Neale chris.neale at utoronto.ca
Wed Feb 4 23:17:01 CET 2009


Thank you Berk,

I will repeat my runs using the checkpoint file and report my findings back to this list. Thank you for this advice.

Chris.

-- original message --

Hi,

In this manner you use the same random seed and thus noise for all parts.
In most cases this will not lead to serious artifacts with SD,
but you can never be sure.
When checkpoints are used, you do not repeat random numbers.
This also gives a difference between serial and parallel in 4.0.
With serial you get exactly the same noise per atom, in parallel not,
since atoms migrate from one node to another (with domain decompostion).

If you do not use checkpoints, use ld_seed=-1 and do not use tpbconv.

Berk


>/ Date: Wed, 4 Feb 2009 15:05:47 -0500
/>/ From: chris.neale at utoronto.ca <http://www.gromacs.org/mailman/listinfo/gmx-users>
/>/ To: gmx-users at gromacs.org <http://www.gromacs.org/mailman/listinfo/gmx-users>
/>/ Subject: [gmx-users] micelle disaggregated in serial, but not parallel,	runs using sd integrator
/>/ 
/>/ Thank you Berk,
/>/ 
/>/ I will loon into tau_t=1.0 (or at least not = 0.1). Thank you for the hint.
/>/ 
/>/ These simulations run in 200 ps segments and utilize restarts via  
/>/ grompp -t -e like this:
/>/ 
/>/ EXECUTING:  
/>/ /hpf/projects1/pomes/cneale/exe/gromacs-4.0.3/exec/bin/grompp -f  
/>/ /scratch/4772976.1.ompi-4-21.q/md6_running/dpc50_md6.mdp  -c  
/>/ md5_success/dpc50_md5.gro -p dpc50.top -n dpc50.ndx -o  
/>/ /scratch/4772976.1.ompi-4-21.q/md6_running/dpc50_md6.tpr -maxwarn 1 -t  
/>/ md5_success/dpc50_md5.trr -e md5_success/dpc50_md5.edr
/>/ 
/>/ The -maxwarn 1 is to avoid this message:
/>/ 
/>/ WARNING 1 [file  
/>/ /scratch/4772976.1.ompi-4-21.q/md5_running/dpc50_md5.mdp, line unknown]:
/>/    Can not couple a molecule with free_energy = no
/>/ 
/>/ which I don't think should be a problem.
/>/ 

<<TRUNCATED ... previous posts contain the full history of this ticket>>
/




More information about the gromacs.org_gmx-users mailing list