[gmx-users] Re: ci barely out of bounds

David van der Spoel spoel at xray.bmc.uu.se
Sun May 27 20:00:51 CEST 2007

chris.neale at utoronto.ca wrote:
> This email refers to my original posts on this topic:
> http://www.gromacs.org/pipermail/gmx-users/2006-October/024154.html
> http://www.gromacs.org/pipermail/gmx-users/2006-October/024333.html
> I have previously posted some of this information to somebody else's 
> bugzilla post #109 including a possible workaround
> http://bugzilla.gromacs.org/show_bug.cgi?id=109
> I have compiled my gromacs always using one of the gcc3.3* distros. I 
> didn't do anything fancy, just the usual configure,make,make install. I 
> did have trouble with a gromacs version compiled using a gcc4* distro 
> and somebody on this user list assisted me in determining that I should 
> just roll back my gcc verison. I run on all sorts of computers, opterons 
> and intel chips, 32 and 64 bit, and this particular ci problem is the 
> same for me on all of them.
> Matteo, I want to be sure that we are on the same page here: your ci is 
> just *barely* out of bounds right? This is a different problem than when 
> your ci is a huge negative number. In that case you have some other 
> problem and your system is exploding.
> There is one test and one workaround included in my bugzilla post. The 
> test is to recompile gromacs with the -DEBUG_PBC flag and see if the 
> problem still occurs. For me this solved the problem (although gromacs 
> runs much slower so it is not a great workaround). The solution was to 
> remake my system with a few more or a few less waters so that the number 
> of grids wasn't changing as the volume of the box fluctuates (slightly) 
> during constant pressure simulations.
> I here include the text that I added to that bugzilla post:
> Did you try with a version of mdrun that was compiled with -DEBUG_PBC ?
> I have some runs that reliably (but stochastically) give errors about an 
> atom
> being found in a grid just one block outside of the expected boundary 
> only in
> parallel runs, and often other nodes have log files that indicate that 
> they have
> just updated the grid size (constant pressure simulation). This error 
> disappears
> when I run with a -DEBUG_PBC version. My assumption here is that there 
> is some
> non-blocking MPI communication that is not getting through in time. The
> -DEBUG_PBC version spends a lot of time checking some things and 
> although it
> never reports having found some problem, I assume that a side-effect of 
> these
> extra calculations is to slow things down enough at the proper stage so 
> that the
> MPI message gets through. I have solved my problem by adjusting my 
> simulation
> cell so that it doesn't fall close to the grid boundaries. Perhaps you are
> experiencing some analogous problem?
> Quoting Matteo Guglielmi <matteo.guglielmi at epfl.ch>:
>> Hello Chris,
>> I have the same problem with gromacs and did not understand
>> what's going wrong yet.
>> I did not try to run a serial job (as you did) but all my 7 simulations
>> (6 solvated pores in membranes + 1 protein in water... all of them
>> with positional restrains - double precision) keep crashing in the
>> same way.
>> Did you finally understand why they do crash (in parallel)?
>> How did you compile gromacs?
>> I used the intel copilers (ifort icc icpc 9.1 series) whith the
>> following optimization flags: -O3 -unroll -axT.
>> I've also tried the 8.0 series but no chance to get rid of the problem.
>> I'm running on woodcrest (xeon cpu 5140 2.33GHz) and xeon cpu
>> 3.06GHz.
>> Thanks for your attention,
>> MG

Do you use pressure coupling? In principle that can cause problems when 
combined with position restraints. Further once again, please try to 
reproduce the problem with gcc as well. If this is related to bugzilla 
109 as Chris suggests then please let's continue the discussion there.

David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596,  	75124 Uppsala, Sweden
phone:	46 18 471 4205		fax: 46 18 511 755
spoel at xray.bmc.uu.se	spoel at gromacs.org   http://folding.bmc.uu.se

More information about the gromacs.org_gmx-users mailing list