[gmx-users] Re: ci barely out of bounds

chris.neale at utoronto.ca chris.neale at utoronto.ca
Sun May 27 18:43:48 CEST 2007

This email refers to my original posts on this topic:

I have previously posted some of this information to somebody else's  
bugzilla post #109 including a possible workaround

I have compiled my gromacs always using one of the gcc3.3* distros. I  
didn't do anything fancy, just the usual configure,make,make install.  
I did have trouble with a gromacs version compiled using a gcc4*  
distro and somebody on this user list assisted me in determining that  
I should just roll back my gcc verison. I run on all sorts of  
computers, opterons and intel chips, 32 and 64 bit, and this  
particular ci problem is the same for me on all of them.

Matteo, I want to be sure that we are on the same page here: your ci  
is just *barely* out of bounds right? This is a different problem than  
when your ci is a huge negative number. In that case you have some  
other problem and your system is exploding.

There is one test and one workaround included in my bugzilla post. The  
test is to recompile gromacs with the -DEBUG_PBC flag and see if the  
problem still occurs. For me this solved the problem (although gromacs  
runs much slower so it is not a great workaround). The solution was to  
remake my system with a few more or a few less waters so that the  
number of grids wasn't changing as the volume of the box fluctuates  
(slightly) during constant pressure simulations.

I here include the text that I added to that bugzilla post:
Did you try with a version of mdrun that was compiled with -DEBUG_PBC ?
I have some runs that reliably (but stochastically) give errors about an atom
being found in a grid just one block outside of the expected boundary only in
parallel runs, and often other nodes have log files that indicate that  
they have
just updated the grid size (constant pressure simulation). This error  
when I run with a -DEBUG_PBC version. My assumption here is that there is some
non-blocking MPI communication that is not getting through in time. The
-DEBUG_PBC version spends a lot of time checking some things and although it
never reports having found some problem, I assume that a side-effect of these
extra calculations is to slow things down enough at the proper stage  
so that the
MPI message gets through. I have solved my problem by adjusting my simulation
cell so that it doesn't fall close to the grid boundaries. Perhaps you are
experiencing some analogous problem?

Quoting Matteo Guglielmi <matteo.guglielmi at epfl.ch>:

> Hello Chris,
> I have the same problem with gromacs and did not understand
> what's going wrong yet.
> I did not try to run a serial job (as you did) but all my 7 simulations
> (6 solvated pores in membranes + 1 protein in water... all of them
> with positional restrains - double precision) keep crashing in the
> same way.
> Did you finally understand why they do crash (in parallel)?
> How did you compile gromacs?
> I used the intel copilers (ifort icc icpc 9.1 series) whith the
> following optimization flags: -O3 -unroll -axT.
> I've also tried the 8.0 series but no chance to get rid of the problem.
> I'm running on woodcrest (xeon cpu 5140 2.33GHz) and xeon cpu
> 3.06GHz.
> Thanks for your attention,
> MG.
> --
> +----------------------------------------------------------+
> | Address:    Matteo Guglielmi                             |
> |             EPFL-SB-ISIC-LCBC      Ecole Polytechnique   |
> |             BCH 4121               Federale de Lausanne  |
> |             CH-1015 Lausanne                             |
> +----------------------------------------------------------+
> | Contacts:                                                |
> |                                                          |
> | Phone   --> +41-(0)21-6930323                 _____      |
> | Fax     --> +41-(0)21-6930320                 |   |      |
> | Mobile1 --> +41-(0)76-5245216                -------     |
> | Mobile2 --> +39-340-2751607                    0-0       |
> | Web     --> http://lcbcpc21.epfl.ch             c        |
> | EMail1  --> matteo.guglielmi at epfl.ch   +--o00o-----o00o--+
> | EMail2  --> matteo.guglielmi at gmail.com |ICanSeeClearlyNow|
> | ICQ     --> #166904425                 |...TheBrainIsGone|
> +----------------------------------------+-----------------+

More information about the gromacs.org_gmx-users mailing list