[gmx-users] Gromacs 4.0.2 hangs?

Berk Hess gmx3 at hotmail.com
Mon Jan 5 15:05:33 CET 2009


Hi,

There could be a bug in Gromacs 4.0.2 which causes problems
when running in parallel.
We are investigating this.

A much simpler way to run shorter simulations for the moment is
the -maxh option of mdrun.
You don't need tpbconv either, you can directly read a checkpoint
file with the -cpi option.

PS you might want to use mdrun -deffnm dppc_01

Berk

> From: ydubief at uvm.edu
> To: gmx-users at gromacs.org
> Date: Mon, 5 Jan 2009 08:10:49 -0500
> Subject: [gmx-users] Gromacs 4.0.2 hangs?
> 
> Dear all,
> 
> I have some difficulties running mdrun_mpi for large number of  
> iterations, typically > 40000. The code hangs, not necessarily at a  
> fixed iteration, for all configurations I run  and this behavior shows  
> up under linux and mac OS. The simulations range from 10^4 to 10^6  
> coarse-grained atoms and I try to keep the amount of data generated  
> per much smaller than 1Gb. I have worked around this issue by running  
> 25000-iteration similations with tpbconv-restarts, as follows:
> mpiexec -np 256 mdrun_mpi -nosum -v -s dppc0_1.tpr -o dppc0_1.trr -c  
> dppc0_1.gro -e dppc0_1.edr -x dppc0_1.xtc
>   tpbconv -s dppc0_1.tpr -f dppc0_1.trr -e dppc0_1.edr -o dppc0_2.tpr - 
> extend 250.0
> With such scripts, I have been able to generate 10^5 to 10^6  
> iterations without any problem. I was wondering if anyone has  
> experienced similar problems and if I am missing something.
> 
> I have pretty much ruled out a problem with mpi, since I have  
> thoroughly tested these computers  with other mpi codes. I am now  
> wondering if there might be a problem with output files.
> 
> I run Gromacs 4.0.2 on a linux cluster (quad core processors, myrinet  
> and mpich, compiled with gcc, single precision) up to 256 processors,  
> on a macpro 2 quad  and on a macbook dual core using the Fink package  
> for openmpi. All these computers have enough available disk space for  
> any simulation I run. A typical simulation is a coarse grained MD  
> using martini with .mdp files obtained from Marrink's website.
> 
> Best,
> 
> Yves
> 
> 
> 
> --
> Yves Dubief, Ph.D., Assistant Professor
> Graduate program coordinator
> University of Vermont, School of Engineering
> Mechanical Engineering Program
> 201 D Votey Bldg, 33 Colchester Ave, Burlington, VT 05405
> Tel: (1) 802 656 1930 Fax: (1) 802 656 3358
> Also:
> Vermont Advanced Computing Center
> 206 Farrell Hall, 210 Colchester Ave, Burlington, VT 05405
> Tel: (1) 802 656 9830 Fax: (1) 802 656 9892
> email: ydubief at uvm.edu
> web: http://www.uvm.edu/~ydubief/
> 
> 
> 
> 
> 
> 
> 
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php

_________________________________________________________________
Express yourself instantly with MSN Messenger! Download today it's FREE!
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20090105/cd4eb3a2/attachment.html>


More information about the gromacs.org_gmx-users mailing list