[gmx-users] [Fwd: mpich - mdrun problem and md using mpich]

David spoel at xray.bmc.uu.se
Wed Aug 10 20:58:12 CEST 2005


-------- Forwarded Message --------
From: Yuhua Song <yhsong at ccb.wustl.edu>
To: gmx-users at gromacs.org
Cc: spoel at gromacs.org
Subject: mpich - mdrun problem and md using mpich
Date: Wed, 10 Aug 2005 10:40:51 -0500


Dear Colleague:

When I try to run Gromacs 3.2.1 on supercomputer with multiprocessor,
After I launch the simulation using 6 processors or more it doesn't
work. The odd thing is that it doesn't exit with errors, but it waits
for data which never arrive. Finally report the error about Timeout
ect.

 /opt/mpich/gnu/bin/mpirun -np 6 -nolocal -machinefile machines  /usr/local/bin/mdrun_mpi -np 6 -v
and the output is something like
 starting mdrun 'Generated by genbox'
100 steps,      0.2 ps.
and, after some time, it writes out

p4_1636:  p4_error: Timeout in establishing connection to remote
process:

I search the mailing list, the same issue that has been reported on
mailing list on 2004-10-18 by gianluca santarossa titled as: “md using
mpich”.

David van der Spoel ‘s reply indicate that “there is a problem in mpich that makes that jobs never end, it has to do with interaction between MPI and grid software. 
There is a workaround in the CVS code that will shortly be released in beta.” 
I am wondering whether current public version of 3.2.1 have solved this problem, the error comes from the MPI installation.  Or need to download special patch for this.
I appreciate any input and suggestions.
Thank you very much,
Yuhua
 
-- 
David.
________________________________________________________________________
David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596,          75124 Uppsala, Sweden
phone:  46 18 471 4205          fax: 46 18 511 755
spoel at xray.bmc.uu.se    spoel at gromacs.org   http://xray.bmc.uu.se/~spoel
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++




More information about the gromacs.org_gmx-users mailing list