[gmx-users] problem with openmpi and gromacs4.0
hui sun
sunhui1018 at yahoo.com.cn
Thu Nov 6 15:04:25 CET 2008
Dear all,
Recently, we installed the gromacs4.0 with openmpi parallel program. When we started a simulation with np=1, the simulation was finished normally. But when the same tpr file was run with np=n (n>1), I obtained the following error message:
MPI_ABORT invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode -1
My used command is & mpirun -np 32 -hostfile mpd.hosts mdrun_mpi -s large.tpr -o large.trr -c large.gro -g large.log
The content of the mpd.hosts file is following,
node1 slots=8
.
.
.
nodex slots=8
It must be noted that my tpr file is produced via the gromacs 3.3.3. Is this related to the error?
I don't know what to do now and how can I overcome this. Could you help me?
Thanks in advance!
Hui Sun
---------------------------------
雅虎邮箱,您的终生邮箱!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20081106/4c48f16a/attachment.html>
More information about the gromacs.org_gmx-users
mailing list