[gmx-users] problem with openmpi and gromacs4.0
larsson at xray.bmc.uu.se
Thu Nov 6 17:56:28 CET 2008
On Nov 6, 2008, at 15:04 , hui sun wrote:
> Dear all,
> Recently, we installed the gromacs4.0 with openmpi parallel program.
> When we started a simulation with np=1, the simulation was finished
> normally. But when the same tpr file was run with np=n (n>1), I
> obtained the following error message:
> MPI_ABORT invoked on rank 0 in communicator MPI_COMM_WORLD with
> errorcode -1
> My used command is & mpirun -np 32 -hostfile mpd.hosts mdrun_mpi -s
> large.tpr -o large.trr -c large.gro -g large.log
> The content of the mpd.hosts file is following,
> node1 slots=8
> nodex slots=8
> It must be noted that my tpr file is produced via the gromacs 3.3.3.
> Is this related to the error?
Most probably you need to run the same version of grompp as mdrun.
(Another tip is to use the option "-deffnm large" instead of -s -o -c -
g with large names.)
> I don't know what to do now and how can I overcome this. Could you
> help me?
> Thanks in advance!
> Hui Sun
> gmx-users mailing list gmx-users at gromacs.org
> Please search the archive at http://www.gromacs.org/search before
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gromacs.org_gmx-users