[gmx-users] Running MPI-enabled GROMACS

Choon Peng Chng cpchng at bii-sg.org
Fri Dec 28 10:08:24 CET 2001


Hi fellow GROMACS users,

  I've configured and install GROMACS 3.0.5 with MPI enabled. Was
successful with using MPICH on
a Dual PIII-800MHz Linux cluster instead of the default LAM, after many
hard knocks :)

However, I'm only able to do mdrun with -np 1 and get similar benchmark
timings.
mdrun -s topol.tpr -o traj.trr -c confout.gro -g md.log -np 1

But once I specify -np 2, I got:

Fatal error: run input file topol.tpr was made for 2 nodes,
             while mdrun expected it to be for 1 nodes.
[0] MPI Abort by user Aborting program !
[0] Aborting program!
p0_8012:  p4_error: : -1
    p4_error: latest msg from perror: No such file or directory

In fact, Alan da Silva asked a similar question back in Sept:
Alan Wilter Sousa da Silva alan at biof.ufrj.br
 Wed, 5 Sep 2001 09:44:19 -0300 (BRT)

mdrun_mpi -s water.tpr -o water.trr -c water_out.gr o -v -g
water.log -np 2
Fatal error: run input file water.tpr was made for 2 nodes,
             while mdrun_mpi expected it to be for 1 nodes.

Dr. David van der Spoel asked him to use:
mpirun -c 2 -c2c -O mdrun_mpi options
instead.

But Alan still cannot get it to run. The issue seems unsettled.

For me, I cannot do mpirun with mdrun as the executable. Because I'm
using MPICH??
Tried adding mpi lib to LD_LIBRARY_PATH but to no effect.

How to get GROMACS to run in parallel mode using MPI? Anyone else faced
the same problem?
Can anyone help?


Thanks so much.
regards,
choon peng
--
Choon-Peng Chng
Scientific Programmer
BioInformatics Institute (BII)
30 Medical Drive, Level 1 IMCB Building, Singapore 117609
Tel: (65)8746173





More information about the gromacs.org_gmx-users mailing list