[gmx-users] Perplexed about MPI
dr.horsfield at physics.org
dr.horsfield at physics.org
Tue Oct 29 18:00:27 CET 2002
Hi,
I have begun to despair of MPI! I have LAM-MPI 6.5.7 installed. It passes
the test suite perfectly, and the Pallas MPI benchmarks run fine. I have
gromacs 3.1.4 installed, and I have compiled it for MPI. A particular
small job runs fine on 2 processors (either on the same box, or on two
boxes), but falls over immediately on 3 or 4, with the error:
MPI process rank 0 (n0, p3603) caught a SIGSEGV.
-----------------------------------------------------------------------------
One of the processes started by mpirun has exited with a nonzero exit
code. This typically indicates that the process finished in error.
If your process did not finish in error, be sure to include a "return
0" or "exit(0)" in your C code before exiting the application.
PID 3603 failed on node n0 with exit status 1.
-----------------------------------------------------------------------------
Could the problem be that the number of atoms on each node is not the same
for 3 or 4 processors? Any suggestions?
Cheers,
Andrew
More information about the gromacs.org_gmx-users
mailing list