[gmx-users] PME, MPI and bproc
Jason DeJoannis
jdejoan at emory.edu
Thu Apr 24 18:30:01 CEST 2003
Dear Gromacs Users,
We get a "SIGSEGV" crash on each node when
using PME in parallel. This occurs on a
Beowulf linux cluster which uses "bproc" and
"bpsh" to issue remote processes. The crash
is quite general, i.e. it will even happen
with a box of water at several values of
the mesh spacing and interpolation order.
We have searched the gromacs archives and
suspect it has something to do with MPICH
itself. As a test of this hypothesis we have
considered using LAMMPI instead. We have
installed LAMMPI, FFTW and GROMACS on a
workstation and PME works great. However
the installation of LAMMPI on our bproc
Linux cluster fails. From reading the
lam-mpi.org archives it is apparent that
bproc support is only in an experimental
phase, and moreover does not work with older
versions of bproc.
Our cluster uses:
bproc-3.1.9
mpich-1.2.2.3
fftw-2.1.3-2
gromacs-3.1.3-1
Suggestions? Has anyone succeeding in running
parallel PME simulations on a bproc cluster?
Best regards,
Jason
---
Jason de Joannis, Ph.D.
Chemistry Department, Emory University
1515 Pierce Dr. NE, Atlanta, GA 30322
Phone: (404) 712-2983
Email: jdejoan at emory.edu
http://userwww.service.emory.edu/~jdejoan
More information about the gromacs.org_gmx-users
mailing list