[gmx-users] Re: Segmentation fault, mdrun_mpi

Ladasky blind.watchmaker at yahoo.com
Mon Oct 8 10:39:25 CEST 2012


Justin Lemkul wrote
> My first guess would be a buggy MPI implementation.  I can't comment on
> hardware 
> specs, but usually the random failures seen in mdrun_mpi are a result of
> some 
> generic MPI failure.  What MPI are you using?

I am using the OpenMPI package, version 1.4.3.  It's one of three MPI
implementations which are included in the standard repositories of Ubuntu
Linux 11.10.  I can also obtain MPICH2 and gromacs-mpich without jumping
through too many hoops.  It looks like LAM is also available.  However, if
GROMACS needs a special package to interface with LAM, it's not in the
repositories.

Alternately, I could drop using the external MPI for now and just use the
new multi-threaded GROMACS defaults.  I was trying to prepare for longer
runs on a cluster, however.  If those runs are going to crash, I had better
know about it now.



--
View this message in context: http://gromacs.5086.n6.nabble.com/Segmentation-fault-mdrun-mpi-tp5001601p5001776.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.



More information about the gromacs.org_gmx-users mailing list