[gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)
mark.j.abraham at gmail.com
Fri Feb 6 11:35:37 CET 2015
What was the last thing written to the log files?
On Tue, Feb 3, 2015 at 2:01 PM, Felipe Villanelo <el.maestrox at gmail.com>
> I trying to learn REMD following the tutorial on gromacs page
> a 4-cores computer.
> However when I try to use the command:
> mpirun -np 4 mdrun_mpi -v -multidir equil (as the tutorial says)
> the program crashed with the following error:
> mpirun noticed that process rank 2 with PID 13013 on node debian exited on
> signal 11 (Segmentation fault).
> The mpi is running fine with the 4 cores if I run a simple gromacs
> simulation (NPT equil) in the same machine.
> So I think it is not a problem of mpi configuration (as I read in another
> These with gromacs version is 5.0.2
> If I try to run the same with an older version of gromacs (4.5.5) the error
> is different (previously adjusting the options on the mdp file to match
> changes in syntaxis betweeen versions):
> [debian:23526] *** An error occurred in MPI_comm_size
> [debian:23526] *** on communicator MPI_COMM_WORLD
> [debian:23526] *** MPI_ERR_COMM: invalid communicator
> [debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
> But this version also work fine with mpi using the 4 cores on a simple
> Felipe Villanelo Lizana
> Laboratorio de Biología Estructural y Molecular
> Universidad de Chile
> Gromacs Users mailing list
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users