[gmx-users] REMD: mdrun_mpi crash with segmentation fault (but mpi is working)
Felipe Villanelo
el.maestrox at gmail.com
Tue Feb 10 13:36:28 CET 2015
Absolutely nothing is written in the log file, just the citations
Felipe Villanelo Lizana
Bioquímico
Laboratorio de Biología Estructural y Molecular
Universidad de Chile
2015-02-03 10:01 GMT-03:00 Felipe Villanelo <el.maestrox at gmail.com>:
> Hi,
>
> I trying to learn REMD following the tutorial on gromacs page
> <http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham,_Session_1B> on
> a 4-cores computer.
> However when I try to use the command:
> mpirun -np 4 mdrun_mpi -v -multidir equil[0123] (as the tutorial says)
> the program crashed with the following error:
> mpirun noticed that process rank 2 with PID 13013 on node debian exited on
> signal 11 (Segmentation fault).
>
> The mpi is running fine with the 4 cores if I run a simple gromacs
> simulation (NPT equil) in the same machine.
> So I think it is not a problem of mpi configuration (as I read in another
> thread)
>
> These with gromacs version is 5.0.2
>
> If I try to run the same with an older version of gromacs (4.5.5) the
> error is different (previously adjusting the options on the mdp file to
> match changes in syntaxis betweeen versions):
>
> [debian:23526] *** An error occurred in MPI_comm_size
> [debian:23526] *** on communicator MPI_COMM_WORLD
> [debian:23526] *** MPI_ERR_COMM: invalid communicator
> [debian:23526] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
>
> But this version also work fine with mpi using the 4 cores on a simple
> simulation
>
> Thanks
> Bye
>
> Felipe Villanelo Lizana
> Bioquímico
> Laboratorio de Biología Estructural y Molecular
> Universidad de Chile
>
More information about the gromacs.org_gmx-users
mailing list