[gmx-users] problems in MPI run for REMD in GROMACS

Terry terrencesun at gmail.com
Tue May 3 10:23:39 CEST 2016


On Tue, May 3, 2016 at 2:50 PM, Pabitra Mohan <uniquepabs at gmail.com> wrote:

> Dear Gromacs Users,
> Can any body suggest the solutions to my problem appearing during REMD run
> in GROMACS
>
> While executing the following command these error messages are coming
>
> pabitra at pabitra-Dell-System-XPS-L502X:~/Desktop/ctld_remd/stage2$ mpirun
> -np 4 mdrun_mpi -v -multidir sim0 sim1 sim2 sim3 sim4 -replex 500
> GROMACS:    mdrun_mpi, VERSION 5.0.6
>
> GROMACS is written by:
> Emile Apol         Rossen Apostolov   Herman J.C. Berendsen Par
> Bjelkmar
> Aldert van Buuren  Rudi van Drunen    Anton Feenstra     Sebastian Fritsch
> Gerrit Groenhof    Christoph Junghans Peter Kasson       Carsten Kutzner
> Per Larsson        Justin A. Lemkul   Magnus Lundborg    Pieter Meulenhoff
> Erik Marklund      Teemu Murtola      Szilard Pall       Sander Pronk
> Roland Schulz      Alexey Shvetsov    Michael Shirts     Alfons Sijbers
> Peter Tieleman     Christian Wennberg Maarten Wolf
> and the project leaders:
> Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
>
> Copyright (c) 1991-2000, University of Groningen, The Netherlands.
> Copyright (c) 2001-2014, The GROMACS development team at
> Uppsala University, Stockholm University and
> the Royal Institute of Technology, Sweden.
> check out http://www.gromacs.org for more information.
>
> GROMACS is free software; you can redistribute it and/or modify it
> under the terms of the GNU Lesser General Public License
> as published by the Free Software Foundation; either version 2.1
> of the License, or (at your option) any later version.
>
> GROMACS:      mdrun_mpi, VERSION 5.0.6
> Executable:   /usr/bin/mdrun_mpi.openmpi
> Library dir:  /usr/share/gromacs/top
> Command line:
>   mdrun_mpi -v -multidir sim0 sim1 sim2 sim3 sim4 -replex 500
>
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.0.6
> Source code file:
> /build/gromacs-EDw7D5/gromacs-5.0.6/src/gromacs/gmxlib/main.cpp, line: 383
>
>
Hi,

The real problem is here. Your command line arguments will not work. This
is by design, not a bug. Do what the error suggests.

Cheers
Terry


> Fatal error:
> The number of ranks (4) is not a multiple of the number of simulations (5)

For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Error on rank 2, will try to stop all ranks
> Halting parallel program mdrun_mpi on CPU 2 out of 4
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.0.6
> Source code file:
> /build/gromacs-EDw7D5/gromacs-5.0.6/src/gromacs/gmxlib/main.cpp, line: 383
>
> Fatal error:
> The number of ranks (4) is not a multiple of the number of simulations (5)
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Error on rank 0, will try to stop all ranks
> Halting parallel program mdrun_mpi on CPU 0 out of 4
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.0.6
> Source code file:
> /build/gromacs-EDw7D5/gromacs-5.0.6/src/gromacs/gmxlib/main.cpp, line: 383
>
> Fatal error:
> The number of ranks (4) is not a multiple of the number of simulations (5)
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Error on rank 1, will try to stop all ranks
> Halting parallel program mdrun_mpi on CPU 1 out of 4
>
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.0.6
> Source code file:
> /build/gromacs-EDw7D5/gromacs-5.0.6/src/gromacs/gmxlib/main.cpp, line: 383
>
> Fatal error:
> The number of ranks (4) is not a multiple of the number of simulations (5)
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Error on rank 3, will try to stop all ranks
> Halting parallel program mdrun_mpi on CPU 3 out of 4
>
> gcq#11: "She's Not Bad, She's Just Genetically Mean" (Captain Beefheart)
>
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
> with errorcode -1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> gcq#13: "If Life Seems Jolly Rotten, There's Something You've Forgotten !"
> (Monty Python)
>
>
> gcq#268: "It's Not Dark Yet, But It's Getting There" (Bob Dylan)
>
>
> gcq#49: "I'll Master Your Language, and In the Meantime I'll Create My Own"
> (Tricky)
>
> --------------------------------------------------------------------------
> mpirun has exited due to process rank 2 with PID 3232 on
> node pabitra-Dell-System-XPS-L502X exiting improperly. There are two
> reasons this could occur:
>
> 1. this process did not call "init" before exiting, but others in
> the job did. This can cause a job to hang indefinitely while it waits
> for all processes to call "init". By rule, if one process calls "init",
> then ALL processes must call "init" prior to termination.
>
> 2. this process called "init", but exited without calling "finalize".
> By rule, all processes that call "init" MUST call "finalize" prior to
> exiting or it will be considered an "abnormal termination"
>
> This may have caused other processes in the application to be
> terminated by signals sent by mpirun (as reported here).
> --------------------------------------------------------------------------
> [pabitra-Dell-System-XPS-L502X:03229] 3 more processes have sent help
> message help-mpi-api.txt / mpi-abort
> [pabitra-Dell-System-XPS-L502X:03229] Set MCA parameter
> "orte_base_help_aggregate" to 0 to see all help / error messages
>
>
> Thank you very much for helping
>
> --
> Dr. Pabitra Mohan Behera
> National Post-Doctoral Fellow (DST-SERB)
> Computational Biology and Bioinformatics Lab
> Institute of Life Sciences, Bhubaneswar
> An Autonomous Organization of
> Department of Biotechnology, Government of India
> Nalco Square, Bhubaneswar
> Odisha, Pin-751023 (India)
> Mobile No: +91-9776503664, +91-9439770247
> E-mail: pabitra at ils.res.in, uniquepabs at gmail.com
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list