[gmx-users] How to set more mpi for the REMD run?

Paul bauer paul.bauer.q at gmail.com
Mon Nov 11 15:19:21 CET 2019


Hello,

yes, this is indeed possible (see 
http://manual.gromacs.org/current/user-guide/mdrun-features.html#running-multi-simulations 
for more information on multi simulations with GROMACS).
It is just the case that the tutorial system is too small to be 
correctly distributed with domain decomposition. If you use a larger 
system that will work.

Cheers

Paul

On 11/11/2019 14:57, ZHANG Cheng wrote:
> Thank you Paul! I want to use more than one mpi processes for each of the REMD, would it be possible?
>
>
>
>
> ------------------ Original ------------------
> From:&nbsp;"ZHANG Cheng"<272699575 at qq.com&gt;;
> Date:&nbsp;Mon, Nov 11, 2019 09:39 PM
> To:&nbsp;"gromacs.org_gmx-users"<gromacs.org_gmx-users at maillist.sys.kth.se&gt;;
>
> Subject:&nbsp;How to set more mpi for the REMD run?
>
>
>
> I am using the same files based on Mark Abraham's REMD tutorial, except using a recent Gromacs version (gromacs/2019.3).http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham%2C_Session_1B
>
>
> For Stage 1 of the tutorial, when "#$ -pe mpi 4" is used, it can be run successfully:
> gerun mdrun_mpi -v -multidir ./equil[0123]
>
>
> However, when "#$ -pe mpi 12" is used with the same command, the error message told me as the below. Can I ask how to properly set more mpi?
>
>
>
>
> Program:&nbsp; &nbsp; &nbsp;mdrun_mpi, version 2019.3
> Source file: src/gromacs/domdec/domdec.cpp (line 2403)
> MPI rank:&nbsp; &nbsp; 6 (out of 12)
>
>
> Fatal error:
> There is no domain decomposition for 3 ranks that is compatible with the given
> box and a minimum cell size of 0.8875 nm
> Change the number of ranks or mdrun option -rcon or -dds or your LINCS
> settings
> Look in the log file for details on the domain decomposition
>
>
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
>
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 6


-- 
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594



More information about the gromacs.org_gmx-users mailing list