[gmx-users] How to set more mpi for the REMD run?

Paul bauer paul.bauer.q at gmail.com
Mon Nov 11 14:44:30 CET 2019


Hello,

you have set the -mpi option correctly (I think), but the small test 
system used for the tutorial can not be split up over more domains (as 
the  error message tells you).
The idea of the tutorial are here to run on system for each mpi process. 
Are you trying to do the same (and run 12 different simulations), or do 
you want to use several mpi processes for each of the REMD simulations?

Cheers

Paul

On 11/11/2019 14:39, ZHANG Cheng wrote:
> I am using the same files based on Mark Abraham's REMD tutorial, except using a recent Gromacs version (gromacs/2019.3).http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham%2C_Session_1B
>
>
> For Stage 1 of the tutorial, when "#$ -pe mpi 4" is used, it can be run successfully:
> gerun mdrun_mpi -v -multidir ./equil[0123]
>
>
> However, when "#$ -pe mpi 12" is used with the same command, the error message told me as the below. Can I ask how to properly set more mpi? 
>
>
>
>
> Program:     mdrun_mpi, version 2019.3
> Source file: src/gromacs/domdec/domdec.cpp (line 2403)
> MPI rank:    6 (out of 12)
>
>
> Fatal error:
> There is no domain decomposition for 3 ranks that is compatible with the given
> box and a minimum cell size of 0.8875 nm
> Change the number of ranks or mdrun option -rcon or -dds or your LINCS
> settings
> Look in the log file for details on the domain decomposition
>
>
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
>
> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 6


-- 
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594



More information about the gromacs.org_gmx-users mailing list