[gmx-users] mdrun domain decomposition in GROMACS 5 and GROMACS 4

Justin Lemkul jalemkul at vt.edu
Thu Jun 18 00:29:55 CEST 2015



On 6/17/15 6:18 PM, Rebeca García Fandiño wrote:
> Dear Gromacs users,
> I am trying to carry out a mdrun simulation (mdrun -v -deffnm prod ) and depending on the version and on the number of processors I use, I am having these errors:
>
> 1) Using GROMACS 5.0.4 (AMD architecture) and 2 or 4 processors, the calculation is OK.
>
> 2) Using GROMACS 4.6.2 (Sandy architecture) and 2, 4, 8 processors, the calculation is also OK.
>
> 3) Using GROMACS 5.0.4 (AMD architecture) and 8 processors, I get this error:
>
> Program mdrun, VERSION 5.0.4
> Source code file: /root/gromacs-5.0.4/src/gromacs/mdlib/domdec.c, line: 6902
>
> Fatal error:
> There is no domain decomposition for 8 ranks that is compatible with the given box and a minimum cell size of 0.959375 nm
> Change the number of ranks or mdrun option -rcon or -dds or your LINCS settings
>
>
> 4) Using GROMACS 4.6.2 (Sandy architecture) and 16 processors, I get this error, too:
>
> Fatal error:
> There is no domain decomposition for 16 nodes that is compatible with the given box and a minimum cell size of 0.959375 nm
> Change the number of nodes or mdrun option -rcon or -dds or your LINCS settings
>
> Which could be the reason of the errors? I assume this is not due to the system, since for a lower number of processors, it works well.
>

http://www.gromacs.org/Documentation/Errors#There_is_no_domain_decomposition_for_n_nodes_that_is_compatible_with_the_given_box_and_a_minimum_cell_size_of_x_nm

Your system is too small to make use of 8 processors with DD.

-Justin

-- 
==================================================

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==================================================


More information about the gromacs.org_gmx-users mailing list