[gmx-users] domain decomposition and load balancing

Justin A. Lemkul jalemkul at vt.edu
Sat Feb 20 00:11:22 CET 2010



Amit Choubey wrote:
> Hi Mark,
> 
> I am not using PME calculation.
> 
> I was hoping mdrun will do the cell allocation itself.
> 

It will, unless it can't, which is exactly your problem.  Mark's point stands, 
regardless of whether or not you're using PME.  DD requires certain minimum size 
requirements (which are discussed in the manual and the Gromacs 4 paper), so you 
have two choices:

1. Read about the options mdrun is telling you about.
2. Use fewer nodes so that the DD algorithm can construct reasonably-sized domains.

-Justin

> Thanks,
> Amit
> 
> 
> On Fri, Feb 19, 2010 at 2:27 PM, Mark Abraham <mark.abraham at anu.edu.au 
> <mailto:mark.abraham at anu.edu.au>> wrote:
> 
>     ----- Original Message -----
>     From: Amit Choubey <kgp.amit at gmail.com <mailto:kgp.amit at gmail.com>>
>     Date: Saturday, February 20, 2010 8:51
>     Subject: [gmx-users] domain decomposition and load balancing
>     To: Discussion list for GROMACS users <gmx-users at gromacs.org
>     <mailto:gmx-users at gromacs.org>>
> 
>      > Hi Everyone,
>      > I am trying to run a simulation with the option "pbc=xy" turned
>     on. I am using 64 processors for the simulation. The mdrun_mpi
>     evokes the following error message before starting the md steps
>      >
>      > There is no domain decomposition for 64 nodes that is compatible
>     with the given box and a minimum cell size of 0.889862 nm> Change
>     the number of nodes or mdrun option -rdd or -dds>
>     Look in the log file for details on the domain decomposition>
>      > This has to do with the load balancing in the domain
>     decomposition version of mdrun. Can anyone suggest me how to set the
>     option -rdd or -dds?
> 
>     Those options are not normally the problem - but see the log file
>     for info and mdrun -h for instructions.
> 
>     You should read up in the manual about domain decomposition, and see
>     about choosing npme such that 64-npme is a number that is suitably
>     composite that you can make a reasonably compact 3D grid so that the
>     minimum cell size is not a constraint. Cells have to be large enough
>     that all nonbonded interactions can be resolved in consultation with
>     at most nearest-neighbour cells (and some other constraints).
> 
>     I'm assuming pbc=xy requires a 2D DD. For example, npme=19 gives
>     npp=45 gives 9x5x1, but npme=28 gives npp=36 gives 6x6x1, which
>     allows for the cells to have the smallest diameter possible. Of
>     course if your simulation box is so small that the 2D DD for pbc=xy
>     will always lead to slabs that are too small in one dimension then
>     you can't solve this problem with DD.
> 
>     If pbc=xy permits a 3D DD, then the same considerations apply.
>     npme=19 gives 5x3x3 but npme=28 allows 4x3x3
> 
>      > Also the simulation runs fine on one node (with domain
>     decomposition) and with particle decomposition but both of them
>     extremely slow.>
> 
>     Well, that's normal...
> 
>     Mark
>     --
>     gmx-users mailing list    gmx-users at gromacs.org
>     <mailto:gmx-users at gromacs.org>
>     http://lists.gromacs.org/mailman/listinfo/gmx-users
>     Please search the archive at http://www.gromacs.org/search before
>     posting!
>     Please don't post (un)subscribe requests to the list. Use the
>     www interface or send it to gmx-users-request at gromacs.org
>     <mailto:gmx-users-request at gromacs.org>.
>     Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> 
> 

-- 
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================



More information about the gromacs.org_gmx-users mailing list