[gmx-users] domain decomposition and load balancing

Amit Choubey kgp.amit at gmail.com
Fri Feb 19 23:41:45 CET 2010


Hi Mark,

I am not using PME calculation.

I was hoping mdrun will do the cell allocation itself.

Thanks,
Amit


On Fri, Feb 19, 2010 at 2:27 PM, Mark Abraham <mark.abraham at anu.edu.au>wrote:

> ----- Original Message -----
> From: Amit Choubey <kgp.amit at gmail.com>
> Date: Saturday, February 20, 2010 8:51
> Subject: [gmx-users] domain decomposition and load balancing
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>
> > Hi Everyone,
> > I am trying to run a simulation with the option "pbc=xy" turned on. I am
> using 64 processors for the simulation. The mdrun_mpi evokes the following
> error message before starting the md steps
> >
> > There is no domain decomposition for 64 nodes that is compatible with the
> given box and a minimum cell size of 0.889862 nm> Change the number of nodes
> or mdrun option -rdd or -dds>
> Look in the log file for details on the domain decomposition>
> > This has to do with the load balancing in the domain decomposition
> version of mdrun. Can anyone suggest me how to set the option -rdd or -dds?
>
> Those options are not normally the problem - but see the log file for info
> and mdrun -h for instructions.
>
> You should read up in the manual about domain decomposition, and see about
> choosing npme such that 64-npme is a number that is suitably composite that
> you can make a reasonably compact 3D grid so that the minimum cell size is
> not a constraint. Cells have to be large enough that all nonbonded
> interactions can be resolved in consultation with at most nearest-neighbour
> cells (and some other constraints).
>
> I'm assuming pbc=xy requires a 2D DD. For example, npme=19 gives npp=45
> gives 9x5x1, but npme=28 gives npp=36 gives 6x6x1, which allows for the
> cells to have the smallest diameter possible. Of course if your simulation
> box is so small that the 2D DD for pbc=xy will always lead to slabs that are
> too small in one dimension then you can't solve this problem with DD.
>
> If pbc=xy permits a 3D DD, then the same considerations apply. npme=19
> gives 5x3x3 but npme=28 allows 4x3x3
>
> > Also the simulation runs fine on one node (with domain decomposition) and
> with particle decomposition but both of them extremely slow.>
>
> Well, that's normal...
>
> Mark
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20100219/700baf68/attachment.html>


More information about the gromacs.org_gmx-users mailing list