[gmx-users] Domain decomposition for parallel simulations
Mark Abraham
mark.j.abraham at gmail.com
Sat Feb 10 02:39:56 CET 2018
Hi,
On Fri, Feb 9, 2018, 17:15 Kevin C Chan <cchan2242-c at my.cityu.edu.hk> wrote:
> Dear Users,
>
> I have encountered the problem of "There is no domain decomposition for n
> nodes that is compatible with the given box and a minimum cell size of x
> nm" and by reading through the gromacs website and some threads I
> understand that the problem might be caused by breaking the system into too
> small boxes by too many ranks. However, I have no idea how to get the
> correct estimation of suitable paralleling parameters. Hope someone could
> share his experience.
>
> Here are information stated in the log file:
> *Initializing Domain Decomposition on 4000 ranks*
> *Dynamic load balancing: on*
> *Will sort the charge groups at every domain (re)decomposition*
> *Initial maximum inter charge-group distances:*
> * two-body bonded interactions: 0.665 nm, Dis. Rest., atoms 23558 23590*
> * multi-body bonded interactions: 0.425 nm, Proper Dih., atoms 12991
> 12999*
> *Minimum cell size due to bonded interactions: 0.468 nm*
> *Maximum distance for 5 constraints, at 120 deg. angles, all-trans: 0.819
> nm*
> *Estimated maximum distance required for P-LINCS: 0.819 nm*
>
Here we see mdrun report how large it needs to make the domains to ensure
they can do their job - in this case P-LINCS is the most demanding.
*This distance will limit the DD cell size, you can override this with
> -rcon*
> *Guess for relative PME load: 0.11*
> *Will use 3500 particle-particle and 500 PME only ranks*
> *This is a guess, check the performance at the end of the log file*
> *Using 500 separate PME ranks, as guessed by mdrun*
>
Mdrun guessed poorly, as we will see.
*Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25*
> *Optimizing the DD grid for 3500 cells with a minimum initial size of 1.024
> nm*
>
That's 1.25 × 0.819, so that domains can cope as particles move around.
*The maximum allowed number of cells is: X 17 Y 17 Z 17*
>
Thus the grid that produces 3500 ranks can have no dimension greater than
17.
And I got this afterwards:
> *Fatal error:*
> *There is no domain decomposition for 3500 ranks that is compatible with
> the given box and a minimum cell size of 1.02425 nm*
>
> Here are some questions:
> 1. the maximum allowed number of cells is 17x17x17 which is 4913 and seems
> to be larger than the requested 3500 particle-particle ranks, so the
> minimum cell size is not causing the problem?
>
It is. The prime factors of 3500 are not very forgiving. The closest
factorization that might produce a grid with all dimensions below 17 is 25
× 14 × 10. So mdrun painted itself into a corner when choosing 3500 PP
ranks. The choice of decomposition is not trivial (see one of my published
works, hint hint), and it is certainly possible that using less hardware
provides better performance through making it possible to have two PP and
PME decompositions have mutually agreeable decompositions that leads to
better message passing performance. 4000 is very awkward given the
constraint of 17. Maybe 16x16x15 overall ranks is good.
2. Where does this 1.024 nm comes from? We can see the inter charge-group
> distances are listed as 0.665 and 0.425 nm
> 3. The distance restraint between atoms 23558 23590 was set explicitly (or
> added manually) in the topology file and should be around 0.32 nm by using
> [intermolecular_interactions]. How could I know my manual setting is
> working or not? As it has shown a different value.
>
Well one of you is right, but I can't tell which :-) Try measuring it in a
different way.
Mark
> Thanks in advance,
> Kevin
> OSU
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list