[gmx-users] Speeding things up

Berk Hess gmx3 at hotmail.com
Fri Aug 14 10:50:29 CEST 2009


Hi,

For systems with vacuum the automatic domain decomposition setup does
not do a good job. It currently decomposes based on the box dimensions,
not on the actual atom distribution in the box.
I was thinking of improving this a bit for 4.1.

I would guess -dd 4 2 1 will give the best performance.

Berk

> Date: Fri, 14 Aug 2009 10:36:50 +0200
> From: szefczyk at mml.ch.pwr.wroc.pl
> To: gmx-users at gromacs.org
> Subject: [gmx-users] Speeding things up
> 
> Dear Gromacs Users,
> 
> I am was wondering if I would be able tweak my simulation to run it
> a bit faster. According to the chapter 3.17 of the manual, I have setup
> the cut-offs and Fourier grid spacing so that the PME load is around 25%.
> However, Gromacs still complains in the log file about the performance
> loss due to the load imballance:
> 
>     D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S
> 
>  av. #atoms communicated per step for force:  2 x 49748.7
>  av. #atoms communicated per step for LINCS:  2 x 4755.5
> 
>  Average load imbalance: 29.1 %
>  Part of the total run time spent waiting due to load imbalance: 17.1 %
>  Steps where the load balancing was limited by -rdd, -rcon and/or -dds: X 9 %
> 
> NOTE: 17.1 % performance was lost due to load imbalance
>       in the domain decomposition.
> 
> 
>      R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G
> 
>  Computing:         Nodes     Number     G-Cycles    Seconds     %
> -----------------------------------------------------------------------
>  Domain decomp.         8       5001      418.999      167.6     0.4
>  Comm. coord.           8      50001      312.233      124.9     0.3
>  Neighbor search        8       5001    11297.211     4518.7    12.0
>  Force                  8      50001    55518.686    22206.4    58.7
>  Wait + Comm. F         8      50001      761.632      304.6     0.8
>  PME mesh               8      50001    24141.436     9656.1    25.5
>  Write traj.            8        105        3.990        1.6     0.0
>  Update                 8      50001      407.860      163.1     0.4
>  Constraints            8      50001     1229.791      491.9     1.3
>  Comm. energies         8      50001      230.156       92.1     0.2
>  Rest                   8                 182.584       73.0     0.2
> -----------------------------------------------------------------------
>  Total                  8               94504.578    37800.0   100.0
> 
>         Parallel run - timing based on wallclock.
> 
>                NODE (s)   Real (s)      (%)
>        Time:   4725.000   4725.000    100.0
>                        1h18:45
> 
> Is there something I could do about it? I should probably mention
> that my system is composed of a slab of molecules and a protein on top
> of it, in a cubic box, so there is a lot of vacuum around. I understand
> that Gromacs should take care to optimize the domain decomposition
> (I use the default parameters for DD). The job is running on 8 cores
> of a single machine.
> 
> I would appreciate suggestions,
> Borys Szefczyk
> 
> -- 
>                  REQUIMTE,  &  Molecular Modelling & Quantum Chemistry Group,
>   Department of Chemistry,  &  Institute of Physical & Theoretical Chemistry,
>        Faculty of Science,  &  Wroclaw University of Technology
>        University of Porto  &  http://ichfit.ch.pwr.wroc.pl/people/szefczyk
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php

_________________________________________________________________
See all the ways you can stay connected to friends and family
http://www.microsoft.com/windows/windowslive/default.aspx
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20090814/ef221be5/attachment.html>


More information about the gromacs.org_gmx-users mailing list