[gmx-developers] Alternate Parallelization scheme
spoel at xray.bmc.uu.se
Sat Jun 18 17:56:44 CEST 2005
On Sat, 2005-06-18 at 10:23 -0500, Nathan Moore wrote:
> I'm porting GROMACS to IBM's Blue Gene system. At present, the dpcc
> benchmark runs fastest at about 64 processors. A deeper look into the
> execution shows that of the ~1100 seconds of walltime the run takes, ~500
> seconds are used for MPI communication. I'd like to see if this can be
The 500 s you see are not communication, they are load imbalance. This
means some processors are ready with their work and others are still
working. The real communication time is negligible. What it does tell
you is that with good load balancing you would be able to go down 500 s
in execution time.
> Accordingly, I'd like to implement an MPI collective routine rather than
> the ring structure currently implemented. The most important parts to
> parallelize seem like the move_x and move_f functions (perhaps also
> (1) in what function are the x and f arrays declared? I assume both are
> cartesian triplets - is this defined in a struct or are the arrays flat?
Yes, it can be considered a 1D array of length 3*Natoms
> (2) Which particles does each node control? I assume that the grompp
> options -sort and -shuffle destroy the easy mapping, rank=0 gets the first
> 10 atoms, rank=1 gets the second 10 atoms etc.
This doesn't matter once you're in mdrun. All this stuff is done in the
preprocessor. As a matter of fact mdrun prints in md0.log how the
division over processors is done.
> Nathan Moore
> gmx-developers mailing list
> gmx-developers at gromacs.org
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-developers-request at gromacs.org.
David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596, 75124 Uppsala, Sweden
phone: 46 18 471 4205 fax: 46 18 511 755
spoel at xray.bmc.uu.se spoel at gromacs.org http://xray.bmc.uu.se/~spoel
More information about the gromacs.org_gmx-developers