[gmx-users] Gromacs 4 Scaling Benchmarks...
ckutzne at gwdg.de
Mon Nov 10 13:44:26 CET 2008
most likely the Ethernet is the problem here. I compiled some numbers
for the DPPC
benchmark in the paper "Speeding up parallel GROMACS on high-latency
which are for version 3.3, but PME will behave similarly. If you did
not already use
separate PME nodes, this is worth a try, since on Ethernet the
performance will drastically
depend on the number of nodes involved in the FFT. I also have a tool
which finds the
optimal PME settings for a given number of nodes, by varying the
number of PME nodes
and the fourier grid settings. I can send it to you if you want.
On Nov 9, 2008, at 10:30 PM, Yawar JQ wrote:
> I was wondering if anyone could comment on these benchmark results
> for the d.dppc benchmark?
> Nodes Cutoff (ns/day) PME (ns/day)
> 4 1.331 0.797
> 8 2.564 1.497
> 16 4.5 1.92
> 32 8.308 0.575
> 64 13.5 0.275
> 128 20.093 -
> 192 21.6 -
> It seems to scale relatively well up to 32-64 nodes without PME.
> This seems slightly better than the benchmark results for Gromacs 3
> on www.gromacs.org.
> Can someone comment on the magnitude of the performance hit and lack
> of scaling with PME is worrying me.
> For the PME runs, I set rlist,rvdw,rouloumb=1.2 and the rest set to
> the defaults. I can try it with some other settings, larger spacing
> for the grid, but I'm not sure how much more that would help. Is
> there a more standardized system I should use for testing PME scaling?
> This is with GNU compilers and parallelization with OpenMPI 1.2. I'm
> not sure what we're using for the FFTW The compute nodes are Dell
> m600 blades w/ 16GB of RAM and dual quad core Intel Xeon 3GHz
> processors. I believe it's all ethernet interconnects.
> gmx-users mailing list gmx-users at gromacs.org
> Please search the archive at http://www.gromacs.org/search before
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gromacs.org_gmx-users