[gmx-users] Should I use separate PME nodes
Carsten Kutzner
ckutzne at gwdg.de
Fri Jun 25 15:42:59 CEST 2010
Hi Gaurav,
separate PME nodes usually pay off on a larger number of nodes
(>16). In rare cases, you will see a performance benefit on a small number
of nodes as well. Just try it! Or use g_tune_pme ... ;)
Carsten
On Jun 25, 2010, at 3:32 PM, Gaurav Goel wrote:
> I ran my simulation in parallel on 4 nodes (with zero separate PME
> nodes). Below is the information printed in md.log.
>
> I see that PME-Mesh calculations took 60% of CPU time. Any
> recommendations on using 1 or more separate PME nodes to speed up?
>
>
> Computing: M-Number M-Flops % Flops
> -----------------------------------------------------------------------
> Coul(T) + VdW(T) 1761401.496982 119775301.795 20.2
> Outer nonbonded loop 106414.135764 1064141.358 0.2
> Calc Weights 32400.006480 1166400.233 0.2
> Spread Q Bspline 2332800.466560 4665600.933 0.8
> Gather F Bspline 2332800.466560 27993605.599 4.7
> 3D-FFT 47185929.437184 377487435.497 63.6
> Solve PME 675840.135168 43253768.651 7.3
> NS-Pairs 823453.927656 17292532.481 2.9
> Reset In Box 2160.002160 6480.006 0.0
> CG-CoM 2160.004320 6480.013 0.0
> Virial 11700.002340 210600.042 0.0
> Ext.ens. Update 10800.002160 583200.117 0.1
> Stop-CM 10800.002160 108000.022 0.0
> Calc-Ekin 10800.004320 291600.117 0.0
> -----------------------------------------------------------------------
> Total 593905146.863 100.0
> -----------------------------------------------------------------------
>
> R E A L C Y C L E A N D T I M E A C C O U N T I N G
>
> Computing: Nodes Number G-Cycles Seconds %
> -----------------------------------------------------------------------
> Domain decomp. 4 1000001 3859.416 1488.1 0.6
> Comm. coord. 4 5000001 1874.635 722.8 0.3
> Neighbor search 4 1000001 78640.722 30322.2 11.2
> Force 4 5000001 180659.902 69658.5 25.8
> Wait + Comm. F 4 5000001 2578.994 994.4 0.4
> PME mesh 4 5000001 422268.834 162817.7 60.4
> Write traj. 4 10001 17.526 6.8 0.0
> Update 4 5000001 2981.794 1149.7 0.4
> Comm. energies 4 5000001 2633.176 1015.3 0.4
> Rest 4 3580.341 1380.5 0.5
> -----------------------------------------------------------------------
> Total 4 699095.342 269556.0 100.0
> -----------------------------------------------------------------------
>
> Thanks,
> Gaurav
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne
More information about the gromacs.org_gmx-users
mailing list