[gmx-users] Slow Runs
Denny Frost
dsfrost at cableone.net
Fri Jan 28 20:57:16 CET 2011
Here's what I've got:
M E G A - F L O P S A C C O U N T I N G
RF=Reaction-Field FE=Free Energy SCFE=Soft-Core/Free Energy
T=Tabulated W3=SPC/TIP3p W4=TIP4p (single or pairs)
NF=No Forces
Computing: M-Number M-Flops % Flops
-----------------------------------------------------------------------------
Coul(T) + VdW(T) 1219164.751609 82903203.109 80.6
Outer nonbonded loop 25980.879385 259808.794 0.3
Calc Weights 37138.271040 1336977.757 1.3
Spread Q Bspline 792283.115520 1584566.231 1.5
Gather F Bspline 792283.115520 4753698.693 4.6
3D-FFT 119163.856212 953310.850 0.9
Solve PME 2527.465668 161757.803 0.2
NS-Pairs 47774.705001 1003268.805 1.0
Reset In Box 371.386080 1114.158 0.0
Shift-X 24758.847360 148553.084 0.1
CG-CoM 1237.953600 3713.861 0.0
Angles 18569.135520 3119614.767 3.0
Propers 14855.308416 3401865.627 3.3
Impropers 3094.855920 643730.031 0.6
Virial 1242.417375 22363.513 0.0
Stop-CM 1237.953600 12379.536 0.0
P-Coupling 12379.423680 74276.542 0.1
Calc-Ekin 12379.436160 334244.776 0.3
Lincs 11760.476208 705628.572 0.7
Lincs-Mat 245113.083072 980452.332 1.0
Constraint-V 23520.928704 188167.430 0.2
Constraint-Vir 11760.452496 282250.860 0.3
-----------------------------------------------------------------------------
Total 102874947.133 100.0
-----------------------------------------------------------------------------
R E A L C Y C L E A N D T I M E A C C O U N T I N G
Computing: Nodes Number G-Cycles Seconds %
-----------------------------------------------------------------------
Neighbor search 1 99195 8779.027 3300.3 3.8
Force 1 991941 188562.885 70886.8 81.7
PME mesh 1 991941 18012.830 6771.6 7.8
Write traj. 1 41 16.835 6.3
0.0
Update 1 991941 2272.379 854.3 1.0
Constraints 1 991941 11121.146 4180.8 4.8
Rest 1 2162.628 813.0 0.9
-----------------------------------------------------------------------
Total 1 230927.730 86813.1 100.0
-----------------------------------------------------------------------
-----------------------------------------------------------------------
PME spread/gather 1 1983882 17065.384 6415.4 7.4
PME 3D-FFT 1 1983882 503.340 189.2 0.2
PME solve 1 991941 427.136 160.6 0.2
-----------------------------------------------------------------------
Does that mean it's only using 1 node? That would explain the speed issues.
On Fri, Jan 28, 2011 at 12:46 PM, Justin A. Lemkul <jalemkul at vt.edu> wrote:
>
>
> Denny Frost wrote:
>
>> gromacs 4.5.1
>>
>>
> Ah, what I posted was from 4.0.7. I wonder why that sort of output was
> eliminated in 4.5; it's quite useful. Sorry for leading you astray on that.
> No matter, the end of the .log file will still contain statistics about
> what's eating up all your simulation time.
>
> -Justin
>
> On Fri, Jan 28, 2011 at 12:40 PM, Erik Marklund <erikm at xray.bmc.uu.se<mailto:
>> erikm at xray.bmc.uu.se>> wrote:
>>
>> PME is still an Ewald sum.
>>
>> Erik
>>
>> Denny Frost skrev 2011-01-28 20.38:
>>
>>> I don't have any domain decomposition information like that in my
>>> log file. That's worrisome. The only other information I could
>>> find about PME and Ewald and this set of lines:
>>>
>>> Table routines are used for coulomb: TRUE
>>> Table routines are used for vdw: FALSE
>>> Will do PME sum in reciprocal space.
>>>
>>> ++++ PLEASE READ AND CITE THE FOLLOWING REFERENCE ++++
>>> U. Essman, L. Perela, M. L. Berkowitz, T. Darden, H. Lee and L. G.
>>> Pedersen A smooth particle mesh Ewald method
>>> J. Chem. Phys. 103 (1995) pp. 8577-8592
>>> -------- -------- --- Thank You --- -------- --------
>>>
>>> Will do ordinary reciprocal space Ewald sum.
>>> Using a Gaussian width (1/beta) of 0.384195 nm for Ewald
>>> Cut-off's: NS: 1.2 Coulomb: 1.2 LJ: 1.2
>>> System total charge: 0.000
>>> Generated table with 4400 data points for Ewald.
>>> Tabscale = 2000 points/nm
>>> Generated table with 4400 data points for LJ6.
>>> Tabscale = 2000 points/nm
>>> Generated table with 4400 data points for LJ12.
>>> Tabscale = 2000 points/nm
>>> Configuring nonbonded kernels...
>>> Configuring standard C nonbonded kernels...
>>> Testing x86_64 SSE2 support... present.
>>>
>>>
>>> Why does it say it will do PME on one line, then ordinary Ewald later?
>>>
>>> On Fri, Jan 28, 2011 at 12:26 PM, Justin A. Lemkul
>>> <jalemkul at vt.edu <mailto:jalemkul at vt.edu>> wrote:
>>>
>>>
>>>
>>> Denny Frost wrote:
>>>
>>> I just realized that that was a very old mdp file. Here
>>> is an mdp file from my most recent run as well as what I
>>> think are the domain decomposition statistics.
>>>
>>> mdp file:
>>> title = BMIM+PF6
>>> cpp = /lib/cpp
>>> constraints = hbonds
>>> integrator = md
>>> dt = 0.002 ; ps !
>>> nsteps = 4000000 ; total 8ns.
>>> nstcomm = 1
>>> nstxout = 50000
>>> nstvout = 50000
>>> nstfout = 0
>>> nstlog = 5000
>>> nstenergy = 5000
>>> nstxtcout = 25000
>>> nstlist = 10
>>> ns_type = grid
>>> pbc = xyz
>>> coulombtype = PME
>>> vdwtype = Cut-off
>>> rlist = 1.2
>>> rcoulomb = 1.2
>>> rvdw = 1.2
>>> fourierspacing = 0.12
>>> pme_order = 4
>>> ewald_rtol = 1e-5
>>> ; Berendsen temperature coupling is on in two groups
>>> Tcoupl = berendsen
>>> tc_grps = BMI PF6 tau_t
>>> = 0.2 0.2
>>> ref_t = 300 300
>>> nsttcouple = 1
>>> ; Energy monitoring
>>> energygrps = BMI PF6
>>> ; Isotropic pressure coupling is now on
>>> Pcoupl = berendsen
>>> pcoupltype = isotropic
>>> ;pc-grps = BMI PFF
>>> tau_p = 1.0
>>> ref_p = 1.0
>>> compressibility = 4.5e-5
>>>
>>> ; Generate velocites is off at 300 K.
>>> gen_vel = yes
>>> gen_temp = 300.0
>>> gen_seed = 100000
>>>
>>> domain decomposition
>>> There are: 12800 Atoms
>>> Max number of connections per atom is 63
>>> Total number of connections is 286400
>>> Max number of graph edges per atom is 6
>>> Total number of graph edges is 24800
>>>
>>>
>>> More useful information is contained at the very top of the
>>> .log file, after the citations. An example from one of my own
>>> runs is:
>>>
>>> Linking all bonded interactions to atoms
>>> There are 2772 inter charge-group exclusions,
>>> will use an extra communication step for exclusion forces for PME
>>>
>>> The initial number of communication pulses is: X 2 Y 1
>>> The initial domain decomposition cell size is: X 1.05 nm Y 1.58 nm
>>>
>>> The maximum allowed distance for charge groups involved in
>>> interactions is:
>>> non-bonded interactions 1.400 nm
>>> (the following are initial values, they could change due to
>>> box deformation)
>>> two-body bonded interactions (-rdd) 1.400 nm
>>> multi-body bonded interactions (-rdd) 1.054 nm
>>> atoms separated by up to 5 constraints (-rcon) 1.054 nm
>>>
>>> When dynamic load balancing gets turned on, these settings
>>> will change to:
>>> The maximum number of communication pulses is: X 2 Y 2
>>> The minimum size for domain decomposition cells is 0.833 nm
>>> The requested allowed shrink of DD cells (option -dds) is: 0.80
>>> The allowed shrink of domain decomposition cells is: X 0.79 Y 0.53
>>> The maximum allowed distance for charge groups involved in
>>> interactions is:
>>> non-bonded interactions 1.400 nm
>>> two-body bonded interactions (-rdd) 1.400 nm
>>> multi-body bonded interactions (-rdd) 0.833 nm
>>> atoms separated by up to 5 constraints (-rcon) 0.833 nm
>>>
>>>
>>> Making 2D domain decomposition grid 9 x 6 x 1, home cell index
>>> 0 0 0
>>>
>>>
>>> Also, the output under "DOMAIN DECOMPOSITION STATISTICS" (at
>>> the bottom of the file) would be useful. Also look for any
>>> notes about performance lost due to imbalance, waiting for
>>> PME, etc. These provide very detailed clues about how your
>>> system was treated.
>>>
>>> -Justin
>>>
>>>
>>> -- ========================================
>>>
>>> Justin A. Lemkul
>>> Ph.D. Candidate
>>> ICTAS Doctoral Scholar
>>> MILES-IGERT Trainee
>>> Department of Biochemistry
>>> Virginia Tech
>>> Blacksburg, VA
>>> jalemkul[at]vt.edu <http://vt.edu> | (540) 231-9080
>>>
>>> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>>>
>>> ========================================
>>> -- gmx-users mailing list gmx-users at gromacs.org
>>> <mailto:gmx-users at gromacs.org>
>>>
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before
>>> posting!
>>> Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-request at gromacs.org
>>> <mailto:gmx-users-request at gromacs.org>.
>>>
>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>>
>>>
>>
>> -- -----------------------------------------------
>> Erik Marklund, PhD student
>> Dept. of Cell and Molecular Biology, Uppsala University.
>> Husargatan 3, Box 596, 75124 Uppsala, Sweden
>> phone: +46 18 471 4537 fax: +46 18 511 755
>> erikm at xray.bmc.uu.se <mailto:erikm at xray.bmc.uu.se>
>> http://folding.bmc.uu.se/
>>
>>
>>
>> --
>> gmx-users mailing list gmx-users at gromacs.org
>> <mailto:gmx-users at gromacs.org>
>>
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org
>> <mailto:gmx-users-request at gromacs.org>.
>>
>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
> --
> ========================================
>
> Justin A. Lemkul
> Ph.D. Candidate
> ICTAS Doctoral Scholar
> MILES-IGERT Trainee
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20110128/258d207f/attachment.html>
More information about the gromacs.org_gmx-users
mailing list