[gmx-users] difference of calculation time between the numbers of fep-lambdas values

Justin Lemkul jalemkul at vt.edu
Tue Apr 28 02:49:56 CEST 2015



On 4/27/15 8:34 PM, nao.morishita at takeda.com wrote:
> Hi Justin and Hannes,
>
>
>
> Thank you for the replay.
>
> I apologize for the insufficient information.
>
> Actually, I calculated them with couple-moltype = solute and init-lambda-state = 1.
>
> The other parameters except for fep-lambdas, the number of core(8) and hardware were same.
>
> When I look at "REAL CCLE AND TIME ACCOUNTING" in log file, there is large difference in "PME redist. X/F".
>
>
>
> What makes it much longer?
>

Please upload full .log files to a file-sharing service.

The first run is systematically slower than the second.  Nothing else was 
running on this hardware?  The machine was the same, or identical hardware? 
Same GROMACS binaries?

-Justin

> I would appreciate any comments or concerns.
>
>
>
> Input1(fep-lambdas           = 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0)
>
> ****************************************************************************************
>
>
>
> Computing:         Nodes   Th.     Count  Wall t (s)     G-Cycles       %
>
> -----------------------------------------------------------------------------
>
> Domain decomp.         8    1     250000     100.985     2100.559     0.2
>
> DD comm. load          8    1        501       0.004        0.082     0.0
>
> Neighbor search        8    1     250001     351.227     7305.783     0.8
>
> Comm. coord.           8    1    2500001      74.248     1544.405     0.2
>
> Force                  8    1    2500001    8671.720   180378.288    20.8
>
> Wait + Comm. F         8    1    2500001      49.926     1038.492     0.1
>
> PME mesh               8    1    2500001   31496.259   655145.855    75.6
>
> Write traj.            8    1        547       0.275        5.710     0.0
>
> Update                 8    1    2500001     284.487     5917.540     0.7
>
> Constraints            8    1    5000002     396.070     8238.552     1.0
>
> Comm. energies         8    1     250001       7.876      163.837     0.0
>
> Rest                   8                     250.084     5201.928     0.6
>
> -----------------------------------------------------------------------------
>
> Total                  8                   41683.159   867041.031   100.0
>
> -----------------------------------------------------------------------------
>
> -----------------------------------------------------------------------------
>
> PME redist. X/F        8    1    7500003   26513.015   551490.632    63.6
>
> PME spread/gather      8    1   10000004    3879.936    80705.579     9.3
>
> PME 3D-FFT             8    1   10000004     702.656    14615.765     1.7
>
> PME 3D-FFT Comm.       8    1   20000008     264.536     5502.543     0.6
>
> PME solve              8    1    5000002     127.236     2646.610     0.3
>
> -----------------------------------------------------------------------------
>
>
>
>                 Core t (s)   Wall t (s)        (%)
>
>         Time:   332678.400    41683.159      798.1
>
>                           11h34:43
>
>                   (ns/day)    (hour/ns)
>
> Performance:       10.364        2.316
>
> Finished mdrun on node 0 Thu Apr  2 06:31:45 2015
>
> *********************************************************************************
>
> Input2(fep-lambdas           = 0.0 0.1 0.2)
>
> *********************************************************************************
>
> Computing:         Nodes   Th.     Count  Wall t (s)     G-Cycles       %
>
> -----------------------------------------------------------------------------
>
> Domain decomp.         8    1     250000      85.779     1784.259     0.5
>
> DD comm. load          8    1        501       0.003        0.072     0.0
>
> Neighbor search        8    1     250001     312.515     6500.517     1.7
>
> Comm. coord.           8    1    2500001      69.584     1447.387     0.4
>
> Force                  8    1    2500001    4219.963    87778.089    23.5
>
> Wait + Comm. F         8    1    2500001      46.401      965.178     0.3
>
> PME mesh               8    1    2500001   12342.326   256728.717    68.8
>
> Write traj.            8    1        520       0.150        3.117     0.0
>
> Update                 8    1    2500001     262.237     5454.706     1.5
>
> Constraints            8    1    5000002     368.111     7656.961     2.1
>
> Comm. energies         8    1     250001       6.524      135.712     0.0
>
> Rest                   8                     227.039     4722.575     1.3
>
> -----------------------------------------------------------------------------
>
> Total                  8                   17940.633   373177.290   100.0
>
> -----------------------------------------------------------------------------
>
> -----------------------------------------------------------------------------
>
> PME redist. X/F        8    1    7500003    7982.056   166032.158    44.5
>
> PME spread/gather      8    1   10000004    3327.975    69224.129    18.5
>
> PME 3D-FFT             8    1   10000004     645.036    13417.186     3.6
>
> PME 3D-FFT Comm.       8    1   20000008     259.876     5405.587     1.4
>
> PME solve              8    1    5000002     118.906     2473.318     0.7
>
> -----------------------------------------------------------------------------
>
>
>
>                 Core t (s)   Wall t (s)        (%)
>
>         Time:   143196.260    17940.633      798.2
>
>                           4h59:00
>
>                   (ns/day)    (hour/ns)
>
> Performance:       24.079        0.997
>
> Finished mdrun on node 0 Fri Apr  3 19:11:15 2015
>
> *********************************************************************************
>
>
>
> Best regards,
>
>
>
> Nao
>
>
>
>
>
> -----Original Message-----
> From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] On Behalf Of Hannes Loeffler
> Sent: Monday, April 27, 2015 10:48 PM
> To: gromacs.org_gmx-users at maillist.sys.kth.se
> Cc: gmx-users at gromacs.org
> Subject: Re: [gmx-users] difference of calculation time between the numbers of fep-lambdas values
>
>
>
> On Mon, 27 Apr 2015 09:05:05 -0400
>
> Justin Lemkul <jalemkul at vt.edu<mailto:jalemkul at vt.edu>> wrote:
>
>
>
>>
>
>>
>
>> On 4/27/15 9:02 AM, Hannes Loeffler wrote:
>
>>> On Mon, 27 Apr 2015 07:43:51 -0400
>
>>> Justin Lemkul <jalemkul at vt.edu<mailto:jalemkul at vt.edu>> wrote:
>
>>>
>
>>>> Note that without couple-moltype, you're going to be decoupling the
>
>>>> whole system, which is (1) not what you want for calculating
>
>>>> solvation free energy and (2) extremely slow.  There is an inherent
>
>>>> slowdown when running the free energy code, but it should not be so
>
>>>> large.
>
>>>
>
>>> Really? I just looked into the code yesterday and I think that if
>
>>> you do not set couple-moltype, none of the coupling code is ever
>
>>> triggered.  I have only played a bit with relative free energies
>
>>> (Gromacs 4.6) where I think you don't need any of the couple-
>
>>> parameters at all.
>
>>>
>
>>
>
>> Maybe I got that backwards.  For relative free energies, this is true
>
>> (we're doing these now); topological differences are all that are
>
>> needed.  For an absolute solvation free energy, one needs
>
>> couple-moltype unless the B-state (dummy) is explicitly defined in
>
>> the topology, AFAIK.  If there is no couple-moltype, and no B-state
>
>> defined in the topology, I'm not sure what the code would even be
>
>> doing, actually.
>
>
>
> I think it runs a normal MD simulation for each lambda.  A quick test
>
> with a A-state only topology and no couple- parameters gives only
>
> gradients with zero value.  In fact, the output file says
>
>
>
> There are 0 atoms and 0 charges for free energy perturbation
>
>
>
>
>
> Cheers,
>
> Hannes.
>
>
>
> --
>
> Gromacs Users mailing list
>
>
>
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
>
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>
>
> * For (un)subscribe requests visit
>
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org<mailto:gmx-users-request at gromacs.org>.
>

-- 
==================================================

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==================================================


More information about the gromacs.org_gmx-users mailing list