[gmx-users] How do I get to know the meaning of the first column in log file?
Guillaume Chevrot
guillaume.chevrot at gmail.com
Mon Jul 28 09:51:25 CEST 2014
Hi,
if you need 0.345 hour to get 1 ns, that means that you can simulate
1/0.345~2.9ns/hour, then in one day you will simulate 1/0.345*24~69.5 ns
Guillaume
On 07/28/2014 09:39 AM, Theodore Si wrote:
> I thought that 69.479 ns/day means I can simulate 69.479 ns per day.
> But if as you said, I need 0.345 hour to get a simulated nanosecond,
> then I can only get 0.345 * 24 = 8.28 simulated nanosecond per day?
> Then what does 69.479 mean? And the Core t(s)?
>
>
> 于 2014/7/28 15:29, Mark Abraham 写道:
>> Your run took nearly a minute, and did so at a rate that would take
>> 0.345
>> hours to do a simulated nanosecond
>>
>> Mark
>>
>>
>> On Mon, Jul 28, 2014 at 9:05 AM, Theodore Si <sjyzhxw at gmail.com> wrote:
>>
>>> Core t (s) Wall t (s) (%)
>>> Time: 2345.800 49.744 4715.7
>>> (ns/day) (hour/ns)
>>> Performance: 69.479 0.345
>>>
>>>
>>> What does 0.345 hour/ns stand for? and the Wall time 49.77s?
>>>
>>>
>>> 于 2014/7/28 14:53, Mark Abraham 写道:
>>>
>>> I plan to put some more of this kind of documentation in the
>>> upcoming User
>>>> Guide, but it isn't done yet.
>>>>
>>>> Number of times that section was entered, the total wall time spent in
>>>> that
>>>> section, and the total number of processor gigacycles spent in that
>>>> section, and percentage of same. Some columns are useful for only some
>>>> kinds of comparisons.
>>>>
>>>> Mark
>>>>
>>>>
>>>> On Mon, Jul 28, 2014 at 5:35 AM, Theodore Si <sjyzhxw at gmail.com>
>>>> wrote:
>>>>
>>>> Thanks a lot!
>>>>> But I am still confused about other things.
>>>>> For instance, what do count, wall t(s) G-Cycles mean? It seems
>>>>> that the
>>>>> last column is the percentage of G-Cycles.
>>>>> I really hope there is a place where I can find all relative
>>>>> information
>>>>> of the log file.
>>>>>
>>>>> 于 2014/7/28 11:12, Mark Abraham 写道:
>>>>>
>>>>> On Jul 28, 2014 4:53 AM, "Theodore Si" <sjyzhxw at gmail.com> wrote:
>>>>>
>>>>>> For example, in the following form, what does Wait + Comm. F
>>>>>> mean? Is
>>>>>>> there a webpage that explains the forms in log file?
>>>>>>>
>>>>>>> Unfortunately not (yet), but they correspond in a more-or-less
>>>>>>> clear
>>>>>> way
>>>>>> to
>>>>>> the segments in manual figure 3.16. In this case, to the three boxes
>>>>>> below
>>>>>> "Evaluate potential/forces." Significant time spent here would
>>>>>> indicate
>>>>>> poor balance of compute load.
>>>>>>
>>>>>> Mark
>>>>>>
>>>>>> R E A L C Y C L E A N D T I M E A C C O U N T I N G
>>>>>>
>>>>>>> Computing: Nodes Th. Count Wall t (s) G-Cycles %
>>>>>>>
>>>>>>> ------------------------------------------------------------
>>>>>>>
>>>>>> -----------------
>>>>>>
>>>>>> Domain decomp. 24 2 801 2.827 351.962 5.7
>>>>>>> DD comm. load 24 2 800 0.077 9.604 0.2
>>>>>>> DD comm. bounds 24 2 800 0.354 44.014 0.7
>>>>>>> Neighbor search 24 2 801 1.077 134.117 2.2
>>>>>>> Launch GPU ops. 24 2 40002 1.518 189.021 3.1
>>>>>>> Comm. coord. 24 2 19200 2.009 250.121 4.0
>>>>>>> Force 24 2 20001 8.478 1055.405 17.0
>>>>>>> Wait + Comm. F 24 2 20001 1.967 244.901 4.0
>>>>>>> PME mesh 24 2 20001 24.064 2995.784 48.4
>>>>>>> Wait GPU nonlocal 24 2 20001 0.170 21.212 0.3
>>>>>>> Wait GPU local 24 2 20001 0.072 8.935 0.1
>>>>>>> NB X/F buffer ops. 24 2 78402 0.627 78.050 1.3
>>>>>>> Write traj. 24 2 2 0.037 4.569 0.1
>>>>>>> Update 24 2 20001 0.497 61.874 1.0
>>>>>>> Constraints 24 2 20001 4.198 522.645 8.4
>>>>>>> Comm. energies 24 2 801 0.293 36.500 0.6
>>>>>>> Rest 24 1.478 184.033 3.0
>>>>>>>
>>>>>>> ------------------------------------------------------------
>>>>>>>
>>>>>> -----------------
>>>>>>
>>>>>> Total 24 49.744 6192.746 100.0
>>>>>>> ------------------------------------------------------------
>>>>>>>
>>>>>> -----------------
>>>>>> ------------------------------------------------------------
>>>>>> -----------------
>>>>>>
>>>>>> PME redist. X/F 24 2 40002 6.199 771.670 12.5
>>>>>>> PME spread/gather 24 2 40002 7.194 895.557 14.5
>>>>>>> PME 3D-FFT 24 2 40002 2.727 339.480 5.5
>>>>>>> PME 3D-FFT Comm. 24 2 80004 7.460 928.742 15.0
>>>>>>> PME solve 24 2 20001 0.434 53.968 0.9
>>>>>>>
>>>>>>> ------------------------------------------------------------
>>>>>>>
>>>>>> -----------------
>>>>>>
>>>>>> --
>>>>>>> Gromacs Users mailing list
>>>>>>>
>>>>>>> * Please search the archive at
>>>>>>>
>>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>>>>> posting!
>>>>>>
>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>>>> * For (un)subscribe requests visit
>>>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
>>>>>>> or
>>>>>>>
>>>>>>> send a mail to gmx-users-request at gromacs.org.
>>>>>> --
>>>>> Gromacs Users mailing list
>>>>>
>>>>> * Please search the archive at http://www.gromacs.org/
>>>>> Support/Mailing_Lists/GMX-Users_List before posting!
>>>>>
>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>>
>>>>> * For (un)subscribe requests visit
>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>>>> send a mail to gmx-users-request at gromacs.org.
>>>>>
>>>>>
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at http://www.gromacs.org/
>>> Support/Mailing_Lists/GMX-Users_List before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> send a mail to gmx-users-request at gromacs.org.
>>>
>
--
----------------------------------------------------------------------------------
Guillaume Chevrot, Ph.D.
Postdoctoral Fellow
MEMPHYS - Center for Biomembrane Physics
Department of Physics, Chemistry and Pharmacy
University of Southern Denmark
Campusvej 55, 5230 Odense M, Denmark
-------
Tel. +33 6.25.42.76.56
guillaume.chevrot at gmail.com
http://gchevrot.github.io/home
twitter: @gchevrot
ORCID: http://orcid.org/0000-0001-7912-2235
-----------------------------------------------------------------------------------
More information about the gromacs.org_gmx-users
mailing list