[gmx-users] Problem with the mdrun_openmpi on cluster
James Starlight
jmsstarlight at gmail.com
Tue Mar 15 16:47:24 CET 2016
assuming that below command produce what I am looking for
-bash-4.1$ tail -n 3 eq_npt.log
(Mnbf/s) (GFlops) (ns/day) (hour/ns)
Performance: 1978.319 102.192 21.487 1.117
Finished mdrun on node 0 Tue Mar 15 16:23:03 2016
what combination of the shell commands will be useful to extract from
that information that digit 21.487 and put it into the specified log.
Then Iàd like to extract inloop several of such data from 10
independent runs, and calculate the average
2016-03-15 14:45 GMT+01:00 James Starlight <jmsstarlight at gmail.com>:
> Right, thanks so much!
>
> 2016-03-15 13:58 GMT+01:00 Mark Abraham <mark.j.abraham at gmail.com>:
>> Hi,
>>
>>
>> On Tue, Mar 15, 2016 at 11:57 AM James Starlight <jmsstarlight at gmail.com>
>> wrote:
>>
>>> just performed some benchmarks with full atomic system- short md of
>>> the water soluble protein still using mpiexec -np 46 mdrun_openmpi of
>>> the GMX 4.5 and there were no such errors with DD so it seems that the
>>> problem indeed in MARTINI atoms representation.
>>>
>>
>> Not really. The bonded interactions have a longer physical range in a CG
>> model, and that limits the current implementation of domain decomposition.
>>
>>
>>> BTW how I could quickly check some info about performance of the
>>> simulation7 what logs should I expect7 If somebody has already done it
>>>
>>
>> Depends on your simulation and hardware, so nobody has anything that is
>> obviously comparable.
>>
>>
>>> I will be very thankful for some usefull combination of shell commands
>>> which will extract performance information from sim log.
>>>
>>
>> Start with tail -n 50 md.log ;-)
>>
>> Mark
>>
>> Thanks in advance!!
>>>
>>> J.
>>>
>>> 2016-03-14 18:27 GMT+01:00 Justin Lemkul <jalemkul at vt.edu>:
>>> >
>>> >
>>> > On 3/14/16 1:26 PM, James Starlight wrote:
>>> >>
>>> >> For that system I have not defined virtual sites.
>>> >
>>> >
>>> > That disagrees with the error message, which explicitly complains about
>>> > vsites.
>>> >
>>> >> BTW the same simulation on local desctop using 2 cores from core2 duo
>>> runs
>>> >> OK =)
>>> >>
>>> >
>>> > Because you're not invoking DD there.
>>> >
>>> >> so one of the solution probably is to try to use more recent gmx 5.0
>>> >> to see what will happenes
>>> >>
>>> >
>>> > Good idea.
>>> >
>>> > -Justin
>>> >
>>> >
>>> >> 2016-03-14 18:22 GMT+01:00 Justin Lemkul <jalemkul at vt.edu>:
>>> >>>
>>> >>>
>>> >>>
>>> >>> On 3/14/16 1:19 PM, James Starlight wrote:
>>> >>>>
>>> >>>>
>>> >>>> I tried to increase size on the system providding much bigger bilayer
>>> >>>> in the system
>>> >>>>
>>> >>>> for this task I obtained another error also relevant to DD
>>> >>>>
>>> >>>> Program g_mdrun_openmpi, VERSION 4.5.7
>>> >>>> Source code file:
>>> >>>> /builddir/build/BUILD/gromacs-4.5.7/src/mdlib/domdec_con.c, line: 693
>>> >>>>
>>> >>>> Fatal error:
>>> >>>> DD cell 0 2 1 could only obtain 0 of the 1 atoms that are connected
>>> >>>> via vsites from the neighboring cells. This probably means your vsite
>>> >>>> lengths are too long compared to the domain decomposition cell size.
>>> >>>> Decrease the number of domain decomposition grid cells.
>>> >>>> For more information and tips for troubleshooting, please check the
>>> >>>> GROMACS
>>> >>>> website at http://www.gromacs.org/Documentation/Errors
>>> >>>> -------------------------------------------------------
>>> >>>>
>>> >>>> "It's So Fast It's Slow" (F. Black)
>>> >>>>
>>> >>>> Error on node 9, will try to stop all the nodes
>>> >>>> Halting parallel program g_mdrun_openmpi on CPU 9 out of 64
>>> >>>>
>>> >>>>
>>> >>>> BTW I checked the bottom of the syste,.gro file and found the next
>>> >>>> sizes which are seems too small for my syste, consisted for several
>>> >>>> hundreds of lipid, arent it7
>>> >>>>
>>> >>>> 15.00000 15.00000 15.00000 0.00000 0.00000 0.00000
>>> 0.00000
>>> >>>> 0.00000 0.00000
>>> >>>>
>>> >>>
>>> >>> No, that seems fine. But if your box is set up wrong, that's your
>>> fault
>>> >>> from the command below :)
>>> >>>
>>> >>>>
>>> >>>> for my case that gro file was produced automatically using MARTINI
>>> >>>> method
>>> >>>>
>>> >>>> ./insane.py -f test.pdb -o system.gro -p system.top -pbc cubic -box
>>> >>>> 15,15,15 -l DPPC:4 -l DOPC:3 -l CHOL:3 -salt 0.15 -center -sol W
>>> >>>>
>>> >>>>
>>> >>>> Will be very thankful for any help!!
>>> >>>>
>>> >>>
>>> >>> So you've got a system that is a CG model, with virtual sites? That's
>>> >>> going
>>> >>> to create all kinds of havoc. Please do try Googling your error,
>>> because
>>> >>> this difficulty has come up before specifically in the case of CG
>>> >>> systems,
>>> >>> which have longer-than-normal bonded interactions and requires some
>>> mdrun
>>> >>> tuning.
>>> >>>
>>> >>> -Justin
>>> >>>
>>> >>> --
>>> >>> ==================================================
>>> >>>
>>> >>> Justin A. Lemkul, Ph.D.
>>> >>> Ruth L. Kirschstein NRSA Postdoctoral Fellow
>>> >>>
>>> >>> Department of Pharmaceutical Sciences
>>> >>> School of Pharmacy
>>> >>> Health Sciences Facility II, Room 629
>>> >>> University of Maryland, Baltimore
>>> >>> 20 Penn St.
>>> >>> Baltimore, MD 21201
>>> >>>
>>> >>> jalemkul at outerbanks.umaryland.edu | (410) 706-7441
>>> >>> http://mackerell.umaryland.edu/~jalemkul
>>> >>>
>>> >>> ==================================================
>>> >>>
>>> >>> --
>>> >>> Gromacs Users mailing list
>>> >>>
>>> >>> * Please search the archive at
>>> >>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>> >>> posting!
>>> >>>
>>> >>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> >>>
>>> >>> * For (un)subscribe requests visit
>>> >>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> >>> send a
>>> >>> mail to gmx-users-request at gromacs.org.
>>> >
>>> >
>>> > --
>>> > ==================================================
>>> >
>>> > Justin A. Lemkul, Ph.D.
>>> > Ruth L. Kirschstein NRSA Postdoctoral Fellow
>>> >
>>> > Department of Pharmaceutical Sciences
>>> > School of Pharmacy
>>> > Health Sciences Facility II, Room 629
>>> > University of Maryland, Baltimore
>>> > 20 Penn St.
>>> > Baltimore, MD 21201
>>> >
>>> > jalemkul at outerbanks.umaryland.edu | (410) 706-7441
>>> > http://mackerell.umaryland.edu/~jalemkul
>>> >
>>> > ==================================================
>>> > --
>>> > Gromacs Users mailing list
>>> >
>>> > * Please search the archive at
>>> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>> posting!
>>> >
>>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> >
>>> > * For (un)subscribe requests visit
>>> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> send a
>>> > mail to gmx-users-request at gromacs.org.
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>> posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> send a mail to gmx-users-request at gromacs.org.
>>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list