[gmx-users] Gromacs 5.0 compilation slower than 4.6.5. What went wrong ?
Abhi Acharya
abhi117acharya at gmail.com
Fri Sep 5 16:06:40 CEST 2014
Hello,
Is you system solvated with water molecules?
The reason I ask is that, in case of the run with 4.6.5 you gromacs has
used a group cut-off scheme, whereas 5.0 has used verlet scheme. For system
with water molecules, group scheme gives better performance than verlet.
For more check out:
http://www.gromacs.org/Documentation/Cut-off_schemes
Regards,
Abhishek Acharya
On Fri, Sep 5, 2014 at 7:28 PM, Carsten Kutzner <ckutzne at gwdg.de> wrote:
> Hi,
>
> you might want to use g_tune_pme to find out the optimal number
> of PME nodes for 4.6 and 5.0.
>
> Carsten
>
>
>
> On 05 Sep 2014, at 15:39, David McGiven <davidmcgivenn at gmail.com> wrote:
>
> > What is even more strange is that I tried with 10 pme nodes (mdrun -ntmpi
> > 48 -v -c TEST_md.gro -npme 16), got a 15,8% performance loss and ns/day
> are
> > very similar : 33 ns/day
> >
> > D.
> >
> > 2014-09-05 14:54 GMT+02:00 David McGiven <davidmcgivenn at gmail.com>:
> >
> >> Hi Abhi,
> >>
> >> Yes I noticed that imbalance but I thought gromacs knew better than the
> >> user how to split PP/PME!!
> >>
> >> How is it possible that 4.6.5 guesses better than 5.0 ?
> >>
> >> Anyway, I tried :
> >> mdrun -nt 48 -v -c test.out
> >>
> >> Exits with an error "You need to explicitly specify the number of MPI
> >> threads (-ntmpi) when using separate PME ranks"
> >>
> >> Then:
> >> mdrun -ntmpi 48 -v -c TEST_md.gro -npme 12
> >>
> >> Then again 35 ns/day with the warning :
> >> NOTE: 8.5 % performance was lost because the PME ranks
> >> had less work to do than the PP ranks.
> >> You might want to decrease the number of PME ranks
> >> or decrease the cut-off and the grid spacing.
> >>
> >>
> >> I don't know much about Gromacs so I am puzzled.
> >>
> >>
> >>
> >>
> >> 2014-09-05 14:32 GMT+02:00 Abhi Acharya <abhi117acharya at gmail.com>:
> >>
> >>> Hello,
> >>>
> >>> From the log files it is clear that out of 48 cores, the 5.0 run had 8
> >>> cores allocated to PME while the 4.6.5 run had 12 cores. This seems to
> >>> have
> >>> caused a greater load imbalance in case of the 5.0 run.
> >>>
> >>> If you notice the last table in both .mdp files, you will notice that
> the
> >>> PME spread/gather wall time values for 5.0 is more than double the wall
> >>> time value in case of the 4.6.5.
> >>>
> >>> Try running the simulation by explicitly setting the -npme flag as 12.
> >>>
> >>> Regards,
> >>> Abhishek Acharya
> >>>
> >>>
> >>> On Fri, Sep 5, 2014 at 4:43 PM, David McGiven <davidmcgivenn at gmail.com
> >
> >>> wrote:
> >>>
> >>>> Thanks Szilard, here it goes! :
> >>>>
> >>>> 4.6.5 : http://pastebin.com/nqBn3FKs
> >>>> 5.0 : http://pastebin.com/kR4ntHtK
> >>>>
> >>>> 2014-09-05 12:47 GMT+02:00 Szilárd Páll <pall.szilard at gmail.com>:
> >>>>
> >>>>> mdrun writes a log file, named md.log by default, which contains
> among
> >>>>> other things results of hardware detection and performance
> >>>>> measurements. The list does not accept attachments, please upload it
> >>>>> somewhere (dropbox, pastebin, etc.) and post a link.
> >>>>>
> >>>>> Cheers,
> >>>>> --
> >>>>> Szilárd
> >>>>>
> >>>>>
> >>>>> On Fri, Sep 5, 2014 at 12:37 PM, David McGiven <
> >>> davidmcgivenn at gmail.com>
> >>>>> wrote:
> >>>>>> Command line in both cases is :
> >>>>>> 1st : grompp -f grompp.mdp -c conf.gro -n index.ndx
> >>>>>> 2nd : mdrun -nt 48 -v -c test.out
> >>>>>>
> >>>>>> Log file you mean the standard output/error ? Attached to the email
> >>> ?
> >>>>>>
> >>>>>> Thanks
> >>>>>>
> >>>>>> 2014-09-05 12:30 GMT+02:00 Szilárd Páll <pall.szilard at gmail.com>:
> >>>>>>
> >>>>>>> Please post the command lines you used to invoke mdrun as well as
> >>> the
> >>>>>>> log files of the runs you are comparing.
> >>>>>>>
> >>>>>>> Cheers,
> >>>>>>> --
> >>>>>>> Szilárd
> >>>>>>>
> >>>>>>>
> >>>>>>> On Fri, Sep 5, 2014 at 12:10 PM, David McGiven <
> >>>> davidmcgivenn at gmail.com
> >>>>>>
> >>>>>>> wrote:
> >>>>>>>> Dear Gromacs users,
> >>>>>>>>
> >>>>>>>> I just compiled gromacs 5.0 with the same compiler (gcc 4.7.2),
> >>> same
> >>>>> OS
> >>>>>>>> (RHEL 6) same configuration options and basically everything
> >>> than my
> >>>>>>>> previous gromacs 4.6.5 compilation and when doing one of our
> >>> typical
> >>>>>>>> simulations, I get worst performance.
> >>>>>>>>
> >>>>>>>> 4.6.5 does 45 ns/day
> >>>>>>>> 5.0 does 35 ns/day
> >>>>>>>>
> >>>>>>>> Do you have any idea of what could be happening ?
> >>>>>>>>
> >>>>>>>> Thanks.
> >>>>>>>>
> >>>>>>>> Best Regards,
> >>>>>>>> D.
> >>>>>>>> --
> >>>>>>>> Gromacs Users mailing list
> >>>>>>>>
> >>>>>>>> * Please search the archive at
> >>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>>>>>> posting!
> >>>>>>>>
> >>>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>>>>>
> >>>>>>>> * For (un)subscribe requests visit
> >>>>>>>>
> >>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> >>>> or
> >>>>>>> send a mail to gmx-users-request at gromacs.org.
> >>>>>>> --
> >>>>>>> Gromacs Users mailing list
> >>>>>>>
> >>>>>>> * Please search the archive at
> >>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>>>>>> posting!
> >>>>>>>
> >>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>>>>
> >>>>>>> * For (un)subscribe requests visit
> >>>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> >>> or
> >>>>>>> send a mail to gmx-users-request at gromacs.org.
> >>>>>>>
> >>>>>> --
> >>>>>> Gromacs Users mailing list
> >>>>>>
> >>>>>> * Please search the archive at
> >>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>>>> posting!
> >>>>>>
> >>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>>>
> >>>>>> * For (un)subscribe requests visit
> >>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> >>> or
> >>>>> send a mail to gmx-users-request at gromacs.org.
> >>>>> --
> >>>>> Gromacs Users mailing list
> >>>>>
> >>>>> * Please search the archive at
> >>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>>>> posting!
> >>>>>
> >>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>>
> >>>>> * For (un)subscribe requests visit
> >>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> or
> >>>>> send a mail to gmx-users-request at gromacs.org.
> >>>>>
> >>>> --
> >>>> Gromacs Users mailing list
> >>>>
> >>>> * Please search the archive at
> >>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>>> posting!
> >>>>
> >>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>>
> >>>> * For (un)subscribe requests visit
> >>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> >>>> send a mail to gmx-users-request at gromacs.org.
> >>>>
> >>> --
> >>> Gromacs Users mailing list
> >>>
> >>> * Please search the archive at
> >>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >>> posting!
> >>>
> >>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>>
> >>> * For (un)subscribe requests visit
> >>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> >>> send a mail to gmx-users-request at gromacs.org.
> >>>
> >>
> >>
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
>
> --
> Dr. Carsten Kutzner
> Max Planck Institute for Biophysical Chemistry
> Theoretical and Computational Biophysics
> Am Fassberg 11, 37077 Goettingen, Germany
> Tel. +49-551-2012313, Fax: +49-551-2012302
> http://www.mpibpc.mpg.de/grubmueller/kutzner
> http://www.mpibpc.mpg.de/grubmueller/sppexa
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list