[gmx-users] Running multiple Gromacs simulations on the same node

Mark Abraham mark.j.abraham at gmail.com
Sat Sep 9 00:36:20 CEST 2017


Hi,

On Sat, Sep 9, 2017 at 12:23 AM MING HA <mingtha at scarletmail.rutgers.edu>
wrote:

> Hi all,
>
>
> Thanks for getting back to me so quickly. I found that when I did not
> specify any
> pinning, the running time of each simulation on the same node scaled
> proportionally
> to the number of concurrent Gromacs simulations. After some testing, i
> found that
> when I do not pin the core on which the simulations runs, then each Gromacs
> simulations get mapped to the same process.
>

That's up to your OS. If you check your log files, you will see comments
from mdrun about what it noticed about the pinning context. Obviously, if
it detects something else pinning threads, it respects that.



> I have two questions:
> 1) Is the core on which the simulation runs pre-defined (e.g. each gromacs
> simulation
> starts on Core 0)?
>

That depends exactly how you were running GROMACS, which we don't know.
Other infrastructure like OS, HPC job schedulers, and MPI libraries tend to
get involved here, too...

2) Is there a way to tell Gromacs to allow the OS of the machine to
> automatically schedule
> different processes to run on different cores?
>

mdrun can be instructed to control everything, see the examples in
http://manual.gromacs.org/documentation/2016.3/user-guide/mdrun-performance.html#examples-for-mdrun-on-one-node.
To handle this use case, you need to use either the mdrun options, or some
external tool to handle all these details explicitly, because there is no
way for mdrun to understand that there are other mdrun processes and how it
should manage that.

Mark


> Thanks,
> Ming
>
> On Fri, Sep 8, 2017 at 9:07 AM, Szilárd Páll <pall.szilard at gmail.com>
> wrote:
>
> > If you run MPI-enabled GROMACS and start N simulation with M(=1) ranks
> > each, you will have N*M processes. That's how MPI works. However, you
> > do not necessarily need to use MPI; the default build uses thread-MPI,
> > for instance.
> >
> > --
> > Szilárd
> >
> > On Fri, Sep 8, 2017 at 6:00 AM, MING HA <mingtha at scarletmail.rutgers.edu
> >
> > wrote:
> > > Hi all,
> > >
> > >
> > > It may seem a bit weird, but I'm trying to run multiple Gromacs
> > simulations
> > > simultaneously on the same node, and I specified each Gromacs
> simulation
> > > to use only 1 MPI process and 1 OMP thread. I'm doing this because I am
> > > trying to check how accurate my model can predict an application using
> a
> > > single thread and process.
> > >
> > > My question is: If, for example, I am running multiple Gromacs
> > simulations
> > > simultaneously, each with 1 MPI process and 1 OMP thread,  on the same
> > > node, does each simulation use separate MPI processes and OMP threads,
> > > or are they shared?
> > >
> > >
> > > Sincerely,
> > > Ming
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at http://www.gromacs.org/
> > Support/Mailing_Lists/GMX-Users_List before posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at http://www.gromacs.org/
> > Support/Mailing_Lists/GMX-Users_List before posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.


More information about the gromacs.org_gmx-users mailing list