[gmx-developers] Running Gromacs on GPUs on multiple machines

Mark Abraham mark.j.abraham at gmail.com
Thu May 29 18:56:18 CEST 2014


On May 29, 2014 4:53 PM, "Vedran Miletić" <rivanvx at gmail.com> wrote:
>
> 2014-05-29 16:36 GMT+02:00 Anders Gabrielsson <andgab at kth.se>:
> > You'll probably have to supply -npernode/-ppn too so that your mpirun
> > doesn't put all MPI ranks on the same host.
> >
> > /Anders
> >
> > On 29 May 2014, at 13:17, Vedran Miletić <rivanvx at gmail.com> wrote:
> >
> > Fatal error:
> > Incorrect launch configuration: mismatching number of PP MPI processes
> > and GPUs per node.
> > mdrun_mpi was started with 5 PP MPI processes per node, but you
provided 1
> > GPU.
>
> Hi Anders,
>
> thanks for your response. Weirdly enough, mpirun actually doesn't run
> processes all on one node, it distributes them as equally as possible,
> going around your hostfile in a round-robin fashion. (I verified this
> by running hostname.)

What were the hostnames? There is some logic that tries to reconstruct the
nodes from hostnames as supplied by MPI, but GROMACS assumes the form

yourmachinename043

with the digits varying across nodes. So far, this has worked, since
sysadmins are orderly types... IIRC we use a hash in GROMACS 5, or at least
we talked about it!

Mark

> However, it seems that for some reason Gromacs assumes mpirun does run
> 5 processes on a single node. Regardless, i tried
>
> mpirun -np 5 -npernode 1 -hostfile ... mdrun_mpi -v -deffnm ... -gpu_id 0
>
> but it produces the same error. Anything else I could try?
>
> Regards,
> Vedran
> --
> Gromacs Developers mailing list
>
> * Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before
posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers
or send a mail to gmx-developers-request at gromacs.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20140529/6f13e032/attachment-0001.html>


More information about the gromacs.org_gmx-developers mailing list