[gmx-users] GROMACS-4.6.3 CUDA version on multiple nodes each having 2 GPUs

Prajapati, Jigneshkumar Dahyabhai j.prajapati at jacobs-university.de
Mon Nov 18 12:51:55 CET 2013


Hi Szilárd,

My apologies for misunderstanding. The old thread I mentioned is not mine, I just found it on website. 

Unlike that thread, our nodes have homogeneous hardware. I tried all the options and everything looks perfect on a single node. I started to  face problems  when I switched to two nodes. I could run the job on many occasions but I couldn't use GPUs on second node. Everytime GROMACS uses GPUs on first node only and fails to use GPUs on second node.

The command I used was,

mpirun -np 4  ${GROMACS}/mdrun  -v -deffnm $configfile

and please find the log file in attachment (em.log) . You can see that GROMACS detects only two GPUs on first node and then crashes (it is obvious as MPI ranks is not same as no. of gpus per node). Main thing here to note down is only two GPUs were detected on first node I used.


but when I tried with,

mpirun -np 4  ${GROMACS}/mdrun -gpu_id 0011 -v -deffnm $configfile

This time job completed successfully but used GPUs on first node only. please find logfie in attachments (em2.log)
 
Likewise, I tried with,

mpirun -np 2  ${GROMACS}/mdrun -v -deffnm $configfile
mpirun -np 2  ${GROMACS}/mdrun -gpu_id 01 -v -deffnm $configfile

Job ran successfully but couldn't use GPUs on second node.

Each node is having 2 GPUs cards. With that configuration please let me how to launch job on two nodes.

Thank you.

-Jignesh


________________________________________
From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se [gromacs.org_gmx-users-bounces at maillist.sys.kth.se] on behalf of Szilárd Páll [pall.szilard at gmail.com]
Sent: Thursday, November 14, 2013 2:42 PM
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS-4.6.3 CUDA version on multiple nodes each having 2 GPUs

Hi Jignesh,

I don't get what the issue is, you need to be more specific than
"fails" and "none of them worked." You should provide exact command
line, stderr output and log files as we can't get what exactly is the
error you are getting.

Previously you seemed to hint that you had inhomogeneous hardware
(i.e. nodes with different CPU/GPU setup), but now you're saying that
all nodes are the same - case in which it should all work just fine
with default settings!

Cheers,
--
Szilárd


On Wed, Nov 13, 2013 at 7:55 PM, Prajapati, Jigneshkumar Dahyabhai
<j.prajapati at jacobs-university.de> wrote:
> Hello,
>
> I am trying to run MPI, OpenMP and CUDA enable GROMACS 4.6.3 on nodes having 12 cores (2 CPUs) and 2 GPUs (Tesla M2090) each. The problem is when I launch job GROMCAS is using only GPUs on first node come across and failing to use GPUs on other nodes.
>
> The command I used for two gpu enable nodes was,
>
> mpirun -np 2  mdrun -v -deffnm $configfile
>
> I tried with many other options but none of them worked. The one thing needs to remember here is that on all the nodes, GPUs got id 0 and 1 so -gpu_id option also didn't work.
>
> This old thread gave me some idea but I didn't understand it completely.
> http://lists.gromacs.org/pipermail/gmx-users/2013-March/079802.html
>
> Please suggests me the possible solutions for this issue.
>
> Thank you
> --Jignesh
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
--
gromacs.org_gmx-users mailing list    gromacs.org_gmx-users at maillist.sys.kth.se
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-request at gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


More information about the gromacs.org_gmx-users mailing list