[gmx-users] Performance on multiple GPUs per node
Szilárd Páll
pall.szilard at gmail.com
Fri Dec 11 12:54:11 CET 2015
Hi,
Without details of your benchmarks it's hard to comment on why do you not
see performance improvement with multiple GPUs per node. Sharing some logs
would be helpful.
Are you comparing performance with N cores and varying number of GPUs? The
balance of hardware resources is a key factor to scaling and my guess is
that your runs are essentially CPU-bound, hence adding more GPUs does not
help.
Have a look at these papers:
https://doi.org/10.1002/jcc.24030
https://doi.org/10.1007/978-3-319-15976-8_1
especially the former covers the topic quite well and both show scaling of
<100k protein system to 32-64 nodes (dual socket/dual GPU).
Cheers,
--
Szilárd
On Fri, Dec 11, 2015 at 11:54 AM, Jens Krüger <
krueger at informatik.uni-tuebingen.de> wrote:
> Dear all,
>
> we are currently planning a new cluster at our universities compute
> centre. The big question on our side is, how many and which GPUs we should
> put into the nodes.
>
> We have access to a test system with four Tesla K80s per Node. Using one
> GPU node we can reach something like 23 ns/day for the ADH system (PME,
> cubic) which is pretty much in line with e.g.,
> http://exxactcorp.com/index.php/solution/solu_list/84
>
> When trying to use 2 or more GPUs on one node, the performance plunges to
> below 10 ns/day no matter how we split the MPI/OMP threads. Has anybody of
> you guys access to a comparable hardware setup? We would be interested in
> benchmark data answering the question: Does GROMACS-5.1 scales on more than
> one GPU per node?
>
> Thanks and best wishes,
>
> Jens
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list