[gmx-users] possible configuration for gromacs gpu compute node

Szilárd Páll pall.szilard at gmail.com
Tue Apr 8 17:45:07 CEST 2014


Harry,

Is the workload you are describing a typical one? Are machines going
to be used exclusively in single-rank/no domain-decomposition mode? If
that's the case, instead of paying $1.5K for the E5-2643v2, for
(almost) the same price of a a single one of these Xeons you can get
*three* i7-4930K-s which for GROMACS is pretty much the same. Highly
L3-sensitive codes may like the 2x more cache (12 Mb vs 25 Mb), but
that should not make a lot of difference with GROMACS if the system is
not unusually large for single-node.

The price of a dual-socket chassis+motherboard will also be
substantially higher than a good X79 workstation motherboard. Even if
for some reason you need "enterprise-grade" (and let's take an
arbitrary definition of "enterprise"), possibly rackmountable chassis,
single-socket e.g. Supermicro or similar will be much most cost
effective than the rather overpriced Intel dual-socket setups.

Regarding the CPU-GPU balance, if you have a typical workload I
strongly suggest that you try before buying. Some companies have a
test-drive programs (see NVIDIA and partners' sites).

Regarding node sharing your job scheduler should be able to take care
of handing the right set of cores and GPU to the processes sharing a
node (via CPU affinities which mdrun respects + GPU "affinity" - e.g.
CUDA_VISIBLE_DEVICES).

Cheers,
--
Szilárd

On Tue, Apr 8, 2014 at 11:57 AM, Harry Mark Greenblatt
<harry.greenblatt at weizmann.ac.il> wrote:
> BS"D
>
> Dear All
>
> I was thinking of configuring some GPU compute nodes with the following:
>
> 2 x E5-2643v2 (6 cores each, 12 cores total)
> 2 x GTX 770.
>
> The idea is to run two Gromacs jobs on each node, each using 6 cores and 1 GPU card.
>
> 1).  Does this sound reasonable.
> 2).  If so, I am not clear how to submit jobs so that if 1 job is running, the second will take the unoccupied gpu card (especially since this will be done under ROCKS and SGE; of course that question may be best addressed to the ROCKS BB).
>
> Thanks
>
> Harry
>
>
>
> -------------------------------------------------------------------------
>
> Harry M. Greenblatt
>
> Associate Staff Scientist
>
> Dept of Structural Biology           Harry.Greenblatt at weizmann.ac.il<mailto:arry.Greenblatt at weizmann.ac.il>
>
> Weizmann Institute of Science        Phone:  972-8-934-3625
>
> 234 Herzl St.                        Facsimile:   972-8-934-4159
>
> Rehovot, 76100
>
> Israel
>
>
>
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.


More information about the gromacs.org_gmx-users mailing list