[gmx-users] Workstation choice

Benson Muite benson.muite at ut.ee
Sun Sep 9 07:59:53 CEST 2018


This is old, but seems to indicate Beowulf clusters work quite well:

https://docs.uabgrid.uab.edu/wiki/Gromacs_Benchmark

Szilárd had helped create a benchmark data set available at:
http://www.gromacs.org/About_Gromacs/Benchmarks
http://www.gromacs.org/@api/deki/files/240/=gromacs-5.0-benchmarks.pdf
ftp://ftp.gromacs.org/pub/benchmarks/gmxbench-3.0.tar.gz

Does your use case involves a large number of ensemble simulations which 
can be done in single precision without error correction? If so might 
you be better building a small Beowulf cluster with lower spec 
processors that have integrated GPUs? For example a Ryzen 3 with 
integrated graphics is about $100. Motherboard, RAM, power supply would 
probably get you to about $300. Intel core I3 bundle would be about 
$350.  Setup could be done using OpenHPC stack:
http://www.openhpc.community/

This would get you a personal 5-7 node in house cluster. However, 
ability to do maintenance, have local support for repair may also be 
important in considering system lifetime cost, not just initial purchase 
price. Gromacs current and future support for OpenCl, and likely also 
important here.

At least one computer store in my region has allowed benchmarking.

On 09/07/2018 09:40 PM, Olga Selyutina wrote:
> Hi,
> A lot of thanks for valuable information.
> If it isn’t difficult for you, could you answer how the growth of
> performance under using the second GPU on the single simulation was changed
> in GROMACS 2018 vs older versions (2016, 5.1, it was 20-30% higher)?
>
>
> 2018-09-07 23:25 GMT+07:00 Szilárd Páll <pall.szilard at gmail.com>:
>
>> Are you intending to use it mostly/only for running simulations or also as
>> a desktop computer?
>>
>> Yes, it will be mostly used for simulations.
>
>> I'm not on the top of pricing details so you should probably look at some
>> configs and get back with concrete CPU + GPU (+price) combinations and we
>> might be able to guesstimate what's best.
>>
>>
> These sets of CPU and GPU are suitable for price (in our region):
> *GPU*
> GTX 1070 ~1700MHz, cuda 1920 - $514
> GTX 1080 ~1700MHz, cuda 2560 - $615
> GTX 1070Ti ~1700MHz, cuda 2432 - $615
> GTX 1080Ti ~1600MHz, cuda 3584 - $930
>
> *CPU*
> Ryzen 7 2700X - $357
> 4200MHz, 8/16 cores/threads, cache L1/L2/L3 768KB/4MB/16MB, 105W, max.T 85C
>
> Threadripper 1950X - $930
> 4000MHz, 16/32 cores/threads, cache  L1/L2/L3 1.5/8/32MB, 180W, max.T 68C
>
> i7 8086K - $515
> 4800MHz, 6/12 cores/threads, cache L2/L3 1.5/12MB, 95W, max.T 100C
>
> i7 8700K - $442
> 4600MHz, 6/12 cores/threads, cache L2/L3 1.5/12MB, 95W, max.T 100C
>
> The most suitable combinations CPU+GPU are as follows:
> 1) Ryzen 7 2700X + two GTX 1080 - $1587
> 1.1) Ryzen 7 2700X + one GTX 1080 + one GTX 1080*Ti* - $1900 (maybe?)
> 2) Threadripper 1950X + one GTX 1080Ti - $1860
> 3) i7 8700K + two GTX 1080 - $1672
> 4) Ryzen 7 2700X + three GTX 1070 - $1900
> My suggestions:
> Variant 1 seems to be the most suitable.
> Variant 2 seems to be suitable only if the single simulation is running on
> workstation
> It’s a bit confusing that in synthetic tests/games performance of i7 8700
> is higher than Ryzen 7 2700.
> Thanks a lot again for your advice, it has already clarified a lot!
>
>
>



More information about the gromacs.org_gmx-users mailing list