[gmx-users] AMD vs Intel, nvidia 20xx
melichercik at leaf.nh.cas.cz
melichercik at leaf.nh.cas.cz
Wed Sep 19 12:52:57 CEST 2018
I've put several notices in text...
I can't help ypu with exact hardware optimisations cause I don't have experiences with more than 1 GPU per CPU.
On Wed, Sep 19, 2018 at 10:50:42AM +0200, Tamas Hegedus wrote:
> I am planning to buy 1 or 2 GPU workstations/servers for stand alone
> gromacs and gromacs+plumed (w plumed I can efficiently use only 1 GPU).
> I would like to get some suggestions for the best performance/price
> ratio. More specifically, some info on the correlation between PCI lines
> and performance.
> Based on google, gmx mail list, and my experience I think that
> * i9 with 44PCI lines
> * 4 gtx 1080Ti
> are OK and fits the budget.
I'm not sure you are able to put 4 (at least dual-slot) GPUs to single ATX board - you don't have enough space between PCIe slots. May be in some server board, but I'm not sure you can get 4 Geforce instead of some Tesla/Quadro.
> However, there are several upcoming issues:
> 1. I expect the 4 GPUs running at 70% (1PME and 3PP) with 16 cores. I do
> not know if the 44PCI lines are limiting or not?
> 2. 4x8 channel are OK? Should I look for a mother board with a PLX chip
> that makes 64 virtual pci lines? Do you have experience with this type
> of mobo? I have red that the average will be better than 4x8, but others
> say that its makes the system slower.
> 3. An option might be go with AMD CPUs with higher PCI line, but I do
> not trust AMD when high CPU performance is needed. But this might come
> from the past and I should forget this. Have you used recent AMD
> processors with gromacs with GPU acceleration?
AMD CPUs are OK. I have several Ryzen with Radeon Nano (Fiji GPU) and they are working nicely. Coleagues from another Lab has 16-core Threadripper and they are happy with it (using VASP and CPU-pnly calculations). The question is how NUMA access to memory will influence the performance with new Threadrippers with 24 and 32 cores. I didn't seen any results about this. BTW Xeon Gold 6xxx would be nice, cause it has two AVX pipelines (see some tests at www.servethehome.com page), but it's price...
> 4. I have the feeling using gmx 2018 that I would not gain performance
> with (physical) 32 cores and 4 GPUs, since 16 already is sufficient and
> the 70% GPU usage is not because waiting for the CPU. I do not know
> about this (how to check this). What is your experience?
> 5. I am afraid to buy more GPU power than necessary.
> It might be better to buy 2 faster and 2 slower GPU or only 3 newer GPUs
> to optimize the system.
I have the feeling the best is to have same GPUs (calculating the same job). If you would use it for several jobs parallel, may be. But there is question, if not to split the whole computer to several a little smaller (e. g. 8 core + 1-2 GPUs) and get more of them.
> However, I most likely can not get 1070Ti. In addition, is better for
> the system not to run at 100%.
> BUT: there will be new RTX 20xx out and likely I can get lower version
> number for less power consumption and the same computing capacity (e.g.
> 2070 instead of 1080).
> BUT: it seems for me that RTX2070 has much less cores (2304) than 1080Ti
> and 2080Ti has more cores than 1080Ti.
> 1080, 4*3584 cores, 64 PCI lines
> 2080, 3*4352 cores, 48 PCI lines
> However, in the letter case more PP may run on the GPU and again the
> CPU/GPU communication may become the limiting factor.
I thing you cannot calculate only CUDa cores, but their frequency is also important. However I don't know what is better - more cores with lower frequency or otherwise. And probably it depends on system size anyway. But comparing the values from wikipedia, I don't think there will be huge jump in calculating power (CUDA cores and frequency). And the power consumption is more or less the same (cca. 5-20 W) And this anyway is changing with cooling performance etc.
It would be different if Gromacs could use Tensor cores or even Raytracing cores. I don't know anything about this, so could someone bring some light into this? Thanks.
> Thanks for all the opinion and suggestion,
> Tamas Hegedus, PhD
> Senior Research Fellow
> Department of Biophysics and Radiation Biology
> Semmelweis University | phone: (36) 1-459 1500/60233
> Tuzolto utca 37-47 | mailto:tamas at hegelab.org
> Budapest, 1094, Hungary | http://www.hegelab.org
> Gromacs Users mailing list
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users