[gmx-developers] Branches w/working OpenCL support

Mirco Wahab mirco.wahab at chemie.tu-freiberg.de
Sun Jun 7 12:23:43 CEST 2015


Hi Szilard,

On 31.05.2015 22:15, Szilárd Páll wrote:
> [...]
> https://gerrit.gromacs.org/#/c/4314/
> Use the version in review on gerrit I linked above, that's the most up
> to date code.
>[...]

I completed some tests using an AMD device (Pitcairn-GPU,
Radeon R9-270X), and used adh_cubic_vsites, rnase_cubic, and
villin_vsites from the "gromacs acceleration page".

So far, the simulations appear to run stable and deliver correct
results if *no pressure coupling* is used. As I noted already with
the NV opencl test, this affects the AMD opencl implementation too:
pressure coupling just don't work so far.

Compilation: there needs to be an additional include of
#include <algorithm> in src/gromacs/gmxana/gmx_wham.cpp
in order to provide std::min/max for VC. Another problem
is the use of binary numerals in src\gromacs\listed-forces\bonded.cpp,
which are C++14. Maybe these could be written in hexadecimal
numerals in order to avoid problems w/non C++14-compilers?

Pressure coupling: Depending on the simulation setup w/pressure
coupling, gmx will  either crash immediately (when using pme) or
continuously enlarge the volume (eg. when using rf + berendsen).

I put the log files an a host, these include log files and
input for a very simple spc water system w/pressure coupling
(which doesn't work w/opencl but works perfectly using -nb cpu).
http://spule.compch.tu-freiberg.de/~wahab/gromacs/gmx_opencl_2015-06_szilard.zip

short overview:

  adh_cubic_vsites
   cpu/pme:	5.738,	gpu/pme:	17.421
   cpu/rf:	7.913,	gpu/rf: 	26.344

  rnase_cubic
   cpu/pme:	15.001,	gpu/pme:	39.363
   cpu/rf:	24.729,	gpu/rf: 	59.179

  villin_vsites
   cpu/pme:	124.86,	gpu/pme:	284.213
   cpu/rf:	151.05,	gpu/rf: 	310.794

  (water_small_npt/nvt)
   cpu/pme/npt:	20.00  	gpu/pme/npt:	n/a
   cpu/pme/nvt:	21.65	gpu/pme/nvt:	52.05


How is the current status for the planned inclusion of
opencl into a gromacs release?

Regards,

M.



More information about the gromacs.org_gmx-developers mailing list