[gmx-developers] How to parallel Gromacs with multi-thread?

Berk Hess hess at kth.se
Tue Nov 24 09:25:44 CET 2015


Hi,

Why not simply use MPI parallelization?

But what (exotic) architecture does not support OpenMP and SIMD? If you 
don't have SIMD, I would think it's not worth using it for production. 
You get great performance from a cheap Intel CPU + NVidia GPU machine.

Cheers,

Berk

On 2015-11-24 06:05, Vinson Leung wrote:
> Hi everyone, I am new learner to Gromacs and I want to implement 
> Gromacs in a multi-core CPU machine which is  for my research. Because 
> the machine we use only support MPI (no openmp, no SIMD), so I 
> profiled the MPI-only version of Gromacs-5.0.4 and found that the 
> hotspot was nbnxn_kernel_ref() in src/gromacs/mdlib/nbnxn_kernel_ref.c 
> which occupied 80% of the total running time. Naturally I want to 
> accelerate the nbnxn_kernel_ref() by parallelization with 
> multi-thread. After I simply make some analysis and found that the 
> structure of nbnxn_kernel_ref() is like below:
> ========================================================
> for (nb = 0 ; nb < nnbl; nb++)
> {
> ......
>       for( n = 0 ; n < nbl->nci ; n++ )  // defined in 
> nbnxn_kernel_ref_outer.h
>       {
>       ....
>       }
> ...h
> }
> ========================================================
> So here is my quesion. When I compile with OpenMP=OFF, the value of 
> nnbl is 1 during the runtime.  So  can I  parallelize the inner loop 
> by just evenly separate the inner loop  ( nbl->nci )  to multi-core?
> Or I found that when I compile with OpenMP=ON the value of nnbl is not 
> 1 which I can parallelize the outer loop with multi-thread. But my 
> machine does not support OpenMP. So is there any way to make some 
> modification in the code and compile with OpenMP=OFF to make the value 
> of nnbl is not 1?
> Thanks.
>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20151124/6d92d327/attachment.html>


More information about the gromacs.org_gmx-developers mailing list