[gmx-users] Running Gromacs on GPUs

Mark Abraham mark.j.abraham at gmail.com
Thu May 11 13:07:53 CEST 2017


Hi,

Each of the four simulations on your node has its own particle-particle
rank, and we knew that having multiple of them didn't work in 5.1.x, so
disabled it. But it was fixed for the 2016 release.

Mark

On Thu, 11 May 2017 13:00 Giannis Gl <igaldadas at gmail.com> wrote:

> Dear Gromacs users,
>
> I am trying to run a multiple walkers metadynamics simulation on Gromacs
> 5.1.4 using a machine that has 12 CPUs and 4 GPUs.
> I have compiled Gromacs using the following schems:
>
> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON
> -DGMX_MPI=on -DGMX_GPU=on -DGMX_USE_OPENCL=on
>
> and the command that I use to run the simulation is a follows:
>
> mpirun -np 4 gmx_mpi mdrun -s topol -multi 4 -gpu_id 0123 -x traj
>
> However, I get the following error with or without the plumed flag:
>
> Running on 1 node with total 6 cores, 12 logical cores, 4 compatible GPUs
> -------------------------------------------------------
> Program gmx mdrun, VERSION 5.1.4
> Source code file:
> /usr/local/gromacs-5.1.4/src/gromacs/gmxlib/gmx_detect_hardware.cpp, line:
> 1203
>
> Fatal error:
> The OpenCL implementation only supports using exactly one PP rank per node
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Halting parallel program gmx mdrun on rank 1 out of 4
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> I tried the following command but still I get the same error,
>
> mpirun -np 4 gmx_mpi mdrun -s topol -multi 4 -gpu_id 0123 -ntomp 1
>
> Did anyone have the same issue at some point and if yes, how did you solve
> it?
>
> Best wishes,
> Yiannis
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list