[gmx-users] Gromacs 4.6.7 with MPI and OpenMP

Mark Abraham mark.j.abraham at gmail.com
Fri May 8 13:51:06 CEST 2015


Hi,

On Thu, May 7, 2015 at 6:16 PM Malcolm Tobias <mtobias at wustl.edu> wrote:

>
> All,
>
> I'm attempting to build gromacs on a new cluster and following the same
> recipies that I've used in the past, but encountering a strange behavior:
> It claims to be using both MPI and OpenMP, but I can see by 'top' and the
> reported core/walltime that it's really only generating the MPI processes
> and no threads.
>

I wouldn't take the output from top completely at face value. Do you get
the same performance from -ntomp 1 as -ntomp 4?


> We're running a hetergenous environment, so I tend to build with
> MPI/OpenMP/CUDA and the Intel compilers, but I'm seeing this same sort of
> behavior with the GNU compilers.  Here's how I'm configuring things:
>
> [root at login01 build2]# cmake -DGMX_FFT_LIBRARY=mkl -DGMX_MPI=ON
> -DGMX_GPU=ON -DCUDA_TOOLKIT_ROOT_DIR=/opt/cuda -DGMX_OPENMP=ON
> -DCMAKE_INSTALL_PREFIX=/act/gromacs-4.6.7_take2 .. | tee cmake.out
>

You need root access for "make install." Period. Routinely using root means
you've probably hosed your system some time...


> or with GNU:
>
> [root at login01 build2]#  cmake -DGMX_BUILD_OWN_FFTW=ON -DGMX_GPU=ON
> -DCUDA_TOOLKIT_ROOT_DIR=/opt/cuda -DGMX_MPI=ON -DGMX_OPENMP=ON
> -DCMAKE_INSTALL_PREFIX=/act/gromacs-4.6.7_take2 .. | tee cmake.out
>
> I'm explicitly setting GMX_GPU=ON, but I don't recall having to do so in
> the past.  I see in the output where it's testing for OpenMP:
>
> [root at login01 build]# grep OpenMP  cmake.out
> -- Try OpenMP C flag = [-openmp]
> -- Performing Test OpenMP_FLAG_DETECTED
> -- Performing Test OpenMP_FLAG_DETECTED - Success
> -- Try OpenMP CXX flag = [-openmp]
> -- Performing Test OpenMP_FLAG_DETECTED
> -- Performing Test OpenMP_FLAG_DETECTED - Success
> -- Found OpenMP: -openmp
>
> When I go to run, I use 'mpirun -np 2' to trigger it to use 2 GPUs, then
> use the '-ntomp 4' flag to trigger 4 OpenMP threads/MPI proceess.   The
> funny part is that GROMACS reports that it's doing all this:
>
> Using 2 MPI processes
> Using 4 OpenMP threads per MPI process
>
> although I do see this warning:
>
> Number of CPUs detected (16) does not match the number reported by OpenMP
> (1).
>

Yeah, that happens. There's not really a well-defined standard, so once the
OS, MPI and OpenMP libraries all combine, things can get messy. But if
people go around using root routinely... ;-)

I'm not sure how to proceed with debugging this, so any suggestions would
> be helpful.
>
> Thanks in advance,
> Malcolm
>
> --
> Malcolm Tobias
> 314.362.1594
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list