[gmx-users] Testing mpi version of Gromacs 5.1

Joachim Hein joachim.hein at math.lu.se
Thu Oct 8 11:56:08 CEST 2015


Hi everyone

I hope this is the right place to discuss this.

I am having issues with checking the correctness of my mpi version of Gromacs 5.1.  I have build and tested earlier versions without issues (e.g. 5.0.4 is the latest I did).  I am also building float and double versions of the serial tools and they test without issue using: make check.

I tried two things to test my MPI build (I’ll give a detailed account on how I build later):

 make -j 8 check

on a back-end node (slurm sbatch job asking for 8 cores) of our cluster.  Tests 19 to 25 pass, while the first 18 fail in MPI_Init.  Is that how it is supposed to be?

The other test I tried was as follows:

- change into the regression test directory
- source the GMXRC file of an install containing a float and a float mpi version
- executing: ./gmxtest.pl all -np 4  (in a batch script asking for 4 cores)

I get the following:

 gmx-completion.bash: line 258: `_gmx_convert-tpr_compl': not a valid identifier



So here comes the detail on the build:
-----------------------------------------------------------

gcc/5.2.0
openmpi 1.10.0 build for the above gcc 5.2.0
fftw 3.3.4 build for the above gcc with sse and avx
boost/1.59.0

CMAKEFLAGS="-DCMAKE_C_COMPILER=gcc -DCMAKE_CXX_COMPILER=g++ -DGMX_FFT_LIBRARY=fftw3 -DBUILD_SHARED_LIBS=ON -DCMAKE_INSTALL_PREFIX=$PREFIX -DGMX_DEFAULT_SUFFIX=ON -DREGRESSION
TEST_PATH=$RTESTPATH -DGMX_SIMD=AVX_128_FMA -DCMAKE_PREFIX_PATH=$FFTW3_HOME"

I build the serial and parallel version:

# Building single precision without MPI
BUILDDIR="bdir_float_boost"
mkdir -p $BUILDDIR
cd $BUILDDIR

cmake ../ $CMAKEFLAGS

make -j 16

make -j 4 check

make install

cd ..

# Building single precision with MPI
BUILDDIR="bdir_float_mpi_boost"
mkdir -p $BUILDDIR
cd $BUILDDIR

cmake ../ $CMAKEFLAGS -DGMX_MPI=ON -DBUILD_SHARED_LIBS=off

make -j 16

make install


Both installs go to the same directory.  This procedure is essentially the same as I used fro Gromacs 5.0.4.

I tested with gcc 4.9.0 and openmpi 1.8.3 (which I used for gromacs 5.0.4) and the results are similar, though the error message from the MPI_Init is slightly different.

Any comments and help is greatly appreciated.

Thanks and best wishes
  Joachim


More information about the gromacs.org_gmx-users mailing list