[gmx-users] Re: gmx 4.6 mpi installation through openmpi?

escajarro juan-manuel.castillo at mv.uni-kl.de
Mon Jun 3 16:00:17 CEST 2013


Hello all,

I have a very similar problem, but compiling both Gromacs 4.6 and 4.6.1 with
cmake and MPI support (no problems with non-MPI installation). I have cmake
version 2.8.7, openmpi version 1.4.1, and I am using gcc version 4.6.3 in a
linux Ubuntu 4.6.3. I use the cmake options:

cmake -DGMX_BUILD_OWN_FFTW=ON -DGMX_MPI=ON -DGMX_BINARY_SUFFIX=_mpi

A summary of the output given by this command is:

CUDA_TOOLKIT_ROOT_DIR not found or specified
-- Could NOT find CUDA (missing:  CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE
CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY) (Required is at least version "3.2")
-- The GROMACS-managed build of FFTW 3 will configure with the following
optimizations: --enable-sse2
-- Configuring done
-- Generating done
-- Build files have been written to: /home/smith/gromacs-4.6

I believe that not having CUDA installed is not a problem, as I do not plan
to run in GPU, and nowhere in the installation manual says that it is
obligatory. After running make, the simulation stops with the following
message:

Linking C shared library libgmx_mpi.so
[ 53%] Built target gmx
Scanning dependencies of target gromacs_include_links
[ 53%] Generating gromacs
[ 53%] Built target gromacs_include_links
Scanning dependencies of target template
[ 53%] Building C object share/template/CMakeFiles/template.dir/template.c.o
Linking C executable template
../../src/gmxlib/libgmx_mpi.so.6: undefined reference to `__mth_i_dfloatuk'
../../src/gmxlib/libgmx_mpi.so.6: undefined reference to
`__builtin_va_gparg1'
../../src/gmxlib/libgmx_mpi.so.6: undefined reference to `__pgdbg_stub'
collect2: ld returned 1 exit status
make[2]: *** [share/template/template] Error 1
make[1]: *** [share/template/CMakeFiles/template.dir/all] Error 2
make: *** [all] Error 2

It looks like I lack one library somehow, but I do not know which one. I
tried to install fftw before installing gromacs, and tried with both shared
and dynamic libraries with the same result.

Can anybody help?

Thanks



--
View this message in context: http://gromacs.5086.x6.nabble.com/gmx-4-6-mpi-installation-through-openmpi-tp5006975p5008730.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.



More information about the gromacs.org_gmx-users mailing list