[gmx-users] gromacs 4.6.2 MPI distribution location problems

sirishkaushik kaushik.lakkaraju at gmail.com
Mon Jun 10 23:12:21 CEST 2013


Hi All,

I installed gromacs 4.6.2 using the following cmake options:

 cmake
-DCMAKE_INSTALL_PREFIX=/home/kaushik/gromacs_executable/gromacs-new-mpi
-DGMX_MPI=on

After successful installation,

when I run a test with mpirun mdrun,  the program breaks with the following
error:

--------------------------------------------------------------------------
It looks like orte_init failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems.  This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

  orte_ess_set_name failed
  --> Returned value Data unpack would read past end of buffer (-26) instead
of ORTE_SUCCESS
--------------------------------------------------------------------------
[n132:11207] [[10503,1],0] ORTE_ERROR_LOG: Data unpack would read past end
of buffer in file ../../../orte/util/nidmap.c at line 371
[n132:11207] [[10503,1],0] ORTE_ERROR_LOG: Data unpack would read past end
of buffer in file ../../../../../orte/mca/ess/base/ess_base_nidmap.c at line
62
[n132:11207] [[10503,1],0] ORTE_ERROR_LOG: Data unpack would read past end
of buffer in file ../../../../../../orte/mca/ess/env/ess_env_module.c at
line 173
[n132:11207] [[10503,1],0] ORTE_ERROR_LOG: Data unpack would read past end
of buffer in file ../../../orte/runtime/orte_init.c at line 132
*** The MPI_Init() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.


MPI related forums suggest that error points to gromacs compiling against an
older-mpi-library than the current distribution used during running mpirun
mdrun.

 when I run an ldd on mdrun, I see :

        linux-vdso.so.1 =>  (0x00007fff571ff000)
        libgmxpreprocess.so.8 =>
/state/partition1/home/kaushik/gromacs_executable/gromacs-new-mpi/bin/./../lib/libgmxpreprocess.so.8
(0x00007fab42706000)
        libmd.so.8 =>
/state/partition1/home/kaushik/gromacs_executable/gromacs-new-mpi/bin/./../lib/libmd.so.8
(0x00007fab421f4000)
        libgmx.so.8 =>
/state/partition1/home/kaushik/gromacs_executable/gromacs-new-mpi/bin/./../lib/libgmx.so.8
(0x00007fab41a0c000)
        libfftw3f.so.3 => /opt/fftw/3.3.2/single_prec/lib/libfftw3f.so.3
(0x00007fab4170e000)
        libmpi_cxx.so.0 => /usr/lib/libmpi_cxx.so.0 (0x00007fab414e7000)
        libmpi.so.0 => /usr/lib/libmpi.so.0 (0x00007fab41237000)
        libdl.so.2 => /lib/libdl.so.2 (0x00007fab41033000)
        libm.so.6 => /lib/libm.so.6 (0x00007fab40db0000)
        libnuma.so.1 => /usr/lib/libnuma.so.1 (0x00007fab40ba8000)
        librt.so.1 => /lib/librt.so.1 (0x00007fab409a0000)
        libnsl.so.1 => /lib/libnsl.so.1 (0x00007fab40787000)
        libutil.so.1 => /lib/libutil.so.1 (0x00007fab40584000)
        libpthread.so.0 => /lib/libpthread.so.0 (0x00007fab40368000)
        libgomp.so.1 => /usr/local/lib64/libgomp.so.1 (0x00007fab40159000)
        libc.so.6 => /lib/libc.so.6 (0x00007fab3fdf7000)
        libstdc++.so.6 => /usr/local/lib64/libstdc++.so.6
(0x00007fab3faef000)
        libgcc_s.so.1 => /usr/local/lib64/libgcc_s.so.1 (0x00007fab3f8da000)
        libopen-rte.so.0 => /usr/lib/libopen-rte.so.0 (0x00007fab3f68e000)
        /lib64/ld-linux-x86-64.so.2 (0x00007fab429f8000)
        libopen-pal.so.0 => /usr/lib/libopen-pal.so.0 (0x00007fab3f438000)

Our current open-mpi libraries are in /usr/local/lib and not /usr/lib;

I checked the CMakeCache.txt to check for the paths of the mpi related
environment variables and stuff:

//Extra MPI libraries to link against
MPI_EXTRA_LIBRARY:STRING=/usr/local/lib/libmpi.so;/usr/lib/libdl.so;/usr/lib/libm.so;/usr/lib/libnuma.so;/usr/lib/librt.so;/usr/lib/libnsl.so;/usr/lib/libutil.so;/usr/lib/libm.so;/usr/lib/libdl.so

//MPI include path
MPI_INCLUDE_PATH:STRING=/usr/local/include

//MPI library to link against
MPI_LIBRARY:FILEPATH=/usr/local/lib/libmpi_cxx.so

//MPI linking flags
MPI_LINK_FLAGS:STRING=-Wl,--export-dynamic


When all these correctly point to the /usr/local/lib; why is GROMACS
compiling with /usr/lib/libmpi.so which is old?

Ares there any other environment variables/locations that I am failing to
set correctly during compilation?

Thanks,

Kaushik




--
View this message in context: http://gromacs.5086.x6.nabble.com/gromacs-4-6-2-MPI-distribution-location-problems-tp5009002.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.



More information about the gromacs.org_gmx-users mailing list