[gmx-users] A MPI librabry not recognized by the configure script
David van der Spoel
spoel at xray.bmc.uu.se
Fri Sep 5 21:36:09 CEST 2008
Nicolas Sapay wrote:
> David van der Spoel wrote:
>> Nicolas Sapay wrote:
>>> Hi,
>>>
>>> I'm trying to compile the Gromacs-3.3.3 package with MPI and have the
>>> following issue when I run the configuration script:
>>>
>>> ./configure --prefix /home/users/nsapay/local/bin/gromacs-3.3.3
>>> --program-suffix _mpi --enable-mpi --disable-nice MPICC=mpicc CC=gcc
>>> --without-x
>>> CPPFLAGS=-I/home/users/nsapay/local/bin/fftw-3.1.2/include/
>>> LDFLAGS=-L/home/users/nsapay/local/bin/fftw-3.1.2/lib/
>>> >>> checking size of int... configure: error: cannot compute sizeof
>>> (int)
>>>
>>> the config.log file tells me it's because a MPI library is not found:
>>>
>>> configure:7621: mpicc -o conftest -O3 -fomit-frame-pointer
>>> -finline-functions -Wall -Wno-unused -funroll-all-loops
>>> -I/home/users/nsapay/local/bin/fftw-3.1.2/include/
>>> -L/opt/hpmpi/lib/linux_amd64/
>>> -L/home/users/nsapay/local/bin/fftw-3.1.2/lib/ conftest.c >&5
>>> configure:7624: $? = 0
>>> configure:7630: ./conftest
>>> ./conftest: error while loading shared libraries: libmpio.so.1:
>>> cannot open shared object file: No such file or directory
>>> configure:7633: $? = 127
>>> configure: program exited with status 127
>>>
>>> I tried to reconfigure the installation by including the directory
>>> where the libmpi is stored:
>>>
>>> ./configure --prefix /home/users/nsapay/local/bin/gromacs-3.3.3
>>> --program-suffix _mpi --enable-mpi --disable-nice MPICC=mpicc
>>> CC=gcc --without-x
>>> CPPFLAGS=-I/home/users/nsapay/local/bin/fftw-3.1.2/include/
>>> LDFLAGS="-L/opt/hpmpi/lib/linux_amd64/
>>> -L/home/users/nsapay/local/bin/fftw-3.1.2/lib/"
>>
>> You might want to try
>> LD=mpicc
> Nope, same error...
You may need to add fftw library path to ldflags as well.
>>
>>>
>>> But it doesn't work. I get exactly the same error message. Why the
>>> libmpio.so.1 library is not recognized by the configuration script?
>>> Note that I have compiled Gromacs without MPI with no particular
>>> issue. The system is a HP cluster with AMD processors, Red Hat 4.5,
>>> gcc/mpicc 3.4.6.
>>>
>>> Thanks a lot for your advices
>>>
>>> -Nicolas
>>>
>>>
>>> ------------------------------------------------------------------------
>>>
>>> _______________________________________________
>>> gmx-users mailing list gmx-users at gromacs.org
>>> http://www.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> Please don't post (un)subscribe requests to the list. Use the www
>>> interface or send it to gmx-users-request at gromacs.org.
>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>>
>
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
--
David van der Spoel, Ph.D., Professor of Biology
Molec. Biophys. group, Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone: +46184714205. Fax: +4618511755.
spoel at xray.bmc.uu.se spoel at gromacs.org http://folding.bmc.uu.se
More information about the gromacs.org_gmx-users
mailing list