[gmx-users] parallel run error
David
spoel at xray.bmc.uu.se
Tue Nov 23 18:23:31 CET 2004
On Tue, 2004-11-23 at 11:46 -0500, Lianqing Zheng wrote:
> Dear GMX-pals,
>
> Here is what I did:
>
> 1. Compile and install fftw
> a) MPICC=/opt/lam-7.0/bin/mpicc
> b) ./configure --prefix=/home/lzheng/fftw-2.1.3 --enable-float
> --enable-type-prefix --enable-mpi
> c) make
> d) make install
>
> 2. Compile and install gromacs 3.2.1
> a) export CPPFLAGS=-I/home/lzheng/fftw-2.1.3/include
> b) export LDFLAGS=-L/home/lzheng/fftw-2.1.3/lib
> c) ./configure --prefix=/home/lzheng/gromacs-3.2.1 --enable-mpi
> --without-motif-libraries --without-motif-includes --without-x
> d) make
> e) make install
>
> 3. Run water simulation in tutorial
> a) lamboot -v bhost.def
> b) grompp -f grompp.mdp -p water.top -c spc216.gro -o water.tpr -np 4
> c) /opt/lam-7.0/bin/mpirun n21-24
> /home/lzheng/gromacs-3.2.1/i686-pc-linux-gnu/bin/mdrun -s water.tpr -o
> water.trr -c water_out.gro -v -g water.log -np 4
>
> OK, then I got the following error message:
>
> ****begin****
> /home/lzheng/gromacs-3.2.1/i686-pc-linux-gnu/bin/mdrun: error while
> loading shared libraries: libxml2.so.2: cannot open shared object file: No
> such file or directory
Your cluster does not have libxml2 installed. Either ask your sysadm to
install libxml2 or configure --without-libxml2
Note that xml may become obligatory in the next (4.0) release.
> /home/lzheng/gromacs-3.2.1/i686-pc-linux-gnu/bin/mdrun: error while
> loading shared libraries: libxml2.so.2: cannot open shared object file: No
> such file or directory
> /home/lzheng/gromacs-3.2.1/i686-pc-linux-gnu/bin/mdrun: error while
> loading shared libraries: libxml2.so.2: cannot open shared object file: No
> such file or directory
> /home/lzheng/gromacs-3.2.1/i686-pc-linux-gnu/bin/mdrun: error while
> loading shared libraries: libxml2.so.2: cannot open shared object file: No
> such file or directory
> -----------------------------------------------------------------------------
> It seems that [at least] one of the processes that was started with
> mpirun did not invoke MPI_INIT before quitting (it is possible that
> more than one process did not invoke MPI_INIT -- mpirun was only
> notified of the first one, which was on node n0).
>
> mpirun can *only* be used with MPI programs (i.e., programs that
> invoke MPI_INIT and MPI_FINALIZE). You can use the "lamexec" program
> to run non-MPI programs over the lambooted nodes.
> -----------------------------------------------------------------------------
> ****end****
>
> I also tried mpirun -np 4 and got the same error.
>
> Thanks for your kind help!
>
> Lianqing
>
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
--
David.
________________________________________________________________________
David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596, 75124 Uppsala, Sweden
phone: 46 18 471 4205 fax: 46 18 511 755
spoel at xray.bmc.uu.se spoel at gromacs.org http://xray.bmc.uu.se/~spoel
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
More information about the gromacs.org_gmx-users
mailing list