[gmx-users] error: Cannot compile and link MPI code with mpicc
Sarath Kumar
bskumar.tech at gmail.com
Mon Feb 1 10:19:09 CET 2010
Message: 2
> Date: Mon, 1 Feb 2010 12:31:32 +0530
> From: Chandan Choudhury <iitdckc at gmail.com>
> Subject: Re: [gmx-users] error: Cannot compile and link MPI code with
> mpicc
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID:
> <4e22679c1001312301x5ee854dbn401b1e11f8d4b6f3 at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Thanks.
> I just added export CPPFLAGS=-I/usr/local/openmpi/include in the bashrc
> file, and could compile the mpi version of gromacs.
> The next thing is I got the error on executing mdrun_mpi -h
> Following is the output. Kindly help.
>
> corsica:/usr/local/gromacs/bin # mdrun_mpi -h
> [corsica:17130] [NO-NAME] ORTE_ERROR_LOG: Not found in file
> runtime/orte_init_stage1.c at line 182
> --------------------------------------------------------------------------
> It looks like orte_init failed for some reason; your parallel process is
> likely to abort. There are many reasons that a parallel process can
> fail during orte_init; some of which are due to configuration or
> environment problems. This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
>
> orte_rml_base_select failed
> --> Returned value -13 instead of ORTE_SUCCESS
>
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort. There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems. This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
>
> ompi_mpi_init: orte_init_stage1 failed
> --> Returned "Not found" (-13) instead of "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init
> *** before MPI was initialized
> *** MPI_ERRORS_ARE_FATAL (goodbye)
> [corsica:17130] Abort before MPI_INIT completed successfully; not able to
> guarantee that all other processes were killed!
>
> --
> Chandan kumar Choudhury
> NCL, Pune
> INDIA
>
>
U will be getting error s like this
as the MPI failed.
When u run mdrun with MPI.
If u use this CPPFLAGS, and i also had a problem.
When i used this option, after that i was unable to revert back it to the
original state also.
So the better option update the gcc, c++ compilers,
If u have doubt in this, remove gromacs, All MPi.
then do yum install -y *openmpi*
--It will automatically install the missing libraries in dependencies.
and then download fftw ---latest
the install fftw with
./configure --enable-threads --enable-mpi
the gromacs
./configure --enable-threads --enable-mpi
this will surely work, because in order gromacs work with MPI.
U should compile fftw with MPI as above.
--
B. Sarath Kumar, B.tech.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20100201/0e231225/attachment.html>
More information about the gromacs.org_gmx-users
mailing list