[gmx-users] error: Cannot compile and link MPI code with mpicc
Chandan Choudhury
iitdckc at gmail.com
Mon Feb 1 14:56:04 CET 2010
Thanks You all.
Mark you were right. Some how mpi was screwed up. Now I am able to run the
job parallely.
Actually there were two different versions of openmpi and I suppose they
were conflicting. Uninstalling both of them and instally the recent openmpi
solved the problem
Chandan
--
Chandan kumar Choudhury
NCL, Pune
INDIA
On Mon, Feb 1, 2010 at 5:02 PM, Mark Abraham <Mark.Abraham at anu.edu.au>wrote:
> On 01/02/10 20:19, Sarath Kumar wrote:
>
>>
>>
>> Message: 2
>> Date: Mon, 1 Feb 2010 12:31:32 +0530
>> From: Chandan Choudhury <iitdckc at gmail.com <mailto:iitdckc at gmail.com>>
>>
>> Subject: Re: [gmx-users] error: Cannot compile and link MPI code with
>> mpicc
>> To: Discussion list for GROMACS users <gmx-users at gromacs.org
>> <mailto:gmx-users at gromacs.org>>
>>
>> Message-ID:
>> <4e22679c1001312301x5ee854dbn401b1e11f8d4b6f3 at mail.gmail.com
>> <mailto:4e22679c1001312301x5ee854dbn401b1e11f8d4b6f3 at mail.gmail.com>>
>>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>> Thanks.
>> I just added export CPPFLAGS=-I/usr/local/openmpi/include in the bashrc
>> file, and could compile the mpi version of gromacs.
>> The next thing is I got the error on executing mdrun_mpi -h
>> Following is the output. Kindly help.
>>
>> corsica:/usr/local/gromacs/bin # mdrun_mpi -h
>> [corsica:17130] [NO-NAME] ORTE_ERROR_LOG: Not found in file
>> runtime/orte_init_stage1.c at line 182
>>
>> --------------------------------------------------------------------------
>> It looks like orte_init failed for some reason; your parallel process
>> is
>> likely to abort. There are many reasons that a parallel process can
>> fail during orte_init; some of which are due to configuration or
>> environment problems. This failure appears to be an internal failure;
>> here's some additional information (which may only be relevant to an
>> Open MPI developer):
>>
>> orte_rml_base_select failed
>> --> Returned value -13 instead of ORTE_SUCCESS
>>
>>
>> --------------------------------------------------------------------------
>>
>> --------------------------------------------------------------------------
>> It looks like MPI_INIT failed for some reason; your parallel process is
>> likely to abort. There are many reasons that a parallel process can
>> fail during MPI_INIT; some of which are due to configuration or
>> environment
>> problems. This failure appears to be an internal failure; here's some
>> additional information (which may only be relevant to an Open MPI
>> developer):
>>
>> ompi_mpi_init: orte_init_stage1 failed
>> --> Returned "Not found" (-13) instead of "Success" (0)
>>
>> --------------------------------------------------------------------------
>> *** An error occurred in MPI_Init
>> *** before MPI was initialized
>> *** MPI_ERRORS_ARE_FATAL (goodbye)
>> [corsica:17130] Abort before MPI_INIT completed successfully; not
>> able to
>> guarantee that all other processes were killed!
>>
>> --
>> Chandan kumar Choudhury
>> NCL, Pune
>> INDIA
>>
>>
>>
>>
>> U will be getting error s like this
>> as the MPI failed.
>> When u run mdrun with MPI.
>> If u use this CPPFLAGS, and i also had a problem.
>> When i used this option, after that i was unable to revert back it to
>> the original state also.
>>
>> So the better option update the gcc, c++ compilers,
>> If u have doubt in this, remove gromacs, All MPi.
>>
>> then do yum install -y *openmpi*
>>
>> --It will automatically install the missing libraries in dependencies.
>>
>> and then download fftw ---latest
>>
>> the install fftw with
>> ./configure --enable-threads --enable-mpi
>> the gromacs
>> ./configure --enable-threads --enable-mpi
>>
>>
>> this will surely work, because in order gromacs work with MPI.
>> U should compile fftw with MPI as above.
>>
>
> None of this will help if the MPI environment is not configured, e.g. a
> hostfile set up. Chandan needs to get a test MPI program to run before there
> is evidence any of this discussion belongs on the GROMACS mailing last.
>
> Mark
>
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20100201/a846faed/attachment.html>
More information about the gromacs.org_gmx-users
mailing list