[gmx-users] "Lam" is not required for running parallel job
sunita at chem.iitb.ac.in
sunita at chem.iitb.ac.in
Wed Feb 13 13:17:33 CET 2008
Thank you Mark.
Sorry, I forgot to mention that I already installed MPI library. I used
following .rpm files to install MPI.
The details of the installation:
Step 1: FFTW (fftw-3.0.1.tar.gz)
Step 2. MPI (all the above openmpi rpms)
Step 3. GROMACS (gromacs-3.3.1.tar.gz)
The output of "which mpirun" is
[sanjay at proline ~]$ which mpirun
[sanjay at proline ~]$
I was thinking MPI and LAM are to be installed separately.
That's what confused me. But, now it is clear.
Thank you again.
> ----- Original Message -----
> From: sunita at chem.iitb.ac.in
> Date: Wednesday, February 13, 2008 10:20 pm
> Subject: Re: [gmx-users] "Lam" is not required for running parallel job
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>> > ----- Original Message -----
>> > From: sunita at chem.iitb.ac.in
>> > Date: Wednesday, February 13, 2008 5:40 pm
>> > Subject: [gmx-users] "Lam" is not required for running
>> parallel job
>> > To: gmx-users at gromacs.org
>> >> Dear Users,
>> >> Lam doesn't seem to be essential for running parallel job.
>> > An MPI library is essential.
>> >> Because in one of our Xeon machine with quardcore 1.6 Ghz
>> >> processor and
>> >> with 6 nodes we installed parallel version of gromacs (version
>> >> 3.3.1) and
>> >> its running without any problem. Lam is not installed in this
>> machine>> still it is running the parallel job.
>> > How are you running grompp and mdrun, and what makes you think the
>> > calculation is running in parallel?
>> I used the following command to run parallel job and tested it by
>> generating .tpr file for 2 nodes, 4 nodes and 6 nodes. With 6
>> nodes it
>> took least time to finish the mdjob.
>> grompp -np 6 -f ......
>> mpirun -np 6 mdrun_mpi -v -s .....
>> It seems correct to me.
>> What do you say?
> You have an MPI library installed, else mpirun would not exist. Said
> library doesn't have to be called LAM, however. It also existed when
> someone compiled mdrun_mpi, unless they actually did it single-processor
> and your above snippets don't reflect what you're actually doing.
> It's a good idea when seeking feedback to give full information, e.g.
> copying and pasting your actual commands. If the contents of your head
> were always right, you would probably already have solved your own
> problem. Accordingly, we'd much prefer know what you actually did, rather
> than something that's been filtered through your head. :-)
> gmx-users mailing list gmx-users at gromacs.org
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
More information about the gromacs.org_gmx-users