[gmx-users] "Lam" is not required for running parallel job
mark.abraham at anu.edu.au
mark.abraham at anu.edu.au
Wed Feb 13 13:03:09 CET 2008
----- Original Message -----
From: sunita at chem.iitb.ac.in
Date: Wednesday, February 13, 2008 10:20 pm
Subject: Re: [gmx-users] "Lam" is not required for running parallel job
To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> >
> >
> > ----- Original Message -----
> > From: sunita at chem.iitb.ac.in
> > Date: Wednesday, February 13, 2008 5:40 pm
> > Subject: [gmx-users] "Lam" is not required for running
> parallel job
> > To: gmx-users at gromacs.org
> >
> >> Dear Users,
> >>
> >> Lam doesn't seem to be essential for running parallel job.
> >
> > An MPI library is essential.
> >
> >> Because in one of our Xeon machine with quardcore 1.6 Ghz
> >> processor and
> >> with 6 nodes we installed parallel version of gromacs (version
> >> 3.3.1) and
> >> its running without any problem. Lam is not installed in this
> machine>> still it is running the parallel job.
> >
> > How are you running grompp and mdrun, and what makes you think the
> > calculation is running in parallel?
> >
> I used the following command to run parallel job and tested it by
> generating .tpr file for 2 nodes, 4 nodes and 6 nodes. With 6
> nodes it
> took least time to finish the mdjob.
>
> grompp -np 6 -f ......
> mpirun -np 6 mdrun_mpi -v -s .....
>
> It seems correct to me.
> What do you say?
You have an MPI library installed, else mpirun would not exist. Said library doesn't have to be called LAM, however. It also existed when someone compiled mdrun_mpi, unless they actually did it single-processor and your above snippets don't reflect what you're actually doing.
It's a good idea when seeking feedback to give full information, e.g. copying and pasting your actual commands. If the contents of your head were always right, you would probably already have solved your own problem. Accordingly, we'd much prefer know what you actually did, rather than something that's been filtered through your head. :-)
Mark
More information about the gromacs.org_gmx-users
mailing list