[gmx-users] only one cpu "works" in my linux cluster
liu xin
zgxjlx at gmail.com
Tue Oct 9 16:14:43 CEST 2007
Thanks for your quick comment Alan :)
On 10/9/07, Alan Dodd <anoddlad at yahoo.com> wrote:
>
> Yeah.
> Check the machinefile tells mpich what you think it does. It looks pretty
> certain to be an issue with mpich rather than gromacs. Unfortunately, I
> don't use mpich any more because I find LAM better. Standard procedure,
> turn up the verbosity on everything, check the outputs, re-read the relevant
> manuals.
>
> ----- Original Message ----
> From: liu xin <zgxjlx at gmail.com>
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Sent: Tuesday, October 9, 2007 1:38:50 PM
> Subject: Re: [gmx-users] only one cpu "works" in my linux cluster
>
> Thanks Alan
> in fact I've already added the option -machinefile, here's the script:
>
> cat $PBS_NODEFILE >/home/liuxin/mpd.hosts
> /home/liuxin/mpich2/bin/mpdboot -n 6 -f mpd.hosts
> /home/liuxin/mpich2/bin/mpiexec -machinefile $PBS_NODEFILE -np 12
> /home/liuxin/
> programs/gromacs33mpi/bin/mdrun -v -s 12np.tpr -o -c -e -g -np 12
>
> I am a little busy today, sorry for replying so late...
>
> On 10/8/07, Alan Dodd <anoddlad at yahoo.com> wrote:
> >
> > Check the MPICH manuals for how to specify the nodes to run on. From
> > memory, the option -machinefile lets you do this.
> >
> > ----- Original Message ----
> > From: liu xin < zgxjlx at gmail.com>
> > To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> > Sent: Monday, October 8, 2007 1:51:12 PM
> > Subject: [gmx-users] only one cpu "works" in my linux cluster
> >
> > Dear GMXers
> >
> > this is how I've done it so far:
> > grompp -f -c -p -o 12np.tpr -np 12
> > qsub -l node=6 12np.sh (/home/me/mpich2/bin/mpirun -np 12 mdrun -s
> > 12np.tpr -np 12 )
> > then it seems my mdrun works fine, but when I ssh to each node to check
> > the cpu efficiency with "top", I find that there's only one cpu works with
> > 12 processes! The other nodes are completely idle.
> >
> > the cluster have 56 Intel Xeon duo 3.0G CPUs, below is my system info:
> > Linux login 2.4.21-32.EL #1 SMP Fri Apr 15 21:02:58 EDT 2005 x86_64
> > x86_64 x86_64 GNU/Linux
> > our administrator have installed mpich1.2.7 in the default DIR
> > (/usr/local), but I have some problems to do mpi-mdrun with mpich1.2.7,
> > so I installed mpich2 in my personal dir.
> > I've searched the list, but cant get any solution. Does it have
> > something to do with the kernel or there's some conflict between mpich1 and
> > 2, or something else?
> >
> > This is the first time I build up a Linux Cluster on my own, ANY
> > suggestions are appreciated !
> >
> > Xin Liu
> >
> >
> > ------------------------------
> > Fussy? Opinionated? Impossible to please? Perfect. Join Yahoo!'s user
> > panel<http://us.rd.yahoo.com/evt=48516/*http://surveylink.yahoo.com/gmrs/yahoo_panel_invite.asp?a=7+>and lay it on us.
> >
> > _______________________________________________
> > gmx-users mailing list gmx-users at gromacs.org
> > http://www.gromacs.org/mailman/listinfo/gmx-users
> > Please search the archive at http://www.gromacs.org/search before
> > posting!
> > Please don't post (un)subscribe requests to the list. Use the
> > www interface or send it to gmx-users-request at gromacs.org.
> > Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> >
>
>
>
>
> ------------------------------
> Pinpoint customers
> <http://us.rd.yahoo.com/evt=48250/*http://searchmarketing.yahoo.com/arp/sponsoredsearch_v9.php?o=US2226&cmp=Yahoo&ctv=AprNI&s=Y&s2=EM&b=50>who
> are looking for what you sell.
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20071009/b123b976/attachment.html>
More information about the gromacs.org_gmx-users
mailing list