[gmx-users] cluster problem

gerph at correo.unam.mx gerph at correo.unam.mx
Tue Oct 19 23:13:00 CEST 2004


Dear David,

We followed all the steps listed in documentation:

http://www.gromacs.org/documentation/reference_3.2/online/speptide.html

or the right one for the version 3.1.1 ('cause some options change)

The first two pairs of grompp & mdrun executed well but the second one
mdrun failed:

/local/mpich/bin/mpiexec -np 6 /local/gromacs2/i686-pc-linux-gnu/bin/mdrun
-e $PBS_O_WORKDIR/pr.edr -s $PBS_O_WORKDIR/pr -o $PBS_O_WORKDIR/pr -c
$PBS_O_WORKDIR/after_pr -g $PBS_O_WORKDIR/prlog

We also set this variable:

export P4_GLOBMEMSIZE=5242880

Everything is ok with <= 5 processes (each process per node, when we try
using both processors of the node also fails) but when we try >= 6 it
hangs up.

Some idea is welcome,

Gerardo

> On Tue, 2004-10-19 at 18:46, gerph at correo.unam.mx wrote:
> > Hi everybody,
> > 
> > we have a cluster beowulf with smp nodes running (almost well, but
> not
> > totally) GROMACS but we can only run the demo s-peptide as just as 5
> > task, no more than that.
> > If we try to use both processors of the node it fails, even if we
> only
> > create 2 tasks and try to run them in the same node. Apparently, one
> of
> > the task hangs up (the master or task 0) and the process never ends.
> > I know that 'cause I kill the job taking out from pbs scheduler.
> > 
> > We can run that demo with one process (P) for each node;
> >        with P <= 5
> > 
> > We're using mpiexec and pbs.
> 
> try running the water box from the tutorial first. Do you mean it
> doesn't work, or crashes, or does not become faster?
> > 
> > Test it with gromacs 3.1.1 & 3.2.1
> > 
> > 
> > Next, we sumarize our cluster's configuration:
> > 
> > 
> > %mpiexec -v
> > 
> >   Version 0.71, configure options: --with-pbs=/usr/local/pbs
> > --with-default-comm=mpich-p4 --with-mpicc=/local/mpich/bin/mpicc
> > 
> > 
> > mpich-1.2.4
> > 
> > 
> > OpenPBS_2_3_16
> > 
> > 
> > All nodes configuration:         2.4.18papikernel #2 SMP  RH 7.2
> > 
> > 
> > 
> > Gromac's version:                3.1  &  3.2.1
> > 
> > 
> > 
> > Hope you can help us.
> > 
> > 
> > -------------------------------------------------
> > www.correo.unam.mx
> > UNAMonos Comunicándonos
> > 
> > _______________________________________________
> > gmx-users mailing list
> > gmx-users at gromacs.org
> > http://www.gromacs.org/mailman/listinfo/gmx-users
> > Please don't post (un)subscribe requests to the list. Use the 
> > www interface or send it to gmx-users-request at gromacs.org.
> -- 
> David.
> ________________________________________________________________________
> David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
> Dept. of Cell and Molecular Biology, Uppsala University.
> Husargatan 3, Box 596,  	75124 Uppsala, Sweden
> phone:	46 18 471 4205		fax: 46 18 511 755
> spoel at xray.bmc.uu.se	spoel at gromacs.org   http://xray.bmc.uu.se/~spoel
> ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> 
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> 


-------------------------------------------------
www.correo.unam.mx
UNAMonos Comunicándonos




More information about the gromacs.org_gmx-users mailing list