[gmx-users] cluster problem
David
spoel at xray.bmc.uu.se
Tue Oct 19 19:00:48 CEST 2004
On Tue, 2004-10-19 at 18:46, gerph at correo.unam.mx wrote:
> Hi everybody,
>
> we have a cluster beowulf with smp nodes running (almost well, but not
> totally) GROMACS but we can only run the demo s-peptide as just as 5
> task, no more than that.
> If we try to use both processors of the node it fails, even if we only
> create 2 tasks and try to run them in the same node. Apparently, one of
> the task hangs up (the master or task 0) and the process never ends.
> I know that 'cause I kill the job taking out from pbs scheduler.
>
> We can run that demo with one process (P) for each node;
> with P <= 5
>
> We're using mpiexec and pbs.
try running the water box from the tutorial first. Do you mean it
doesn't work, or crashes, or does not become faster?
>
> Test it with gromacs 3.1.1 & 3.2.1
>
>
> Next, we sumarize our cluster's configuration:
>
>
> %mpiexec -v
>
> Version 0.71, configure options: --with-pbs=/usr/local/pbs
> --with-default-comm=mpich-p4 --with-mpicc=/local/mpich/bin/mpicc
>
>
> mpich-1.2.4
>
>
> OpenPBS_2_3_16
>
>
> All nodes configuration: 2.4.18papikernel #2 SMP RH 7.2
>
>
>
> Gromac's version: 3.1 & 3.2.1
>
>
>
> Hope you can help us.
>
>
> -------------------------------------------------
> www.correo.unam.mx
> UNAMonos Comunicándonos
>
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
--
David.
________________________________________________________________________
David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596, 75124 Uppsala, Sweden
phone: 46 18 471 4205 fax: 46 18 511 755
spoel at xray.bmc.uu.se spoel at gromacs.org http://xray.bmc.uu.se/~spoel
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
More information about the gromacs.org_gmx-users
mailing list