AW: [gmx-users] expected it to be for 1 nodes

Jens Krüger mercutio at uni-paderborn.de
Thu Sep 29 14:31:16 CEST 2005


Hello Jeroen,
 
try somthing like:
 
>lamboot machinefile.input             (to start MPI on all nodes
specified in the input file)
 
>lamnodes                                    (to see, if every node is
really there)
 
>mpirun n0-1 mdrun_mpi -v -np 2 ... (to start the job on cpu0 and cpu1)
 
In my experience mpi AND mdrun like to know, that they shall run the job
on two cpu's.
 
Best regards,
 
 
Jens

-----Ursprüngliche Nachricht-----
Von: gmx-users-bounces at gromacs.org
[mailto:gmx-users-bounces at gromacs.org] Im Auftrag von Jeroen Mesters
Gesendet: Donnerstag, 29. September 2005 14:06
An: gmx-users at gromacs.org
Betreff: [gmx-users] expected it to be for 1 nodes


Machine: dual processor dell precision 650 with SuSE linux 9.2
Software: 1) mpich installed, machines file constructed, mpd started 2)
compiled gromacs twice, once with and once without --enable-mpi (using
mpich). So, I ended up with two programmes, mdrun_mpi and mdrun. The
program versions are different in size which I would expect... Indeed,
one program was compiled with cc and the other with mpicc (no errors
during compilation).

Step 1: grompp -np 2 -f full.mdp -c after_pr_CA.gro -p ph5.top -n
index.ndx -o 2cpu.tpr
Step 2: mdrun_mpi -np 2 -s 2cpu.tpr  >>>> Fatal error: run input file
2cpu.tpr was made for 2 nodes, while mdrun_mpi expected it to be for 1
nodes.

Question 1. Are my commands correct i.e. is this the proper why to
start?
Question 2. Can this work at all or do I really need 2 computers (2
nodes)? 


Thanx for any hints, Jeroen.
-- 

Jeroen Raymundus Mesters, Ph.D.

Institut fuer Biochemie, Universitaet zu Luebeck

Ratzeburger Allee 160, D-23538 Luebeck

Tel: +49-451-5004070, Fax: +49-451-5004068

E-mail: mesters at biochem.uni-luebeck.de

Http://www.biochem.uni-luebeck.de

--

If you can look into the seeds of time and say

which grain will grow and which will not - speak then to me  (Macbeth)

--

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20050929/7d3424e0/attachment.html>


More information about the gromacs.org_gmx-users mailing list