[gmx-users] Parallel processing

Mark Abraham Mark.Abraham at anu.edu.au
Mon May 5 14:56:50 CEST 2008


mariognu-outside at yahoo.com.br wrote:
> Hi all.
> 
> First, excuse my English, it is not too good, but I'm
> learning.
> 
> I have a problem with my gromacs. I have a Dell
> Precision 470 (2 Xeon processor with 2Gb RAM) and I
> can't run mdrun_mpi for 2 processor. I describe what
> is happening. 
> 
> I installed gromacs-3.3.3-1.x86_64.rpm and
> gromacs-mpi-3.3.3-1.x86_64.rpm. My Linux is Fedora 8
> x64.
> 
> If I run 
> 
> mdrun -v -deffnm run
> 
> or
> 
> mpiexec -np 2 mdrun_mpi -np 2 -v -deffnm run
> 
> or
> 
> mdrun_mpi -np 2 -v -deffnm run
> 
> The time to conclude dynamics is the same. I think
> that  the the 2 last command will be more fast. But if
> I look to system monitor and analyze process I see the
> in the second case the 2 processor is 100%. If I see
> in the thirty case the process is alternate in the 2
> processor, one is 100% other is near 10%, after one is
> near 10% and other is 100%.
> 
> If I run 
> 
> grompp -f run.mdp -p topol.top -c run -o run -np 2
> mpiexec -np 2 mdrun_mpi -np 2 -v -deffnm run

Only with this grompp line do you have a chance to produce a .tpr file 
for a run in parallel. You also need to have configured your MPI 
environment to run two processes with one on each processor, and be 
running an MPI mdrun.

> I have this error
> 
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 3.3.3
> Source code file: init.c, line: 69
> 
> Fatal error:
> run input file run.tpr was made for 2 nodes,
>              while mdrun_mpi expected it to be for 1
> nodes.
> -------------------------------------------------------

I expect you're actually using a 1-processor run.tpr. Clean up your 
directory and try again.

> And I see that the gromacs is run in one node, but I
> put in command to run in two nodes
> 
> [mariojose at xxx smase]$ mpiexec -np 2 mdrun_mpi -np 2
> -v -deffnm run
> NNODES=1, MYRANK=0, HOSTNAME=xxx.unesp.br

Check everything carefully!

Mark



More information about the gromacs.org_gmx-users mailing list