[gmx-users] [Fwd: low cpu usage]

Mark Abraham Mark.Abraham at anu.edu.au
Sat Jun 21 15:32:14 CEST 2008


Less than 24 hours ago 
http://www.gromacs.org/pipermail/gmx-users/2008-June/034719.html, I said

"Please put GROMACS-related correspondence on the list. That way 
multiple people can have a chance to reply, and the answer is archived 
for all to search."

The details you provide are useful, but not complete. There is a lot 
more to running a parallel calculation than the number and type of your 
CPUs. The kind of network is much more critical. See the many posts on 
this list by Carsten Kutzner on this topic.


-------- Original Message --------
Subject: low cpu usage
Date: Sat, 21 Jun 2008 05:16:37 -0700 (PDT)
From: ha salem <greencomp86 at yahoo.com>
To: mark.abraham at anu.edu.au

  Run Gromacs on 2 machine (each machine has 1 core2quad)

/usr/local/share/gromacs_333/bin/grompp -f md.mdp -po mdout.mdp -c
md.gro -r md_out.gro -n md.ndx -p md.top -o topol.tpr -np 2

  mpirun -np 2 /usr/local/share/gromacs_333/bin/mdrun  -np 2-f topol.tpr
-o md.trr -c md_out.gro -e md.edr -g md.log &
I also test with -np 8 but my cpu usage is low and the speed is less 
than single run!!!
thank you in your advance



On June 5 
http://www.gromacs.org/pipermail/gmx-users/2008-June/034430.html, I 
suggested you check out 
http://wiki.gromacs.org/index.php/grompp#Parallel_calculations, and on 
June 6 I felt the need to give you the same advice and point out that 
the command line you've given in your email would not have been 
effective, had you actually used it.

The same goes for the above command line... "-np 2-f topol.tpr" was not 
the command line you used if mdrun actually ran. We're not interested in 
reconstructions... we're interested in what you actually did. :-)

For certainty of correct function, all three -np options require the 
number of MPI processes you wish to run. In your case, it sounds like 
they should be 4*2=8 in all cases. Your text does not make it clear what 
you've actually done.

Mark



More information about the gromacs.org_gmx-users mailing list