[gmx-users] mpirun problem

Carsten Kutzner ckutzne at gwdg.de
Sun Jun 15 21:30:51 CEST 2008

On 15.06.2008, at 20:19, ha salem wrote:

> dear users
>      I have encouneterd a problem with mpirun I have 2 pc (every pc  
> has 1 intel
>      quad core cpu) ,when I run mdrun on 1 machine with  "-np 4 "  
> option the calculation
>      run on 4 cores and goes faster ,system monitor show all 4 cores  
> of this cpu are working
>      every core has 90% cpu usage,and every thing is ok
>      but now I connect 2 computer to LAN and I executed lamboot -v  
> lamhosts
>      then I run mpirun -np 8 but I see the all 8 cores of 2 machines  
> are workinng with
>      20 % 10% cpu usage and speed is lower than 4 cores of 1 cpu!!!!
This could have a lot of reasons. What kind of interconnect do you  
use? If it is gigabit
Ethernet, you will need at least 80000 particles to be faster on two 4  
CPU machines compared
to one. With only fast ethernet, do not expect any scaling at all on  
today's fast processors.

Try grompp -shuffle -sort, this will help increase the scaling a bit.


>      can you help me ?my molecule is part of hsa and is macro molecule
>    these are my commands
> usr/local/share/gromacs_331/bin/grompp -np 8 -f prmd.mdp -c  
> finalprsp.gro -r finalprsp.gro -p n.top
> mpirun -np 8 /usr/local/share/gromacs_331/bin/mdrun -np 8 -s  
> prmd.tpr -o prmd.trr -c finalprmd.gro -g prmd.log -e prmd.edr -n n.ndx
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before  
> posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20080615/c0b24897/attachment.html>

More information about the gromacs.org_gmx-users mailing list