[gmx-users] mpirun problem
Carsten Kutzner
ckutzne at gwdg.de
Mon Jun 16 13:09:55 CEST 2008
ha salem wrote:
> dear carsten
> I have 110000 particels ,cores of cpu has 90 % usage on one machine but
> when I run the same
> calculation on 2 machine the cpu usage of cores reduce to 20% my lan
> is 100 Mbps
> do you mean with LAN Gigabit the cpu uasage increase to 90%?
From the benchmarks I have seen, I can say that you cannot expect any
speedup if your computers are only connected with 100 Mpbs. You will
need at least 1000 Mbps, or better Infiniband/Myrinet.
Carsten
> thank you
>
> --- On *Sun, 6/15/08, Carsten Kutzner /<ckutzne at gwdg.de>/* wrote:
>
> From: Carsten Kutzner <ckutzne at gwdg.de>
> Subject: Re: [gmx-users] mpirun problem
> To: greencomp86 at yahoo.com, "Discussion list for GROMACS users"
> <gmx-users at gromacs.org>
> Date: Sunday, June 15, 2008, 7:30 PM
>
> On 15.06.2008, at 20:19, ha salem wrote:
>
>> dear users
>> I have encouneterd a problem with mpirun I have 2 pc (every
>> pc has 1 intel
>> quad core cpu) ,when I run mdrun on 1 machine with "-np 4 "
>> option the calculation
>> run on 4 cores and goes faster ,system monitor show all 4
>> cores of this cpu are working
>> every core has 90% cpu usage,and every thing is ok
>> but now I connect 2 computer to LAN and I executed lamboot -v
>> lamhosts
>> then I run mpirun -np 8 but I see the all 8 cores of 2
>> machines are workinng with
>> 20 % 10% cpu usage and speed is lower than 4 cores of 1 cpu!!!!
>>
> This could have a lot of reasons. What kind of interconnect do you
> use? If it is gigabit
> Ethernet, you will need at least 80000 particles to be faster on two
> 4 CPU machines compared
> to one. With only fast ethernet, do not expect any scaling at all on
> today's fast processors.
>
> Try grompp -shuffle -sort, this will help increase the scaling a bit.
>
> Regards,
> Carsten
>
>>
>> can you help me ?my molecule is part of hsa and is macro molecule
>> these are my commands
>> usr/local/share/gromacs_331/bin/grompp -np 8 -f prmd.mdp -c
>> finalprsp.gro -r finalprsp.gro -p n.top
>> mpirun -np 8 /usr/local/share/gromacs_331/bin/mdrun -np 8 -s
>> prmd.tpr -o prmd.trr -c finalprmd.gro -g prmd.log -e prmd.edr -n n.ndx
>>
>>
>> _______________________________________________
>> gmx-users mailing list gmx-users at gromacs.org
>> <mailto:gmx-users at gromacs.org>
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
>
More information about the gromacs.org_gmx-users
mailing list