[gmx-users] low cpu usage

Yang Ye leafyoung at yahoo.com
Wed Jun 25 15:08:55 CEST 2008


It could be system-specific. Could you try out dppc in tutor/gmxbench or 
download gmxbench from gromacs' website (section Benchmark)?

Regards,
Yang Ye

Dr. Bernd Rupp wrote:
> same problem as mpich2.
>
> regards,
> Bernd
>
> Am Mittwoch, 25. Juni 2008 schrieb Yang Ye:
>   
>> I don't think Python is to be blamed.
>> How about lam-mpi?
>>
>> Regards,
>> Yang Ye
>>
>> Dr. Bernd Rupp wrote:
>>     
>>> Dear all,
>>>
>>>
>>> CPU: Intel(R) Core(TM)2 Extreme CPU Q6850  @ 3.00GHz
>>> System: fedora 8
>>> Kernel: 2.6.25.6-27.fc8 #1 SMP
>>> gromacs 3.3.3 correct compiled
>>> MPI : mpich or mpich2
>>>
>>> We had the same problem with mpich2.
>>> single processor run CPU load 100%
>>> double processor run CPU load around 70%
>>> quad processor run CPU load around 40%
>>>
>>> With mpich we have no problem:
>>> quad processor run CPU load around 95%
>>>
>>> We think that implementation of python are the reason of the bad scaling
>>> of mpich2. Because mpiexec and mpdboot of mipch2 are python scipts.
>>>
>>> May be we are wrong, but mpich dont use python and runs well!?
>>>
>>> see you
>>>
>>> Bernd
>>>
>>> Am Samstag, 21. Juni 2008 schrieb ha salem:
>>>       
>>>>>> Dear users
>>>>>> my gromacs is 3.3.3 my cpus are intel core2quad 2.4
>>>>>>             
>>>>> GHz and my mpi is
>>>>>
>>>>>           
>>>>>> LAM 7.0.6
>>>>>> I can get the cpu usage of 4 cores on one node but
>>>>>>             
>>>>> when I run on 2
>>>>>
>>>>>           
>>>>>> node the cpu usage of cores is low
>>>>>> I have installed gromacs with these instructions
>>>>>> Compile LAM 7
>>>>>> ./configure --prefix=/usr/local/share/lam7
>>>>>>             
>>>>> --enable-static
>>>>>
>>>>>           
>>>>>> make |tee make.log
>>>>>> make install
>>>>>> make clean
>>>>>>
>>>>>> Compile fftw
>>>>>>
>>>>>>  export MPI_HOME=/usr/local/share/lam7
>>>>>> export LAMHOME=/usr/local/share/lam7
>>>>>> export PATH=/usr/local/share/lam7/bin:$PATH
>>>>>> ./configure --prefix=/usr/local/share/fftw3
>>>>>>             
>>>>> --enable-mpi
>>>>>
>>>>>           
>>>>>> make |tee make.log
>>>>>> make install
>>>>>> make distclean
>>>>>>
>>>>>> Compile Gromacs
>>>>>>
>>>>>> export MPI_HOME=/usr/local/share/lam7
>>>>>> export LAMHOME=/usr/local/share/lam7
>>>>>> export PATH=/usr/local/share/lam7/bin:$PATH
>>>>>>
>>>>>> ./configure  --prefix=/usr/local/share/gromacs_333
>>>>>> --exec-prefix=/usr/local/share/gromacs_333
>>>>>>             
>>>>> --program-prefix=""
>>>>>
>>>>>           
>>>>>> --program-suffix="" --enable-static
>>>>>>             
>>>>> --enable-mpi  --disable-float
>>>>>
>>>>>           
>>>>>> make |tee make.log
>>>>>> make install
>>>>>> make distclean
>>>>>>
>>>>>> lamboot -v lamhosts
>>>>>>
>>>>>>
>>>>>>  Run Gromacs on 2 machine (each machine has 1
>>>>>>             
>>>>> core2quad)
>>>>>
>>>>>           
>>>>>> /usr/local/share/gromacs_333/bin/grompp -f md.mdp -po
>>>>>>             
>>>>> mdout.mdp -c
>>>>>
>>>>>           
>>>>>> md.gro -r md_out.gro -n md.ndx -p md.top -o topol.tpr
>>>>>>             
>>>>> -np 2
>>>>>
>>>>>           
>>>>>>  mpirun -np 2 /usr/local/share/gromacs_333/bin/mdrun
>>>>>>             
>>>>> -np 2-f topol.tpr
>>>>>
>>>>>           
>>>>>> -o md.trr -c md_out.gro -e md.edr -g md.log &
>>>>>> I also test with -np 8 but my cpu usage is low and the
>>>>>>             
>>>>> speed is less
>>>>>
>>>>>           
>>>>>> than single run!!!
>>>>>> thank you in your advance
>>>>>>             
>>>>> ------------------------------------------------------------------------
>>>>>
>>>>> _______________________________________________
>>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>>> http://www.gromacs.org/mailman/listinfo/gmx-users
>>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>>> Please don't post (un)subscribe requests to the list. Use the 
>>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php




More information about the gromacs.org_gmx-users mailing list