[gmx-users] Scaling on dual athlon

David spoel at xray.bmc.uu.se
Sun Oct 20 07:06:55 CEST 2002


On Sat, 2002-10-19 at 13:41, Stefano Piana wrote:
> Hallo, I am a brand new user :-)
> I am very happy of Gromacs, still I have problems with the parallel 
> implementation.
> I compiled with the lam-mpi standard options and I made some tests to see the 
> scaling between one and two processors on a dual athlon. I report the results 
> to provide some statistics and to ask for suggestions on how to improve the 
> parallel performance.
> I tried different systems in water (8071 and 39941 atoms) with PME. These are 
> the timings in seconds:
>               nsteps   1cpu      2cpu     1/2 cpu ratio
> Sh3 protein   1000     87        64        1.36
> (8071 Atoms)  5000     462       332       1.39
> Caspase-3
> (39941 Atoms)  100     63        43        1.46
>                1000     628      427        1.47
> 
> Is this the maximum performance I should expect or can I improve a bit?
> Somewhere I read that the parallel implementation of the PME is not very 
> efficient; however PME takes only 3-4% of my cpu. 10 to 20% is taken by the 
> FFT that should be parallelized (?). Most of the rest (40-60%) is burnt by the 
> wat-wat interaction.
> Bye,
First off, you may improve the scaling somewaht by recompiling LAM with
different options:
  --with-tcp-short=BYTES  use BYTES as the size of the longest TCP short
                          message
  --with-shm-short=BYTES  use BYTES as the size of the longest shared
                          memory short message
set both these values to 500000 or so).


Thenyou may improve scaling somewhat by icreasing the PME order and
having a courser grid, as discussed before on the list.
-- 
Groeten, David.
________________________________________________________________________
Dr. David van der Spoel, 	Biomedical center, Dept. of Biochemistry
Husargatan 3, Box 576,  	75123 Uppsala, Sweden
phone:	46 18 471 4205		fax: 46 18 511 755
spoel at xray.bmc.uu.se	spoel at gromacs.org   http://zorn.bmc.uu.se/~spoel
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++



More information about the gromacs.org_gmx-users mailing list