[gmx-users] Haphazard Results with LAM-MPI

David spoel at xray.bmc.uu.se
Sun Apr 20 11:20:01 CEST 2003


On Sun, 2003-04-20 at 07:08, Venkat Ramanan K wrote:
> Hi anton,
> 
> We have tried the benchmark files from the gromacs website and we 
> could easily see the scaleup of the cluster but the time is
> 
> still not optimized compared with the benchmark. We run GROMACS 
> 3.1 on RedHat linux 8, Kernel 2.4.20. We also have
> 
> MOSIX installed. This kernel was compiled by us to implement MOSIX 
> on the cluster. The following are the specifications of
> 
> the system we use...
> 
> System harware was that of default IBM Netvista.
> 
> 1. Intel Pentium III (1GHZ)
> 2. 100Mbits ethernet card (Intel)
> 3. 256 Mb RAM on the server node and 128 Mb RAM on all other 
> nodes.
> 
> Linux : 	Redhat Linux 8
> Kernel : 	2.4.20 (downloaded and compiled from kernel.org)
> LAM:  	lam-usysv-6.5.9 with "rsh" RPI.
> 
> We use the folowing commands....
> 
> 1.grompp -v .........-np 16
> 2.mpirun N -lamd -nger -nsigs mdrun -v -s ........
> 
> I always generate an error if i use -c2c option with mpirun. I am 
> not able to figure out the problem. I will send u the .config file 
> in
> 
> my next mail when i get back to my dept.

Please recompile LAM with the recommended options for tcpshortblocksize
(512k). Without the -c2c option it will never be fast I'm afraid.

> 
> Venky
> 
> 
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
-- 
Groeten, David.
________________________________________________________________________
Dr. David van der Spoel, 	Dept. of Cell and Molecular Biology
Husargatan 3, Box 596,  	75124 Uppsala, Sweden
phone:	46 18 471 4205		fax: 46 18 511 755
spoel at xray.bmc.uu.se	spoel at gromacs.org   http://xray.bmc.uu.se/~spoel
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++



More information about the gromacs.org_gmx-users mailing list