[gmx-users] Attempting to scale gromacs mdrun_mpi

Mark Abraham mark.abraham at anu.edu.au
Sun Aug 29 13:46:25 CEST 2010



----- Original Message -----
From: NG HUI WEN <HuiWen.Ng at nottingham.edu.my>
Date: Friday, August 27, 2010 15:55
Subject: [gmx-users] Attempting to scale gromacs mdrun_mpi
To: gmx-users at gromacs.org


      <!--                          font-face 	{font-family:"Cambria Math";} font-face 	{font-family:Calibri;} font-face 	{font-family:Consolas;}                           p.MsoNormal, li.MsoNormal, div.MsoNormal 	{ 	margin:0in; 	margin-bottom:.0001pt; 	font-size:11.0pt; 	font-family:"Calibri","sans-serif";} a:link, span.MsoHyperlink 	{ 	color:blue; 	text-decoration:underline;} a:visited, span.MsoHyperlinkFollowed 	{ 	color:purple; 	text-decoration:underline;} p.MsoPlainText, li.MsoPlainText, div.MsoPlainText 	{ 	margin:0in; 	margin-bottom:.0001pt; 	font-size:10.5pt; 	font-family:Consolas;} span.EmailStyle17 	{ 	font-family:"Calibri","sans-serif"; 	color:windowtext;} span.PlainTextChar 	{ 	font-family:Consolas;} span.SpellE 	{} .MsoChpDefault 	{}  div.Section1 	{page:Section1;} -->  -----------------------------------------------------------
| 

 
 > Thanks a lot Roland! >   > I'm using Beowulf cluster which I believe uses ethernet connection (1 node = 1 cpu in my case). I found an article by Kutzner et al (2006) which talks about the problem in speeding up parallel processing in such clusters. Does it mean that if i continue to use the ethernet-connected cluster and not gone for the 4.5 beta version, I'm stuck with a low number of processors due to the difficulty in scaling? Thanks again for your advice!

Yes. Ethernet, even gigabit, is not up to the job of doing the required communication for GROMACS for more than a handful of processors. You can get what you pay for with networking hardware, I'm afraid :-)

Mark
 |
-----------------------------------------------------------



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20100829/3f1d93dc/attachment.html>


More information about the gromacs.org_gmx-users mailing list