[gmx-users] Re: Infiniband Setup Benchmark Results

Ilya Chorny ichorny at gmail.com
Fri Aug 21 17:04:52 CEST 2009

I just setup an infiniband networking system on my Dell dual quad core
cluster and wanted to share the benchmark results.
I am using a Cisco SFS 7000P switch and Mellanox MHEA28-XTC HCA infiniband
cards. Setup was trivial thanks to a very competent sys admin!!!! Used open
the openMPI that came with the Mellanox software and it worked out of the

All of the nodes in the cluster are dual quad core. 128 cores total.


425362 Atoms Protein/Water
4 fs time step
PME, etc..

8 Cores1ns/day
16 Cores 2ns/day
32 Cores 4 ns/day
64 Cores 8 ns/day
128 Cores 14 ns/day.

I get 100% scaling through 64 and 87% @ 128. Not bad!!!!



Ilya Chorny Ph.D.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20090821/bcb3ad0c/attachment.html>

More information about the gromacs.org_gmx-users mailing list