[gmx-users] infiniband question
David van der Spoel
spoel at xray.bmc.uu.se
Thu Sep 6 08:29:41 CEST 2007
Andrei Neamtu wrote:
> Hello gmx,
>
> I have a question regarding infiniband interconnect: Is
> there any difference (in terms of performance) between integrated
> on-board infiniband (ex. Mellanox) and PCI-Express infiniband adaptors
> (due to pci-e limitations)? Which one is recomended. We are in a
> process of buying a cluster on which gromacs will be the main
> computational engine.
> Any help will be greatly appreciated!
> Thank you,
> Andrei
If you are buying a serious cluster then ask them to let you do
benchmarks. the PCI bus can be a bottleneck, although the throughput is
still quite OK. It also depends on how many cores are sharing the
connection.
>
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
--
David van der Spoel, Ph.D.
Molec. Biophys. group, Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone: +46184714205. Fax: +4618511755.
spoel at xray.bmc.uu.se spoel at gromacs.org http://folding.bmc.uu.se
More information about the gromacs.org_gmx-users
mailing list