[gmx-users] Gromacs 4 Scaling Benchmarks...
vivek sharma
viveksharma.iitb at gmail.com
Tue Nov 11 14:30:30 CET 2008
Hi All,
one thing I forgot to mention I am getting here around 6 ns/day...for a
protein of size around 2600 atoms..
With Thanks,
Vivek
2008/11/11 vivek sharma <viveksharma.iitb at gmail.com>
> HI MArtin,
> I am using here the infiniband having speed more than 10 gbps..Can you
> suggest some option to scale better in this case.
>
> With Thanks,
> Vivek
>
> 2008/11/11 Martin Höfling <martin.hoefling at gmx.de>
>
> Am Dienstag 11 November 2008 12:06:06 schrieb vivek sharma:
>>
>>
>> > I have also tried scaling gromacs for a number of nodes ....but was not
>> > able to optimize it beyond 20 processor..on 20 nodes i.e. 1 processor
>> per
>>
>> As mentioned before, performance strongly depends on the type of
>> interconnect
>> you're using between your processes. Shared Memory, Ethernet, Infiniband,
>> NumaLink, whatever...
>>
>> I assume you're using ethernet (100/1000 MBit?), you can tune here to some
>> extend as described in:
>>
>> Kutzner, C.; Spoel, D. V. D.; Fechner, M.; Lindahl, E.; Schmitt, U. W.;
>> Groot,
>> B. L. D. & Grubmüller, H. Speeding up parallel GROMACS on high-latency
>> networks Journal of Computational Chemistry, 2007
>>
>> ...but be aware that principal limitations of ethernet remain. To come
>> around
>> this, you might consider to invest in the interconnect. If you can come
>> out
>> with <16 cores, shared memory nodes will give you the "biggest bang for
>> the
>> buck".
>>
>> Best
>> Martin
>> _______________________________________________
>> gmx-users mailing list gmx-users at gromacs.org
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20081111/01a0a921/attachment.html>
More information about the gromacs.org_gmx-users
mailing list