[gmx-users] Performance problems with more than one node

Tiago Marques a28427 at ua.pt
Mon Sep 29 11:40:41 CEST 2008


Hi!

I see that you still can scale a lot more than me. Can you, or someone, give
me one of those testes? I would like to compare with a similar machine,
where gromacs is scaling more than one node well.

Best regards,

                          Tiago Marques

On Thu, Sep 25, 2008 at 2:52 PM, vivek sharma <viveksharma.iitb at gmail.com>wrote:

> Hi friends,
> I am also facing the similar problem when tried to scale gromacs for more
> number of processors ,
> I have tried one job using gromacs on EKA, in an attempt to scale it for
> more number of processor I am able to get the reduction in simulation time
> upto 20 processors, it is taking more time for 40 processor for same
> simulation, and when tried with 60 processor, it crashed with segmentation
> fault.
> i HAVE TRIED OTHER OPTION LIKE CONSTRAINT_ALGO, COULOMBTYPE AND SHUFFLE
> OPTION
>
> 2008/9/25 Tiago Marques <a28427 at ua.pt>
>
>> We currently have no funds available to migrate to infiniband but we will
>> in the future.
>>
>> I thought on doing interface bonding but I really think that isn't really
>> the problem here, there must be something I'm missing, since most
>> applications scale well to 32 cores on GbE. I can't scale any application to
>> more than 8 though.
>>
>> Best regards,
>>
>>                            Tiago Marques
>>
>> On Tue, Sep 23, 2008 at 6:30 PM, Diego Enry <diego.enry at gmail.com> wrote:
>>
>>> Tiago you can try merging two network interfaces with "channel
>>> bonding" it's native on all new (2.6.x) linux kernels. You only need
>>> two network adapters (most dual socket boards come with then), two
>>> network switches ( or two VPN on the same switch).
>>>
>>> To tell you the truth, you will not much improvement even with the
>>> latest gromacs version (4beta). However other software that may be
>>> used by your group like NAMD, GAMESS, will benefit a lot from this
>>> approach. (it almost doubles network bandwidth)
>>>
>>> The best solution for gromacs is to migrate to infiniband. Go for it,
>>> it is not super expensive anymore.
>>>
>>>
>>> On Tue, Sep 23, 2008 at 1:48 PM, Jochen Hub <jhub at gwdg.de> wrote:
>>> > Tiago Marques wrote:
>>> >> I don't know how large the system is. I'm the cluster's system
>>> administrator
>>> >> and don't understand much of what's going on. The test was given to me
>>> by a
>>> >> person who works with it. I can ask him or look at it, if you can
>>> point me
>>> >> how to do it.
>>> >
>>> > Hi,
>>> >
>>> > you can count the nr of atoms in the structure:
>>> >
>>> > grep -c ATOM protein.pdb
>>> >
>>> > Jochen
>>> >
>>> >>
>>> >> Thanks, I will look at some of his posts.
>>> >>
>>> >> Best regards,
>>> >>
>>> >>                         Tiago Marques
>>> >>
>>> >>
>>> >> On Tue, Sep 23, 2008 at 4:03 PM, Jochen Hub <jhub at gwdg.de> wrote:
>>> >> Tiago Marques wrote:
>>> >>> Hi!
>>> >>>
>>> >>> I've been using Gromacs on dual-socket quad-core Xeons with 8GiB of
>>> RAM,
>>> >>> connected with Gigabit Ethernet and I always seem to have problems
>>> scaling
>>> >>> to more than a node.
>>> >>>
>>> >>> When I run a test on 16 cores, it does run but the result is often
>>> slower
>>> >>> than when running on only 8 cores on the same machine. The best
>>> result
>>> >> I've
>>> >>> managed is not being slower than 8 cores on 16.
>>> >>>
>>> >>> What am I missing here, or are the tests inappropriate to run over
>>> more
>>> >> than
>>> >>> one machine?
>>> >>
>>> >> How large is your system? Which gromacs version are you using?
>>> >>
>>> >> And have a look at the messages by Carsten Kutzner in this list, he
>>> >> wrote a lot on gromacs scaling.
>>> >>
>>> >> Jochen
>>> >>
>>> >>> Best regards,
>>> >>>
>>> >>> Tiago Marques
>>> >>>
>>> >>>
>>> >>>
>>> >>>
>>> ------------------------------------------------------------------------
>>> >>>
>>> >>> _______________________________________________
>>> >>> gmx-users mailing list gmx-users at gromacs.org
>>> >>> http://www.gromacs.org/mailman/listinfo/gmx-users
>>> >>> Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> >>> Please don't post (un)subscribe requests to the list. Use the
>>> >>> www interface or send it to gmx-users-request at gromacs.org.
>>> >>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>> >>
>>> >>
>>> >> --
>>> >> ************************************************
>>> >> Dr. Jochen Hub
>>> >> Max Planck Institute for Biophysical Chemistry
>>> >> Computational biomolecular dynamics group
>>> >> Am Fassberg 11
>>> >> D-37077 Goettingen, Germany
>>> >> Email: jhub[at]gwdg.de
>>> >> Tel.: +49 (0)551 201-2312
>>> >> ************************************************
>>> >> _______________________________________________
>>> >> gmx-users mailing list gmx-users at gromacs.org
>>> >> http://www.gromacs.org/mailman/listinfo/gmx-users
>>> >> Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> >> Please don't post (un)subscribe requests to the list. Use the
>>> >> www interface or send it to gmx-users-request at gromacs.org.
>>> >> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>> >>
>>> >>
>>> >>
>>> >>
>>> ------------------------------------------------------------------------
>>> >>
>>> >> _______________________________________________
>>> >> gmx-users mailing list    gmx-users at gromacs.org
>>> >> http://www.gromacs.org/mailman/listinfo/gmx-users
>>> >> Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> >> Please don't post (un)subscribe requests to the list. Use the
>>> >> www interface or send it to gmx-users-request at gromacs.org.
>>> >> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>> >
>>> >
>>> > --
>>> > ************************************************
>>> > Dr. Jochen Hub
>>> > Max Planck Institute for Biophysical Chemistry
>>> > Computational biomolecular dynamics group
>>> > Am Fassberg 11
>>> > D-37077 Goettingen, Germany
>>> > Email: jhub[at]gwdg.de
>>> > Tel.: +49 (0)551 201-2312
>>> > ************************************************
>>> > _______________________________________________
>>> > gmx-users mailing list    gmx-users at gromacs.org
>>> > http://www.gromacs.org/mailman/listinfo/gmx-users
>>> > Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> > Please don't post (un)subscribe requests to the list. Use the
>>> > www interface or send it to gmx-users-request at gromacs.org.
>>> > Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>> >
>>>
>>>
>>>
>>> --
>>> Diego Enry B. Gomes
>>> Laboratório de Modelagem e Dinamica Molecular
>>> Universidade Federal do Rio de Janeiro - Brasil.
>>> _______________________________________________
>>> gmx-users mailing list    gmx-users at gromacs.org
>>> http://www.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at http://www.gromacs.org/search before
>>> posting!
>>> Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-request at gromacs.org.
>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>>
>>
>>
>>
>> _______________________________________________
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>
>
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20080929/30282ad3/attachment.html>


More information about the gromacs.org_gmx-users mailing list