Fwd: [gmx-users] Performance problems with more than one node

Carsten Kutzner ckutzne at gwdg.de
Fri Sep 26 15:59:29 CEST 2008


vivek sharma wrote:
> Hi Carsten and Justin,
> I am interrupting here as I tried with the option u suggested..
> I tried cut-off instead of PME as coulombtype option it is running well 
> for 24 processor, then I tried with 60 processor , following is the 
> result I am getting
> 
> Result1: When tried for 50 ps of run on 24 processors, with PME took 
> 12:29 in comparison to 7:54 with cut-off
This looks reasonable to me.
> 
> Result2: When tried for 500 ps of run on 60 processors, with PME it is 
> giving same segmentation fault again and with cut-off it is giving LINCS 
> warning and exiting with writing the intermediate step.pdb
Don't run gromacs 3.3 on 60 processors with such a small system. You should
have at least 1000 particles per CPU with gmx 3. Try with gromacs 4, where
also your protein can be split among the processors.

Carsten

>  
> Can you suggest some more option that I can try for scaling experiment...
> Also I tried with shuffle and sort option it didn't worked for me as my 
> system is simply one protein molecule in a ater box (around 45000 no. of 
> atoms)
> connected Gromacs version I am using is 3.3.3 and the hardware is like 
> all nodes contain quad-core 3.0 GHz Intel Xeon processors connected via 
> infiniband.
> 
> With Thanks,
> Vivek
> 
> 2008/9/26 Carsten Kutzner <ckutzne at gwdg.de <mailto:ckutzne at gwdg.de>>
> 
>     Hi Tiago,
> 
>     if you swith off PME and suddenly your system scales, then the
>     problems are likely to result from bad MPI_Alltoall performance. Maybe
>     this is worth a check. If this is the case, there's a lot more
>     information
>     about this in the paper "Speeding up parallel GROMACS on high-
>     latency networks" from 2007 to which you will also find link on the
>     gromacs webpage.
> 
>     What you can also do to track down the problem is to compile gromacs
>     with
>     MPE logging, for which you have to enable the #define USE_MPE macro
>     at the
>     begin of mpelogging.h (you will have to use gmx version 4, though). You
>     will get a logfile which you can view with jumpshot then. The MPE tools
>     come with the MPICH MPI distribution.
> 
>     Carsten
> 
> 
>     Tiago Marques wrote:
> 
>         We currently have no funds available to migrate to infiniband
>         but we will in the future.
> 
>         I thought on doing interface bonding but I really think that
>         isn't really the problem here, there must be something I'm
>         missing, since most applications scale well to 32 cores on GbE.
>         I can't scale any application to more than 8 though.
> 
>         Best regards,
>                                   Tiago Marques
> 
> 
>         On Tue, Sep 23, 2008 at 6:30 PM, Diego Enry
>         <diego.enry at gmail.com <mailto:diego.enry at gmail.com>
>         <mailto:diego.enry at gmail.com <mailto:diego.enry at gmail.com>>> wrote:
> 
>            Tiago you can try merging two network interfaces with "channel
>            bonding" it's native on all new (2.6.x) linux kernels. You
>         only need
>            two network adapters (most dual socket boards come with
>         then), two
>            network switches ( or two VPN on the same switch).
> 
>            To tell you the truth, you will not much improvement even
>         with the
>            latest gromacs version (4beta). However other software that
>         may be
>            used by your group like NAMD, GAMESS, will benefit a lot from
>         this
>            approach. (it almost doubles network bandwidth)
> 
>            The best solution for gromacs is to migrate to infiniband. Go
>         for it,
>            it is not super expensive anymore.
> 
> 
>            On Tue, Sep 23, 2008 at 1:48 PM, Jochen Hub <jhub at gwdg.de
>         <mailto:jhub at gwdg.de>
>            <mailto:jhub at gwdg.de <mailto:jhub at gwdg.de>>> wrote:
>             > Tiago Marques wrote:
>             >> I don't know how large the system is. I'm the cluster's
>         system
>            administrator
>             >> and don't understand much of what's going on. The test
>         was given
>            to me by a
>             >> person who works with it. I can ask him or look at it, if you
>            can point me
>             >> how to do it.
>             >
>             > Hi,
>             >
>             > you can count the nr of atoms in the structure:
>             >
>             > grep -c ATOM protein.pdb
>             >
>             > Jochen
>             >
>             >>
>             >> Thanks, I will look at some of his posts.
>             >>
>             >> Best regards,
>             >>
>             >>                         Tiago Marques
>             >>
>             >>
>             >> On Tue, Sep 23, 2008 at 4:03 PM, Jochen Hub <jhub at gwdg.de
>         <mailto:jhub at gwdg.de>
>            <mailto:jhub at gwdg.de <mailto:jhub at gwdg.de>>> wrote:
>             >> Tiago Marques wrote:
>             >>> Hi!
>             >>>
>             >>> I've been using Gromacs on dual-socket quad-core Xeons with
>            8GiB of RAM,
>             >>> connected with Gigabit Ethernet and I always seem to have
>            problems scaling
>             >>> to more than a node.
>             >>>
>             >>> When I run a test on 16 cores, it does run but the result is
>            often slower
>             >>> than when running on only 8 cores on the same machine.
>         The best
>            result
>             >> I've
>             >>> managed is not being slower than 8 cores on 16.
>             >>>
>             >>> What am I missing here, or are the tests inappropriate
>         to run
>            over more
>             >> than
>             >>> one machine?
>             >>
>             >> How large is your system? Which gromacs version are you
>         using?
>             >>
>             >> And have a look at the messages by Carsten Kutzner in
>         this list, he
>             >> wrote a lot on gromacs scaling.
>             >>
>             >> Jochen
>             >>
>             >>> Best regards,
>             >>>
>             >>> Tiago Marques
>             >>>
>             >>>
>             >>>
>             >>>
>          
>          ------------------------------------------------------------------------
>             >>>
>             >>> _______________________________________________
>             >>> gmx-users mailing list gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>            <mailto:gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>>
> 
>             >>> http://www.gromacs.org/mailman/listinfo/gmx-users
>             >>> Please search the archive at http://www.gromacs.org/search
>            before posting!
>             >>> Please don't post (un)subscribe requests to the list.
>         Use the
>             >>> www interface or send it to
>         gmx-users-request at gromacs.org <mailto:gmx-users-request at gromacs.org>
>            <mailto:gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>>.
> 
>             >>> Can't post? Read
>         http://www.gromacs.org/mailing_lists/users.php
>             >>
>             >>
>             >> --
>             >> ************************************************
>             >> Dr. Jochen Hub
>             >> Max Planck Institute for Biophysical Chemistry
>             >> Computational biomolecular dynamics group
>             >> Am Fassberg 11
>             >> D-37077 Goettingen, Germany
>             >> Email: jhub[at]gwdg.de <http://gwdg.de> <http://gwdg.de>
> 
>             >> Tel.: +49 (0)551 201-2312
>             >> ************************************************
>             >> _______________________________________________
>             >> gmx-users mailing list gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>            <mailto:gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>>
> 
>             >> http://www.gromacs.org/mailman/listinfo/gmx-users
>             >> Please search the archive at http://www.gromacs.org/search
>            before posting!
>             >> Please don't post (un)subscribe requests to the list. Use the
>             >> www interface or send it to gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>
>            <mailto:gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>>.
> 
>             >> Can't post? Read
>         http://www.gromacs.org/mailing_lists/users.php
>             >>
>             >>
>             >>
>             >>
>          
>          ------------------------------------------------------------------------
>             >>
>             >> _______________________________________________
>             >> gmx-users mailing list    gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>            <mailto:gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>>
> 
>             >> http://www.gromacs.org/mailman/listinfo/gmx-users
>             >> Please search the archive at http://www.gromacs.org/search
>            before posting!
>             >> Please don't post (un)subscribe requests to the list. Use the
>             >> www interface or send it to gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>
>            <mailto:gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>>.
> 
>             >> Can't post? Read
>         http://www.gromacs.org/mailing_lists/users.php
>             >
>             >
>             > --
>             > ************************************************
>             > Dr. Jochen Hub
>             > Max Planck Institute for Biophysical Chemistry
>             > Computational biomolecular dynamics group
>             > Am Fassberg 11
>             > D-37077 Goettingen, Germany
>             > Email: jhub[at]gwdg.de <http://gwdg.de> <http://gwdg.de>
> 
>             > Tel.: +49 (0)551 201-2312
>             > ************************************************
>             > _______________________________________________
>             > gmx-users mailing list    gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>            <mailto:gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>>
> 
>             > http://www.gromacs.org/mailman/listinfo/gmx-users
>             > Please search the archive at http://www.gromacs.org/search
>         before
>            posting!
>             > Please don't post (un)subscribe requests to the list. Use the
>             > www interface or send it to gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>
>            <mailto:gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>>.
> 
>             > Can't post? Read
>         http://www.gromacs.org/mailing_lists/users.php
>             >
> 
> 
> 
>            --
>            Diego Enry B. Gomes
>            Laboratório de Modelagem e Dinamica Molecular
>            Universidade Federal do Rio de Janeiro - Brasil.
>            _______________________________________________
>            gmx-users mailing list    gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>            <mailto:gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>>
> 
>            http://www.gromacs.org/mailman/listinfo/gmx-users
>            Please search the archive at http://www.gromacs.org/search before
>            posting!
>            Please don't post (un)subscribe requests to the list. Use the
>            www interface or send it to gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>
>            <mailto:gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>>.
> 
>            Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> 
> 
> 
> 
>         ------------------------------------------------------------------------
> 
>         _______________________________________________
>         gmx-users mailing list    gmx-users at gromacs.org
>         <mailto:gmx-users at gromacs.org>
>         http://www.gromacs.org/mailman/listinfo/gmx-users
>         Please search the archive at http://www.gromacs.org/search
>         before posting!
>         Please don't post (un)subscribe requests to the list. Use the
>         www interface or send it to gmx-users-request at gromacs.org
>         <mailto:gmx-users-request at gromacs.org>.
>         Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> 
> 
>     -- 
>     Dr. Carsten Kutzner
> 
>     Max Planck Institute for Biophysical Chemistry
>     Theoretical and Computational Biophysics Department
>     Am Fassberg 11
>     37077 Goettingen, Germany
>     Tel. +49-551-2012313, Fax: +49-551-2012302
>     www.mpibpc.mpg.de/home/grubmueller/
>     <http://www.mpibpc.mpg.de/home/grubmueller/>
>     www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne/
>     <http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne/>
> 
>     _______________________________________________
>     gmx-users mailing list    gmx-users at gromacs.org
>     <mailto:gmx-users at gromacs.org>
>     http://www.gromacs.org/mailman/listinfo/gmx-users
>     Please search the archive at http://www.gromacs.org/search before
>     posting!
>     Please don't post (un)subscribe requests to the list. Use the www
>     interface or send it to gmx-users-request at gromacs.org
>     <mailto:gmx-users-request at gromacs.org>.
>     Can't post? Read http://www.gromacs.org/mailing_lists/users.php
> 
> 
> 
> ------------------------------------------------------------------------
> 
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php

-- 
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics Department
Am Fassberg 11
37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
www.mpibpc.mpg.de/home/grubmueller/
www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne/



More information about the gromacs.org_gmx-users mailing list