[gmx-developers] Possible bug in gmx

Alexey Shvetsov alexxy at omrb.pnpi.spb.ru
Sun Sep 28 11:53:54 CEST 2014


Hi,

DD grid is

Domain decomposition grid 4 x 1 x 1, separate PME ranks 0
PME domain decomposition: 4 x 1 x 1

for 4 node setup

and

Domain decomposition grid 4 x 2 x 1, separate PME ranks 0
PME domain decomposition: 4 x 2 x 1

for 8 node setup

It's reproducable with 5.0 release and latest git master. I try to check 
if its reproducable with 1 node. Also i can provide tpr file for this 
system

Mark Abraham писал 28-09-2014 13:28:
> Hi,
> 
> It's hard to say on that information. There were some issues fixed in
> the lead-up to GROMACS 5 with DD not always working with 2 domains in
> a direction, but that's a pure guess. I'd assume you can reproduce
> this with release-5-0 branch. Do you observe it with a single domain?
> If not, then it's surely a bug (and should be submitted to redmine).
> 
> Mark
> 
> On Sun, Sep 28, 2014 at 11:18 AM, Alexey Shvetsov
> <alexxy at omrb.pnpi.spb.ru> wrote:
> 
>> Hi all!
>> 
>> I'm doing some tests with small peptide and constantly getting this
>> error.
>> I get it with few systems.
>> 
>> Systems sizes are around 10k or 20k
>> I run it on 4 or 8 old nodes each with two xeon 54xx series
>> 
>> starting mdrun '2ZCH_3 in water'
>> 50000000 steps, 100000.0 ps (continuing from step 1881000, 3762.0
>> ps).
>> 
>> Step 13514000:
>> The charge group starting at atom 6608 moved more than the distance
>> allowed by the domain decomposition (1.112924) in direction X
>> distance out of cell -1.193103
>> Old coordinates: 5.467 0.298 3.636
>> New coordinates: 5.467 0.298 3.636
>> Old cell boundaries in direction X: 4.037 5.382
>> New cell boundaries in direction X: 4.089 5.452
>> 
> --------------------------------------------------------------------------
>> MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
>> with errorcode 1.
>> 
>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>> You may or may not see output from other processes, depending on
>> exactly when Open MPI kills them.
>> 
> --------------------------------------------------------------------------
>> 
>> -------------------------------------------------------
>> Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown
>> Source code file:
>> 
> /var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/gromacs-9999/src/gromacs/mdlib/domdec.cpp,
>> line: 4388
>> 
>> Fatal error:
>> A charge group moved too far between two domain decomposition steps
>> This usually means that your system is not well equilibrated
>> For more information and tips for troubleshooting, please check the
>> GROMACS
>> website at http://www.gromacs.org/Documentation/Errors [1]
>> -------------------------------------------------------
>> 
>> --
>> Best Regards,
>> Alexey 'Alexxy' Shvetsov, PhD
>> Department of Molecular and Radiation Biophysics
>> FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
>> Leningrad region, Gatchina, Russia
>> mailto:alexxyum at gmail.com
>> mailto:alexxy at omrb.pnpi.spb.ru
>> --
>> Gromacs Developers mailing list
>> 
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List [2]
>> before posting!
>> 
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists [3]
>> 
>> * For (un)subscribe requests visit
>> 
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers
>> [4] or send a mail to gmx-developers-request at gromacs.org.
> 
> 
> 
> Links:
> ------
> [1] http://www.gromacs.org/Documentation/Errors
> [2] http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
> [3] http://www.gromacs.org/Support/Mailing_Lists
> [4] 
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers

-- 
Best Regards,
Alexey 'Alexxy' Shvetsov, PhD
Department of Molecular and Radiation Biophysics
FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
Leningrad region, Gatchina, Russia
mailto:alexxyum at gmail.com
mailto:alexxy at omrb.pnpi.spb.ru


More information about the gromacs.org_gmx-developers mailing list