[gmx-developers] Possible bug in gmx

Mark Abraham mark.j.abraham at gmail.com
Sun Sep 28 16:17:13 CEST 2014


How about a redmine issue - this thread's not about GROMACS development,
per se ;-)

Mark

On Sun, Sep 28, 2014 at 3:18 PM, Alexey Shvetsov <alexxy at omrb.pnpi.spb.ru>
wrote:

> Hi Berk!
>
> Its not a cut and paste error, also there are no pdb dumps.
> Also I see this error before with other systems.
>
> I can provide tpr file for that system
>
>
> https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr
>
>
> Berk Hess писал 28-09-2014 16:37:
>
>  Hi,
>>
>> I assume that your old and be coordinates being identical is correct
>> and not a cut-and-paste error.
>> This seems a bit strange or do you freeze part of the system?
>> The only things moving here are then the domain boundaries and I don't
>> see an issue there, since they only moved a little.
>>
>> Do you have any more output besides the error message? PDB dump files
>> maybe?
>>
>> Cheers,
>>
>> Berk
>>
>> On 09/28/2014 02:22 PM, Alexey Shvetsov wrote:
>>
>>> Hi,
>>>
>>> just wanna add that this error seems to be reproducable even on single
>>> node. Also i get same error for gpu runs.
>>> However i dont see it in large systems (800k+ atoms) running on large
>>> number of cpus (512+)
>>>
>>> Alexey Shvetsov писал 28-09-2014 13:44:
>>>
>>>> Hi,
>>>>
>>>> DD grid is
>>>>
>>>> Domain decomposition grid 4 x 1 x 1, separate PME ranks 0
>>>> PME domain decomposition: 4 x 1 x 1
>>>>
>>>> for 4 node setup
>>>>
>>>> and
>>>>
>>>> Domain decomposition grid 4 x 2 x 1, separate PME ranks 0
>>>> PME domain decomposition: 4 x 2 x 1
>>>>
>>>> for 8 node setup
>>>>
>>>> It's reproducable with 5.0 release and latest git master. I try to
>>>> check if its reproducable with 1 node. Also i can provide tpr file for
>>>> this system
>>>>
>>>> Mark Abraham писал 28-09-2014 13:28:
>>>>
>>>>> Hi,
>>>>>
>>>>> It's hard to say on that information. There were some issues fixed in
>>>>> the lead-up to GROMACS 5 with DD not always working with 2 domains in
>>>>> a direction, but that's a pure guess. I'd assume you can reproduce
>>>>> this with release-5-0 branch. Do you observe it with a single domain?
>>>>> If not, then it's surely a bug (and should be submitted to redmine).
>>>>>
>>>>> Mark
>>>>>
>>>>> On Sun, Sep 28, 2014 at 11:18 AM, Alexey Shvetsov
>>>>> <alexxy at omrb.pnpi.spb.ru> wrote:
>>>>>
>>>>>  Hi all!
>>>>>>
>>>>>> I'm doing some tests with small peptide and constantly getting this
>>>>>> error.
>>>>>> I get it with few systems.
>>>>>>
>>>>>> Systems sizes are around 10k or 20k
>>>>>> I run it on 4 or 8 old nodes each with two xeon 54xx series
>>>>>>
>>>>>> starting mdrun '2ZCH_3 in water'
>>>>>> 50000000 steps, 100000.0 ps (continuing from step 1881000, 3762.0
>>>>>> ps).
>>>>>>
>>>>>> Step 13514000:
>>>>>> The charge group starting at atom 6608 moved more than the distance
>>>>>> allowed by the domain decomposition (1.112924) in direction X
>>>>>> distance out of cell -1.193103
>>>>>> Old coordinates: 5.467 0.298 3.636
>>>>>> New coordinates: 5.467 0.298 3.636
>>>>>> Old cell boundaries in direction X: 4.037 5.382
>>>>>> New cell boundaries in direction X: 4.089 5.452
>>>>>>
>>>>>>  ------------------------------------------------------------
>>>>> --------------
>>>>>
>>>>>> MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
>>>>>> with errorcode 1.
>>>>>>
>>>>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>>>>>> You may or may not see output from other processes, depending on
>>>>>> exactly when Open MPI kills them.
>>>>>>
>>>>>>  ------------------------------------------------------------
>>>>> --------------
>>>>>
>>>>>>
>>>>>> -------------------------------------------------------
>>>>>> Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown
>>>>>> Source code file:
>>>>>>
>>>>>>  /var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/
>>>>> gromacs-9999/src/gromacs/mdlib/domdec.cpp,
>>>>>
>>>>>> line: 4388
>>>>>>
>>>>>> Fatal error:
>>>>>> A charge group moved too far between two domain decomposition steps
>>>>>> This usually means that your system is not well equilibrated
>>>>>> For more information and tips for troubleshooting, please check the
>>>>>> GROMACS
>>>>>> website at http://www.gromacs.org/Documentation/Errors [1]
>>>>>> -------------------------------------------------------
>>>>>>
>>>>>> -- Best Regards,
>>>>>> Alexey 'Alexxy' Shvetsov, PhD
>>>>>> Department of Molecular and Radiation Biophysics
>>>>>> FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
>>>>>> Leningrad region, Gatchina, Russia
>>>>>> mailto:alexxyum at gmail.com
>>>>>> mailto:alexxy at omrb.pnpi.spb.ru
>>>>>> -- Gromacs Developers mailing list
>>>>>>
>>>>>> * Please search the archive at
>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List [2]
>>>>>> before posting!
>>>>>>
>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists [3]
>>>>>>
>>>>>> * For (un)subscribe requests visit
>>>>>>
>>>>>>  https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_
>>>>> gmx-developers
>>>>>
>>>>>> [4] or send a mail to gmx-developers-request at gromacs.org.
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> Links:
>>>>> ------
>>>>> [1] http://www.gromacs.org/Documentation/Errors
>>>>> [2] http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
>>>>> [3] http://www.gromacs.org/Support/Mailing_Lists
>>>>> [4] https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_
>>>>> gmx-developers
>>>>>
>>>>
>>>> -- Best Regards,
>>>> Alexey 'Alexxy' Shvetsov, PhD
>>>> Department of Molecular and Radiation Biophysics
>>>> FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
>>>> Leningrad region, Gatchina, Russia
>>>> mailto:alexxyum at gmail.com
>>>> mailto:alexxy at omrb.pnpi.spb.ru
>>>>
>>>
>>>
> --
> Best Regards,
> Alexey 'Alexxy' Shvetsov, PhD
> Department of Molecular and Radiation Biophysics
> FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
> Leningrad region, Gatchina, Russia
> mailto:alexxyum at gmail.com
> mailto:alexxy at omrb.pnpi.spb.ru
> --
> Gromacs Developers mailing list
>
> * Please search the archive at http://www.gromacs.org/
> Support/Mailing_Lists/GMX-developers_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers
> or send a mail to gmx-developers-request at gromacs.org.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20140928/c019eaac/attachment-0001.html>


More information about the gromacs.org_gmx-developers mailing list