[gmx-developers] Possible bug in gmx

Alexey Shvetsov alexxy at omrb.pnpi.spb.ru
Sun Sep 28 11:19:09 CEST 2014


Hi all!

I'm doing some tests with small peptide and constantly getting this 
error.
I get it with few systems.

Systems sizes are around 10k or 20k
I run it on 4 or 8 old nodes each with two xeon 54xx series

starting mdrun '2ZCH_3 in water'
50000000 steps, 100000.0 ps (continuing from step 1881000,   3762.0 ps).

Step 13514000:
The charge group starting at atom 6608 moved more than the distance 
allowed by the domain decomposition (1.112924) in direction X
distance out of cell -1.193103
Old coordinates:    5.467    0.298    3.636
New coordinates:    5.467    0.298    3.636
Old cell boundaries in direction X:    4.037    5.382
New cell boundaries in direction X:    4.089    5.452
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

-------------------------------------------------------
Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown
Source code file: 
/var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/gromacs-9999/src/gromacs/mdlib/domdec.cpp, 
line: 4388

Fatal error:
A charge group moved too far between two domain decomposition steps
This usually means that your system is not well equilibrated
For more information and tips for troubleshooting, please check the 
GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------


-- 
Best Regards,
Alexey 'Alexxy' Shvetsov, PhD
Department of Molecular and Radiation Biophysics
FSBI Petersburg Nuclear Physics Institute, NRC Kurchatov Institute,
Leningrad region, Gatchina, Russia
mailto:alexxyum at gmail.com
mailto:alexxy at omrb.pnpi.spb.ru


More information about the gromacs.org_gmx-developers mailing list