[gmx-developers] Repartitioning Domain Decomposition after PBC Wrapping
Michael Quevillon
mquevill at nd.edu
Thu Aug 2 19:44:44 CEST 2018
Hello,
I am one of the developers of a suite of advanced sampling techniques
(SSAGES v0.8.2, https://github.com/MICCoM/SSAGES-public) that ties in with
several MD engines (including GROMACS). Ideally, our patching of GROMACS
would be minimal, to keep the code bases relatively clean, so we are
injecting some function definitions and calls into
src/programs/mdrun/md.cpp (among others) by means of a patch (
https://github.com/MICCoM/SSAGES-public/blob/master/hooks/gromacs/gmx_diff_2018.x.patch),
with most of the calculations happening in our own code.
The bulk changes that our methods apply happens immediately after the
do_force() call, via our PostIntegrationHook(). There are a couple methods
that we have implemented that use techniques similar to Replica Exchange,
in that they swap configurations. When using domain decomposition, these
simulations will fail, due to wrapping around the periodic boundary
conditions, causing a cg_move_error of "Atom moved too far". The only thing
that happens before this error is part of the molecule gets wrapped around
the boundary, so there is a disconnect between the atom positions and in
which domain decomposition cell it is.
I have tried changing the condition of
`if ( (bExchanged || bNeedRepartition) && DOMAINDECOMP(cr) )`
to TRUE, just to see if that would fix the issue, as the system would be
repartitioned into the proper domain decomposition cells (upon calling
dd_partition_system). However, while this avoids the aforementioned crash,
the atoms do not move at all (when visualized with VMD).
We were also thinking that triggering a neighbor list rebuild would fix
this, but setting `nstlist = 1` had no effect on this error.
If there's any suggestions or ideas you might have about how we can go
about alleviating this issue, we would greatly appreciate it. Below are
instructions for building SSAGES and running a test case that exemplifies
this issue.
Thanks in advance!
Michael Quevillon
Build instructions:
In order to compile SSAGES with GROMACS, run the following commands in the
root folder (./), to build the executable at ./build/ssages:
mkdir build
cd build
cmake .. -DGROMACS=2018.1 -DGMX_BUILD_OWN_FFTW=yes
make
Within our code, there is an example that will fail relatively quickly. The
example is alanine dipeptide in vacuum using the Swarm of Trajectories
method. The files are located at ./Examples/User/Swarm/GMX_ADP. By default,
the example is set up with 22 walkers, meaning you would need to run it
with a multiple of 22 cores. The problem still exists with fewer walkers,
and uses fewer resources. Please change 22 to 4 in Input_Generator.py,
Template_Input.json, and copytpr.py. The source includes pre-compiled .tpr
files, but can be regenerated with `gmx grompp -f nvt.mdp -c adp.gro -o
adp.tpr`. Then, run the python scripts of Input_Generator.py and copytpr.py
to set up the input files. Then, SSAGES can be run with `mpirun -np 8
./ssages Swarm.json`, or any larger multiple of 4.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20180802/3f695330/attachment.html>
More information about the gromacs.org_gmx-developers
mailing list