[gmx-developers] Collect all coordinates/velocities/virial in one MPI rank in mdlib/constr.cpp::Impl::apply()
Berk Hess
hess at kth.se
Wed Jan 12 09:09:25 CET 2022
On 1/11/22 11:21, Lorién López Villellas wrote:
> Hi all.
>
> We are working on implementing a new parallel bond-constraint solver
> based on SHAKE. We already have a working OpenMP version with
> promising results. We now want to do a first MPI implementation of the
> solver. Due to some algorithm-inherent constraints, we can not reuse
> LINCS MPI implementation, in which you only send/receive the needed
> coordinates via dd_move_x_constraints() in both constr.cpp and
> lincs.cpp. As a first approach, we want to collect all the coordinates
> on the master rank, execute the solver, and send the updated results
> to the other ranks. However, every rank has mixed its local
> coordinates plus the communicated ones. How can we distinguish between
> local and communicated coordinates in x and xprime arrays? How can we
> know which atom corresponds to each entry of the arrays? Is there any
> already-implemented method to perform the send/receive all coordinates?
>
> We also need to collect the velocities and the virial. As far as I can
> see, there is no send/receive of this information before/in P-LINCS.
> How are the velocities and the virial kept up to date in P-LINCS?
When using leap-frog, all constraint algorithms in GROMACS correct the
velocities using the position correction divided by dt. As all current
constraint algorithms only modify local coordinates, only the local
velocities need to be passed to the constraint algorithms. With the
velocity Verlet integrator the velocities are constrained in a separate
step.
Note that we are moving away from constraining all bonds in
bio-molecular simulations because of two reasons. One is that most force
fields have been parametrized with only bonds involving hydrogens
constrained. The other is that computationally it is much simpler to
constrain bonds that are only connected through a single (heavy) atom.
Cheers,
Berk
>
> Thank you very much in advance.
>
> Regards,
> Lorién.
>
More information about the gromacs.org_gmx-developers
mailing list