[gmx-developers] Reference coordinates in mdrun - Availability at all nodes
Carsten Kutzner
ckutzne at gwdg.de
Wed Jan 6 11:00:48 CET 2010
Hi Tsjerk,
what you want to do seems very similar to the procedure followed
in the 4.x essential dynamics module. I think you can copypaste these
routines and modify them for your purpose. All the stuff is in
src/mdlib/edsam.c
init_edsam reads and broadcasts the reference set of coordinates so they
are available on all nodes in ed->sref.x. The reference positions are
communicated once and only at the begin of a simulation.
During domain decomposition, the local indices of the coordinates that
need to be compared against their reference values are determined in
dd_make_local_ed_indices (and stored in s->anrs_loc[]).
The actual values of the local coordinates are then assembled in
get_coordinates. Here, all the coordinates from all nodes are
assembled together into a collective coordinate array that you
can directly compare against the reference array.
If you do not need all the group's positions on all nodes, you do not even
need to call get_coordinates. For that case, c_ind[i] (i=0...s->nr_loc) tells you
the index in sref that you need to compare each local atom of your group
to, e.g.
// x are the local part of the coordinates that enter the routine
for (i = 0; i < s->nr_loc; i++)
{
copy_rvec(x[s->anrs_loc[i]], curr_x); // current local coordinate
copy_rvec(s->sref.x[s->c_ind[i]], ref_x); // reference to compare against
....
}
If you do it this way, you only do DD to find out which of the coordinates
of the reference structure are local, which will of course change with time,
so no overhead and no extra communication involved.
Hope that helps,
Carsten
On Jan 6, 2010, at 8:33 AM, Tsjerk Wassenaar wrote:
> Hi Mark,
>
> Thanks. It makes sense not to store the full array at each node, which
> would require more memory than strictly necessary. The drawback is
> doing DD every time on something that doesn't change, but I see that
> that's required to maintain consistency of indices. Using the
> t_forcerec seems a bit of hack, but might indeed be the best way.
>
> Cheers,
>
> Tsjerk
>
> On Tue, Jan 5, 2010 at 11:00 PM, Mark Abraham <Mark.Abraham at anu.edu.au> wrote:
>> Tsjerk Wassenaar wrote:
>>>
>>> Hi,
>>>
>>> For some reason, I'm reading in a set of reference coordinates in
>>> mdrun (do_md). These should be available during the entire run, but
>>> obviously, I have to ensure that they are available at each node in
>>> order not to compromise efficiency. Could any of you give a pointer
>>> (no pun intended, but please, not a void pointer :p) on how to achieve
>>> that?
>>
>> It won't be straightforward. The DD algorithm doesn't replicate full
>> position arrays across processors, so there's a bunch of messy indexing that
>> goes on. Presumably you'll need to subject your reference coordinates to the
>> same decomposition so that indices match on the nodes. If so, you can just
>> add another rvec * to t_forcerec, load it earlier, and use the global vs
>> local machinery that's already used for the positions. I suggest stepping
>> through the (start of the) DD in the do_md loop.
>>
>> Mark
>> --
>> gmx-developers mailing list
>> gmx-developers at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-developers
>> Please don't post (un)subscribe requests to the list. Use the www interface
>> or send it to gmx-developers-request at gromacs.org.
>>
>
>
>
> --
> Tsjerk A. Wassenaar, Ph.D.
>
> Computational Chemist
> Medicinal Chemist
> Neuropharmacologist
> --
> gmx-developers mailing list
> gmx-developers at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-developers
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-developers-request at gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne
More information about the gromacs.org_gmx-developers
mailing list