[gmx-developers] grand-canonical MC using gromacs
Mark Abraham
Mark.Abraham at anu.edu.au
Tue Dec 7 23:09:21 CET 2010
On 8/12/2010 1:18 AM, René Pool wrote:
> Hi,
>
> Thanks Mark, found it!
> For me the problem is
> static bool bFirst=TRUE;
> which is set to FALSE after initialization of the neighbor lists.
> in function ns() (in force.c).
>
> Resetting this variable to TRUE seems to yield the correct average
> density in an MC simulation. The downside is that by doing so, a huge
> memory leak is created.
Yep. GROMACS is very bad at managing its "once per simulation" memory,
on the theory that it's only doing one simulation. I once had a version
of GROMACS based on 4.0.x that had a pile of memory-cleanup routines of
the kind you might need. I could make them available if anyone is
interested.
Hopefully the move to C++ will help a bit with some formal destructors.
Mark
> At least I now know where to find the solution!
>
> Cheers,
> René
>
> Mark Abraham wrote:
>>
>> On 12/07/10, René Pool <r.pool at vu.nl> wrote:
>>> Dear all,
>>>
>>> We're busy implementing grand-canonical Monte Carlo using gromacs.
>>> Particle translations accomplished by NVT MD, rendering this a
>>> hybrid MC MD method. Our approach is doing this by a python shell
>>> around gromacs in which we can insert/remove particles and perform
>>> MD NVT moves. For optimal efficiency, we use the ctypes module. This
>>> allows us to keep the gromacs data structures in memory during the
>>> MC run.
>>>
>>> To this end we had to make minor modifications/additions to the
>>> gromacs src (ver 4.0.5 because we started this project with this
>>> version). Most importantly, we have split up mdrunner() into (1)
>>> mdrunner_initialize(), (2) mdrunner_integrate() and (3)
>>> mdrunner_finalize(). The first is for initializing the mecessary
>>> gromacs data structures in python and storing them in memory, The
>>> second performs NVT MD Monte Carlo moves and single point energy
>>> evaluations of a given state. The third is called after the MC loop
>>> for neatly finalizing the simulation.
>>>
>>> For each number of particles N, we have the "tpr" in memory. In the
>>> case of a MC insertion or removal move, we copy the current state->x
>>> and state->v arrays into the trial state->x and state->v arrays. For
>>> insertion we add one molecule to these arrays in the case of
>>> insertion (i.e. replace state->x[last-1] and state->v[last-1] by a
>>> trial position and a velocity vector taken from the
>>> Maxwell-Boltzmann distribution). For removal we select a molecule at
>>> random and exclude this entry from the lists before copying it to
>>> the trial state. These "trial states" serve as input for
>>> mdrunner_integrate() that computes the single point energies (an md
>>> of nsteps=0).
>>>
>>> So far so good, everything works, BUT...
>>>
>>> The energies computed by mdrunner_integrate() do not correspond to
>>> the state and therefore screws up the MC sampling.
>>>
>>> My question therefore is:
>>> Are we overlooking some variable in the gromacs data structures that
>>> also needs to be set during insertion/removal, or is there some
>>> static (global) variable that stands in the way?
>>>
>>
>> When are you constructing and managing the neighbour lists? These are
>> not trivial to manage well.
>>
>> Mark
>>
>>
>>>
>>> Thanks in advance for your help!
>>>
>>> By the way, we're testing this on a simple LJ fluid system on a
>>> single CPU. So for now, no complicated stuff such as
>>> parallellization or charges.
>>>
>>> Cheers,
>>> Rene
>>>
>>> --
>>> =====================================================
>>> René Pool
>>> IBIVU/Bioinformatics
>>> Vrije Universiteit Amsterdam
>>> De Boelelaan 1081a
>>> 1081HV AMSTERDAM, the Netherlands
>>> Room P120
>>> E: r.pool at few.vu.nl
>>> T: +31 20 59 83714
>>> F: +31 20 59 87653
>>> =====================================================
>>> --
>>> gmx-developers mailing list
>>> gmx-developers at gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-developers
>>> Please don't post (un)subscribe requests to the list. Use the www
>>> interface or send it to gmx-developers-request at gromacs.org.
>>>
>>>
>>
More information about the gromacs.org_gmx-developers
mailing list