[gmx-users] Very large fluctuations in dg/dl

David Mobley dmobley at gmail.com
Tue May 8 22:56:10 CEST 2007


> My first
> observation was that 1 ns seemed to be a minimum for certain lambda values
> (e.g. lambda=0.70). I sometimes read in literature that some authors
> used a few hundreds of ps, which seemed (to me) not sufficient for
> proper convergence.

My experience is that the range of lambda=0.7 to 0.85 or so is the
hardest part, with the particular set of soft core parameters you're
using (the ones Michael Shirts and I recommend). This is usually the
range at which the water begins to interpenetrate with the solute in
solvation free energies, and this can initially be a bit "sticky"
(i.e. the solute overlaps with a water molecule and gets stuck for a
while before they come apart again).

I suppose how much data you need partly depends on the accuracy you
want to achieve. But you'll notice in my tutorial I talk about running
these for 5 ns at each lambda value.

Of course, when one is doing binding simulations, it may be a lot
harder to get 5 ns at each lambda value than for solvation, so
sometimes one is willing to sacrifice some accuracy...

> Now, if we come back to the error estimate of the mean, I found (for
> lambda=0.00) 0.2 kJ/mol using block averaging (using the -ee option of
> g_analyze), which is reasonable I imagine (even if higher precisions
> have been described in literature). I'm not sure whether this is the same
> way of calculating the uncertainty compared to what you proposed.

John would be better prepared than me to comment on whether block
averaging gives the same estimates of the autocorrelation time than
his strategy (from Janke). I think probably the most important thing
is to use some reasonable estimate of the autocorrelation time.

John probably has a python implementation of the Janke routine
floating around on his website somehwere; I can try and dig up the
link if you'd like it, or you can ask him for it.


More information about the gromacs.org_gmx-users mailing list