[gmx-users] Re: Question about Berendsen thermostat and Nose-Hoover temp coupling (chris.neale at utoronto.ca)

Michael Shirts michael.shirts at columbia.edu
Wed Jul 23 16:12:36 CEST 2008


Note that a recently a velocity rescaling algorithm was developed in
the Parrinello group that does give the correct canonical ensemble.
See: http://arxiv.org/abs/0803.4060.  It might be nice to replace
Berendsen with this algorithm, which should have the nice properties
of Berendsen (converges to the constraint temperature quickly) without
the serious drawbacks.

Cheers,
Michael

On Wed, Jul 23, 2008 at 10:10 AM, Michael Shirts
<michael.shirts at columbia.edu> wrote:
>> Have you seen any information to suggest that this is actually a
>> non-trivial concern? That is, given static point charges, an empirical
>> LJ force, short cutoffs, etc., do you believe that the application of
>> nose-hoover, berendsen, or even the arbitrary velocity rescaling
>> significantly degrades the quality of the obtained dynamics?
>
> 1) I think there's an important distinction to be made here between
> accuracy and physical validity.   If you use a thermostat, then the
> dynamic properties you obtain for the system will be different than
> the properties obtained without the thermostat, independent of what
> model you choose.  So, I don't know that it's that useful to ask
> whether the differences from the true system due to thermostat are
> large compared to the differences due to the choice of model -- the
> results are going to be dependent on the thermostat, so the field has
> chosen a standard definition of the dynamics, one that most resembles
> the actual physical system (where there isn't temperature rescaling
> every 2 fs or a piston coupled to a 10 nm cube of water, etc).
>
> 2) If one uses the Berendsen thermostat, then statistical mechanics of
> canonical ensembles will not strictly apply, and one can't use many of
> the results one would like to (or, are using already incorrectly).
> One can do physics-based simulation of molecular models, or one can do
> non-physics-based simulations of molecular models.  In many cases, the
> non-physics-based results will be statistically indistinguishable from
> the physics based results.  But why bother with an uncontrolled
> approximation when you don't -have- to use one?   It just adds another
> chance that what one simulates is not reproducible or reliable, and
> heaven knows that simulation currently has enough of those already.
>
> It's like building a tower out of blocks, and each uncontrolled
> approximation, no matter how small, is a bit of unevenness in the
> block.  If you have just one wobbly block, you can stand on it pretty
> well -- you know the limits of the approximation, you can get a pretty
> good sense of what's going on with the system.  Once there's a large
> number of wobbly blocks, though -- good luck standing on top of it and
> trying to build good science.  As simulations get larger and more
> complicated, they are built out of more and more blocks, and we need
> to make sure we're paying careful attention to how well they all fit
> together.
>
> Best,
> Michael Shirts
> Research Fellow
> Columbia University
> Department of Chemistry
>



More information about the gromacs.org_gmx-users mailing list