[gmx-users] Unexpected behavior of g_msd

darrellk at ece.ubc.ca darrellk at ece.ubc.ca
Thu Nov 26 20:43:00 CET 2009


Hi Berk,
I am thinking that you mean that MSD is a function of the time
differences and the number of time differences. In other words, if the
simulation is too short, then GROMACS would not have enough time
differences to accurately calculate a diffusion constant since the slope
of the msd versus time within these time intervals would vary too much.
Thus, I thought that, in order to get an accurate measurement of
diffusion, it was necessary to run the simulation for a long enough
period of time to get into a region where there is a linear relationship
between msd & time so that the regression fit is improved. Is this
correct?

In the paper "The Working Man’s Guide to Obtaining Self Diffusion
Coefficients from Molecular Dynamics Simulations", they indicate that a
simulation time of approximately 2 ns is appropriate for gas phase
simulations and that the section of time interval used for calculating
the diffusion constant be the linear region near towards the end of the
simulation.

Thanks.

Darrell

>Date: Thu, 26 Nov 2009 11:34:02 +0100
>From: Berk Hess <gmx3 at hotmail.com>
>Subject: RE: [gmx-users] Unexpected behavior of g_msd
>To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>Message-ID: <COL113-W20188A074F24AF04DD1C638E9B0 at phx.gbl>
>Content-Type: text/plain; charset="iso-8859-1"
>
>
>NO!
>The MSD is NOT a function of the trajectory time!
>The MSD is a function of time differences.
>For example:
>if trestart=10 and -b=60 then MSD(4) in the output is the average of the the MSD
>between 64 and 68, 74 and 78, 84 and 88, etc.
>
>Berk



More information about the gromacs.org_gmx-users mailing list