[gmx-users] Shell (Drude) model for polarization in GROMACS
Eric Smoll
ericsmoll at gmail.com
Sun Jul 1 21:13:22 CEST 2018
Thank you,
I have compiled the "Drude" branch and am preparing my topology.
Unfortunately, the best reference for implementing shell molecular dynamics
in Gromacs is an extremely long and confusing 2014 gmx-users thread between
you and an extremely confused Gromacs user (
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-July/091066.html).
I have more questions.
I am interested in adding polarization to an OPLS-AA style model. Let's
build a model system to add clarity to our communication. Consider a
linear, 5 physical-atom (not gromacs-atom) artificial molecule. To
describe this artificial molecule in a shell molecular dynamics simulation
we need to define 10 gromacs-particles. 5 of these gromacs-particles will
be of the gromacs-atom type (marked with an A in the topology) and 5 of
these gromacs-particle will be of the gromacs-shell type (marked with an S
in the topology). Each gromacs-shell will be "attached" to a corresponding
gromacs-atom type as discussed in (1) below.
(1) As you clarified earlier, I create a polarization directive specifying
gromacs-atom/gromacs-shell pairs. For each entry in the polarization
directive, I specify the function number as 1 and the polarizability in
units of nm^3. This "attaches" the gromacs-shell to the gromacs-atom with
a harmonic force. This interaction is not considered a "bonded
interaction" meaning that the resulting connectivity gromacs-shell/gromacs-atom
connectivity is not considered when excluding interactions at the distance
specified in the molecule type directive. Only the network defined by the
gromacs-atom/gromacs-atom connectivity is considered when excluding
interactions. Assuming the exclusion distance is three bonds and I have
scaled 1-4 interactions specified (defaults directive settings and pairs
directive entries), what happens with the gromacs-particle interactions as
a function of distance?
1 bond away: gromacs-particle/gromacs-particle non-bonded interactions
excluded
2 bond away: gromacs-particle/gromacs-particle non-bonded interactions
excluded
3 bond away: gromacs-particle/gromacs-particle non-bonded interactions
included but scaled by 0.5. I am correct in assuming that
gromacs-atom/gromacs-atom, gromacs-atom/gromacs-shell, and
gromacs-shell/gromacs-shell coulomb interactions are all scaled down by a
factor of 0.5?
4 bond away: gromacs-particle/gromacs-particle non-bonded interactions
included (gromacs-atom/gromacs-atom, gromacs-atom/gromacs-shell, and
gromacs-shell/gromacs-shell)
(2) Do I need to create .gro coordinates for the 10 gromacs-particles or
just the 5 gromacs-atom? If I need to declare the coordinates for all 10
gromacs-particles, can I specify each gromacs-shell particle with the same
coordinates as the gromacs-atom particle that it is attached to?
(3) Now, according to the manual, thole polarization is intended to
screen/damp/diminish coulomb interactions in shell molecular dynamics
simulations. The wording suggests that thole polarization only impact
gromacs-shell/gromacs-shell interactions (not gromacs-atom/gromacs-atom or
gromacs-shell/gromacs-shell interactions). Am I correct? Each entry in
the associated directive requires listing 4 numbers. The first two numbers
specify a gromacs-atom and its attached gromacs-shell. The last two
numbers specify a different gromacs-atom and its attached gromacs-shell.
Given my assumption in (1) above, I would only need to include these for
all 1-4, 1-5, etc. intramolecular gromacs-atom/gromacs-shell pairs. I
assume that 1-4 interactions are still diminished by 0.5. Am I correct?
Best,
Eric
On Tue, Jun 26, 2018 at 8:40 AM, Justin Lemkul <jalemkul at vt.edu> wrote:
>
>
> On 6/26/18 10:37 AM, Eric Smoll wrote:
>
>> Justin,
>>
>> In a previous thread on core/shell optimization (
>> https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users
>> /2014-August/091101.html)
>> you state:
>>
>> "No, I've made wholesale changes to the code in the way that energies and
>> forces
>> are computed. The current code has bugs. If you want a modified version
>> (beta
>> software, so caveat emptor), contact me off-list."
>>
>> Is this still true? Is core/shell optimization broken in GROMACS 2018.1?
>> If so, would you mind sharing your Drude implementation?
>>
>
> Check out git master and switch to the drude branch. Don't try to use
> domain decomposition, only OpenMP for parallelization. Documentation is
> unfortunately somewhat sparse but the code is reasonably commented.
>
> Note that everything I have done is for extended Lagrangian dynamics, I
> haven't tested much with massless shells or use of [polarization]
> directives.
>
> -Justin
>
>
> Best,
>> Eric
>>
>> On Mon, Jun 18, 2018 at 1:14 PM, Justin Lemkul <jalemkul at vt.edu> wrote:
>>
>>
>>> On 6/18/18 4:05 PM, Eric Smoll wrote:
>>>
>>> Justin,
>>>>
>>>> Thank you so much for the rapid and clear reply! Sorry to ask for a bit
>>>> more clarification.
>>>>
>>>> The thole_polarization isn't in the manual at all. Is it structured the
>>>> same way as the [ polarization ] directive in the manual:
>>>>
>>>> [ thole_polarization ]
>>>> ; Atom i j type alpha
>>>> 1 2 1 0.001
>>>>
>>>> If I want Thole corrections, am I correct in assuming that I should list
>>>> *all shells* in the system under this thole_polarization directive with
>>>> (as
>>>> you pointed out) "i" or "j" as the shell? If "i" is the shell, "j" is
>>>> the
>>>> core. If "j" is the shell, "i" is the core.
>>>>
>>>> You have to list everything explicitly, including the shielding factor,
>>> and between dipole pairs (Thole isn't between an atom and its shell, it's
>>> between neighboring dipoles). I honestly don't know what the format is; I
>>> completely re-wrote the Thole code for our Drude implementation (still
>>> not
>>> officially incorporated into a release due to DD issues, but we're close
>>> to
>>> a fix...)
>>>
>>> The code for "init_shell_flexcon" was very helpful. Thank you!
>>>
>>>> nstcalcenergy must be set to 1. The code says that domain
>>>> decomposition
>>>> is not supported so multi-node MPI calculations are not allowed. I can
>>>> still use an MPI-enabled GROMACS executable on a single node for shell
>>>> MD,
>>>> correct? Thread parallelization is still permitted, correct?
>>>>
>>>> Presumably you're limited to OpenMPI, but again I have no idea about
>>> this
>>> code. I've never actually used it.
>>>
>>>
>>> -Justin
>>>
>>> --
>>> ==================================================
>>>
>>> Justin A. Lemkul, Ph.D.
>>> Assistant Professor
>>> Virginia Tech Department of Biochemistry
>>>
>>> 303 Engel Hall
>>> 340 West Campus Dr.
>>> Blacksburg, VA 24061
>>>
>>> jalemkul at vt.edu | (540) 231-3129
>>> http://www.thelemkullab.com
>>>
>>> ==================================================
>>>
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at http://www.gromacs.org/Support
>>> /Mailing_Lists/GMX-Users_List before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> send a mail to gmx-users-request at gromacs.org.
>>>
>>>
> --
> ==================================================
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Virginia Tech Department of Biochemistry
>
> 303 Engel Hall
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalemkul at vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==================================================
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at http://www.gromacs.org/Support
> /Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list