[gmx-users] pcoupltype
ABEL Stephane
Stephane.ABEL at cea.fr
Wed Oct 31 19:25:57 CET 2018
Just my 2 cents
If I recall well CHARMM36 and others force fields (such as lipid14) were initially parametrized and validated using the anisotropic pressure
coupling scheme. So it make sense to use it instead of semi-isotropic pressure coupling. Moreover I found that if the membrane system is well equilibrated (for instance > 100 ns of NPT is semiisotropic for POPC or DOPC membranes), you can switch safely to anistropic pressure scheme and does not observe significant deformations of the bilayer. These observations are based on my simulations and are of course not general?
Stéphane
------------------------------
Message: 4
Date: Wed, 31 Oct 2018 15:03:55 +0000
From: "Gonzalez Fernandez, Cristina" <cristina.gonzalezfdez at unican.es>
To: "gmx-users at gromacs.org" <gmx-users at gromacs.org>
Subject: Re: [gmx-users] pcoupltype
Message-ID: <d93ec9bbd100439194fee0e52e147a7b at unican.es>
Content-Type: text/plain; charset="iso-8859-1"
Thank you very much Kevin and Justin for your information
-----Mensaje original-----
De: gromacs.org_gmx-users-bounces at maillist.sys.kth.se [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] En nombre de Justin Lemkul
Enviado el: mi?rcoles, 31 de octubre de 2018 13:53
Para: gmx-users at gromacs.org
Asunto: Re: [gmx-users] pcoupltype
On 10/30/18 3:42 PM, Kevin Boyd wrote:
> Hi,
>
> For membrane systems you typically want to use semi-isotropic pressure
> coupling. If instead you want to simulate *one* lipid (as a ligand)
> with a protein in solution, you should stick to isotropic pressure
> coupling. I've never heard of any anisotropic pressure coupling
> protocols in equilibrium NPT systems like what you describe.
Indeed, anisotropic coupling is best applied to solids/crystals. In the case of membranes, anisotropic coupling leads to deformation of the box over long periods of time.
-Justin
--
==================================================
Justin A. Lemkul, Ph.D.
Assistant Professor
Virginia Tech Department of Biochemistry
303 Engel Hall
340 West Campus Dr.
Blacksburg, VA 24061
jalemkul at vt.edu | (540) 231-3129
http://www.thelemkullab.com
==================================================
--
Gromacs Users mailing list
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
------------------------------
Message: 5
Date: Wed, 31 Oct 2018 16:06:22 +0100
From: Ramon Guix? <ramonguixxa at gmail.com>
To: gmx-users at gromacs.org
Subject: Re: [gmx-users] gromacs+CHARMM36+GPUs induces
protein-membrane system collapse?
Message-ID:
<CAAc09t4TkS+WYm_PUCeM6t4RjW1YrnVGk-kAoxw7wfTM1uEpcw at mail.gmail.com>
Content-Type: text/plain; charset="UTF-8"
Ok, I mean, It makes sense what you say.
I still I don't understand though how this has not been spotted before, I
mean, I presume this combination is widely used (CHARMM+GROMACS+GPUs) and I
have replicated the problem using versions 5, 2016, and 2018...? I know I
sound a little desperate, but is there any workaround I could use to
proceed until this is resolved?
Ramon
On Wed, Oct 31, 2018 at 3:51 PM Justin Lemkul <jalemkul at vt.edu> wrote:
>
>
> On 10/31/18 10:47 AM, Ramon Guix? wrote:
> > Hi Justin,
> >
> > I do not have access to GPUs with ECC, but I have actually run the same
> tpr
> > in different machines with different GPUs, including GTX 980, 1060 and
> > 1080. In all cases, I get the same problem. Although they are all
> non-ECC
> > cards, it is quite unlikely that a bit flip is behind this, isn't it?
>
> Certainly is. It wasn't clear if you were just running over and over
> again on the same card or what.
>
> > In fact, I had simulated before CHARMM-GUI systems using GTX cards and
> > never had a problem. Since this has only benn happening recently to me,
> so
> > I am wondering if I should blame the new force field (CHARMM36m) or
> whether
> > CHARMM-GUI changed something in the way they generate gromacs topologies?
>
> I sincerely doubt it's the force field. We validate that extensively
> across many compounds to verify that GROMACS and CHARMM compute the
> forces the same (I do this myself when we produce a new force field
> port). We don't (can't) do that easily on GPU, but if the forces are
> different, that points to a software bug, because everything about the
> CPU implementations is identical. Given that you have a sudden change in
> system behavior, that further points to a software bug, in my opinion,
> because if something was fundamentally wrong with the force field, (1)
> you would see it on CPU runs and (2) it would almost certainly occur
> much earlier in the simulation.
>
> -Justin
>
> --
> ==================================================
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Virginia Tech Department of Biochemistry
>
> 303 Engel Hall
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalemkul at vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==================================================
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
------------------------------
--
Gromacs Users mailing list
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
End of gromacs.org_gmx-users Digest, Vol 174, Issue 82
******************************************************
More information about the gromacs.org_gmx-users
mailing list