[gmx-users] RE : Questions about REMD calculations
ABEL Stephane 175950
Stephane.ABEL at cea.fr
Thu Oct 14 22:54:51 CEST 2010
Dear all,
Thank you very much Chris, Xavier and ms (;) for your comments and I will take into account your suggestions in my proposal (for GENCI, Chris ;)) .
A bientot
Stefane
----------------------------------------------------------------------
Message: 1
Date: Thu, 14 Oct 2010 17:22:38 +0200
From: Florian Dommert <dommert at icp.uni-stuttgart.de>
Subject: Re: [gmx-users] Re: g_velacc problem (Florian Dommert)
To: Discussion list for GROMACS users <gmx-users at gromacs.org>
Message-ID: <4CB7203E.1020307 at icp.uni-stuttgart.de>
Content-Type: text/plain; charset=ISO-8859-1
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On 10/14/2010 04:41 PM, Eudes Fileti wrote:
> Dear Florian, thanks for the help. I wonder just one more thing.
> Is it possible to obtain the lateral diffusion coefficient in a specific
> plane (say xy)
> using g_velacc? Or it is only possible with g_msd?
I am not sure, but the MSD is nothing else than the integrated VACF, so
if you modify the code correspondingly it should work. The modification
shouldn't be that hard, because you just have to prevent summing up a
certain direction of the velocity.
/Flo
> Bests
> eef
> _______________________________________
> Eudes Eterno Fileti
> Física da Matéria Condensada
> Simulação Computacional de Nano-estruturas via Dinâmica Molecular
>
>
>
>
- --
Florian Dommert
Dipl.-Phys.
Institute for Computational Physics
University Stuttgart
Pfaffenwaldring 27
70569 Stuttgart
Phone: +49(0)711/685-6-3613
Fax: +49-(0)711/685-6-3658
EMail: dommert at icp.uni-stuttgart.de
Home: http://www.icp.uni-stuttgart.de/~icp/Florian_Dommert
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEYEARECAAYFAky3ID4ACgkQLpNNBb9GiPkK2QCg3gMovcfuJRdOd9+IZ7KrmZ1u
2VgAoIPTQGVxyrmDUXi5PbPPg5tCr7h2
=PBOI
-----END PGP SIGNATURE-----
------------------------------
Message: 2
Date: Thu, 14 Oct 2010 17:54:02 +0200
From: "ABEL Stephane 175950" <Stephane.ABEL at cea.fr>
Subject: [gmx-users] Questions about REMD calculations
To: <gmx-users at gromacs.org>
Message-ID:
<F654B3EE96986E4B8DC6EF0919C88DA301038BE9 at LODERI.intra.cea.fr>
Content-Type: text/plain; charset="iso-8859-1"
Dear all,
I come back to you for several questions about the futures replica-exchange calculations that i would like to perform. The system of interest will contain 12 peptides (with 7 residues each) and 40000 water molecules, it come from a previous MD performed in NPT ensemble. With these systems, i would like to study the aggregation process between the peptides.
After reading several paper about REMD method and playing with the Temperature generator for REMD-simulations web server (http://folding.bmc.uu.se/remd/), i suspect that this system is too big for REMD. Indeed if use the following parameters in the webserver
Pdes 0.2
Temperature range 290 - 600
Number of water molecules 41380
Number of protein atoms 1092
Including all H ~ 1656
Number of hydrogens in protein ~ 240
Number of constraints ~ 1092
Number of vsites ~ 0
Number of degrees of freedom ~ 250464
Energy loss due to constraints 520.59 (kJ/mol K)
I obtain 271 replicas (ouch !!) . If i assume that for each replica app. 16 CPU, The simulations will be too big and will cost a lot CPU time.
So my question is can i reduce safely the number of water in system to reduce the number of replicas ?
For example for 10000 mol of water the number of replicas will be 135. It is not bad. It is a good option to overcome this limitation.
I have also read the number of replicas can be significantly reduced by using variants of REMD for example replica exchange with solute tempering (REST) from Berne and co-workers. Is this method is implemented in GROMACS ?
Or Can i use the REMD in implicit solvant for example with the coarse grain OPEP force field as described in Chebaro, et al. (2008).J. Phys. Chem. B 113(1): 267-274. or by Wang and Voth in J. Phys. Chem. B 112(41): 13079-13090.
Any advices and comments are welcome
Stefane
------------------------------
Stéphane Abel, PhD
CEA Saclay DSV/IBITEC-S/SB2SM
91191 Saclay, FRANCE
website: http://www.st-abel.com <http://www.st-abel.com/>
------------------------------
------------------------------
Message: 3
Date: Thu, 14 Oct 2010 12:16:35 -0400
From: chris.neale at utoronto.ca
Subject: [gmx-users] Questions about REMD calculations
To: gmx-users at gromacs.org
Message-ID: <20101014121635.bg1lkn1bswkks4s4 at webmail.utoronto.ca>
Content-Type: text/plain; charset=ISO-8859-1; DelSp="Yes";
format="flowed"
Stefane,
why are you hesitant to reduce the size of your system. If you can
still address the same questions with a smaller system, then I'd say
the answer is nearly always to use that smaller system (REMD or not).
Besides, my gut feeling is that even if you had enough 271*16 cpus
available, you would not get enough exchanges within a reasonable wall
clock time to get a converged answer. I say this because you're
basically looking at a protein folding problem over 12*7 residues
(larger than I have seen REMD be successfully reported) but also you
have a big entropy problem because the 7-mers are not tethered
together so this might even be harder than protein folding. Sure,
there could be some help from symmetry, but I assume that you don't
know that yet or else why do REMD?
I am personally highly suspect about REST: what ensemble does this
method sample, exactly?
Implicit solvent might be an option, but only if your implicit solvent
aggregation energy landscape has important minima in the same place as
the explicit solvent energy landscape, and how would you know this?
So I suggest that you make your box smaller and look at 2 peptides in
a box of water (try to get even fewer than 10K waters if you can).
Besides, this is your first REMD right? That's another reason to keep
things simple at the start. If you get good results from this, you can
publish them and them move on to the more complex system if you
believe that you can converge it. Perhaps with your first results in
hand, you could get an allocation from GENCI, Grand Equipement
National de Calcul Intensif, or some similar HPCS organization in
France, to compute the second part.
Chris.
--- original message ---
Dear all,
I come back to you for several questions about the futures
replica-exchange calculations that i would like to perform. The system
of interest will contain 12 peptides (with 7 residues each) and 40000
water molecules, it come from a previous MD performed in NPT ensemble.
With these systems, i would like to study the aggregation process
between the peptides.
After reading several paper about REMD method and playing with the
Temperature generator for REMD-simulations web server
(http://folding.bmc.uu.se/remd/), i suspect that this system is too
big for REMD. Indeed if use the following parameters in the webserver
Pdes 0.2
Temperature range 290 - 600
Number of water molecules 41380
Number of protein atoms 1092
Including all H ~ 1656
Number of hydrogens in protein ~ 240
Number of constraints ~ 1092
Number of vsites ~ 0
Number of degrees of freedom ~ 250464
Energy loss due to constraints 520.59 (kJ/mol K)
I obtain 271 replicas (ouch !!) . If i assume that for each replica
app. 16 CPU, The simulations will be too big and will cost a lot CPU
time.
So my question is can i reduce safely the number of water in system to
reduce the number of replicas ?
For example for 10000 mol of water the number of replicas will be 135.
It is not bad. It is a good option to overcome this limitation.
I have also read the number of replicas can be significantly reduced
by using variants of REMD for example replica exchange with solute
tempering (REST) from Berne and co-workers. Is this method is
implemented in GROMACS ?
Or Can i use the REMD in implicit solvant for example with the coarse
grain OPEP force field as described in Chebaro, et al. (2008).J. Phys.
Chem. B 113(1): 267-274. or by Wang and Voth in J. Phys. Chem. B
112(41): 13079-13090.
Any advices and comments are welcome
Stefane
Message: 5
Date: Thu, 14 Oct 2010 10:32:44 -0600
From: XAvier Periole <x.periole at rug.nl>
Subject: Re: [gmx-users] Questions about REMD calculations
To: Discussion list for GROMACS users <gmx-users at gromacs.org>
Message-ID: <A755F6B0-59AC-4737-8796-5A6200636FD2 at rug.nl>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed; delsp=yes
Well indeed the more atoms (degrees of freedom) in your system the
more replicas you need to insure exchanges between temperatures ...
I would say that even 135 replicas * 16 cpus= 2160 cpus ... that is
still
an enormous amount of cpu time for 12 small peptides :))
do you need to go up to 600K ? You can be sure that at that temperature
the force field is not accurate and the peptides will just be really
crazy ...
what about you keep the temperature lower than 400 K. You still will
increase your sampling considerably but need much less cpus! You can
convince yourself by simulating your system at 400K and 600K and
look at its behavior ...
Then I am not a fan of implicit solvents so I'll pass on that, then
for the
coarse grained FF the OPEP FF seem to be fine for your application.
You can also look at the ones from Deserno's group (bereau-2009) and
from Feig's group (primo: gopal-2010). I am no sure how the one of voth
is transferable ... the others are :))
XAvier.
On Oct 14, 2010, at 9:54 AM, ABEL Stephane 175950 wrote:
> Dear all,
>
> I come back to you for several questions about the futures replica-
> exchange calculations that i would like to perform. The system of
> interest will contain 12 peptides (with 7 residues each) and 40000
> water molecules, it come from a previous MD performed in NPT
> ensemble. With these systems, i would like to study the aggregation
> process between the peptides.
>
> After reading several paper about REMD method and playing with the
> Temperature generator for REMD-simulations web server (http://folding.bmc.uu.se/remd/
> ), i suspect that this system is too big for REMD. Indeed if use the
> following parameters in the webserver
>
> Pdes 0.2
> Temperature range 290 - 600
> Number of water molecules 41380
> Number of protein atoms 1092
> Including all H ~ 1656
> Number of hydrogens in protein ~ 240
> Number of constraints ~ 1092
> Number of vsites ~ 0
> Number of degrees of freedom ~ 250464
> Energy loss due to constraints 520.59 (kJ/mol K)
>
> I obtain 271 replicas (ouch !!) . If i assume that for each replica
> app. 16 CPU, The simulations will be too big and will cost a lot CPU
> time.
>
> So my question is can i reduce safely the number of water in system
> to reduce the number of replicas ?
>
> For example for 10000 mol of water the number of replicas will be
> 135. It is not bad. It is a good option to overcome this limitation.
>
> I have also read the number of replicas can be significantly reduced
> by using variants of REMD for example replica exchange with solute
> tempering (REST) from Berne and co-workers. Is this method is
> implemented in GROMACS ?
>
> Or Can i use the REMD in implicit solvant for example with the
> coarse grain OPEP force field as described in Chebaro, et al.
> (2008).J. Phys. Chem. B 113(1): 267-274. or by Wang and Voth in J.
> Phys. Chem. B 112(41): 13079-13090.
>
> Any advices and comments are welcome
>
> Stefane
>
> ------------------------------
> Stéphane Abel, PhD
> CEA Saclay DSV/IBITEC-S/SB2SM
> 91191 Saclay, FRANCE
> website: http://www.st-abel.com <http://www.st-abel.com/>
> ------------------------------
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search
> before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
------------------------------
Message: 6
Date: Thu, 14 Oct 2010 18:42:55 +0100
From: ms <devicerandom at gmail.com>
Subject: Re: [gmx-users] Questions about REMD calculations
To: gmx-users at gromacs.org
Message-ID: <4CB7411F.50606 at gmail.com>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
On 14/10/10 17:32, XAvier Periole wrote:
> Then I am not a fan of implicit solvents so I'll pass on that, then for the
> coarse grained FF the OPEP FF seem to be fine for your application.
> You can also look at the ones from Deserno's group (bereau-2009) and
> from Feig's group (primo: gopal-2010). I am no sure how the one of voth
> is transferable ... the others are :))
Is OPEP available as a Gromacs FF? I suspect that it is not trivial to
port to Gromacs, it includes explicit directional (angle-dependent)
hydrogen bond interactions, for example, which to my knowledge Gromacs
does not support.
I'd like if it was available though.
m.
------------------------------
Message: 7
Date: Thu, 14 Oct 2010 12:04:42 -0600
From: XAvier Periole <x.periole at rug.nl>
Subject: Re: [gmx-users] Questions about REMD calculations
To: Discussion list for GROMACS users <gmx-users at gromacs.org>
Message-ID: <2D97ADAF-1CDD-4875-91FA-74269BEFEACF at rug.nl>
Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes
On Oct 14, 2010, at 11:42 AM, ms wrote:
> On 14/10/10 17:32, XAvier Periole wrote:
>
>> Then I am not a fan of implicit solvents so I'll pass on that, then
>> for the
>> coarse grained FF the OPEP FF seem to be fine for your application.
>> You can also look at the ones from Deserno's group (bereau-2009) and
>> from Feig's group (primo: gopal-2010). I am no sure how the one of
>> voth
>> is transferable ... the others are :))
>
> Is OPEP available as a Gromacs FF? I suspect that it is not trivial
> to port to Gromacs, it includes explicit directional (angle-
> dependent) hydrogen bond interactions, for example, which to my
> knowledge Gromacs does not support.
>
> I'd like if it was available though.
The have their own code that I am sure they would be happy to share!
>
> m.
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search
> before posting!
> Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
------------------------------
--
gmx-users mailing list
gmx-users at gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
End of gmx-users Digest, Vol 78, Issue 103
******************************************
-------------- next part --------------
A non-text attachment was scrubbed...
Name: winmail.dat
Type: application/ms-tnef
Size: 16875 bytes
Desc: not available
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20101014/71b7d386/attachment.bin>
More information about the gromacs.org_gmx-users
mailing list