[gmx-users] Splitted DMPC bilayer
Justin A. Lemkul
jalemkul at vt.edu
Tue Apr 5 00:57:38 CEST 2011
Dr. Ramón Garduño-Juárez wrote:
> Again much obliged for your comments. They are most illustrative...
> I would like to make a final note on the issue of these many e-mails...
> I am sure that GROMACS is fast, but that fast?...
Yes. Your results prove it. With quality hardware, you get great performance.
> For the sake of knowing that we are doing the right things, this is our
> topol.top file in which we eliminated all POSRES for the Protein and
> DMPC, not so for the WATER...
To what end, I do not know. One generally does not find much use in restraining
water while everything else moves, but syntactically, it is correct.
> ; Include forcefield parameters
> #include "./gromos53a6_lipid.ff/forcefield.itp"
> ; Include chain topologies
> #include "topol_Protein_chain_A.itp"
> #include "topol_Protein_chain_B.itp"
> #include "topol_Protein_chain_C.itp"
> ; Include water topology
> #include "./gromos53a6_lipid.ff/spc.itp"
> #ifdef POSRES_WATER
> ; Position restraint for each water oxygen
> [ position_restraints ]
> ; i funct fcx fcy fcz
> 1 1 1000 1000 1000
> ; Include topology for ions
> #include "./gromos53a6_lipid.ff/ions.itp"
> [ system ]
> ; Name
> [ molecules ]
> ; Compound #mols
> Protein_chain_A 1
> Protein_chain_B 1
> Protein_chain_C 1
> On Protein_chain_A there are 342 atoms
> On Protein_chain_B there are 289 atoms
> On Protein_chain_C there are 715 atoms
> On DMPC there are 123 molecules of 46 atoms each
> On SOL there are 3205 molecules of 3 atoms each
> For a total of 16619 atoms
> I know that this is a medium size system for which I was expecting
> longer CPU time for a 20 ns MD run.
> I know that there was no "error", which I meant is that I was surprised
> by the outcome...
> May be GROMACS is as fast as it is claimed...
> El 04/04/2011 05:27 p.m., Justin A. Lemkul escribió:
>> Dr. Ramón Garduño-Juárez wrote:
>>> Thank you for your comments after finishing the MD production run for
>>> up to 20 ns...
>>> Since this step was over very quickly, now I have a simple question
>>> ¿How long, in human time, should a production run last?
>> There is no way to answer that. It depends on the hardware, number of
>> atoms, system load, application of any number of the Gromacs
>> algorithms, .mdp settings...
>>> The production run was carried out in six processors Intel Xeon (R)
>>> E5405 2.00 GHz. The last few lines of the md_0_1.log are:
>>> Parallel run - timing based on wallclock.
>>> NODE (s) Real (s) (%)
>>> Time: 180685.417 180685.417 100.0
>>> (Mnbf/s) (GFlops) (ns/day) (hour/ns)
>>> Performance: 232.900 12.351 9.564 2.510
>>> Is this correct? In my opinion it should lasted much more longer...
>> Nope, Gromacs is just fast :)
>>> Before reaching this point, this is an update of what we did...
>>> First we eliminated the SOL_SOL group and the only special index
>>> group was Protein_DMPC.
>>> Since the NVT equilibration failed, we took option # 2 of the
>>> "Advanced Troubleshooting", for the 1st phase of Equilibration.
>>> After this step we proceeded with the equilibration phase 2 with a
>>> 1-ns NPT equilibration which ended fine.
>>> Next, we proceeded with a 20 ns production run. Thus, the modified
>>> lines of the .mpd file found in the tutorial page were:
>>> nsteps = 10000000 ; 2 * 10000000 = 2000 ps (20 ns)
>>> tc-grps = Protein DMPC SOL
>>> comm-grps = Protein_DMPC SOL
>>> With this instructions the 20 ns simulation took 2d02h11:25
>>> I believe the "error" comes from the line
>>> constrains = all-bonds which surely must be changed to
>>> constrains = none or hbonds
>> Why do you say that? What error is occurring? You said your
>> simulations were running fine. You most certainly should not remove
>> constraints if you're sticking with a 2-fs timestep. The system will
>> be unstable without constraints. You might be able to get away with
>> hbonds, but certainly not "none."
>>> Looking forward to your comments...
>>> Much obliged,
Justin A. Lemkul
ICTAS Doctoral Scholar
Department of Biochemistry
jalemkul[at]vt.edu | (540) 231-9080
More information about the gromacs.org_gmx-users