[gmx-users] implicit solvent LINCS errors
Michael D. Daily
mdaily.work at gmail.com
Tue May 31 03:45:45 CEST 2011
Thanks for the recommendations everyone. I tried all of the mdp changes
recommended by Justin (increase rlist, rvdw, etc to 2.0; change
T-coupling to v-scale, and eliminate P-coupling). When I increased the
distance cutoffs, it ran about 30 ps then crashed instead of crashing
immediately. Only when I also turned off P-coupling did it keep running
long-term (now it's up to 500K+ steps / 1 ns), so apparently the problem
was a lack of control of system volume. Reducing the timestep from 2 fs
to 1fs did not work by itself and was not necessary ultimately to get it
to run.
Josh, thanks for the physical explanation of why the protein might
explode. My starting structure was a minimized xtal structure so I
wouldn't call it "extended." I'll try again with 4.5.4 before I scale
this up any further.
Mike
On 5/30/2011 8:02 PM, Mark Abraham wrote:
> On 31/05/2011 10:54 AM, Justin A. Lemkul wrote:
>>
>>
>> Joshua L. Phillips wrote:
>>> I've found that I often get LINCS warnings like this when starting from
>>> highly extended conformations when using implicit solvent. The GBSA
>>> surface tension combined with the lack of viscosity (due to the absence
>>> of explicit water) allows the protein to change conformation much
>>> faster
>>> than LINCS likes by default.
>>>
>>> Normally, I just run a short vacuum simulation (keep the same settings
>>> as Justin suggested but set GBSA = No) to let the system relax a little
>>> more before starting a GBSA run (EM is often just not enough). Usually
>>> only about 50ps. Also, doing this with position restraints can help
>>> slow
>>> down the collapse, and usually results in just enough collapse that the
>>> following GBSA run will satisfy the default LINCS settings.
>>>
>>> Another option might be to run a short simulation with GBSA turned on,
>>> but using Langevin dynamics to include some additional friction that
>>> should slow down the initial collapse.
>>>
>>
>> That's generally a good idea. We often use the sd integrator when
>> doing GBSA calculations.
>>
>>> Also, you could fudge with the environment variables associated with
>>> LINCS, but this seems a little dangerous compared to the above two
>>> suggestions.
>>>
>>
>> I wouldn't say "a little dangerous," I'd say "very dangerous" :) If
>> the constraints are failing, it's not necessarily (or usually) their
>> fault. The system is unstable, and trying to just override this will
>> give you a trajectory, but you should be concerned that this
>> trajectory is being ruled by spurious forces.
>>
>>> I would be interested in peoples' opinions and suggestions on how to
>>> handle this issue as it is quite common when starting from highly
>>> extended structures using GBSA. Some proteins are more susceptible than
>>> others...
>>>
>>
>> Reducing the timestep is a good option. If the dynamics are
>> occurring so fast that the constraints can't keep up, reducing dt can
>> help.
>
> Agreed. Even using all-vs-all kernels, I've observed similar symptoms
> as the OP lately with excessive rotation of v-site terminal methyl
> groups. I can generate stable trajectories by starting my
> equilibration with 0.5 fs timesteps, and switching later.
>
> Mark
>
>>
>> -Justin
>>
>>> I also tend to find that version 4.5.4 is much more stable than even
>>> 4.5.3 for implicit solvent simulations.
>>>
>>> -- Josh
>>>
>>> On Mon, 2011-05-30 at 18:06 -0500, Michael D. Daily wrote:
>>>> Thanks Justin, this is very helpful. I'll attempt these fixes
>>>> tomorrow.
>>>>
>>>> Mike
>>>>
>>>> On 5/30/2011 5:50 PM, Justin A. Lemkul wrote:
>>>>>
>>>>> Michael D. Daily wrote:
>>>>>> Hi all,
>>>>>>
>>>>>> I'm trying to run implicit solvent calculations in gromacs 4.5
>>>>>> with the charmm forcefield. I am able to minimize successfully
>>>>>> and compile for
>>>>> When troubleshooting, it is always advisable to try the latest
>>>>> version (4.5.4) to see if the problem is reproducible. If a
>>>>> pertinent bug has been fixed, there's no use troubleshooting the
>>>>> broken version.
>>>>>
>>>>>> mdrun, but soon after starting, mdrun complains about excessive
>>>>>> rotation in LINCS (see the error printed below that). I also
>>>>>> include my mdp file at the bottom. Can anyone advise me as to
>>>>>> the possible cause of such errors, as it is difficult to diagnose
>>>>>> given that grompp worked fine.
>>>>>>
>>>>> For reference:
>>>>>
>>>>> http://www.gromacs.org/Documentation/Terminology/Blowing_Up#Diagnosing_an_Unstable_System
>>>>>
>>>>>
>>>>> If grompp worked, that just means your coordinate and topology
>>>>> matched and there were no internal conflicts within the .mdp
>>>>> file. It is no guarantee that the resulting simulation will
>>>>> actually work, unfortunately.
>>>>>
>>>>>> --- lincs error ---
>>>>>>
>>>>>> Step 1, time 0.002 (ps) LINCS WARNING
>>>>> Typically an instant LINCS failure indicates insufficient
>>>>> minimization. You said you minimized successfully, but what does
>>>>> this mean? What values did you achieve for Fmax and Epot?
>>>>>
>>>>>> relative constraint deviation after LINCS:
>>>>>> rms 0.000780, max 0.020692 (between atoms 880 and 881)
>>>>>> bonds that rotated more than 30 degrees:
>>>>>> atom 1 atom 2 angle previous, current, constraint length
>>>>>> 606 607 36.7 1.0527 0.1080 0.1080
>>>>>> 614 615 35.6 0.8972 0.1125 0.1111
>>>>>> 614 616 75.3 0.1054 0.1121 0.1111
>>>>>> 880 881 58.0 0.1068 0.1134 0.1111
>>>>>> 880 882 50.4 0.9168 0.1121 0.1111
>>>>>> 889 890 55.3 0.1066 0.1122 0.1111
>>>>>> 889 891 35.3 0.8588 0.1118 0.1111
>>>>>>
>>>>>> ---- mdp file ------------
>>>>>>
>>>>>> ; title and include files
>>>>>> title = 1EX6-S35P_md1
>>>>>> cpp = cpp
>>>>>> include =
>>>>>> -I/home/yoo2/myusr/gromacs-3.3.3/share/gromacs/top/ -I./
>>>>> Any reason you're including files from an ancient Gromacs version?
>>>>>
>>>>>> define =
>>>>>> ; integrator and input/output setting up
>>>>>> integrator = md
>>>>>> nsteps = 1000000 ; 2 ns
>>>>>> ;nsteps = 5000 ; 2 ns
>>>>>> dt = 0.002
>>>>>> nstxout = 5000
>>>>>> nstvout = 5000
>>>>>> nstenergy = 500
>>>>>> nstxtcout = 500
>>>>>> nstlog = 500
>>>>>> xtc_grps = System
>>>>>> energygrps = System
>>>>>> comm_mode = Linear
>>>>>>
>>>>>> ;implicit solvent
>>>>>> implicit_solvent = GBSA
>>>>>> gb_algorithm = Still
>>>>>> gb_saltconc = 0.15
>>>>> FYI, gb_saltconc is nonfunctional. Don't expect it to do anything :)
>>>>>
>>>>>> rgbradii = 1.0
>>>>>>
>>>>>> ; neighbor searching and vdw/pme setting up
>>>>>> nstlist = 10
>>>>>> ns_type = grid
>>>>>> pbc = xyz
>>>>>> ;rlist = 1.4
>>>>>> rlist = 1.0
>>>>>>
>>>>>> ;coulombtype = pme
>>>>>> coulombtype = Cut-Off
>>>>>> fourierspacing = 0.1
>>>>>> pme_order = 6
>>>>>> ;rcoulomb = 1.4
>>>>>> rcoulomb = 1.1
>>>>>>
>>>>>> ;vdwtype = switch
>>>>>> vdwtype = Cut-Off
>>>>>> rvdw_switch = 1.0
>>>>>> ;rvdw = 1.2
>>>>>> rvdw = 1.1
>>>>>>
>>>>> All of these are potentially problematic. Running implicit
>>>>> simulations typically requires longer cutoffs than would normally
>>>>> be needed for explicit solvent simulations. Try
>>>>> rlist=rvdw=rcoulomb=2.0 nm.
>>>>>
>>>>>> ; cpt control
>>>>>> tcoupl = nose-hoover
>>>>> A better choice for initial equilibration would be either
>>>>> V-rescale or Berendsen. I know this can be an issue in explicit
>>>>> solvent, when velocities can oscillate a lot at the outset of a
>>>>> simulation using Nose-Hoover and the simulation box can explode; I
>>>>> don't know if this is such a big deal with implicit, but it can't
>>>>> hurt to try.
>>>>>
>>>>>> tc-grps = System
>>>>>> tau_t = 0.4
>>>>>> ref_t = 300.0
>>>>>> Pcoupl = parrinello-rahman
>>>>> I don't know how an implicit box will respond to pressure
>>>>> coupling, but it would be better to try NVT first and see if it's
>>>>> stable, then try NPT and see if things break down.
>>>>>
>>>>> One option that might be advantageous is to use the all-vs-all
>>>>> kernels for a speed upgrade. You can accomplish this with:
>>>>>
>>>>> rlist = 0
>>>>> nstlist = 0
>>>>> rvdw = 0
>>>>> rcoulomb = 0
>>>>> rgbradii = 0
>>>>> pbc = no
>>>>> comm-mode = angular
>>>>>
>>>>> You'd have to run with mdrun -pd (particle decomposition), but the
>>>>> end result can be quite fast and you avoid potential periodicity
>>>>> effects.
>>>>>
>>>>> -Justin
>>>>>
>>>>
>>>> --
>>>> Michael D. Daily, Ph.D.
>>>> Postdoctoral Fellow
>>>> Qiang Cui group
>>>> Department of Chemistry
>>>> University of Wisconsin-Madison
>>>>
>>>
>>
>
--
Michael D. Daily, Ph.D.
Postdoctoral Fellow
Qiang Cui group
Department of Chemistry
University of Wisconsin-Madison
More information about the gromacs.org_gmx-users
mailing list