[gmx-users] Fwd: KALP-15 in DPPC Tutorial Step 0 Segmentation Fault
Justin A. Lemkul
jalemkul at vt.edu
Sat Mar 12 18:51:40 CET 2011
Steve Vivian wrote:
> Based on a preliminary test using multiple threads, the issue is not
> resolved.
> This leads me to believe that my Unit Cell is not built properly.
>
> Below is the procedure used to build the unit cell. I have reviewed it many
> times, but I would appreciate any input regarding potential improvements,
> specifically on the line using trjconv in the EM/Shrink loop.
>
> Safe up to here, (I hope)...
>
> cat KALP_newbox.gro dppc128_whole.gro > system.gro
>
> update minim.mdp
> ; Strong position restraints for InflateGRO
> #ifdef STRONG_POSRES
> #include "strong_posre.itp"
> #endif
>
> Create Strong Position Restraint for protein
> genrestr -f KALP_newbox.gro -o strong_posre.itp -fc 100000 100000 100000
>
> Scale Lipid positions by a factor of 4
> perl inflategro.pl system.gro 4 DPPC 14 system_inflated.gro 5 area.dat
>
> Begin loop of repeated Energy Minimizations and Shrinking
> (repeat loop approximately 25 times until area is approx 71 Ang sq)
> Begin LOOP (from n=1 to n = 26)
> grompp -f minim.mdp -c systm_inf_n.gro -p topol.top -o em_n.tpr
> mdrun -v -deffnm em_n
> trjconv -s em_n.tpr -f em_n.gro -o em_n_out.gro -pbc mol -ur compact
> perl inflategro.pl em_n_out.gro 0.95 DPPC 0 sys_shr_1.gro 5
One problem here: you start the loop every time with system_inf_n? What is
system_inf_n? It seems that you should start one (non-loop) shrink and then
process the subsequent shrinking steps from there. At the end of the loop, you
write to sys_shr_1.gro, which then never gets used again.
-Justin
> ar_shr1.dat
> End LOOP
>
> Add water
> Add ions
> Re-run EM
> Equilibrate (and watch it all explode)
>
>
>
>
>
> -----Original Message-----
> From: gmx-users-bounces at gromacs.org [mailto:gmx-users-bounces at gromacs.org]
> On Behalf Of Justin A. Lemkul
> Sent: Thursday, March 10, 2011 12:56 PM
> To: Discussion list for GROMACS users
> Subject: Re: [gmx-users] Fwd: KALP-15 in DPPC Tutorial Step 0 Segmentation
> Fault
>
>
>
> Steve Vivian wrote:
>> On 03/08/2011 10:23 PM, Justin A. Lemkul wrote:
>>>
>>> Steve Vivian wrote:
>>>> New to Gromacs.
>>>>
>>>> Worked my way through the tutorial with relatively few issues until
>>>> the Equilibration stage. My system blows up!!
>>>>
>>>> Returned to the Topology stage and rebuilt my system ensuring that I
>>>> followed the procedure correctly for the InflateGro process. It
>>>> appears to be correct, reasonable lipid area, no water inside my
>>>> bilayer, vmd shows a structure which appears normal (although I am
>>>> new to this). There are voids between bilayer and water molecules,
>>>> but this is to be expected, correct?
>>>>
>>>> Energy Minimization repeatedly produces results within the expected
>>>> range.
>>>>
>>>> Again system blows up at equilibration, step 0 segmentation fault.
>>>> Regardless of whether I attempt the NVT or Anneal_Npt process (using
>>>> the provided mdp files, including the updates for restraints on the
>>>> protein and the lipid molecules).
>>>>
>>>> I have attempted many variations of the nvt.mdp and anneal_npt.mdp
>>>> files hoping to resolve my issue, but with no success. I will post
>>>> the log information from the nvt.mdp file included in the tutorial.
>>>>
>>>> Started mdrun on node 0 Tue Mar 8 15:42:35 2011
>>>>
>>>> Step Time Lambda
>>>> 0 0.00000 0.00000
>>>>
>>>> Grid: 9 x 9 x 9 cells
>>>> Energies (kJ/mol)
>>>> G96Angle Proper Dih. Improper Dih.
>>>> LJ-14 Coulomb-14
>>>> 8.52380e+01 6.88116e+01 2.23939e+01
>>>> -3.03546e+01 2.71260e+03
>>>> LJ (SR) Disper. corr. Coulomb (SR)
>>>> Coul. recip. Position Rest.
>>>> 1.49883e+04 -1.42684e+03 -2.78329e+05
>>>> -1.58540e+05 2.57100e+00
>>>> Potential Kinetic En. Total
>>>> Energy Conserved En. Temperature
>>>> -4.20446e+05 *1.41436e+14 1.41436e+14
>>>> 1.41436e+14 1.23343e+12*
>>>> Pres. DC (bar) Pressure (bar) Constr. rmsd
>>>> -1.56331e+02 5.05645e+12 1.18070e+01
>>>>
>>>>
>>>> As you can see the Potential Energy is reasonable, but the Kinetic
>>>> Energy and Temperature seem unrealistic.
>>>>
>>>> I am hoping that this is enough information for a more experienced
>>>> Gromacs user to provide guidance. Note: that I have tried all of the
>>>> suggestions that I read on the mailing list and in the "blowing up"
>>>> section of the manual, specifically:
>>>> -reduced time steps in Equilibration Stages
>>>> -reduced Fmax during EM stage (down as low as 100kJ which did not help)
>>>> -modified neighbours list parameters
>>>>
>>>> Any help is appreciated. I can attach and forward any further
>>>> information as required, please let me know.
>>>>
>>> Which Gromacs version are you using? It looks like you're running in
>>> serial, is that correct? Otherwise, please provide your mdrun command
>>> line. If you're using version 4.5.3 in serial, I have identified a
>>> very problematic bug that seems to affect a wide variety of systems
>>> that could be related:
>>>
>> Yes I am currently using Gromacs 4.5.3 in serial.
>>
>>> http://redmine.gromacs.org/issues/715
>>>
>>> I have seen even the most robust tutorial systems fail as well, as
>>> some new lab members experienced the same problem. The workaround is
>>> to run in parallel.
>>>
>> If I understand you correctly, the recommended workaround is to
>> re-configure gromacs 4.5.3 with mpi enabled and complete the
>> Equilibration and Production simulation in parallel.
>>
>
> Strictly speaking, an external MPI library is no longer required. Gromacs
> now
> builds with internal threading support (as long as your hardware and
> compilers
> support such features). In fact, thread support builds by default if
> possible,
> so if your mdrun has an -nt flag, you don't need to do anything else except
> use
> "mdrun -nt (number of threads)" when running your command.
>
>> Do you have a recommendation for which mpi library to install (lam mpi
>> seems to be referenced in other articles on the mailing list)?
>>
>
> I've had good luck with OpenMPI in the past, but this is not strictly
> necessary
> in all cases.
>
>> Are there documented installation procedures for this process (upgrading
>> to gromacs with mpi enabled)?
>>
>
> http://www.gromacs.org/Downloads/Installation_Instructions#Using_MPI
>
> -Justin
>
>> Thanks for your assistance.
>> Steve.
>>
>>> -Justin
>>>
>>>> Regards,
>>>> Steve Vivian.
>>>> svivian at uwo.ca
>>>>
>>>>
>>>>
>
--
========================================
Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
========================================
More information about the gromacs.org_gmx-users
mailing list