[gmx-users] Segmentation Fault (Address not mapped)

darrellk at ece.ubc.ca darrellk at ece.ubc.ca
Sat Jul 18 00:48:02 CEST 2009


Hi Justin,
I froze the graphene sheet because, prior to freezing it, I noticed that
it was vibrating and thought that maybe its vibration was not allowing
the NH3 molecules to adsorb (bond) to it. But after freezing the
graphene sheet, I see that see that NH3 molecules are still not bonding
to it. Physical experiments of NH3 and a graphene lattice connected to
electrodes have shown that NH3 does adsorb to graphene, but all I see
are NH3 molecules coming close to the graphene surface and then bouncing
away which I am assuming is a result of repulsion between the negatively
charged N atom in the ammonia molecule and the pi electrons in the
graphene lattice. So I am not sure why experiments have shown adsorption
unless adsoption is occurring as a result of a current flowing through
the graphene structure or as a result of edge effects at the interface
between the electrodes and the graphene lattice.

Could you tell me how freezing is different that position restraining as
this is not completely clear to me?

I will try position restraining the graphene structure and see if that
resolves my problem.

I have been able to view a trajectory for simulations of fewer than
20,000 time steps and see the frozen graphene lattice and the NH3
molecules floating through space.

Note that the segmentation fault only occurs sometime between 20,000 and
30,000 time steps. Could it be that the "funky" behaviour associated
with freezing would take 20,000+ time steps to cause a segmentation
fault?

Thanks again for your help.

Darrell

>Date: Thu, 16 Jul 2009 16:20:04 -0400
>From: "Justin A. Lemkul" <jalemkul at vt.edu>
>Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>Message-ID: <4A5F8B74.6070404 at vt.edu>
>Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
>
>
>darrellk at ece.ubc.ca wrote:
>> Hi Justin,
>> Thanks for the explanation of the difference between EM & equilibration.
>> Since in my model, I: (i) only have the graphene sheet and ammonia
>> molecules spaced reasonably far apart from each other (1332 NH3
>> molecules in a 38x38x38 box) and from the graphene sheet (distance
>> between the closest ammonia molecule and the graphene sheet is greater
>> than the molecular diameter of ammonia - maybe this is too close and
>> could be causing my problem?); (ii) freeze the graphene sheet; I am
>> thinking equilibration is not required in my model. Please let me know
>> if you think I still need to perform equilibration.
>>
>> Yes, the EM did converge satisfactorily. Here is the output from EM:
>> Steepest Descents converged to Fmax < 250 in 61 steps
>> Potential Energy  =  4.6094102e+04
>> Maximum force     =  2.4543298e+02 on atom 1
>> Norm of force     =  7.5803179e+03
>>
>> Is this a reasonable value for FMax?
>>
>
>Your Fmax looks fine.  Why is it necessary to freeze the graphene sheet?  Why
>not use position restraints (to rule out funky behavior of being frozen)?
>
>Did you ever obtain a trajectory with enough frames that you could watch?  What
>happened?
>
>-Justin
>
>> Thanks again for your help.
>>
>> Darrell
>>
>>
>>> Date: Thu, 16 Jul 2009 07:15:12 -0400
>>> From: "Justin A. Lemkul" <jalemkul at vt.edu>
>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>> Message-ID: <4A5F0BC0.4020200 at vt.edu>
>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>
>>>
>>>
>>> darrellk at ece.ubc.ca wrote:
>>>> Hi Mark,
>>>> I do not do any equilibration, I only do energy minimization as in the
>>>> "GROMACS Tutorial for Solvation Study of Spider Toxin Peptide". Please
>>>> let me know if I need to do equilibration and what is the difference
>>>> between energy minimization and equilibration as this is not clear to me.
>>>>
>>> Did the EM converge satisfactorily?  What was Fmax?
>>>
>>> Unlike EM, equilibration is an MD process; for a protein system, one generally
>>> position-restrains the protein and allows the surrounding solvent to optimize
>>> around the structure.
>>>
>>> -Justin
>>>
>>>> Here is an mdp file for a run that actually completed successfully:
>>>> title           =Graphene
>>>> ;warnings       =10
>>>> cpp             =cpp
>>>> ;define         =-DPOSRES
>>>> constraints     =none
>>>> integrator      =md
>>>> dt              =0.002 ; ps
>>>> nsteps          =10000
>>>> nstcomm         =100
>>>> nstxout         =100
>>>> ;nstvout                =1000
>>>> nstfout         =0
>>>> nstlog          =100
>>>> nstenergy       =100
>>>> nstlist         =100
>>>> ns_type         =grid
>>>> rlist           =2.0
>>>> coulombtype     =PME
>>>> rcoulomb        =2.0
>>>> vdwtype         =cut-off
>>>> rvdw            =5.0
>>>> fourierspacing  =0.12
>>>> fourier_nx      =0
>>>> fourier_ny      =0
>>>> fourier_nz      =0
>>>> pme_order       =4
>>>> ewald_rtol      =1e-5
>>>> optimize_fft    =yes
>>>>
>>>> ; This section freezes graphene lattice
>>>> energygrps      = Grph NH3
>>>> energygrp_excl  = Grph Grph
>>>> freezegrps      = Grph ; Freeze graphene lattice
>>>> freezedim       = Y Y Y; in all directions
>>>>
>>>> Tcoupl          =berendsen
>>>> tau_t           =0.5    0.5
>>>> tc-grps         =NH3    Grph
>>>> ref_t           =300    300
>>>>
>>>> ;coupl          = parrinello-rahman
>>>> ;tau_p          = 1.5
>>>> ;compressibility = 1.3
>>>> ;ref_p          = 0.061
>>>>
>>>> gen_vel = yes
>>>> gen_temp = 300.0
>>>> gen_seed = 173529
>>>>
>>>> And here is a copy of an mdp file for a run that did not complete
>>>> successfully:
>>>>
>>>> title           =Graphene
>>>> ;warnings       =10
>>>> cpp             =cpp
>>>> ;define         =-DPOSRES
>>>> constraints     =none
>>>> integrator      =md
>>>> dt              =0.002 ; ps
>>>> nsteps          =30000
>>>> nstcomm         =500
>>>> nstxout         =500
>>>> ;nstvout                =1000
>>>> nstfout         =0
>>>> nstlog          =500
>>>> nstenergy       =500
>>>> nstlist         =500
>>>> ns_type         =grid
>>>> rlist           =2.0
>>>> coulombtype     =PME
>>>> rcoulomb        =2.0
>>>> vdwtype         =cut-off
>>>> rvdw            =5.0
>>>> fourierspacing  =0.12
>>>> fourier_nx      =0
>>>> fourier_ny      =0
>>>> fourier_nz      =0
>>>> pme_order       =4
>>>> ewald_rtol      =1e-5
>>>> optimize_fft    =yes
>>>>
>>>> ; This section freezes graphene lattice
>>>> energygrps      = Grph NH3
>>>> energygrp_excl  = Grph Grph
>>>> freezegrps      = Grph ; Freeze graphene lattice
>>>> freezedim       = Y Y Y; in all directions
>>>>
>>>> Tcoupl          =berendsen
>>>> tau_t           =0.5    0.5
>>>> tc-grps         =NH3    Grph
>>>> ref_t           =300    300
>>>>
>>>> ;coupl          = parrinello-rahman
>>>> ;tau_p          = 1.5
>>>> ;compressibility = 1.3
>>>> ;ref_p          = 0.061
>>>>
>>>> gen_vel = yes
>>>> gen_temp = 300.0
>>>> gen_seed = 173529
>>>>
>>>> Please let me know what you think might be the problem.
>>>>
>>>> Thanks
>>>>
>>>> Darrell
>>>>
>>>>> Date: Thu, 16 Jul 2009 09:47:49 +1000
>>>>> From: Mark Abraham <Mark.Abraham at anu.edu.au>
>>>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>>> Message-ID: <4A5E6AA5.4040809 at anu.edu.au>
>>>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>>>
>>>>> darrellk at ece.ubc.ca wrote:
>>>>>> Hi Mark,
>>>>>> Yes, I know that the box dimensions are defined in the last line of the
>>>>>> .gro file and I have defined these dimensions as 38 nm x 38 nm x 38 nm
>>>>>> in the .gro file.
>>>>> OK.
>>>>>
>>>>>> I looked through my .gro file to ensure none of the atoms had coordinates
>>>>>> outside the 38x38x38 box. While I was reviewing the file I did notice
>>>>>> that some coordinates had negative values, slightly negative, but
>>>>>> negative none the less. Could this be causing the segmentation fault
>>>>>> between time step 10,000 and time step 30,000? Why wouldn't the
>>>>>> negative coordinates cause a segmentation fault much earlier?
>>>>> The absolute value of the coordinates is irrelevant.
>>>>>
>>>>> Your choice of 2.0 for rcoulomb is likely suboptimal for PME. Some
>>>>> smaller value is probably more efficient, but this will not be the cause
>>>>> of your problem.
>>>>>
>>>>> What is your system preparation regime? (i.e. EM + equilibration)
>>>>>
>>>>> Can you post a corrected and current .mdp file?
>>>>>
>>>>> Mark
>>>>>
>>>>>>> Date: Wed, 15 Jul 2009 16:59:21 +1000
>>>>>>> From: Mark Abraham <Mark.Abraham at anu.edu.au>
>>>>>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>>>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>>>>> Message-ID: <4A5D7E49.9020700 at anu.edu.au>
>>>>>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>>>>>
>>>>>>> darrellk at ece.ubc.ca wrote:
>>>>>>>> Hi Justin,
>>>>>>>> I was experiencing the problem before someone suggested using editconf so
>>>>>>>> I do not think the problem is being caused by editconf. But anyway here
>>>>>>>> is my editconf command. Let me know if you a source of error in this
>>>>>>>> command line.
>>>>>>>>
>>>>>>>> editconf -f graphene.gro -n index.ndx -o graphene_ec.gro
>>>>>>>>
>>>>>>>> I did not want to add in additional space between the solvent and the box
>>>>>>>> as I saw no reason for doing so. And hence that is why I originally did
>>>>>>>> not use editconf.
>>>>>>>>
>>>>>>>> My box dimensions are 38nm x 38nm x 38nm.
>>>>>>> The box dimensions are defined in the bottom line of the .gro file, and
>>>>>>> not by the positions of the atoms in that file. If you haven't ever set
>>>>>>> them to be suitable for your coordinates with editconf, then they might
>>>>>>> not be.
>>>>>>>
>>>>>>> Mark
>>>>>>>
>>>>>>>  I used cutoffs of 2 nm & 5 nm
>>>>>>>> for my system so ensure the cutoff occured at a distance where the
>>>>>>>> potentials were stabalized (not changing). I guess I could use shorter
>>>>>>>> cutoffs such as 1.5 nm & 2 nm and this may decrease my computation time.
>>>>>>>> I also thought that I needed to use larger cut-offs since I am dealing
>>>>>>>> in the gas phase and there is greater ditance between the atoms in my
>>>>>>>> simulation than in liquid-based simulations.
>>>>>>>>
>>>>>>>> In the .log files, I do not see any LINCS warnings or neighborlist
>>>>>>>> errors.
>>>>>>>>
>>>>>>>> I ran gmxcheck on a .trr file and was presented with the following
>>>>>>>> output:
>>>>>>>> *********************************************
>>>>>>>> Checking file mdtraj.trr
>>>>>>>> trn version: GMX_trn_file (single precision)
>>>>>>>> Reading frame 0 time 0.000
>>>>>>>> # Atoms 10482
>>>>>>>> Last frame 5 time 1.000
>>>>>>>>
>>>>>>>>
>>>>>>>> Item #frames Timestep (ps)
>>>>>>>> Step 6 0.2
>>>>>>>> Time 6 0.2
>>>>>>>> Lambda 6 0.2
>>>>>>>> Coords 6 0.2
>>>>>>>> Velocities 6 0.2
>>>>>>>> Forces 0
>>>>>>>> Box 6 0.2
>>>>>>>> *********************************************
>>>>>>>>
>>>>>>>> I ran two additional simulations with different values for nsteps and
>>>>>>>> nstxxxx paramaters and have the following to report:
>>>>>>>>
>>>>>>>> When I run a simulation with the following parameters it completes
>>>>>>>> successfully and I see, in the log file, the system output every 100
>>>>>>>> time steps.
>>>>>>>> nsteps          =10000
>>>>>>>> nstcomm         =100
>>>>>>>> nstxout         =100
>>>>>>>> nstfout         =0
>>>>>>>> nstlog          =100
>>>>>>>> nstenergy       =100
>>>>>>>> nstlist         =100
>>>>>>>>
>>>>>>>> When I run a simulation with the following parameters it fails with a
>>>>>>>> sementation fault and, in the log file, I do not see system output every
>>>>>>>> 500 time steps.
>>>>>>>> nsteps          =30000
>>>>>>>> nstcomm         =500
>>>>>>>> nstxout         =500
>>>>>>>> nstfout         =0
>>>>>>>> nstlog          =500
>>>>>>>> nstenergy       =500
>>>>>>>> nstlist         =500
>>>>>>>>
>>>>>>>> Please let me know what you think might be the problem.
>>>>>>>>
>>>>>>>> Thank you very much.
>>>>>>>>
>>>>>>>> Darrell
>>>>>>>>
>>>>>>>>
>>>>>>>>> Date: Mon, 13 Jul 2009 15:37:15 -0400
>>>>>>>>> From: "Justin A. Lemkul" <jalemkul at vt.edu>
>>>>>>>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>>>>>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>>>>>>> Message-ID: <4A5B8CEB.4020609 at vt.edu>
>>>>>>>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> darrellk at ece.ubc.ca wrote:
>>>>>>>>>> Hi Mark,
>>>>>>>>>> I used editconf on my .gro file with zero space between my solvent and
>>>>>>>>>> the box and the resulting box had the exact same dimension as the
>>>>>>>>>> initial box. I also performed a number of simulation runs with different
>>>>>>>>> If you're using editconf to define zero space, what's the point?  I only ask
>>>>>>>>> because it is a potential source of error if you think you're adding zero space,
>>>>>>>>> but something else might be going on.  Maybe you can post your editconf command
>>>>>>>>> line.
>>>>>>>>>
>>>>>>>>> What are your box dimensions?  Are cut-off lengths of 2.0 and 5.0 nm appropriate
>>>>>>>>> for your system?  How did you determine that these cut-off's should be used?
>>>>>>>>>
>>>>>>>>>> mdp parameters hoping this would provide me some indication of the cause
>>>>>>>>>> of the fault but to no avail. I looked through the log files, error
>>>>>>>>>> files, and output files and could not find any output to help me
>>>>>>>>>> identify the source of my error.
>>>>>>>>>>
>>>>>>>>> It is very odd that Gromacs isn't report anything at all.  No LINCS warnings?
>>>>>>>>> No neighborlist errors?  These would be in the .log file.
>>>>>>>>>
>>>>>>>>>> Could you please let me know how I can look at my structure at each point
>>>>>>>>>> as you indicate below as I do not see any files output that provide me
>>>>>>>>>> to do so? I tried to look at the .trr file but when I try to load it
>>>>>>>>>> into VMD, it causes an error. I am assuming this error is caused because
>>>>>>>>>> the .trr file did not complete correctly due to the segmentation fault.
>>>>>>>>>> Please advise.
>>>>>>>>>>
>>>>>>>>> How early is the segmentation fault occurring?  I have found it useful sometimes
>>>>>>>>> to set nstxout (or nstxtcout) = 1 to try to catch the first few frames if the
>>>>>>>>> explosion is occurring early.  In any case, gmxcheck will help determine how
>>>>>>>>> many frames are present, as well as the integrity of the file (broken frames, etc).
>>>>>>>>>
>>>>>>>>> -Justin
>>>>>>>>>
>>>>>>>>>> Thanks.
>>>>>>>>>>
>>>>>>>>>> Darrell
>>>>>>>>>>
>>>>>>>>>>> Date: Tue, 07 Jul 2009 09:19:42 +1000
>>>>>>>>>>> From: Mark Abraham <Mark.Abraham at anu.edu.au>
>>>>>>>>>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>>>>>>>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>>>>>>>>> Message-ID: <4A52868E.6010807 at anu.edu.au>
>>>>>>>>>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>>>>>>>>>
>>>>>>>>>>> darrellk at ece.ubc.ca wrote:
>>>>>>>>>>>> Hi Mark,
>>>>>>>>>>>> I added the energy group exclusions as indicated in your previous
>>>>>>>>>>>> response but am still experiencing the same problem. I looked at the
>>>>>>>>>>>> .log files and see that in one log file it tells me that my box is
>>>>>>>>>>>> exploding. However, I do not have many molecules in my simulation and
>>>>>>>>>>>> therefore do not think that it is possible that my box is exploding from
>>>>>>>>>>>> pressure.
>>>>>>>>>>> Sure, but if there's something malformed with your model physics or
>>>>>>>>>>> starting configuration, then large forces can make anything explode.
>>>>>>>>>>>
>>>>>>>>>>> Look at your structures at each point and see where things start to go
>>>>>>>>>>> wrong. Make sure you've used editconf on your starting structure to
>>>>>>>>>>> provide the right box dimensions.
>>>>>>>>>>>
>>>>>>>>>>> Mark
>>>>>>>>>>>
>>>>>>>>>>>> Maybe if I re-state my simulation it will help you in providing me
>>>>>>>>>>>> direction on what might be causing the problem. My simulation consists
>>>>>>>>>>>> of a graphene lattice with a layer of ammonia molecules above it. The
>>>>>>>>>>>> box is very large and there is lots of empty space in the box. So I am a
>>>>>>>>>>>> little confused as to how the box could be exploding.
>>>>>>>>>>>>
>>>>>>>>>>>> Thanks again in advance for your help.
>>>>>>>>>>>>
>>>>>>>>>>>> Darrell Koskinen
>>>>>>>>>>>>
>>>>>>>>>>>>> Date: Fri, 03 Jul 2009 11:41:45 +1000
>>>>>>>>>>>>> From: Mark Abraham <Mark.Abraham at anu.edu.au>
>>>>>>>>>>>>> Subject: Re: [gmx-users] Segmentation Fault (Address not mapped)
>>>>>>>>>>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>>>>>>>>>>> Message-ID: <4A4D61D9.6080700 at anu.edu.au>
>>>>>>>>>>>>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>>>>>>>>>>>>
>>>>>>>>>>>>> darrellk at ece.ubc.ca wrote:
>>>>>>>>>>>>>> Dear GROMACS Gurus,
>>>>>>>>>>>>>> I am experiencing a segmentation fault when mdrun executes. My simulation
>>>>>>>>>>>>>> has a graphene lattice with an array (layer) of ammonia molecules above
>>>>>>>>>>>>>> it. The box is three times the width of the graphene lattice, three
>>>>>>>>>>>>>> times the length of the graphene lattice, and three times the height
>>>>>>>>>>>>>> between the graphene lattice and the ammonia molecules. I am including
>>>>>>>>>>>>>> the mdp file and the error message.
>>>>>>>>>>>>> Probably your system is exploding when integration fails with excessive
>>>>>>>>>>>>> forces. You should look at the bottom of stdout, stderr, *and* the .log
>>>>>>>>>>>>> file to diagnose. The error message you give below is merely the
>>>>>>>>>>>>> diagnostic trace from the MPI library, and it not useful for finding out
>>>>>>>>>>>>> what GROMACS thinks the problem might be. Further advice below.
>>>>>>>>>>>>>
>>>>>>>>>>>>>> ***************************************************************************
>>>>>>>>>>>>>> .mdp file
>>>>>>>>>>>>>> title           =FWS
>>>>>>>>>>>>>> ;warnings       =10
>>>>>>>>>>>>>> cpp             =cpp
>>>>>>>>>>>>>> ;define         =-DPOSRES
>>>>>>>>>>>>>> ;constraints    =all-bonds
>>>>>>>>>>>>>> integrator      =md
>>>>>>>>>>>>>> dt              =0.002 ; ps
>>>>>>>>>>>>>> nsteps          =100000
>>>>>>>>>>>>>> nstcomm         =1000
>>>>>>>>>>>>>> nstxout         =1000
>>>>>>>>>>>>>> ;nstvout                =1000
>>>>>>>>>>>>>> nstfout         =0
>>>>>>>>>>>>>> nstlog          =1000
>>>>>>>>>>>>>> nstenergy       =1000
>>>>>>>>>>>>>> nstlist         =1000
>>>>>>>>>>>>>> ns_type         =grid
>>>>>>>>>>>>>> rlist           =2.0
>>>>>>>>>>>>>> coulombtype     =PME
>>>>>>>>>>>>>> rcoulomb        =2.0
>>>>>>>>>>>>>> vdwtype         =cut-off
>>>>>>>>>>>>>> rvdw            =5.0
>>>>>>>>>>>>>> fourierspacing  =0.12
>>>>>>>>>>>>>> fourier_nx      =0
>>>>>>>>>>>>>> fourier_ny      =0
>>>>>>>>>>>>>> fourier_nz      =0
>>>>>>>>>>>>>> pme_order       =4
>>>>>>>>>>>>>> ewald_rtol      =1e-5
>>>>>>>>>>>>>> optimize_fft    =yes
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ; This section added in to freeze hydrogen atoms at edge of graphene
>>>>>>>>>>>>>> lattice to prevent movement of lattice
>>>>>>>>>>>>>> ;energygrp_excl = Edge Edge Edge Grph Grph Grph
>>>>>>>>>>>>>> freezegrps      = Edge Grph ; Hydrogen atoms in graphene lattice are
>>>>>>>>>>>>>> associated with the residue Edge
>>>>>>>>>>>>> See comments in 7.3.24 of manual. You need the energy group exclusions.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Mark
>>>>>>>>>>>>>
>>>>>>>>>>>>>> freezedim       = Y Y Y Y Y Y; Freeze hydrogen atoms in all directions
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ;Tcoupl         =berendsen
>>>>>>>>>>>>>> ;tau_t          =0.1    0.1
>>>>>>>>>>>>>> ;tc-grps                =protein non-protein
>>>>>>>>>>>>>> ;ref_t = 300 300
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ;Pcoupl = parrinello-rahman
>>>>>>>>>>>>>> ;tau_p = 0.5
>>>>>>>>>>>>>> ;compressibility = 4.5e-5
>>>>>>>>>>>>>> ;ref_p = 1.0
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ;gen_vel = yes
>>>>>>>>>>>>>> ;gen_temp = 300.0
>>>>>>>>>>>>>> ;gen_seed = 173529
>>>>>>>>>>>>>> ***************************************************************************
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> ***************************************************************************
>>>>>>>>>>>>>> ERROR IN OUTPUT FILE
>>>>>>>>>>>>>> [node16:25758] *** Process received signal ***
>>>>>>>>>>>>>> [node16:25758] Signal: Segmentation fault (11)
>>>>>>>>>>>>>> [node16:25758] Signal code: Address not mapped (1)
>>>>>>>>>>>>>> [node16:25758] Failing at address: 0xfffffffe1233e230
>>>>>>>>>>>>>> [node16:25758] [ 0] /lib64/libpthread.so.0 [0x3834a0de80]
>>>>>>>>>>>>>> [node16:25758] [ 1] /usr/lib64/libmd_mpi.so.4(pme_calc_pidx+0xd6)
>>>>>>>>>>>>>> [0x2ba295dd0606]
>>>>>>>>>>>>>> [node16:25758] [ 2] /usr/lib64/libmd_mpi.so.4(do_pme+0x808)
>>>>>>>>>>>>>> [0x2ba295dd4058]
>>>>>>>>>>>>>> [node16:25758] [ 3] /usr/lib64/libmd_mpi.so.4(force+0x8de)
>>>>>>>>>>>>>> [0x2ba295dba5be]
>>>>>>>>>>>>>> [node16:25758] [ 4] /usr/lib64/libmd_mpi.so.4(do_force+0x5ef)
>>>>>>>>>>>>>> [0x2ba295ddeaff]
>>>>>>>>>>>>>> [node16:25758] [ 5] mdrun_mpi(do_md+0xe23) [0x411193]
>>>>>>>>>>>>>> [node16:25758] [ 6] mdrun_mpi(mdrunner+0xd40) [0x4142f0]
>>>>>>>>>>>>>> [node16:25758] [ 7] mdrun_mpi(main+0x239) [0x4146f9]
>>>>>>>>>>>>>> [node16:25758] [ 8] /lib64/libc.so.6(__libc_start_main+0xf4)
>>>>>>>>>>>>>> [0x3833e1d8b4]
>>>>>>>>>>>>>> [node16:25758] [ 9] mdrun_mpi [0x40429a]
>>>>>>>>>>>>>> [node16:25758] *** End of error message ***
>>>>>>>>>>>>>> mpirun noticed that job rank 7 with PID 25758 on node node16 exited on
>>>>>>>>>>>>>> signal 11 (Segmentation fault).
>>>>>>>>>>>>>> 7 processes killed (possibly by Open MPI)
>>>>>>>>>>>>>> ***************************************************************************
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Could you please let me know what you think may be causing the fault?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Much thanks in advance.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Darrell Koskinen
>>>>>>>>>> _______________________________________________
>>>>>>>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>>>>>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>>>>>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>>>>>>>> Please don't post (un)subscribe requests to the list. Use the
>>>>>>>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>>>>>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> ========================================
>>>>>>>>>
>>>>>>>>> Justin A. Lemkul
>>>>>>>>> Ph.D. Candidate
>>>>>>>>> ICTAS Doctoral Scholar
>>>>>>>>> Department of Biochemistry
>>>>>>>>> Virginia Tech
>>>>>>>>> Blacksburg, VA
>>>>>>>>> jalemkul[at]vt.edu | (540) 231-9080
>>>>>>>>> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>>>>>>>>>
>>>>>>>>> ========================================
>>>>>>>> _______________________________________________
>>>>>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>>>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>>>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>>>>>> Please don't post (un)subscribe requests to the list. Use the
>>>>>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>>>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>>>>>>>
>>>>>>> ------------------------------
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> gmx-users mailing list
>>>>>>> gmx-users at gromacs.org
>>>>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>>>>>
>>>>>>> End of gmx-users Digest, Vol 63, Issue 61
>>>>>>> *****************************************
>>>>>>>
>>>>>> _______________________________________________
>>>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>>>> Please don't post (un)subscribe requests to the list. Use the
>>>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>>>>>
>>>> _______________________________________________
>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>> Please don't post (un)subscribe requests to the list. Use the
>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>>>
>>> --
>>> ========================================
>>>
>>> Justin A. Lemkul
>>> Ph.D. Candidate
>>> ICTAS Doctoral Scholar
>>> Department of Biochemistry
>>> Virginia Tech
>>> Blacksburg, VA
>>> jalemkul[at]vt.edu | (540) 231-9080
>>> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>>>
>>> ========================================
>>>
>>>
>>> ------------------------------
>>>
>>> _______________________________________________
>>> gmx-users mailing list
>>> gmx-users at gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at http://www.gromacs.org/search before posting!
>>>
>>> End of gmx-users Digest, Vol 63, Issue 74
>>> *****************************************
>>>
>> _______________________________________________
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>
>--
>========================================
>
>Justin A. Lemkul
>Ph.D. Candidate
>ICTAS Doctoral Scholar
>Department of Biochemistry
>Virginia Tech
>Blacksburg, VA
>jalemkul[at]vt.edu | (540) 231-9080
>http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
>========================================
>
>
>------------------------------
>
>_______________________________________________
>gmx-users mailing list
>gmx-users at gromacs.org
>http://lists.gromacs.org/mailman/listinfo/gmx-users
>Please search the archive at http://www.gromacs.org/search before posting!
>
>End of gmx-users Digest, Vol 63, Issue 79
>*****************************************
>



More information about the gromacs.org_gmx-users mailing list