[gmx-users] Segmentation fault error
Mark Abraham
mark.j.abraham at gmail.com
Fri Dec 12 04:38:41 CET 2014
Hi,
This is a generic MPI error message. Nobody can tell from it what caused
it. You need to look at the whole stdout and the mdrun .log file for
diagnostics. You should also report your Gromacs version. Probably you are
just http://www.gromacs.org/Documentation/Terminology/Blowing_Up, but there
is a known problem with 5.0.3 which we will fix ASAP.
Mark
On Fri, Dec 12, 2014 at 3:55 AM, Seyed Mojtaba Rezaei Sani <
s.m.rezaeisani at gmail.com> wrote:
>
> *Dear all,*
> *I am trying to simulate a system of drug carrier consisting of HSPC/CHOL
> in the form of a vesicle. The code works well for the system when the CHOL
> molecules are not inserted. As I include CHOL molecules I face this error:*
>
>
>
>
>
>
>
>
>
>
>
>
>
> *starting mdrun 'Chol/HSPC VESICLE'900000 steps, 27000.0 ps.step
> 0[compute-0-3:30916] *** Process received signal ***[compute-0-3:30916]
> Signal: Segmentation fault (11)[compute-0-3:30916] Signal code: Address not
> mapped (1)[compute-0-3:30916] Failing at address:
> 0x9a200b0[compute-0-3:30916] [ 0] /lib64/libpthread.so.0
> [0x316940eb10][compute-0-3:30916] [ 1] /opt/bio/gromacs/lib/libgmx_mpi.so.6
> [0x2b291ac0ee2c][compute-0-3:30916] *** End of error message
>
> ***--------------------------------------------------------------------------mpirun
> noticed that process rank 10 with PID 30916 on node compute-0-3.local
> exited on signal 11 (Segmentation
>
> fault).--------------------------------------------------------------------------*
>
> *Here is the mdp file:*
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *title = Martiniintegrator =
> mddt = 0.03 nsteps =
> 900000nstcomm = 100comm-grps =
> nstxout = 1000nstvout =
> 1000nstfout = 1000nstlog = 1000 ;
> Output frequency for energies to log file nstenergy = 100
> ; Output frequency for energies to energy filenstxtcout =
> 1000 ; Output frequency for .xtc filextc_precision =
> 100xtc-grps = energygrps = HSPC CHOL
> Wnstlist = 10ns_type =
> gridpbc = xyzrlist =
> 1.4coulombtype = Shift ;Reaction_field (for use with
> Verlet-pairlist) ;PME (especially with polarizable
> water)rcoulomb_switch = 0.0rcoulomb =
> 1.2epsilon_r = 15 ; 2.5 (with polarizable
> water)vdw_type = Shift ;cutoff (for use with
> Verlet-pairlist) rvdw_switch = 0.9rvdw =
> 1.2 ;1.1 (for use with Verlet-pairlist);cutoff-scheme =
> verlet;coulomb-modifier = Potential-shift;vdw-modifier
> = Potential-shift;epsilon_rf = 0 ; epsilon_rf = 0 really
> means epsilon_rf = infinity;verlet-buffer-drift =
> 0.005tcoupl = v-rescale tc-grps = HSPC
> CHOL Wtau_t = 1.0 1.0 1.0 ref_t =
> 323 323 323Pcoupl = berendsen ; parrinello-rahman ;
> parrinello-rahmanPcoupltype = isotropic ;
> semiisotropictau_p = 3.0 ; 12.0 12.0
> ;parrinello-rahman is more stable with larger tau-p, DdJ,
> 20130422compressibility = 3e-4 ;
> 3e-4ref_p = 1.0 ; 1.0 1.0gen_vel
> = yesgen_temp = 320gen_seed =
> 473529constraints = none constraint_algorithm =
> Lincscontinuation = nolincs_order =
> 4lincs_warnangle = 30*
>
>
> *I appreciate any help in advance.*
>
> --
> Seyed Mojtaba Rezaei Sani
>
> Institute for Research in Fundamental Sciences (IPM)
> School of Nano-Science
> Shahid Farbin Alley
> Shahid Lavasani st
> P.O. Box 19395-5531
> Tehran, Iran
> Tel: +98 21 2310 (3069)
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list