[gmx-users] mpirun noticed that process rank 7 with PID 19160 on node compute-0-28.local exited on signal 11 (Segmentation fault).

Mark Abraham mark.j.abraham at gmail.com
Mon Sep 18 15:06:33 CEST 2017


Hi,

You're not compiling properly for your cluster, which might have different
hardware in different places. Read its docs and talk to your admins.

Mark

On Mon, Sep 18, 2017 at 4:07 AM Vidya R <vidyadevi2811 at gmail.com> wrote:

> Thank you for your reply.
>
> When I try to run my job in a single processor through qsub command, (by
> feeding the gromacs mdrun command in script file), it says SEGMENTATION
> FAULT, CORE DUMPED...
>
>
> But, when I run my job in login node (which we are not supposed to do), it
> works very well...
>
>
> Can you comment on this?
>
>
> Thanks,
> Vidya.R
>
> On Mon, Sep 18, 2017 at 2:50 AM, Mark Abraham <mark.j.abraham at gmail.com>
> wrote:
>
> > Hi,
> >
> > You're running a thread-MPI version of GROMACS, which is probably not
> what
> > you want to do if you're running mpirun. It should work even so, but
> > whatever quirks exist with SGE are unfortunately between you, its docs
> and
> > your cluster's docs and admins :-(
> >
> > Mark
> >
> > On Sun, Sep 17, 2017 at 7:23 AM Vidya R <vidyadevi2811 at gmail.com> wrote:
> >
> > > My log file is provided in the link below
> > >
> > > Can you please look into it and let me know why the error arises?
> > >
> > > I am feeding my commands in SGE cluster.   When I run it in my login
> > node,
> > > gmx mdrun -v -deffnm eql runs well
> > >
> > >
> > > But, through qsub command, (with 8 processors) It says,
> > >
> > > mpirun noticed that process rank 7 with PID 19160 on node
> > > compute-0-28.local exited on signal 11 (Segmentation fault).
> > >
> > > Please help me.
> > >
> > > I am unable to figure out, as to whether the problem is with the
> version
> > of
> > > gromacs or the method of compiling.
> > >
> > >
> > >
> > > https://drive.google.com/file/d/0BxGqxeGwTDLbQW9OZDFuM1doUlU/
> > view?usp=sharing
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at
> > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-request at gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at http://www.gromacs.org/
> > Support/Mailing_Lists/GMX-Users_List before posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list