[gmx-users] Normal mode segmentation fault - memory problem ?
Marlon Sidore
marlon.sidore at gmail.com
Wed Jul 31 16:04:29 CEST 2019
Hopefully I'm answering the right way.
It also didn't work when using "-last 82737" and the segfault happened
right after "will try to allocate memory and convert to full matrix
representation".
I don't get what I'm doing wrong here.
Best,
Marlon Sidore
06.69.24.81.94
PhD - Post-doctoral fellow
Institut d’Électronique et des Systèmes (UMR 5214)
860 rue de St Priest
34095 Montpellier cedex 5
FRANCE
Le mer. 24 juil. 2019 à 19:50, <
gromacs.org_gmx-users-request at maillist.sys.kth.se> a écrit :
> Send gromacs.org_gmx-users mailing list submissions to
> gromacs.org_gmx-users at maillist.sys.kth.se
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> or, via email, send a message with subject or body 'help' to
> gromacs.org_gmx-users-request at maillist.sys.kth.se
>
> You can reach the person managing the list at
> gromacs.org_gmx-users-owner at maillist.sys.kth.se
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of gromacs.org_gmx-users digest..."
>
>
> Today's Topics:
>
> 1. Normal mode segmentation fault - memory problem ? (Marlon Sidore)
> 2. Viscosity calculations (Nicholas Michelarakis)
> 3. Variation in free energy between GROMACS versions? (Kenneth Huang)
> 4. Re: Variation in free energy between GROMACS versions?
> (Mark Abraham)
> 5. Re: Normal mode segmentation fault - memory problem ?
> (David van der Spoel)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 24 Jul 2019 15:10:16 +0200
> From: Marlon Sidore <marlon.sidore at gmail.com>
> To: gmx-users at gromacs.org
> Subject: [gmx-users] Normal mode segmentation fault - memory problem ?
> Message-ID:
> <CAAASxxpWP53L9N3RvtDu35=
> kaymvnhcW3A6tOo1GNQYYE0GuaQ at mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> I would like a second opinion on the cause of this segfault.
>
> I have a protein in a vacuum. All atom. After extensive minimization, I
> produced a matrix .mtx.
>
> Then, on the cluster, a simple gmx_mpi nmeig -f normal.mtx -s normal.tpr
> leads to:
> Reading double precision matrix generated by GROMACS 2018.6
> Sparse matrix storage format, nrow=27579, ncols=27579
> [n2534:10432:0] Caught signal 11 (Segmentation fault)
> ==== backtrace ====
> 2 0x000000000006ba2c mxm_handle_error()
> /var/tmp/OFED_topdir/BUILD/mxm-3.7.3111/src/mxm/util/debug/debug.c:641
> 3 0x000000000006bf7c mxm_error_signal_handler()
> /var/tmp/OFED_topdir/BUILD/mxm-3.7.3111/src/mxm/util/debug/debug.c:616
> 4 0x0000000000036280 killpg() ??:0
> 5 0x0000000000642091
>
> _ZN17_INTERNAL3a3192c818nma_sparse_hessianEP16gmx_sparsematrixiPK10t_topologyRKSt6vectorImSaImEEiPfSA_()
> /scratch/DEFAUT/akira/
>
> akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/gmxana/gmx_nmeig.cpp:220
> 6 0x0000000000642091 gmx_nmeig()
>
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-201
> 8.6/src/gromacs/gmxana/gmx_nmeig.cpp:471
> 7 0x0000000000431cb9
> _ZN3gmx12_GLOBAL__N_122CMainCommandLineModule3runEiPPc()
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018
>
> .6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/commandline/cmdlinemodulemanager.cpp:133
> 8 0x0000000000431cb9 _ZN3gmx24CommandLineModuleManager3runEiPPc()
>
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.
>
> 1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/commandline/cmdlinemodulemanager.cpp:589
> 9 0x000000000040f788 main()
>
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/s
> rc/programs/gmx.cpp:60
> 10 0x00000000000223d5 __libc_start_main() ??:0
> 11 0x000000000040f5e9 _start() ??:0
> ===================
>
> Either I did something wrong while producing the matrix or I'm doing
> something wrong on the nmeig. S
>
> Should I have done something to restrict the analysis on the backbone
> (somehow) if it's a memory problem ? I'm still allocacating 100G to
> this analysis, I naively thought it was enough.
>
> Best,
>
> Marlon Sidore
>
> PhD - Post-doctoral fellow
> Institut d??lectronique et des Syst?mes
>
>
> ------------------------------
>
> Message: 2
> Date: Wed, 24 Jul 2019 16:26:12 +0200
> From: Nicholas Michelarakis <nick.mihelas at gmail.com>
> To: gromacs.org_gmx-users at maillist.sys.kth.se
> Subject: [gmx-users] Viscosity calculations
> Message-ID:
> <CAGTCbAT4=
> NJfo3Fi3r-ujFChAFAaPWGMPCy5wn9F2Jyyoq5MFQ at mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> Dear everyone,
>
> I am trying to use simulations to calculate the viscosity of a polymer
> solution. Although I have some experience with MD, this is a new area for
> me so apologies for the naive questions. Having gone through the mailing
> list and after reading Hess' paper (2002), I am a bit confused about the
> current best practices.
>
> 1. If I have understood correctly, the -vis flag in gmx energy is not the
> best way to calculate viscosity due to its sensitivity to the system set-up
> and it should be removed. I am assuming this is still the case?
>
> 2. Assuming one wants to use this method, what is the best (shifted)
> potential and electrostatic cut-offs to use?
>
> For using a non-equilibrium method, one has to use the acceleration options
> in the mdp file to set up a shear flow, if I have understood it correctly.
>
> 3. When performing NEMD, can gmx energy and the -vis flag be used to
> calculate the viscosity? The last line of the manual implies so (
>
> http://manual.gromacs.org/documentation/2019-rc1/reference-manual/special/viscosity-calculation.html
> )
> but just want to make sure, given the criticisms above. If not, this then
> only leaves the gmx tcaf option?
>
> 4. When choosing groups to be accelerated to create a flow in acc-grps,
> should one choose both the solvent and the solute (polymer in this case) or
> just the solvent?
>
> 5. I think understand how the deform option works, but is this necessary,
> and if so why, for NEMD?
>
> Thank you very much in advance!
>
> Best,
> Nick
>
>
> ------------------------------
>
> Message: 3
> Date: Wed, 24 Jul 2019 15:25:10 +0000
> From: Kenneth Huang <khuang8 at student.gsu.edu>
> To: "gromacs.org_gmx-users at maillist.sys.kth.se"
> <gromacs.org_gmx-users at maillist.sys.kth.se>
> Subject: [gmx-users] Variation in free energy between GROMACS
> versions?
> Message-ID:
> <
> BN6PR01MB2786C46521056DB0313783B8A0C60 at BN6PR01MB2786.prod.exchangelabs.com
> >
>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi all,
>
> I'm trying to implement a variation of the simulated tempering procedure
> as described in a 2007 paper (Choosing weights for simulated tempering,
> Sanghyun Park and Vijay S. Pande, Phys. Rev. E 76, 016703) on GROMACS 2018,
> and am trying to verify that my implementation is correct compared to the
> system in the paper (Ala10). While I've managed to match a majority of the
> parameters that are described in the paper, the energy values I'm getting
> are all shifted about 3,000 kJ/mol relative to the ones in the paper.
>
> I'm not familiar enough with how mdrun itself has changed within GROMACS,
> and don't expect to fully be able to reproduce the results of the paper, so
> I was wondering if that large of a difference is expected? Considering that
> I'm using as many of the same settings as in the paper (ie, water model,
> forcefield, number of waters, thermostat, etc), is it unusual to see that
> magnitude of a change for a relatively simple system like Ala10 between an
> older and newer version of GROMACS? Or is this something that's expected,
> and I shouldn't worry to much about it?
>
> Best,
>
> Kenneth
>
>
> ------------------------------
>
> Message: 4
> Date: Wed, 24 Jul 2019 19:13:51 +0200
> From: Mark Abraham <mark.j.abraham at gmail.com>
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Subject: Re: [gmx-users] Variation in free energy between GROMACS
> versions?
> Message-ID:
> <
> CAMNuMARzB9Up8NLZ34GKLoSViP3A4sPq8DFFxzvGyViC2WxiYQ at mail.gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> Hi,
>
> On Wed, 24 Jul 2019 at 17:25, Kenneth Huang <khuang8 at student.gsu.edu>
> wrote:
>
> > Hi all,
> >
> > I'm trying to implement a variation of the simulated tempering procedure
> > as described in a 2007 paper (Choosing weights for simulated tempering,
> > Sanghyun Park and Vijay S. Pande, Phys. Rev. E 76, 016703) on GROMACS
> 2018,
> > and am trying to verify that my implementation is correct compared to the
> > system in the paper (Ala10). While I've managed to match a majority of
> the
> > parameters that are described in the paper, the energy values I'm getting
> > are all shifted about 3,000 kJ/mol relative to the ones in the paper.
> >
> > I'm not familiar enough with how mdrun itself has changed within GROMACS,
> > and don't expect to fully be able to reproduce the results of the paper,
> so
> > I was wondering if that large of a difference is expected? Considering
> that
> > I'm using as many of the same settings as in the paper (ie, water model,
> > forcefield, number of waters, thermostat, etc), is it unusual to see that
> > magnitude of a change for a relatively simple system like Ala10 between
> an
> > older and newer version of GROMACS? Or is this something that's expected,
> > and I shouldn't worry to much about it?
> >
>
> There's a couple of philosophical questions here, but the important one for
> the comparison is that such an old version of GROMACS probably didn't shift
> the potential to zero at the cutoff (several MD packages still don't), and
> even the values of the physical constants have been updated since then
> (this was also a problem in comparing single-point energies between MD
> packages in a SAMPL6? paper from Shirts & co). Given that the method used
> reaction-field, you could probably make some measurements on things and
> scale the potential shift by the number of short-range interactions...
>
> Mark
>
> Best,
> >
> > Kenneth
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
>
>
> ------------------------------
>
> Message: 5
> Date: Wed, 24 Jul 2019 19:49:42 +0200
> From: David van der Spoel <spoel at xray.bmc.uu.se>
> To: gmx-users at gromacs.org
> Subject: Re: [gmx-users] Normal mode segmentation fault - memory
> problem ?
> Message-ID: <56878790-4323-22b4-98bc-8ce2decb991d at xray.bmc.uu.se>
> Content-Type: text/plain; charset=utf-8; format=flowed
>
> Den 2019-07-24 kl. 15:10, skrev Marlon Sidore:
> > I would like a second opinion on the cause of this segfault.
> >
> > I have a protein in a vacuum. All atom. After extensive minimization, I
> > produced a matrix .mtx.
> >
> > Then, on the cluster, a simple gmx_mpi nmeig -f normal.mtx -s normal.tpr
> > leads to:
> > Reading double precision matrix generated by GROMACS 2018.6
> > Sparse matrix storage format, nrow=27579, ncols=27579
> > [n2534:10432:0] Caught signal 11 (Segmentation fault)
> > ==== backtrace ====
> > 2 0x000000000006ba2c mxm_handle_error()
> > /var/tmp/OFED_topdir/BUILD/mxm-3.7.3111/src/mxm/util/debug/debug.c:641
> > 3 0x000000000006bf7c mxm_error_signal_handler()
> > /var/tmp/OFED_topdir/BUILD/mxm-3.7.3111/src/mxm/util/debug/debug.c:616
> > 4 0x0000000000036280 killpg() ??:0
> > 5 0x0000000000642091
> >
> _ZN17_INTERNAL3a3192c818nma_sparse_hessianEP16gmx_sparsematrixiPK10t_topologyRKSt6vectorImSaImEEiPfSA_()
> > /scratch/DEFAUT/akira/
> >
> akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/gmxana/gmx_nmeig.cpp:220
> > 6 0x0000000000642091 gmx_nmeig()
> >
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-201
> > 8.6/src/gromacs/gmxana/gmx_nmeig.cpp:471
> > 7 0x0000000000431cb9
> > _ZN3gmx12_GLOBAL__N_122CMainCommandLineModule3runEiPPc()
> > /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018
> >
> .6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/commandline/cmdlinemodulemanager.cpp:133
> > 8 0x0000000000431cb9 _ZN3gmx24CommandLineModuleManager3runEiPPc()
> >
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.
> >
> 1/openmpi/intel/2.0.4/gromacs-2018.6/src/gromacs/commandline/cmdlinemodulemanager.cpp:589
> > 9 0x000000000040f788 main()
> >
> /scratch/DEFAUT/akira/akira/installs/occigen/applications/gromacs/2018.6/intel/18.1/openmpi/intel/2.0.4/gromacs-2018.6/s
> > rc/programs/gmx.cpp:60
> > 10 0x00000000000223d5 __libc_start_main() ??:0
> > 11 0x000000000040f5e9 _start() ??:0
> > ===================
> >
> > Either I did something wrong while producing the matrix or I'm doing
> > something wrong on the nmeig. S
> >
> > Should I have done something to restrict the analysis on the backbone
> > (somehow) if it's a memory problem ? I'm still allocacating 100G to
> > this analysis, I naively thought it was enough.
> 27500^2 * 8 equals 6 Gb. Should not be a memory problem.
> Have you tried using
> -end 82737
> In that case the full matrix will be used. It may be slow though.
> >
> > Best,
> >
> > Marlon Sidore
> >
> > PhD - Post-doctoral fellow
> > Institut d??lectronique et des Syst?mes
> >
>
>
> --
> David van der Spoel, Ph.D., Professor of Biology
> Head of Department, Cell & Molecular Biology, Uppsala University.
> Box 596, SE-75124 Uppsala, Sweden. Phone: +46184714205.
> http://www.icm.uu.se
>
>
> ------------------------------
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
> End of gromacs.org_gmx-users Digest, Vol 183, Issue 58
> ******************************************************
>
More information about the gromacs.org_gmx-users
mailing list