[gmx-users] gmx mdrun std::bad_alloc whilst using PLUMED
Nash, Anthony
a.nash at ucl.ac.uk
Wed Nov 18 09:29:42 CET 2015
Thanks Mark,
I threw an email across to the plumed group this morning. I was surprised
to get a reply almost immediately. It *could* be the memory allocation
required to define the grid spacing in PLUMED.
Thanks
Anthony
Dr Anthony Nash
Department of Chemistry
University College London
On 17/11/2015 22:24, "gromacs.org_gmx-users-bounces at maillist.sys.kth.se on
behalf of Mark Abraham" <gromacs.org_gmx-users-bounces at maillist.sys.kth.se
on behalf of mark.j.abraham at gmail.com> wrote:
>Hi,
>
>GROMACS is apparently the first to notice that memory is a problem, but
>you
>should also be directing questions about memory use with different kinds
>of
>CVs to the PLUMED people. mdrun knows nothing at all about the PLUMED CVs.
>The most likely explanation is that they have some data structure that
>works OK on small scale problems, but which doesn't do well as the number
>of atoms, CVs, CV complexity, and/or ranks increases.
>
>Mark
>
>On Tue, Nov 17, 2015 at 11:05 PM Nash, Anthony <a.nash at ucl.ac.uk> wrote:
>
>> Hi all,
>>
>> I am using PLUMED 2.2 and gromacs 5.0.4. For a while I had been testing
>> the viability of three collective variables for plumed, two dihedral
>> angles and one centre of mass distance. After observing my dimer rotate
>> about each other I decided it needed an intrahelical distance between
>>two
>> of the dihedral atoms (A,B,C,D), per helix, to sample the CV space
>>whilst
>> maintaining the Œregular¹ alpha-helical structure (the dihedral sampling
>> was coming from the protein uncoiling rather than rotating). Note: it is
>> likely that I will change these distances to the built-in alpha helical
>> CV.
>>
>> The moment I increased the number of CVs from three to five, gromacs
>> throws out a memory error. When I remove the 5th CV it still crashes.
>>When
>> I remove the 4th it stops crashing.
>>
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>> CLUSTER OUTPUT FILE
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>>
>>
>> starting mdrun 'NEU_MUT in POPC in water'
>> 50000000 steps, 100000.0 ps.
>>
>> -------------------------------------------------------
>> Program: gmx mdrun, VERSION 5.0.4
>>
>> Memory allocation failed:
>> std::bad_alloc
>>
>> For more information and tips for troubleshooting, please check the
>>GROMACS
>> website at http://www.gromacs.org/Documentation/Errors
>> -------------------------------------------------------
>> Halting parallel program mdrun_mpi_d on CPU 0 out of 12
>>
>>
>>
>>
>> It halts all 12 processes and the job dies. I increased the memory so I
>>am
>> using 43.2 GB of RAM distributed over 12 processes. The job still fails
>> (but then, allocation of memory is very different to not having any
>>memory
>> at all).
>>
>> The contents of the gromacs.log file only reports the initialisation of
>> gromacs followed by the initialisation of plumed. After this I would
>>have
>> expected the regular MD stepping output. I¹ve included the plumed
>> initialisation below. I would appreciate any help. I suspect the problem
>> lies with the 4th and 5th CV although systematically removing them and
>> playing around with the parameters hasn¹t yielded anything yet. Please
>> ignore what parameter values I have set. Besides the atom number
>> everything else is a result of me trying to work out where certain
>>ranges
>> of values is causing PLUMED to exit and gromacs to crash. PLUMED input
>> file below:
>>
>>
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>> PLUMED INPUTFILE
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>>
>> phi: TORSION ATOMS=214,230,938,922
>> psi: TORSION ATOMS=785,801,367,351
>>
>> c1: COM ATOMS=1-571
>> c2: COM ATOMS=572-1142
>> COMdist: DISTANCE ATOMS=c1,c2
>>
>> d1: DISTANCE ATOMS=214,367
>> d2: DISTANCE ATOMS=938,785
>>
>> UPPER_WALLS ARG=COMdist AT=2.5 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=COMuwall
>> LOWER_WALLS ARG=COMdist AT=1.38 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=COMlwall
>>
>> UPPER_WALLS ARG=d1 AT=1.260 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=d1uwall
>> LOWER_WALLS ARG=d1 AT=1.228 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=d1lwall
>>
>> UPPER_WALLS ARG=d2 AT=1.228 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=d2uwall
>> LOWER_WALLS ARG=d2 AT=1.196 KAPPA=1000 EXP=2.0 EPS=1.0 OFFSET=0
>> LABEL=d2lwall
>>
>> METAD ...
>> LABEL=metad
>> ARG=phi,psi,COMdist,d1,d2
>> PACE=1
>> HEIGHT=0.2
>> SIGMA=0.06,0.06,0.06,0.06,0.06
>> FILE=HILLS_neu_mut_meta_A
>> BIASFACTOR=10.0
>> TEMP=310.0
>> GRID_MIN=-pi,-pi,0,0,0
>> GRID_MAX=pi,pi,2.5,2.5,2.5
>> GRID_SPACING=0.01,0.01,0.01,0.01,0.01
>> ... METAD
>>
>>
>> PRINT STRIDE=100
>>
>>ARG=phi,psi,COMdist,COMlwall.bias,COMuwall.bias,d1,d1lwall.bias,d1uwall.b
>>ia
>> s,d2,d2lwall.bias,d2uwall.bias,metad.bias FILE=COLVAR_neu_mut_meta_A
>>
>>
>>
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>> GROMACS LOGFILE
>> ‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹‹
>>
>> Center of mass motion removal mode is Linear
>> We have the following groups for center of mass motion removal:
>> 0: rest
>> There are: 53575 Atoms
>> Charge group distribution at step 0: 4444 4474 4439 4268 4913 4471 4298
>> 4519 4395 4584 4474 4296
>> Initial temperature: 311.436 K
>>
>>
>> PLUMED: PLUMED is starting
>> PLUMED: Version: 2.2.0 (git: Unknown) compiled on Nov 6 2015 at
>>11:15:41
>> PLUMED: Please cite this paper when using PLUMED [1]
>> PLUMED: For further information see the PLUMED web page at
>> http://www.plumed-code.org
>> PLUMED: Molecular dynamics engine: gromacs
>> PLUMED: Precision of reals: 8
>> PLUMED: Running over 12 nodes
>> PLUMED: Number of threads: 1
>> PLUMED: Cache line size: 512
>> PLUMED: Number of atoms: 53575
>> PLUMED: File suffix:
>> PLUMED: FILE: neu_mut_meta_A.dat
>> PLUMED: Action TORSION
>> PLUMED: with label phi
>> PLUMED: between atoms 214 230 938 922
>> PLUMED: using periodic boundary conditions
>> PLUMED: Action TORSION
>> PLUMED: with label psi
>> PLUMED: between atoms 785 801 367 351
>> PLUMED: using periodic boundary conditions
>> PLUMED: Action COM
>> PLUMED: with label c1
>> PLUMED: serial associated to this virtual atom is 53576
>>
>> <ATOMS FROM A RANGE - I have removed for clarity>
>>
>> PLUMED: PBC will be ignored
>> PLUMED: Action COM
>> PLUMED: with label c2
>> PLUMED: serial associated to this virtual atom is 53577
>>
>> <ATOMS FROM A RANGE - I have removed for clarity>
>> PLUMED: PBC will be ignored
>> PLUMED: Action DISTANCE
>> PLUMED: with label COMdist
>> PLUMED: between atoms 53576 53577
>> PLUMED: using periodic boundary conditions
>> PLUMED: Action DISTANCE
>> PLUMED: with label d1
>> PLUMED: between atoms 214 367
>> PLUMED: using periodic boundary conditions
>> PLUMED: Action DISTANCE
>> PLUMED: with label d2
>> PLUMED: between atoms 938 785
>> PLUMED: using periodic boundary conditions
>>
>> PLUMED: Action UPPER_WALLS
>> PLUMED: with label COMuwall
>> PLUMED: with stride 1
>> PLUMED: with arguments COMdist
>> PLUMED: at 2.500000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: COMuwall.bias
>> PLUMED: added component to this action: COMuwall.force2
>> PLUMED: Action LOWER_WALLS
>> PLUMED: with label COMlwall
>> PLUMED: with stride 1
>> PLUMED: with arguments COMdist
>> PLUMED: at 1.380000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: COMlwall.bias
>> PLUMED: added component to this action: COMlwall.force2
>> PLUMED: Action UPPER_WALLS
>> PLUMED: with label d1uwall
>> PLUMED: with stride 1
>> PLUMED: with arguments d1
>> PLUMED: at 1.260000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: d1uwall.bias
>> PLUMED: added component to this action: d1uwall.force2
>> PLUMED: Action LOWER_WALLS
>> PLUMED: with label d1lwall
>> PLUMED: with stride 1
>> PLUMED: with arguments d1
>> PLUMED: at 1.228000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: d1lwall.bias
>> PLUMED: added component to this action: d1lwall.force2
>>
>> PLUMED: Action UPPER_WALLS
>> PLUMED: with label d2uwall
>> PLUMED: with stride 1
>> PLUMED: with arguments d2
>> PLUMED: at 1.228000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: d2uwall.bias
>> PLUMED: added component to this action: d2uwall.force2
>> PLUMED: Action LOWER_WALLS
>> PLUMED: with label d2lwall
>> PLUMED: with stride 1
>> PLUMED: with arguments d2
>> PLUMED: at 1.196000
>> PLUMED: with an offset 0.000000
>> PLUMED: with force constant 1000.000000
>> PLUMED: and exponent 2.000000
>> PLUMED: rescaled 1.000000
>> PLUMED: added component to this action: d2lwall.bias
>> PLUMED: added component to this action: d2lwall.force2
>> PLUMED: Action METAD
>> PLUMED: with label metad
>> PLUMED: with stride 1
>> PLUMED: with arguments phi psi COMdist d1 d2
>> PLUMED: The number of bins will be estimated from GRID_SPACING
>> PLUMED: Gaussian width 0.060000 0.060000 0.060000 0.060000 0.060000
>> Gaussian height 0.200000
>> PLUMED: Gaussian deposition pace 1
>> PLUMED: Gaussian file HILLS_neu_mut_meta_A
>> PLUMED: Well-Tempered Bias Factor 10.000000
>> PLUMED: Hills relaxation time (tau) 0.231973
>> PLUMED: KbT 2.577483
>> PLUMED: Grid min -pi -pi 0 0 0
>> PLUMED: Grid max pi pi 2.5 2.5 2.5
>> PLUMED: Grid bin 629 629 251 251 251
>> PLUMED: Grid uses spline interpolation
>> PLUMED: added component to this action: metad.bias
>> PLUMED: added component to this action: metad.work
>>
>>
>>
>>
>>
>> Many thanks
>> Anthony
>>
>> Dr Anthony Nash
>> Department of Chemistry
>> University College London
>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>
>--
>Gromacs Users mailing list
>
>* Please search the archive at
>http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>posting!
>
>* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>* For (un)subscribe requests visit
>https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list