[gmx-users] grompp is using a very large amount of memory on a modestly-sized system

Sean Marks semarks at seas.upenn.edu
Mon Mar 11 19:21:39 CET 2019


Hi, Mark,

I'd be happy to, as soon as I get a chance.

I know very little about how GROMACS works internally, but I had a few
ideas I wanted to share in the hopes that they might help. One is that
pairwise parameters for electrostatics could be implemented in the same way
that LJ parameters are stored. That would provide a tremendous amount of
flexibility for use cases far beyond my own. Or there could simply be an
FEP flag for only scaling between molecule types, rather than also between
molecules of a given type. Again, you guys are the experts and I know you
have other priorities. Just my thoughts.

Best,
Sean

On Fri, Mar 8, 2019 at 3:44 PM Mark Abraham <mark.j.abraham at gmail.com>
wrote:

> Hi,
>
> I don't have a solution for the question at hand, but it'd be great to have
> your inputs attached to a new issue at https://redmine.gromacs.org, so
> that
> we can have such an input case to test with, so that we can improve the
> simplistic implementation! Please upload it if you can.
>
> Mark
>
> On Fri., 8 Mar. 2019, 19:24 Sean Marks, <semarks at seas.upenn.edu> wrote:
>
> > Scratch that comment about sparseness. I am short on sleep, and for a
> > moment thought I was talking about constraints, not electrostatics.
> >
> > On Fri, Mar 8, 2019 at 1:12 PM Sean Marks <semarks at seas.upenn.edu>
> wrote:
> >
> > > I understand now, thank you for the prompt response. While the matrix
> > > would actually be quite sparse (since the constraints are localized to
> > each
> > > ice molecule), I take it that memory is being allocated for a dense
> > matrix.
> > >
> > > That aside, is it feasible to accomplish my stated goal of scaling
> > > ice-water electrostatics while leaving other interactions unaffected?
> One
> > > alternative I considered was manually scaling down the charges
> > themselves,
> > > but doing this causes the lattice to lose its form.
> > >
> > > On Fri, Mar 8, 2019 at 12:28 PM Justin Lemkul <jalemkul at vt.edu> wrote:
> > >
> > >>
> > >>
> > >> On 3/8/19 11:04 AM, Sean Marks wrote:
> > >> > Hi, everyone,
> > >> >
> > >> > I am running into an issue where grompp is using a tremendous amount
> > of
> > >> > memory and crashing, even though my system is not especially large
> > >> (63976
> > >> > atoms).
> > >> >
> > >> > I am using GROMACS 2016.3.
> > >> >
> > >> > My system consists of liquid water (7,930 molecules) next to a block
> > of
> > >> ice
> > >> > (8,094 molecules). The ice oxygens are restrained to their lattice
> > >> position
> > >> > with a harmonic potential with strength k = 4,000 kJ/mol/nm^2. I am
> > >> using
> > >> > the TIP4P/Ice model, which is a rigid 4-site model with a negative
> > >> partial
> > >> > charge located on a virtual site rather than the oxygen.
> > >> >
> > >> > My goal is to systematically reduce the electrostatic interactions
> > >> between
> > >> > the water molecules and the position-restrained ice, while leaving
> > >> > water-water and ice-ice interactions unaffected.
> > >> >
> > >> > To accomplish this, I am modeling all of the ice molecules using a
> > >> single
> > >> > moleculetype so that I can take advantages of GROMACS' FEP features
> to
> > >> > selectively scale interactions. I explicitly specify all constraints
> > and
> > >> > exclusions in the topology file. This moleculetype contains one
> > virtual
> > >> > site, 3 constraints, and 4 exclusions per "residue" (ice molecule).
> > >> >
> > >> > When I run grompp, I receive the following error, which I think
> means
> > >> that
> > >> > a huge block of memory (~9 GB) was requested but could not be
> > allocated:
> > >> >
> > >> > =====
> > >> > Command line:
> > >> >    gmx grompp -f npt.mdp -c md.gro -p topol.top -n index.ndx -r
> > >> > initconf_packmol.gro -o input.tpr -maxwarn 2 -pp processed.top
> > >> >
> > >> > ...
> > >> >
> > >> > Generated 21 of the 21 non-bonded parameter combinations
> > >> > Generating 1-4 interactions: fudge = 0.5
> > >> > Generated 21 of the 21 1-4 parameter combinations
> > >> > Excluding 3 bonded neighbours molecule type 'ICE'
> > >> > turning H bonds into constraints...
> > >> > Excluding 3 bonded neighbours molecule type 'SOL'
> > >> > turning H bonds into constraints...
> > >> > Coupling 1 copies of molecule type 'ICE'
> > >> > Setting gen_seed to 1021640799
> > >> > Velocities were taken from a Maxwell distribution at 273 K
> > >> > Cleaning up constraints and constant bonded interactions with
> virtual
> > >> sites
> > >> > Removing all charge groups because cutoff-scheme=Verlet
> > >> >
> > >> > -------------------------------------------------------
> > >> > Program:     gmx grompp, version 2016.3
> > >> > Source file: src/gromacs/utility/smalloc.cpp (line 226)
> > >> >
> > >> > Fatal error:
> > >> > Not enough memory. Failed to realloc -8589934588 bytes for
> il->iatoms,
> > >> > il->iatoms=25e55010
> > >> > (called from file
> > >> >
> > >>
> >
> /home/semarks/source/gromacs/2016.3/icc/serial/gromacs-2016.3/src/gromacs/
> > >> > gmxpreprocess/convparm.cpp,
> > >> > line 565)
> > >> >
> > >> > For more information and tips for troubleshooting, please check the
> > >> GROMACS
> > >> > website at http://www.gromacs.org/Documentation/Errors
> > >> > -------------------------------------------------------
> > >> > =======
> > >> >
> > >> > In the hope that it helps with diagnosing the problem, here is my
> mdp
> > >> file:
> > >>
> > >> The problem is this:
> > >> > couple-intramol = no    ; don't adjust ice-ice interactions
> > >> >
> > >> This setting causes the creation of a large exclusion matrix, which in
> > >> your case is approximately 32,376 x 32,376 elements. For small
> > >> molecules, this generally isn't an issue, but since you're trying to
> > >> modulate a large number of molecules within a much larger
> > >> [moleculetype], the memory requirement goes up exponentially.
> > >>
> > >> -Justin
> > >>
> > >> --
> > >> ==================================================
> > >>
> > >> Justin A. Lemkul, Ph.D.
> > >> Assistant Professor
> > >> Office: 301 Fralin Hall
> > >> Lab: 303 Engel Hall
> > >>
> > >> Virginia Tech Department of Biochemistry
> > >> 340 West Campus Dr.
> > >> Blacksburg, VA 24061
> > >>
> > >> jalemkul at vt.edu | (540) 231-3129
> > >> http://www.thelemkullab.com
> > >>
> > >> ==================================================
> > >>
> > >> --
> > >> Gromacs Users mailing list
> > >>
> > >> * Please search the archive at
> > >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > >> posting!
> > >>
> > >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >>
> > >> * For (un)subscribe requests visit
> > >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > >> send a mail to gmx-users-request at gromacs.org.
> > >>
> > >
> > >
> > > --
> > > Sean M. Marks
> > > PhD Candidate
> > > Dept. of Chemical and Biomolecular Engineering
> > > University of Pennsylvania
> > > SeanMarks1123 at gmail.com
> > >
> >
> >
> > --
> > Sean M. Marks
> > PhD Candidate
> > Dept. of Chemical and Biomolecular Engineering
> > University of Pennsylvania
> > SeanMarks1123 at gmail.com
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


-- 
Sean M. Marks
PhD Candidate
Dept. of Chemical and Biomolecular Engineering
University of Pennsylvania
SeanMarks1123 at gmail.com


More information about the gromacs.org_gmx-users mailing list