[gmx-users] Issues with huge system

Justin Lemkul jalemkul at vt.edu
Sat Nov 5 15:44:13 CET 2016



On 11/4/16 5:09 PM, Kevin Chen wrote:
> Hi Gromacs Users,
>
>
>
> We 've observed  some issues while creating huge simulation boxes (over35
> nm* 35 nm * 35nm in size). Commands used for building our benchmarks showed
> as follow:
>
> gmx pdb2gmx -f 1AKI.pdb -o 1AKI_processed.gro -water spce
>
> gmx editconf -f 1AKI_processed.gro -o 1AKI_newbox.gro -c -box 33 33 33 -bt
> cubic
>
> gmx solvate -cp 1AKI_newbox.gro -cs spc216.gro -o 1AKI_solv.gro -p topol.top
>
> gmx grompp -f ions.mdp -c 1AKI_solv.gro -p topol.top -o ions.tpr
>
> gmx genion -s ions.tpr -o 1AKI_solv_ions.gro -p topol.top -pname NA -nname
> CL -nn 8
>
> gmx grompp -f minim.mdp -c 1AKI_solv_ions.gro -p topol.top -o em.tpr
>
> The problem only showed up when using box size greater than 30nm * 30nm * 30
> nm. Once the box is bigger than 30*30*30, the simulation always crashes
> during EM stage. However everything is fine when using any box sizes smaller
> than 30*30*30.   As such, we are eager to know if that was something we did
> wrong or there's some sort of issues for Gromacs dealing with large systems.
> Any suggestions and pointers are welcome!
>

Crashes with what error?  Are you simply running out of memory?

-Justin

-- 
==================================================

Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==================================================


More information about the gromacs.org_gmx-users mailing list