[gmx-users] Cannot write trajectory frame; maybe you are out of disk space
Mark Abraham
mark.j.abraham at gmail.com
Mon Apr 21 16:23:26 CEST 2014
If you can rule out file permissions issues (because you wrote files
earlier), and actually running out of disk, usually that would suggest a
flaky network file system.
Mark
On Mon, Apr 21, 2014 at 4:09 PM, Andrew Bostick
<andrew.bostick1 at gmail.com>wrote:
> Dear gromacs users
>
> I am doing md simulation on the a protein with 190 residues.
>
> I did minimization and equilibration without problem and error.
>
> But in production run step (last md simulation with 15000000 steps),
> in step 6500000, I encountered with following error:
>
> File input/output error:
> Cannot write trajectory frame; maybe you are out of disk space.
>
> my mdp file is as follows:
>
> -----------------------------------------------------------------------------------------
> title = opls Protein MD
> ; Run parameters
> integrator = md ; leap-frog integrator
> nsteps = 15000000 ; 0.002 * 500000 = 1000 ps, 30 ns
> dt = 0.002 ; 2 fs
> ; Output control
> nstxout = 3000 ; save coordinates every 2 ps
> nstvout = 3000 ; save velocities every 2 ps
> nstxtcout = 3000 ; xtc compressed trajectory output
> every 2 ps
> nstenergy = 3000 ; save energies every 2 ps
> nstlog = 3000 ; update log file every 2 ps
> energygrps = Protein Ion
> ; Bond parameters
> continuation = yes ; Restarting after NPT
> constraint_algorithm = lincs ; holonomic constraints
> constraints = all-bonds ; all bonds (even heavy atom-H
> bonds) constrained
> lincs_iter = 1 ; accuracy of LINCS
> lincs_order = 4 ; also related to accuracy
> ; Neighborsearching
> ns_type = grid ; search neighboring grid cells
> nstlist = 5 ; 10 fs
> rlist = 1.0 ; short-range neighborlist cutoff
> (in nm)
> rcoulomb = 1.0 ; short-range electrostatic cutoff
> (in nm)
> rvdw = 1.0 ; short-range van der Waals cutoff
> (in nm)
> ; Electrostatics
> coulombtype = PME ; Particle Mesh Ewald for
> long-range electrostatics
> pme_order = 4 ; cubic interpolation
> fourierspacing = 0.16 ; grid spacing for FFT
> ; Temperature coupling is on
> tcoupl = V-rescale ; modified Berendsen thermostat
> tc-grps = Protein Non-Protein ; two coupling groups - more
> accurate
> tau_t = 0.1 0.1 ; time constant, in ps
> ref_t = 300 300 ; reference temperature, one for
> each group, in K
> ; Pressure coupling is on
> pcoupl = Parrinello-Rahman ; Pressure coupling on in NPT
> pcoupltype = isotropic ; uniform scaling of box vectors
> tau_p = 2.0 ; time constant, in ps
> ref_p = 1.0 ; reference pressure, in bar
> compressibility = 4.5e-5 ; isothermal compressibility of water,
> bar^-1
> ; Periodic boundary conditions
> pbc = xyz ; 3-D PBC
> ; Dispersion correction
> DispCorr = EnerPres ; account for cut-off vdW scheme
> ; Velocity generation
> gen_vel = no ; Velocity generation is off
>
> -----------------------------------------------------------------------------------------
>
> If I use df -h command to know disk space, I obtained following
> information:
>
> Filesystem Size Used Avail Use% Mounted on
> /dev/sda3 220G 14G 196G 7% /
> none 4.0K 0 4.0K 0% /sys/fs/cgroup
> udev 3.0G 4.0K 3.0G 1% /dev
> tmpfs 607M 1.3M 606M 1% /run
> none 5.0M 0 5.0M 0% /run/lock
> none 3.0G 152K 3.0G 1% /run/shm
> none 100M 64K 100M 1% /run/user
> /dev/sda6 245G 61G 184G 25% /media/pdfco/my dear
> /dev/sda2 220G 85G 136G 39% /media/pdfco/Windows
> /dev/sda5 245G 43G 202G 18% /media/pdfco/nazif
>
> ------------------------------------------------------------------------------------------
>
> Is my computer system suitable for this md simulation?
>
> How to solve this error?
>
> Any help will highly appreciated.
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list