[gmx-users] FAQ: File too big
oliver at biop.ox.ac.uk
Wed May 21 22:23:01 CEST 2003
I wondered if anyone has finally found out to how fix the annoying
'File too big problem' (the latest post was
). David suggests the workaround to access the file through nfs (which
works for me) but it is not really the thing that I want to do with a
few times 3+ GB of data.
Splitting the xtc is not a good solution as I need a continous time
series (perhaps one could allow the analysis tools to read multiple
xtc's which they concat internally?).
For the archive:
The problem manifests itself as the message 'File not found' when
any of the analysis tools (including gmxcheck) refeuses to read a file
which is clearly there (and can be manipulated with standard unix
tools (cat, cp, split. hexdump, ...).
ls -l big.xtc:
-rw-r--r-- 1 oliver oliver 2735936052 May 21 17:48 big.xtc
strace gmxcheck -f big.xtc:
open("big.xtc", O_RDONLY) = -1 EFBIG (File too large)
Howvere, I was under the impression that my kernel Linux
2.4.18-27.8.0smp (RedHat vanilla) has full large file support, and
configure's output looks promising:
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... 64
checking for _LARGE_FILES value needed for large files... no
checking for _LARGEFILE_SOURCE value needed for large files... 1
Perhaps someone out there knows a simple solution?
Oliver Beckstein * oliver at bioch.ox.ac.uk
More information about the gromacs.org_gmx-users