file too large
Ketil Z. Malde
ketil@ii.uib.no
12 May 2003 14:08:58 +0200
ketil@ii.uib.no (Ketil Z. Malde) writes:
> "Simon Marlow" <simonmar@microsoft.com> writes:
>> That's probably a bug, as long as the underlying OS uses 64-bit file
>> offsets. How does the 2Gb limit manifest itself?
On Solaris, the file grows until 2Gb, then:
Fail: permission denied
Action: commitAndReleaseBuffer
Handle: {loc=/Home/stip/ketil/data/ug-250.seq.tmp_M,type=writable,binary=False,buffering=block (1024)}
Reason: File too large
File: /Home/stip/ketil/data/ug-250.seq.tmp_M
This file can be read from GHCI, but after I append "foo" to it and
try it agin, I get:
Prelude> x <- readFile "/Home/stip/ketil/data/ug-250.seq.tmp_M"
*** Exception: failed
Action: openFile
Reason: Value too large for defined data type
File: /Home/stip/ketil/data/ug-250.seq.tmp_M
>From the manual pages to open(2):
O_LARGEFILE
If set, the offset maximum in the open file descrip-
tion is the largest value that can be represented
correctly in an object of type off64_t.
This is with SunOS 5.8. Seems like it behaves quite similar to Linux
in this respect.
-kzm
--
If I haven't seen further, it is by standing in the footprints of giants