file too large

Ketil Z. Malde ketil@ii.uib.no
09 May 2003 12:05:19 +0200


"Simon Marlow" <simonmar@microsoft.com> writes:

> That's probably a bug, as long as the underlying OS uses 64-bit file
> offsets.  How does the 2Gb limit manifest itself?

My program dumps a bunch of data to a file.  File grows, I leave, next
morning, I'm greeted with:

  % time ./xsact -k 25 -n 64 -L -p 3 -x ~/data/ug-250.seq > ug-250.L
  zsh: file size limit exceeded  ./xsact -k 25 -n 64 -L -p 3 -x ~/data/ug-250.seq > ug-250.L
  ./xsact -k 25 -n 64 -L -p 3 -x ~/data/ug-250.seq > ug-250.L  2556.40s user 39.50s system 96% cpu 44:58.50 total

I can still do 

  % echo "foo" >> ug-250.L

and the file grows a bit, without error.  Here's a quick (except for
the dd) case:

        % dd if=/dev/zero of=big_file bs=1M count=3000    
        % du -sh big_file  
        3.0G	big_file        
        % ghci
        Prelude> x <- readFile "big_file"
        *** Exception: permission denied
        Action: openFile
        Reason: File too large
        File: big_file
        Prelude> writeFile "big_file" "foo!"
        *** Exception: permission denied
        Action: openFile
        Reason: File too large
        File: big_file

All standard Unix tools seem to work.  RedHat 8.0, GHC 5.04.2 from
RPMs. 

-kzm
-- 
If I haven't seen further, it is by standing in the footprints of giants