closing file handles
Hal Daume
hdaume@ISI.EDU
Sun, 09 Sep 2001 11:23:35 -0700
I'm working with data spanning a lot of separate files and am having
difficulty getting ghc to close the files. I've reduced my code to the
following bit of code which should show my problem:
>go filelist = do return (mapM go' filelist)
> where go' f = do h <- openFile f ReadMode
> text <- hGetContents h
> let c = head $! text
> return (c `seq` (hClose h `seq` c))
basically, you give "go" a list of files and it'll give you the first
character from each of these.
if i run this with 30-40 files, it's okay. but when i run it with a
list of 400 files, i get:
*** Exception: resource exhausted
Action: openFile
Reason: process file table full
File:
/nfs/isd/marcu/RST-CORPUS-LDC/RSTtrees-WSJ-main-1.0/DATA/wsj_1355.out.dis
which doesn't make sense to me, since it should be (as far as i can
tell) closing each of the handles (the yucky "c `seq` (hClose h `seq`
c)" is my way of making sure that it reads the first character before it
closes the file and then makes sure to close the file before returning
the character -- if there is a better way, please let me know).
if anyone can tell me what my problem is, that would be great!
also, along the same lines, are there any good papers on using haskell
with large amounts of data? i frequently have problems like this and if
there were a unified theory behind dealing with them, it would be
great. any pointers would be helpful.
thanks!
--
Hal Daume III
"Computer science is no more about computers | hdaume@isi.edu
than astronomy is about telescopes." -Dijkstra | www.isi.edu/~hdaume