closing file handles
Sun, 09 Sep 2001 12:23:43 -0700
I'm sorry, I had a small error in my code. It should have been:
>go filelist = mapM go' filelist -- no "do return"
> where go' f = do h <- openFile f ReadMode
> text <- hGetContents h
> let c = head $! text
> return (c `seq` (hClose h `seq` c))
Hal Daume wrote:
> I'm working with data spanning a lot of separate files and am having
> difficulty getting ghc to close the files. I've reduced my code to the
> following bit of code which should show my problem:
> >go filelist = do return (mapM go' filelist)
> > where go' f = do h <- openFile f ReadMode
> > text <- hGetContents h
> > let c = head $! text
> > return (c `seq` (hClose h `seq` c))
> basically, you give "go" a list of files and it'll give you the first
> character from each of these.
> if i run this with 30-40 files, it's okay. but when i run it with a
> list of 400 files, i get:
> *** Exception: resource exhausted
> Action: openFile
> Reason: process file table full
> which doesn't make sense to me, since it should be (as far as i can
> tell) closing each of the handles (the yucky "c `seq` (hClose h `seq`
> c)" is my way of making sure that it reads the first character before it
> closes the file and then makes sure to close the file before returning
> the character -- if there is a better way, please let me know).
> if anyone can tell me what my problem is, that would be great!
> also, along the same lines, are there any good papers on using haskell
> with large amounts of data? i frequently have problems like this and if
> there were a unified theory behind dealing with them, it would be
> great. any pointers would be helpful.
> Hal Daume III
> "Computer science is no more about computers | firstname.lastname@example.org
> than astronomy is about telescopes." -Dijkstra | www.isi.edu/~hdaume
> Haskell mailing list
Hal Daume III
"Computer science is no more about computers | email@example.com
than astronomy is about telescopes." -Dijkstra | www.isi.edu/~hdaume