[Haskell-cafe] Iteratees again

Brandon Allbery allbery.b at gmail.com
Fri Jun 3 18:09:41 CEST 2011

On Fri, Jun 3, 2011 at 03:53, Ketil Malde <ketil at malde.org> wrote:
> dm-list-haskell-cafe at scs.stanford.edu writes:
>> leaking file descriptors
> ...until they are garbage collected.  I tend to consider the OS fd
> limitation an OS design error - I've no idea why there should be some
> arbitrary limit on open files, as long as there is plenty of memory
> around to store them.  But, well, yes, it is a real concern.

In the case of Unix, available memory was indeed the motivating
factor.  The DEC minicomputers it was developed on didn't have a whole
lot of memory, plus older Unix reallocated the per-process file
structures as part of the (struct proc) for speed (again, old slow

The modern reason for limits is mostly to avoid runaway processes.
Usually the hard limit is set pretty high but the soft limit is lower.

More information about the Haskell-Cafe mailing list