[Haskell-cafe] Haskell maximum stack depth

Adrian Hey ahey at iee.org
Mon Feb 4 14:23:09 EST 2008


Hello Simon,

Simon Peyton-Jones wrote:
> | Sorry, but if what you say is true then things are even worse than I
> | thought :-( This behaviour seems really bad to me, especially for
> | concurrent programs.
> 
> Which behaviour precisely?  Can you say what is wrong and what behaviour you expect?

Roughly..

First bad thing:
Stack size (memory consumed) doubles each time it overflows.

Second bad thing:
Arbitrary limit on stack size unrelated to overall (heap) memory
available.

Third bad thing (the really bad thing):
If a stack has temporarily grown (to 64M say), it will never shrink
back down again to something more typical (< 4K say). If I understand
correctly, it will continue to take 64M from the heap regardless.

What I would like is to be able to set an upper limit on total memory
useage and allow the program to freely use this memory as either stack
or heap. At least that should be the default behaviour, but maybe
also allow +RTS restrictions for "debugging" (though I don't think this
is a very good way of investigating a programs stack use).

I would also like stack memory allocation to increase (and decrease :-)
in some sane sized linear increment, not double each time. With the
current scheme, as I understand it, if 65M is needed then 128M will be
allocated.

Stefan O'Rear suggested an alternative. I don't know how hard it would
be to implement though (don't really know anything about ghc rts).

  http://haskell.org/pipermail/glasgow-haskell-users/2007-May/012472.html

Regards
--
Adrian Hey







More information about the Haskell-Cafe mailing list