[Haskell-cafe] Haskell maximum stack depth
simonpj at microsoft.com
Tue Feb 5 03:42:23 EST 2008
| First bad thing:
| Stack size (memory consumed) doubles each time it overflows.
| Second bad thing:
| Arbitrary limit on stack size unrelated to overall (heap) memory
| Third bad thing (the really bad thing):
| If a stack has temporarily grown (to 64M say), it will never shrink
| back down again to something more typical (< 4K say). If I understand
| correctly, it will continue to take 64M from the heap regardless.
| What I would like is to be able to set an upper limit on total memory
| useage and allow the program to freely use this memory as either stack
| or heap. At least that should be the default behaviour, but maybe
| also allow +RTS restrictions for "debugging" (though I don't think this
| is a very good way of investigating a programs stack use).
| I would also like stack memory allocation to increase (and decrease :-)
| in some sane sized linear increment, not double each time. With the
| current scheme, as I understand it, if 65M is needed then 128M will be
Would you like to create a ticket for this? I don't know how many people it bites, and how often, but it'd be good to have it recorded. Changing to a linear increment would be relatively straightforward, but would have quadratic copying cost as the stack grew big. Shrinking stacks would not be hard, I think, at the cost of perhaps copying them again if they grew big.
| Stefan O'Rear suggested an alternative. I don't know how hard it would
| be to implement though (don't really know anything about ghc rts).
Yes, this is the standard solution, and it's a good one because it has a robust cost model (no quadratic costs). However, it's tricky to get right; copying is simpler. If a significant fraction of runtime (for some interesting program(s)) turned out to be consumed by copying stacks then we could consider this.
More information about the Haskell-Cafe