[Haskell-cafe] Haskell maximum stack depth

Adrian Hey ahey at iee.org
Mon Feb 4 12:26:32 EST 2008


Simon Peyton-Jones wrote:
> | Yes, using lots of stack is clearly bad with ghc, but this is a ghc
> | "bug". In fact the only reason these programs do use lots of stack
> | (vs. heap) is just a peculiarity of ghc rts implementation, so it
> | really should be ghc that fixes the problem, or at least admits
> | responsibility :-)
> 
> I don't think there's anything fundamental here. GHC allocates the stack in the heap, and it can grow as big as you like.  The size limit is simply to catch infinite recursion with a more helpful message than "heap overflow".  I think.  There is one peculiarity though: I don't think we ever shrink the stack, so once it gets big it stays big.  This could be fixed, though.

Yikes!

Sorry, but if what you say is true then things are even worse than I
thought :-( This behaviour seems really bad to me, especially for
concurrent programs.

Also, I can't help thinking that the common justification for the
current limit (that it helps find alleged bugs) is a little lame.
It only helps find bugs if one expects ones program to use less than
8M of stack (hence if it's using more, it's a bug by ones *own*
definition). But if a program or library is deliberately designed to
make use of stack (in preference to heap) for efficiency reasons
(or even just to avoid the awkwardness of using explict CPS style),
then this is a source of bugs in otherwise perfectly correct and
reasonable programs.

If we want some way of investigating a programs stack use there must be
a better way of doing it than deliberately inducing a crash in any
program that exceeds 8M of stack.

Thanks for the answer though. I think I'll write a ticket about this :-)

Regards
--
Adrian Hey




More information about the Haskell-Cafe mailing list