[Haskell-cafe] Haskell maximum stack depth
Derek Elkins
derek.a.elkins at gmail.com
Wed Jan 30 00:50:53 EST 2008
On Tue, 2008-01-29 at 08:18 +0000, Adrian Hey wrote:
> Derek Elkins wrote:
> > While perhaps for a simple throw-away program it may be beneficial to
> > write code that allocates unnecessary stack, I personally consider
> > unnecessary stack use a bug. A stack overflow, to me, is always
> > indicative of a bug.
>
> The "bug" is in ghc stack management. Why is it so important that the
> stack size is arbitrarily limited? It's just an intermediate data
> structure, no different from any other intermediate data structure
> you may build on the heap (well apart from it's efficiency). But I guess
> we would be in danger of having our programs run too fast if folk were
> silly enough to make use of the stack.
I said -unnecessary- stack use. It makes no difference if the stack
size is only "limited" by how much core+swap space you have. If you
have no restrictions on stack use and you get a stack overflow, I still
consider it a bug. It's the unnecessary use of the stack, not the
overflow that is the bug; as I said, the stack overflow is -indicative-
of a bug.
However, a question comes up: why is unnecessary "stack" use bad but
unnecessary heap use acceptable? The answer is, for me, it isn't, but
heap and stack are different: length shouldn't take O(n) space period;
whether it's stack space or heap space. I consider an "Out of Memory"
error to be indicative of a bug too unless you are operating over large
data sets or knowingly using a particularly memory-hungry algorithm. A
program that has live sets in the gigabyte range on "small" inputs are
buggy to me. Space leaks are bugs whether it's stack space or heap
space. But.
But, there is a difference (theoretically) on what goes on the heap and
what goes on the stack. The stack is the -control- stack. It, in
theory, holds (only) "control" information (which may include parameters
which are waiting for other parameters to be evaluated, and other such
things). Usually stack overflows are caused by "linear" stack use.
Usually you need pathological control flow to need "linear" stack use,
so either you were unlucky and your data was pathological or you are
setting things up so that the actual case is the pathological one. [As
you might guess, I consider the pathological (left associative) use of
(++) to be a bug as well, albeit more of a "time leak" than a space
leak.]
Practically speaking, stack overflows almost always indicate a (stack)
space leak or an unintended infinite loop.
> So perhaps the current ghc defaults are too generous. What limit do you
> think should be placed on the stack size that a non buggy program can
> use?
Hopefully, you can see why what, if any, limit is set is irrelevant to
my point. All other things being equal, memory being the limit is
ideal. Usually, all other things aren't equal.
[From a different email]
> As nobody has provided any sensible justification for the assertion
> that using lots of stack (vs. heap) inherently is bad (and that
> programs which do have bugs, by definition), it seems to me this is
> one of those quasi-religious beliefs, like (so called) "global
> variables" or the cyclic module dependencies being a bug (by
> definition).
Unnecessarily using lots of stack -or- heap is inherently bad, though,
it may perhaps be the lesser of two evils.
More information about the Haskell-Cafe
mailing list