[Haskell-cafe] puzzling memory leak? (GHC)
simonmar at microsoft.com
Tue Oct 11 05:04:19 EDT 2005
On 11 October 2005 07:34, Young Hyun wrote:
> What intuition should be my guide about the "proper way" of employing
> lazy evaluation? It's not yet clear to me what you mean by "the
> system needs to hang onto all the input data ...". It seems
> counterintuitive for the evaluator to hang onto partially evaluated
> functions and input data when it can be proven (through static
> analysis) that references to that data can no longer exist in the
> remainder of the program execution; for example, consider
> case head $ drop 500000 $ parseArts ... of
> Right x -> hPutArts stderr x
This example should not have a space leak (try it).
Space leaks arise when large unevaluated expressions build up, or there
are unevaluated expressions that refer to large amounts of data. The
solution is usually to insert some strictness, to force evaluation of
the offending expressions, assuming the evaluated representation is
smaller than the unevaluated one.
I suggest you try out space profiling (with GHC or nhc98), and try
various modifications to your code to get a handle on what's happening.
It's hard to tell exactly where the space leak is in your original code,
because it depends on the actual definition of parseArts. Ben's
analysis is definitely plausible. Take a look at the code and ask
yourself: does each element of the list refer to previous elements, or
to an expression accumulated while traversing the list? If this is the
case, and you never actually evaluate the head of the list, then the
entire list will be retained.
A good way to begin to understand space leaks is to study foldl (see
many previous threads on this list).
More information about the Haskell-Cafe