[Haskell-cafe] Re: Problems with strictness analysis?

Luke Palmer lrpalmer at gmail.com
Wed Nov 5 18:34:36 EST 2008


On Wed, Nov 5, 2008 at 1:24 AM, Claus Reinke <claus.reinke at talk21.com> wrote:
>> How can we support analysis of time and space complexity without
>> expanding ourselves into the complicated dirty world of operational
>> thinking?
>
>   equational /= denotational

Nonetheless, Haskell has equational semantics which are derived from
its denotational ones.  But when I said "equational semantics" I
really meant something more like "equations" :-).

>   operational /= bad
>
> Sometimes, denotational is too abstract, offering no guidelines on
> behaviour, just as abstract machine based operational thinking is too
> concrete, hiding
> the insights in a flood of details. Sometimes, operational semantics based
> on directed equations give you the best of both worlds: equational reasoning
> and behavioural guidelines, both at a suitably "clean" and abstract level.

By directed equations you mean unidirectional rewrite rules?

But I'd like to back up a bit.  I may have been too quick to assign
the judgement "dirty" to operational thinking.  I am not asking this
question as a complaint, as a challenge on the language, or as any
other such rhetorical device: my question is earnest.  I would like to
know or to develop a way to allow abstract analysis of time and space
complexity.

On my time working with Haskell, I've developed a pretty accurate
"intuition" about the performance of programs.  It is some meld of
thinking in terms of the lazy evaluation mechanism, some higher-level
rules of thumb about when new thunks are allocated, and probably some
other stuff hiding in the depths of my brain.  I know it, but I am not
satisfied with it, because I can't formalize it.  I wouldn't be able
to write them down and explain to a fellow mathematician how I reason
about Haskell programs.

The example being discussed in this thread is a good one:

  sum [1..10^8] + length [1..10^8]

With Haskell's semantics, we know we can write this as:

  let xs = [1..10^8] in sum xs + length xs

But it causes a change in memory *complexity*, so in some sense these
two sentences are not equal.  What is the theory in which we can
observe this fact?  Is it possible to give a simple denotational
semantics which captures it?


More information about the Haskell-Cafe mailing list