# [Haskell-cafe] Functional progr., images, laziness and all the rest

jerzy.karczmarczuk at info.unicaen.fr jerzy.karczmarczuk at info.unicaen.fr
Wed Jun 21 19:13:47 EDT 2006

```Usually once a year somebody starts a discussion on the merits of
functional/lazy paradigms, etc., in an applicative context, and it is
quite good. People compare Haskell and Ocaml, from time to time
somebody says that - apparently - Clean has better handling of strictness
issues [saying at the same time that he/she doesn't use Clean...], people
divide into different philosophical branches, some complain that it
would be nice to have strict Haskell, others say, that they don't care,
and what is important is the provable/enforced correctness, and laziness
is helpful. People say that the laziness permits to consider the
conditionals as functions, others ask what for, and the discussion is
usually quite interesting.

And here apparently I am one of rare people  - I am not proud of it,
rather quite sad, who defends laziness as an *algorithmisation tool*,
which makes it easy and elegant to construct co-recursive codes. Circular
programs, run-away infinite streams, hidden backtracking etc.

In the domain of images this can be helpful for making filters, especially
infinite-response, recursive filters. For two-dimensional data this might
be clumsy, but for 1D, for example for the sound generation/processing,
you may transform a recurrential equation yielding Y out of X:
Y[n+1] = a*X[N+1] + b*Y[n]
usually (imperatively) implemented as a loop, into a stream definition:

filtr a b x@(x0:xq) = y where
y  = (x0:yq)
yq = a*xq + b*y

with (*) and (+) conveniently overloaded (or replaced by specific
obvious ops).

In such a way you can program in 2 - 6 lines some quite exquisite musical
instruments (for example the Karplus-Strong "guitar", or a flute), construct
the reverberation filters, make ever-rising Shepard/Risset paradoxical
sounds, etc. etc. With laziness it is a sheer pleasure and fun, without -
a pain. If you wish, find my PADL talk on it...

In this context, I found Clean more helpful than Haskell, for ONE reason.
Clean has a primitive datatype: unboxed, spine-lazy but head-strict lists.
The co-recursion works, as the construction of the tail is postponed, but
there is no pollution of the space by thunks - unevaluated list *elements*.

This I really do miss in Haskell... But perhaps I simply don't know how to
obtain a similar behaviour?

For image processing (rather: constructing, but with incremental algorithms)
Clean usage of unique arrays was for me more natural than monadic stuff in
Haskell, but this is probably just a question of style. I agree that here
the laziness is secondary...

Jerzy Karczmarczuk

```