[Haskell-cafe] What's the deal with Clean?

Jason Dagit dagit at codersbase.com
Thu Nov 5 11:26:02 EST 2009


On Thu, Nov 5, 2009 at 6:46 AM, brian <briand at aracnet.com> wrote:

>
> On Nov 5, 2009, at 1:49 AM, David Virebayre wrote:
>
>  I think that's in a way what's Bulat is saying : for Haskell to really
>> compete with C in *his view*, if I understand it, the compiler has to be
>> able to take idiomatic Haskell code, and translate it in idomatic C code or
>> better.
>>
>> Or said another way, we have to be able to write things like SDL, jpeg or
>> mpeg processing in Haskell, instead of writing bindings to C libraries,
>> without losing on performance.
>>
>>
> And this is confusing to those of us who are not compiler experts.
>
> Haskell knows when I have a list of Doubles, you know, because it's
> strongly typed.
>
> Then it proceeds to box them. Huh ?
>

Imagine a computation which will yield a Double if evaluated, but has not
yet been evaluated.   How do you store that in the list?


> The laziness thing has many example od _reducing_ efficiency, but there
> seems to be a real lack of example
> where it helps.  In fact it seems to _always_ hurt.  People sure seem
> excited about it.  Me, not so excited.
>
> I've asked this question before and the answer, apparently, is
> polymorphism.
>

I can't really think of how laziness and polymorphism are related.  For me
the big win with laziness is composability.  Laziness allows us to express
things in ways that are more natural.  The prelude function 'take' is a
perfect example.  It allows you to use finite portions of infinite lists.
 You could then express an infinite series very naturally and then decouple
from that the logic to process finite parts.  The implication here is that
laziness allows you to use data structures for control flow.  This all works
together to enable separation of concerns.  Which is generally a very good
thing if you want to reason about your source code.

Laziness can also be thought of as a transformation on the time complexity
of algorithms.  Sure, the worst-case complexity still remains but often you
can get a better average case by only computing as much as you need.

I hope that helps,
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20091105/4ecc85df/attachment.html


More information about the Haskell-Cafe mailing list