[Haskell-cafe] *GROUP HUG*
waldmann at imn.htwk-leipzig.de
Tue May 24 14:41:22 CEST 2011
Yves Parès <limestrael <at> gmail.com> writes:
> For instance, one of my friends asked me once why the operation of
> calculating the length of list has an O(n) complexity,
> since to his opinion, you could just store the size inside the list
> and increment it when elements are appended.
Then tell me, why does calculating the length of a (Haskell)
list has O(n) complexity.
I could just store the length of the list - as an additional argument
to the "Cons" constructor that is automatically initialized on construction
(and you never need to change it later, since Haskell objects
are "immutable", in the words of Java programmers)
In Haskell, the reason for not doing this (besides simplicity, and inertia)
actually is (I think) laziness: you don't want to evaluate
the "length" field of the second argument of the "cons" prematurely.
Well, unless you look at the "length" field of the constructed object,
it won't be evaluated, but a pile of thunks will be constructed instead,
and I think that's what you really want to avoid.
On the other hand, there are implementations of strict sequences
with an O(1) size implemenation.
More information about the Haskell-Cafe