Performance horrors
Duncan Coutts
duncan.coutts at worc.ox.ac.uk
Fri Aug 29 09:49:11 EDT 2008
On Fri, 2008-08-29 at 02:56 -0400, Cale Gibbard wrote:
> On a side note related to the request for inclusion of complexities,
> since Haskell is a lazy language, we really ought to have complexities
> written in terms of the demanded portion of the result. Of course,
> Data.Set and Data.Map are (structure) strict, so it doesn't affect
> them so much, but it would certainly be nice for the functions in
> Data.List to know the answer to "if If the input is size n and I only
> demand k of the elements of the result, then what is the complexity?",
> especially for things like sort, where a lazy implementation can, for
> instance, make the head of the list available in just O(n) time.
Yeah, that's fairly important, though it's quite subtle. For example
we'd probably say if we demand k elements of the result of map then it
costs proportional to k (though we're not accounting for the cost of the
function application to each element), but technically that's true for
cons or tail too, even though they only do O(1) work and do not change
the 'potential'/'cost' of the remainder of the list.
Probably we can gloss over the details for a simple indication in the
docs, but just for fun here's a spanner:
foldr (\x xs -> x `seq` (x:xs)) []
it's only O(1) to eval whnf. It adds O(n) work to the whole structure,
but that property doesn't distinguish it from
foldr (\x xs -> (x:xs)) []
the difference of course is that it identifies costs within the
structure. So when a function demands the spine it also has to pay the
cost of the elements up front.
Duncan
More information about the Libraries
mailing list