[Haskell] Quicksearch vs. lazyness
westondan at imageworks.com
Mon Mar 19 09:35:18 EDT 2007
I left my copy of Chris Okasaki's "Functional Data Structures" at home,
but I seem to recall (vaguely) that his heap sort algorithm was best for
sorting/searching the first k elements of an orderable sequence.
If you don't have a copy of this book, you should definitely get one. It
is his dissertation, cleaned up and supplemented (on the website) with
Haskell code (the basis of the Edison library), that examines in detail
the role of laziness in data structures and algorithms on them. His
notions of Banker's credits and Physicist's credits is an
easily-teachable notion of evaluating computational complexity.
Steffen Mazanek wrote:
> say, we want to find the k'th element in a sorted list.
> In imperative languages it is much more efficient
> to not use quicksort first and get the k'th element later,
> but to use a quicksearch algorithm that sorts only
> the relevant parts of the array.
> In Haskell one could implement this idea as follows:
> qsearch k  = error ...
> qsearch k (x:xs)
> | lsmaller >= k = qsearch k smaller
> | lsmaller == 0 && k > 1 = qsearch (k-1) greater
> | lsmaller == 0 = x
> | otherwise = qsearch (k-lsmaller) (x:greater)
> where smaller = filter (<= x) xs
> lsmaller = length smaller
> greater = filter (> x) xs
> However, I was wondering whether it might be possible
> to implement quicksort in a way that quicksearch is
> done out of the box by lazy evaluation. Is there any way
> to do this? From my understanding for small k's lazy
> evaluation already does the trick for the naive quicksort
> algorithm (quicksort smaller ++ [x] ++ quicksort greater),
> doesn't it? Is there a search algorithm that makes better
> use of lazy evaluation out of the box?
> Best regards,
> Haskell mailing list
> Haskell at haskell.org
More information about the Haskell