[Haskell-cafe] Re: excercise - a completely lazy sorting algorithm
Heinrich Apfelmus
apfelmus at quantentunnel.de
Tue Jul 7 03:37:09 EDT 2009
Petr Pudlak wrote:
> about a month ago, we were discussing sorting in Haskell with a friend. We
> realized a nice property of lazy merge-sort (which is AFAIK the implementation
> of Data.List.sort): Getting the first element of the sorted list actually
> requires O(n) time, not O(n * log n) as in imperative languages.
Similarly, taking the first k elements will take O(n + k log n)
time. And this also works for the standard lazy quicksort. See also
http://apfelmus.nfshost.com/quicksearch.html
> And immediately we asked: Would it be possible to create a lazy
> selection/sorting algorithm so that getting any element of the sorted
> list/array by its index would require just O(n) time, and getting all the
> elements would still be in O(n * log n)?
>
> More precisely: The result of sorting an n-element list or array should be a
> structure that allows to ask i-th element of the result (for example, a lazy
> array). Asking k arbitrary elements should take no more than
> O(min(n * log n, k * n))
If you want to take elements only from the beginning, then using a list
as result type is enough. But you want random access, and you rightly
note that this requires some other data structure to store the sorted
result, simply because
xs !! k is at least O(k) time
So, a tree like Matthias implements it is the way to go. Basically, it
reifies the recursive calls of quicksort as a lazy data struture which
can be evaluated piecemeal.
(If you don't care about the O(k) inherent in xs !! k and only ask
about the number of comparisons it takes to evaluate xs !! k , then it
is possible to make the standard quicksort slightly lazier so that
this works, too. Details in the link given above.)
Regards,
apfelmus
--
http://apfelmus.nfshost.com
More information about the Haskell-Cafe
mailing list