[Haskell-cafe] Re: Red-Blue Stack
apfelmus at quantentunnel.de
Sat Sep 27 07:33:19 EDT 2008
Josef Svenningsson wrote:
> Stephan Friedrichs wrote:
>> My question is: Is there a case, where finding a persistent
>> solution that performs equally well is *impossible* rather than just
>> harder? I mean might there be a case where (forced) persistence (as we
>> have in pure Haskell) is a definite disadvantage in terms of big-O
>> notation? Do some problems even move from P to NP in a persistent setting?
> The only result I'm aware of is that of Nicholas Pippenger where he
> shows that there are algorithms which are slower by a factor of log n
> if one is not allowed to use mutation:
Note however that Pippenger is forced to make the additional assumption
that the computation is online, i.e. that it operates on an infinite list
data Inf a = a :> (Inf a)
doPerms :: Int -> [Int] -> Inf a -> Inf a
doPerms ~= \n ps -> concat . map (perm ps) . group n
I am not aware of any result that was able to lift this restriction.
In section 3, Pippenger essentially discusses that every ephemeral data
structure that needs T(n) can be made persistent in T(n)*log T(n) time,
basically by making the storage explicit, i.e. simulating RAM with a
pure array like a binary tree. So, we can at least say that problems in
P will stay there.
> Interestingly enough, this particular result does not carry over to
> Haskell. The particular algorithm that he uses can actually be
> implemented optimally using lazy evaluation, as show in the following
> So a pure strict language is less efficient than a strict language
> with mutation and a pure lazy language. Although intuitively a pure
> lazy language should also be less efficient than a strict language
> with mutation I'm not aware of any such results.
Yes, lazy evaluation makes persistent data structures much easier,
sometimes even possible. It only gives amortized times, though.
More information about the Haskell-Cafe