[Haskell-cafe] Hyper-recursion?

Albert Y. C. Lai trebla at vex.net
Tue Jan 17 01:54:58 UTC 2017


There is an idea called "open recursion". Illustrating by Fibonacci (the 
direct but naively slow way) and factorial (the direct but naively slow 
way again, see
http://www.luschny.de/math/factorial/FastFactorialFunctions.htm
):

Instead of writing "fac x = x * fac (x-1)", we write:

     fac x = fac_essence fac x

     fac_essence f 0 = 0
     fac_essence f x = x * f (x-1)

And instead of writing "fib x = fib (x-1) + fib (x-2)", we write:

     fib x = fib_essence fib x

     fib_essence f 0 = 0
     fib_essence f 1 = 1
     fib_essence f x = f (x-1) + f (x-2)

The general pattern: Write a helper essence function that takes one more 
parameter, a function parameter; don't recurse yet, use the function 
parameter instead. Leave the recursion to the main function.

Why would we do this?

One reason is philosophical, like you said, to factor out the recursion 
from the essence of one iteration of the process.

Another reason is that you don't have to merely put back the recursion 
in the main function. You can now play some dependency injection trick 
before recursing, or maybe you don't even recurse.

For example, how about injecting a dependency on a lookup-list, with the 
content of said lookup-list coming from said injection --- there hides 
the recursion.

     fib_list = [fib_essence (fib_list !!) i | i <- [0..]]

or maybe you prefer

     fib_list = map (fib_essence (fib_list !!)) [0..]

You will find that "fib_list !! 100" is less slow than "fib 100".

Sometimes you prefer a function-application interface. So we go on to write:

     fib_notslow x = fib_list !! x

In fact sometimes we turn it around, hide away the lookup table, and write:

     fib_notslow = (fib_list !!)
       where
         fib_list = [fib_essence fib_notslow i | i <- [0..]]
         -- I could write "fib_essence (fib_list !!)",
         -- but now that (fib_list !!) has a name, "fib_notslow"...

Thus we have obtained a revisionist-history definition of dynamic 
programming and memoization: Dependency injection of a lookup table into 
a recursion.

List is still not fast enough; a binary search tree (lazily generated) 
is even better. The "memoize" library 
(http://hackage.haskell.org/package/memoize) does that.

Your next food for thought: You have seen open recursion for recursive 
programs. How about open recursion for recursive data?



More information about the Haskell-Cafe mailing list