[Haskell-beginners] Function application versus function composition performance

Matt Ford matt at dancingfrog.co.uk
Wed Jun 20 00:01:32 CEST 2012


Hi All,

My last question got me thinking (always dangerous): an expression
that doubles a list and takes the 2nd element (index 1), I assume,
goes through the following steps.

double (double [1,2,3,4]) !! 1
double ( 1*2 : double [2,3,4]) !! 1
1*2*2 : double ( double [2,3,4] ) !! 1
1*2*2 : double ( 2*2 : double [3,4] ) !! 1
1*2*2 : 2*2*2 : double ( double [3,4] ) !! 1
2*2*2
8

Up until the element requested all the proceeding elements have to
have the expression formed as Haskell process the calculation.

Now, I want to compare this to using function composition

( double . double ) [ 1 ,2, 3, 4 ] !! 1

This is the bit I'm unsure of - what does the composition look like.
It is easy to see that it could be simplified to something like:

( double . double) (x:xs) = x*4 : (double . double) xs

This would mean the steps could be written

(double . double) [ 1,2,3,4 ] !! 1
(1*4 : (double.double) [2,3,4]) !! 1
(1*4 : 2*4 : (double.double) [ 3,4 ]) !! 1
2*4
8

Ignoring the start and end steps, it will be half the number of steps
compared to the application version.  Significant then, over long
lists.

So is this true, are composite functions simplified in Haskell in
general so that they may have improved performance over function
application?

I've seen posts on the internet that say it's a matter of style only:
http://stackoverflow.com/questions/3030675/haskell-function-composition-and-function-application-idioms-correct-us.
 But my reasoning suggests it could be more than that.

Perhaps, function application is similarly optimised - maybe by
replacing all functions applications with composition and then
simplifying?  Or maybe the simplifying/optimisation step never
happens?  As you can see I'm just guessing at things :-)  But it's
nice to wonder.

Many thanks for any pointers,

Matt.



More information about the Beginners mailing list