[Haskell-cafe] GHC magic optimization ?

Evan Laforge qdunkan at gmail.com
Thu Dec 3 23:32:46 EST 2009


> I will take an example:
>
> f x y= x+y
>
> The program ask the user to enter two numbers and print the sum. If the
> user enter "1 2" "f 1 2=3" is stored and a gargage collector is used to
> remove this dandling expression later ?
> If the user enter again "1 2", ghc search in dandling results to
> try to find the result without computing it again ?

If you do 'let x = f 1 2' then 'x' will be computed once it's
demanded, and at that point the thunk will be overwritten with a
number.  Further references just return the number.  But when 'x' is
out of scope it gets GCed.  So it depends if you keep 'x' in scope,
just like strict languages.

The interesting thing is CAFs, which at the top level will never be
out of scope and hence live forever.  However, it's my vague
understanding that due to optimizations, a non-global looking CAF can
wind up at the top level.

For example, given these:

expensive arg = trace "exp" arg
f args key = Map.lookup key fm
  where
  fm = Map.fromList (expensive args)
main = let g = f [('a', 1), ('b', 2)] in print (g 'a', g 'c')

g args = (v, args)
    where v = expensive 42
main = print (g 12, g 90)

'trace' implies that expensive is only called once in both cases, but
I have to pass -O2, otherwise it gets called twice.  That implies
inner definitions with no free variables are only promoted (I believe
it's called "let floating"?) with -O2.

I think the 'f' example is impressive.  The GHC optimizer is pretty neat!

http://www.haskell.org/haskellwiki/Constant_applicative_form


More information about the Haskell-Cafe mailing list