[Haskell] Compilation of big, computed tables
Simon PeytonJones
simonpj at microsoft.com
Thu Feb 23 06:13:05 EST 2006
http://haskell.org/haskellwiki/GHC:FAQ
(search for cse)
 Original Message
 From: haskellbounces at haskell.org [mailto:haskellbounces at haskell.org] On Behalf Of Bjorn Lisper
 Sent: 23 February 2006 10:28
 To: haskell at list.mightyreason.com
 Cc: Haskell at haskell.org
 Subject: Re: [Haskell] Compilation of big, computed tables

 Chris Kuklewicz:
 >Stefan Karrmann wrote:
 >> Dear all,
 >>
 >> can ghc compile huge tables into efficient code if they are constant at
 >> compile time?
 >>
 >> Two examples may clearify the question:
 >>
 >> big1 :: UArray Int Char
 >> big1 = array (0,1000) $! map (\i > (i,toEnum i)) [0..1000]
 >>
 >> big2 = sum [0..10000]::Int  == 50005000 == n*(n+1)/2 where n = 10000
 >>
 >
 >GHC does not compute such values are compile time because
 >*) The computation may not finish in reasonable time (infinite/not halting)
 >*) The computation may generate a runtime error and crash the compilation
 >(divide by zero, pattern march failure, etc.)
 >*) Haskell is supposed to be lazy, things are computed when needed, not before

 Does this mean that GHC does not evaluate constant subexpressions at
 compiletime? Or does it evaluate only some subclass of surely terminating
 and nonerroneous subexpressions at compiletime?

 Constant subexpression evaluation is a common compiler optimization
 technique, so I would be a bit surprised if GHC does not do it.

 Björn LIsper
 _______________________________________________
 Haskell mailing list
 Haskell at haskell.org
 http://www.haskell.org/mailman/listinfo/haskell
More information about the Haskell
mailing list