[Haskell] Re: Compilation of big, computed tables

Simon Marlow simonmarhaskell at gmail.com
Tue Oct 24 07:14:13 EDT 2006

Bjorn Lisper wrote:
> Chris Kuklewicz: 
>>Stefan Karrmann wrote:
>>>Dear all,
>>>can ghc compile huge tables into efficient code if they are constant at
>>>compile time?
>>>Two examples may clearify the question:
>>>big1 :: UArray Int Char
>>>big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000]
>>>big2 = sum [0..10000]::Int -- == 50005000 == n*(n+1)/2 where n = 10000
>>GHC does not compute such values are compile time because
>>*) The computation may not finish in reasonable time (infinite/not halting)
>>*) The computation may generate a run-time error and crash the compilation
>>(divide by zero, pattern march failure, etc.)
>>*) Haskell is supposed to be lazy, things are computed when needed, not before
> Does this mean that GHC does not evaluate constant subexpressions at
> compile-time?

No, constant folding is definitely one of the optimisations that GHC 

 > Or does it evaluate only some subclass of surely terminating
> and non-erroneous subexpressions at compile-time?

Yes, the compiler will not evaluate expressions that raise exceptions or 
fail.  For example, integer division is only performed at compile time 
for a non-zero dividend.


More information about the Haskell mailing list