[Haskell] Compilation of big, computed tables
Bjorn Lisper
lisper at it.kth.se
Thu Feb 23 05:27:42 EST 2006
Chris Kuklewicz:
>Stefan Karrmann wrote:
>> Dear all,
>>
>> can ghc compile huge tables into efficient code if they are constant at
>> compile time?
>>
>> Two examples may clearify the question:
>>
>> big1 :: UArray Int Char
>> big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000]
>>
>> big2 = sum [0..10000]::Int -- == 50005000 == n*(n+1)/2 where n = 10000
>>
>
>GHC does not compute such values are compile time because
>*) The computation may not finish in reasonable time (infinite/not halting)
>*) The computation may generate a run-time error and crash the compilation
>(divide by zero, pattern march failure, etc.)
>*) Haskell is supposed to be lazy, things are computed when needed, not before
Does this mean that GHC does not evaluate constant subexpressions at
compile-time? Or does it evaluate only some subclass of surely terminating
and non-erroneous subexpressions at compile-time?
Constant subexpression evaluation is a common compiler optimization
technique, so I would be a bit surprised if GHC does not do it.
Björn LIsper
More information about the Haskell
mailing list