[Haskell] Compilation of big, computed tables

Stefan Karrmann S.Karrmann at web.de
Wed Feb 22 15:42:33 EST 2006


Dear all,

can ghc compile huge tables into efficient code if they are constant at
compile time?

Two examples may clearify the question:

big1 :: UArray Int Char
big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000]

big2 = sum [0..10000]::Int -- == 50005000 == n*(n+1)/2 where n = 10000

Both values are constant at compile time. As they are given by pure
functions, the compiler could evaluate them and write the *result* into the
object file 'foo.o'. This would save code size and run time.
I peeked into 'foo.hc' but I didn't found 0x2fb0408 nor an array {0, 1, 2,
3, ..., 1000} or similar things.


The function in big2 should show that the computation of the value could be
very time consuming. If the compiler does not compute it, the source file
could be generated by a helper program (or template Haskell?).

The function in big1 should show that the conversion of the data into an
array could be run-time, code-size and heap-size consuming. (You need the
list (maybe explicitly - think of 1000 fixed pseudo-random numbers) and
convert it into an array.) If the compiler generates the unboxed array
directly it could be rather efficient.


PS: I compiled with: ghc6 -c -O2 -keep-tmp-files -keep-hc-files foo.hs

Regards,
-- 
Stefan Karrmann
-------------- next part --------------
A non-text attachment was scrubbed...
Name: foo.hs
Type: text/x-haskell
Size: 196 bytes
Desc: not available
Url : http://www.haskell.org//pipermail/haskell/attachments/20060222/f9efe452/foo.bin


More information about the Haskell mailing list