robin.houston at gmail.com
Thu Jan 31 18:05:32 EST 2008
Recently I wrote a short program that builds an array containing the
number of divisors of the first 10^7 natural numbers. (Some of you can
guess why I did this, but I won't say explicitly lest this post become
a Googleable spoiler.)
Here's the relevant part of the code.
incElem a i = readArray a i >>= \n -> writeArray a i $! n+1
-- Create an array with elements initialised to 0
a <- newArray (1,n) 0 :: ST s (STUArray s Int Word16)
-- Populate the array
forM_ [2..n] $ \i -> forM_ [i, i+i .. n] (incElem a)
Of course there are more efficient algorithms for this, but this one
has the merit of being extremely simple without being unreasonably
Except that it is a lot slower than it needs to be. After digging
around a bit, I suspected that the reason for this is that the general
form of enumeration [from, then .. to] does not benefit from the
When I added some deforestation rules to GHC/Enum.lhs and rebuilt the
library, I found my suspicion confirmed: after the change, my code
runs roughly twice as fast.
Is there any reason not to do this? I attach a diff of the change I
made, though I expect it could be done better by someone with
experience of this sort of thing.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 2762 bytes
Desc: not available
Url : http://www.haskell.org/pipermail/glasgow-haskell-users/attachments/20080131/2d8bf663/deforest-efdtInt.obj
More information about the Glasgow-haskell-users