[Haskell-cafe] Question regarding deepseq (Control.DeepSeq)

Frank Moore fmoore at gmail.com
Thu Jun 24 20:57:31 EDT 2010


Hello Haskellers,

I am new to programming in Haskell and I am having trouble understanding
exactly when statements become evaluated.  My goal is to try and measure how
long a computation takes without having to use a show function.  The code I
am trying to use is below (taken in part from RWH chapter 25)

----------------------------------
import Data.List (foldl')
import Data.Time.Clock (diffUTCTime, getCurrentTime)
import Control.DeepSeq (deepseq)

mean :: [Double] -> Double
mean xs = s / fromIntegral n where
    (n,s) = foldl' k (0,0) xs
    k (n,s) x = n `seq` s `seq` (n+1,s+x)

main = do
  let as = [1..1e7] :: [Double]
  start <- getCurrentTime
  let meanOver2 = deepseq (mean as) `seq` (mean as) / fromIntegral 2
  end <- getCurrentTime
  putStrLn (show (end `diffUTCTime` start))
  putStrLn (show meanOver2)
-------------------------------------

My understanding of deepseq was that it evaluates (mean as) completely
before continuing, and then the show would not take any time, but instead
all the time is spent in the show meanOver2 function.  I feel like I am
missing something fundamental here.  Any suggestions?  Thanks for your help.

Frank
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20100624/c831453e/attachment.html


More information about the Haskell-Cafe mailing list