[Haskell-cafe] Re: Haskell performance
simonmarhaskell at gmail.com
Thu Dec 20 08:16:08 EST 2007
Malcolm Wallace wrote:
> Simon Peyton-Jones <simonpj at microsoft.com> wrote:
>> What would be v helpful would be a regression suite aimed at
>> performance, that benchmarked GHC (and perhaps other Haskell
>> compilers) against a set of programs, regularly, and published the
>> results on a web page, highlighting regressions.
> Something along these lines already exists - the nobench suite.
> darcs get http://www.cse.unsw.edu.au/~dons/code/nobench
> It originally compared ghc, ghci, hugs, nhc98, hbc, and jhc.
> (Currently the results at
> compare only variations of ghc fusion rules.)
> I have just been setting up my own local copy - initial results at
> where I intend to compare ghc from each of the 6.4, 6.6 and 6.8
> branches, against nhc98 and any other compilers I can get working.
> I have powerpc, intel, and possibly sparc machines available.
That's great. BTW, GHC has a performance bug affecting calendar at the moment:
The best GHC options for this program might therefore be -O2
-fno-state-hack. Or perhaps just -O0.
>> Like Hackage, it should be easy to add a new program.
> Is submitting a patch against the darcs repo sufficiently easy?
> Should we move the master darcs repo to somewhere more accessible, like
Yes, please do. When I have a chance I'd like to help out.
>> It'd be good to measure run-time,
>> but allocation count, peak memory use, code size,
>> compilation time are also good (and rather more stable) numbers to
> Nobench does already collect code size, but does not yet display it in
> the results table. I specifically want to collect compile time as well.
> Not sure what the best way to measure allocation and peak memory use
With GHC you need to use "+RTS -s" and then slurp in the <prog>.stat file.
You can also get allocations, peak memory use, and separate mutator/GC
times this way.
More information about the Haskell-Cafe