[Haskell-cafe] Haskell performance

Malcolm Wallace Malcolm.Wallace at cs.york.ac.uk
Thu Dec 20 05:58:17 EST 2007


Simon Peyton-Jones <simonpj at microsoft.com> wrote:

> What would be v helpful would be a regression suite aimed at
> performance, that benchmarked GHC (and perhaps other Haskell
> compilers) against a set of programs, regularly, and published the
> results on a web page, highlighting regressions.

Something along these lines already exists - the nobench suite.
    darcs get http://www.cse.unsw.edu.au/~dons/code/nobench
It originally compared ghc, ghci, hugs, nhc98, hbc, and jhc.
(Currently the results at
    http://www.cse.unsw.edu.au/~dons/nobench.html
compare only variations of ghc fusion rules.)

I have just been setting up my own local copy - initial results at
    http://www.cs.york.ac.uk/fp/nobench/powerpc/results.html
where I intend to compare ghc from each of the 6.4, 6.6 and 6.8
branches, against nhc98 and any other compilers I can get working.
I have powerpc, intel, and possibly sparc machines available.

> Like Hackage, it should be easy to add a new program.

Is submitting a patch against the darcs repo sufficiently easy?
Should we move the master darcs repo to somewhere more accessible, like
code.haskell.org?

> It'd be good to measure run-time,

Done...

> but allocation count, peak memory use, code size,
> compilation time are also good (and rather more stable) numbers to
> capture.

Nobench does already collect code size, but does not yet display it in
the results table.  I specifically want to collect compile time as well.
Not sure what the best way to measure allocation and peak memory use
are?

Regards,
    Malcolm


More information about the Haskell-Cafe mailing list