[Haskell] [Fwd: Re: Computer Language Shootout]
simonmarhaskell at gmail.com
Thu Jan 25 03:55:37 EST 2007
Forwarding on behalf of Andrzej Jaworski <himself at poczta.nom.pl>:
-------- Original Message --------
From: Andrzej Jaworski <himself at poczta.nom.pl>
It is ironic that just after SPJ disclosed Comments from Brent Fulgham on
Haskell and the shootout the situation has radically changed for the worse.
Without knowing that I committed a blunder referring to the damn benchmark
together with multicore support arguments when trying to convert a prominent
OCaml programmer to Haskell. Now they know more of us:-)
What a language it is that jumps 30% up and down on benchmark while other
languages gracefully stay in line? Can any of you explain the reason for
this disgraceful degradation in Computer Language Shootout?
Clean has also declined in these benchmarks but not that much as Haskell.
According to John van Groningen Clean's binary-trees program in the previous
shootout version used lazy data structure which resulted in lower memory
usage and much faster execution. That was removed by the maintainer of the
shootout and replaced by a much slower one using strict data structure.
I fear that if laziness accounted for previous good scores of GHC then
algorithms where laziness is downplayed must be responsible for that
anecdotal opinion that Haskell can be extremely slow. For example: on Royal
Road Problem in genetic algorithms
Haskell was found to be on average over 500 times slower than SML
If such extreme variations in performance are inherent for Haskell then the
support for multicore rather than boosting its relative performance (against
e.g. one-core-bound OCaml) may merely amplify Haskell unevenness to the
point where programming becomes more of an art than a science like it is in
Perhaps making a collective effort towards benchmarking Haskell programs and
analyzing the results in some methodic way could prove helpful?
More information about the Haskell