nofib regressions in HEAD since 7.6.2 release
Johan Tibell
johan.tibell at gmail.com
Thu Feb 7 22:18:19 CET 2013
Hi all,
I just ran nofib on current HEAD and compared it to 7.6.2 on my 64-bit
Linux machine. There are some regressions I think we should look into
before a release:
--------------------------------------------------------------------------------
Program Size Allocs Runtime Elapsed TotalMem
--------------------------------------------------------------------------------
anna -0.8% +1.5% 0.08 0.08 +0.0%
ansi -0.5% +0.5% 0.00 0.00 +0.0%
atom -0.9% -0.0% -3.1% -0.8% +0.0%
awards -0.9% +0.1% 0.00 0.00 +0.0%
banner +0.1% +2.2% 0.00 0.00 +0.0%
bernouilli -0.6% +2.7% 0.12 0.13 +0.0%
boyer -0.7% +0.0% 0.03 0.04 +0.0%
boyer2 -0.1% +0.0% 0.01 0.01 +0.0%
bspt -0.6% -0.0% 0.01 0.02 +0.0%
cacheprof -0.9% +6.9% +0.8% +4.0% +0.0%
calendar -0.9% +0.1% 0.00 0.00 +0.0%
cichelli -0.1% -0.0% 0.06 0.06 +9.4%
circsim -0.7% +0.1% +1.6% +7.0% +13.6%
clausify -0.8% +0.0% 0.03 0.03 +0.0%
comp_lab_zift -0.7% +0.0% 0.14 0.14 -12.5%
compress -0.1% +0.0% 0.12 0.13 +0.0%
compress2 -0.0% +0.0% 0.12 0.14 +2.3%
constraints -0.8% +0.0% -3.3% -0.7% +0.0%
cryptarithm1 -0.1% +0.0% +2.1% +4.1% +0.0%
cryptarithm2 -0.0% -0.8% 0.01 0.01 +0.0%
cse -0.0% -0.0% 0.00 0.00 +0.0%
eliza +0.1% +7.7% 0.00 0.00 +0.0%
event -0.8% +0.0% 0.09 0.10 -8.7%
exp3_8 -0.8% +0.0% 0.15 0.15 +100.0%
expert -0.2% +18.4% 0.00 0.00 +0.0%
fem +1.5% +0.5% 0.02 0.02 +0.0%
fft -0.9% +0.0% 0.02 0.03 +0.0%
fft2 +3.7% +31.9% 0.05 0.05 +20.0%
fibheaps -1.0% +0.3% 0.03 0.03 +0.0%
fish -0.0% -0.0% 0.01 0.01 +0.0%
fluid -1.3% +13.0% 0.01 0.01 +0.0%
fulsom -0.2% -0.0% 0.19 0.20 +8.3%
gamteb -0.7% -0.2% 0.03 0.04 +0.0%
gcd -0.8% +0.0% 0.02 0.03 +0.0%
gen_regexps +0.0% +2.2% 0.00 0.00 +0.0%
genfft -0.8% -0.3% 0.03 0.03 +0.0%
gg -0.6% +40.4% 0.01 0.02 +50.0%
grep -1.8% +2.3% 0.00 0.00 +0.0%
hidden -0.7% +3.5% +4.1% +7.6% +0.0%
hpg -1.3% -1.3% 0.05 0.10 +0.0%
ida -0.7% -1.0% 0.07 0.08 +11.1%
infer -0.9% +0.5% 0.05 0.05 +30.0%
integer -0.8% +1.1% +1.1% +2.3% +0.0%
integrate -0.9% +56.2% 0.20 0.23 +1.0%
kahan -0.9% +144.9% +77.6% +78.4% +0.0%
knights -0.1% -0.4% 0.01 0.01 +0.0%
lcss -0.7% +0.7% -24.3% -18.4% +1.9%
life -0.2% +0.0% 0.16 0.16 +0.0%
lift -0.0% +0.1% 0.00 0.00 +0.0%
listcompr -0.1% -0.0% 0.06 0.06 +0.0%
listcopy -0.1% -0.0% 0.06 0.06 +0.0%
maillist +0.0% +1.9% 0.02 0.04 +16.5%
mandel -0.8% +0.0% 0.05 0.05 +0.0%
mandel2 -0.1% -4.0% 0.00 0.01 +0.0%
minimax -0.2% -0.0% 0.00 0.00 +0.0%
mkhprog -0.1% +1.1% 0.00 0.00 +0.0%
multiplier -1.2% +0.0% 0.07 0.08 +0.0%
nucleic2 -3.3% +19.5% 0.05 0.05 +0.0%
para -0.0% +25.0% 0.22 0.23 +0.0%
paraffins -0.8% +0.0% 0.06 0.08 +7.5%
parser -1.3% +22.2% 0.03 0.03 +0.0%
parstof -0.5% +4.6% 0.01 0.00 +0.0%
pic +0.5% +0.0% 0.00 0.00 +0.0%
power -1.0% +0.0% -0.5% +1.0% +0.0%
pretty -0.2% +0.0% 0.00 0.00 +0.0%
primes -0.8% -0.0% 0.04 0.05 +0.0%
primetest -0.7% +0.0% 0.07 0.07 +0.0%
prolog -0.2% +16.0% 0.00 0.00 +0.0%
puzzle -0.1% -2.1% 0.09 0.10 +0.0%
queens -0.8% +0.0% 0.02 0.02 +0.0%
reptile -0.8% +0.6% 0.01 0.02 +0.0%
rewrite -0.8% +0.7% 0.02 0.02 +0.0%
rfib -1.0% +0.4% 0.02 0.02 +0.0%
rsa -0.7% +2.5% 0.02 0.02 +0.0%
scc -0.1% +0.0% 0.00 0.00 +0.0%
sched -0.8% +0.0% 0.01 0.02 +0.0%
scs -1.9% +0.8% -5.3% -2.7% +0.0%
simple -0.3% -0.0% 0.15 0.16 +6.9%
solid -0.7% +0.0% 0.09 0.09 +0.0%
sorting +0.0% +55.9% 0.00 0.00 +0.0%
sphere -0.9% -1.8% 0.04 0.04 +0.0%
symalg -0.8% +0.3% 0.01 0.01 +0.0%
tak -0.8% +0.9% 0.01 0.01 +0.0%
transform -0.8% +0.0% -3.3% -5.7% +0.0%
treejoin +0.1% +109.3% 0.15 0.17 -7.4%
typecheck -0.8% +0.0% 0.14 0.15 +0.0%
veritas -0.8% +0.0% 0.00 0.00 +0.0%
wang -0.9% +0.0% 0.07 0.08 +0.0%
wave4main -0.8% +1.8% 0.18 0.19 -7.1%
wheel-sieve1 -0.7% +0.0% +0.0% +1.5% -12.5%
wheel-sieve2 -0.7% +0.0% 0.11 0.12 +2.1%
x2n1 +7.4% +43.3% 0.01 0.01 +200.0%
--------------------------------------------------------------------------------
Min -3.3% -4.0% -24.3% -18.4% -12.5%
Max +7.4% +144.9% +77.6% +78.4% +200.0%
Geometric Mean -0.5% +5.5% +1.8% +4.3% +3.2%
I haven't had time to look through the regressions yet, so if someone has
time, please grab a benchmark that looks bad and have a look at the Core to
see what's going on. I suggest starting with "kahan", which is a relatively
simple benchmark.
-- Johan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.haskell.org/pipermail/ghc-devs/attachments/20130207/d014d92c/attachment-0001.htm>
More information about the ghc-devs
mailing list