[Haskell-cafe] How far compilers are allowed to go with optimizations?
Johan Holmquist
holmisen at gmail.com
Sat Feb 9 09:56:12 CET 2013
As a software developer, who typically inherits code to work on rather
than simply writing new, I see a potential of aggressive compiler
optimizations causing trouble. It goes like this:
Programmer P inherits some application/system to improve upon. Someday
he spots some piece of rather badly written code. So he sets out and
rewrites that piece, happy with the improvement it brings to clarity
and likely also to efficiency.
The code goes into production and, disaster. The new "improved"
version runs 3 times slower than the old, making it practically
unusable. The new version has to be rolled back with loss of uptime
and functionality and management is not happy with P.
It just so happened that the old code triggered some aggressive
optimization unbeknownst to everyone, **including the original
developer**, while the new code did not. (This optimization maybe even
was triggered only on a certain version of the compiler, the one that
happened to be in use at the time.)
I fear being P some day.
Maybe this is something that would never happen in practice, but how
to be sure...
/Johan
2013/2/6 Jan Stolarek <jan.stolarek at p.lodz.pl>:
> You're right, somehow I didn't thought that DPH is doing exactly the same thing. Well, I think
> this is a convincing argument.
>
> Janek
>
>
> _______________________________________________
> Haskell-Cafe mailing list
> Haskell-Cafe at haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
More information about the Haskell-Cafe
mailing list