upgrading GHC

Henning Thielemann lemming at henning-thielemann.de
Thu Feb 21 02:12:19 EST 2008


 I did now spend some days(!) on making only some of my packages
compatible to both GHC-6.4.1 and GHC-6.8.2. The amount of adaption work
increased with every GHC update for me, also because the number of
installed packages constantly grew. I'm hardly able to manage this work
for GHC-6.10, many packages will then go 'outdated', maybe only weeks
after their release. Some people wonder, why not simply upgrade. There are
many reasons: This way you can easily fall into a gap between dependent
packages that are still not updated to GHC-6.8.2 and others that are
already updated, but not backwards compatible. Compiler versions are
different in usability, bugs and annoyances. Namely, GHC-6.4.1 introduced
wrong warnings on apparently superfluous imports and a bug that let one of
my modules become uncompileable because of the compiler running out of
memory, GHC-6.6 replaced working filename completion by only partially
working identifier completion (it was certainly not a good idea, to remove
the old behaviour completely, before the new one worked reliably, but it
happened and we have to cope with), GHC-6.8.1 had a bug in compilation.
So after investing much time in upgrading you might encounter that your
programs don't work anymore or usability decreased considerably and you
have the choice to wait for the next compiler release, try to compile the
HEAD version from repository yourself (good luck!) or turn everything back
to the old version. Even if the compiler does only get better with respect
to features, you might decide not to upgrade, because the newer version
consumes more memory or is slower due to more features that the compiler
must handle.
 Every GHC update so far forced me to recompile my packages, broke some
code, either by new class instances, modules being replaced by newer ones,
shifting modules between packages. Sometimes the update helped improving
the code, either when the compiler emitted new warnings or when internal
functions were changed, and I became aware, that I was using internal
functions. But it is very hard to get a library compiled on different
compiler versions, not to mention different compilers. This is especially
nasty if you are working in an institute (like the universities I worked
at in the past) with different machines with very different software
installations. We have some Solaris machines here with GHC-5, which I do
not administer, Linux machines with GHC-6.4.1, GHC-6.6.1 and so on. I
cannot simply push around patches with darcs because every machine needs
separate package adaption.
 It was said, that Cabal would work also with GHC-6.2. I didn't get it
running and then switched to GHC-6.4. It was said, that multiple versions
of GHC can be installed on the same machine. That's somehow true, but e.g.
runhaskell cannot be told which actual GHC binary to use, and thus it is
not possible to run Cabal with a compiler or a compiler version different
from the compiler to be used for the package.

 I decided to upgrade to Cabal-1.2, which also needed installation of
filepath. I know that installation could be simplified with cabal-install,
which has even more dependencies, and thus I canceled this installation
project. Then I have equipped my Cabal files with a switch on splitBase,
which merely duplicates the globally known information that former
base-1.0 package is now split into base-2.0 or base-3.0 and satellites. It
doesn't give the user any new value, but costs a lot of time for the
package maintainer. I wonder if it would have been simpler to ship GHC-6.8
with a base-1.0 package or provide it on Hackage that just re-exports the
old modules in the known way. This would allow the usage of packages that
are in different state of adaption and it will reduce the amount of work
for package maintainers considerably. I also predict that the switch on
different package arrangements in the Cabal file will grow in future,
eventually becoming error-prone and unmaintainable. How many GHC versions
do you have installed simultaneously in order to test them all?

 Don't misunderstand me. I embrace tidying the libraries but I urge to do
it in a more compatible manner. Deprecated packages do not need to be
banned from the internet. It is not necessary to enforce programmers to
adapt to changes immediately, it is better provide ways for doing the
changes later, when time has come, in a smooth manner. I thought it was a
good idea to adapt to FunctorM in GHC-6.4 quickly instead of rolling my
own class. Then, two GHC releases later this module disappeared, was
replaced by Traversable. I thought it was good style to rewrite code from
List.lookup to FiniteMap in GHC-6.0, in GHC-6.4 it already disappeared,
replaced by Data.Map. Why is it necessary to make working libraries
obsolete so quickly? I thought using standard modules is more reliable
(because of more testers, more possible maintainers) than using custom
modules. If libraries change so quickly this forces programmers to fork to
their own modules.


More information about the Libraries mailing list