A-posteriori .cabal updates don't help (much) with optimistic-upper-bounds (was: Gearing up for a cabal-install-1.16.0 release)
Herbert Valerio Riedel
hvr at gnu.org
Mon Dec 3 12:35:49 CET 2012
Duncan Coutts <duncan.coutts at googlemail.com> writes:
[...]
> I mentioned I'd written a feature to update the .cabal file we use
> when building with the same one as is in the package archive index.
> I've now pushed that to the head branch. I think it would be good to
> have on the 1.16 branch and included in the cabal-install release.
>
> For context, as you know there's been a lot of discussion about
> dependency version bounds and if we should use conservative or
> optimistic bounds. Something that we've been thinking about for a
> while that would help here is if we could adjust the dependencies
> after a release.
>
> For example if you used conservative bounds on a dependency and later
> on a new version of that dependency is released, and it so happens
> that you were lucky this time and your package does still build and
> work with the new version. It'd be nice if you (or some other helper
> monkey) could just adjust the version constraints to reflect reality.
>
> Or the other way around, if you were using optimistic bounds and a
> new version of a dep is released and you were unlucky and it now
> fails, then again we can tighten the version bounds.
IMO, being able to modify .cabal server-side after the fact helps the
situation with (overly) conservative version bounds, but it isn't
practical for optimistic bounds, and here's why:
As everybody knows, with conservative version bounds, the Hackage
database is always in a "consistent" state by definition at all times,
meaning that any combination of packages satisfying the respective
inter-package version constraints is guaranteed to build
properly[1]. The worst thing that can happen is that the solver can't
find a set of compatible packages. When a new package is added to the
Hackage database, this property/invariant is preserved (which is
essential for having reproducible builds).
With optimistic bounds on the other hand, not every combination of
packages for which the inter-package constraint are satisfied is also
guaranteed to build properly. In fact, it only takes one fundamental
package to be uploaded with an incompatible API change to break almost
every packages' compilation (until those are fixed by adding version
bounds (which I claim doesn't work well) or adapting to the new
incompatible API (which could lead to avalanche effects)).
But let's assume a simpler scenario involving only two packages:
1) Hackage contains the package text-0.10
2) there's a package text-foo which depends on 'text >= 0.10' (i.e. with no
upper bound), for which N versions exist on Hackage (all with
the same 'text >= 0.10' constraint):
0.1.0, 0.1.0.1, 0.1.0.2, 0.1.1, 0.1.2, 0.2, ...some more... ,
0.7.2.2, 0.7.2.3
3) now text-0.11 gets uploaded, which has an incompatible API change
(for instance the function 'breakOn' which text-foo uses internally
gets renamed to 'breakBy')
4) text-foo-0.7.2.3's .cabal file is updated server-side to have the
version constraint set to 'text >= 0.10 && text < 0.11'
Alas, 4) isn't enough, as it leaves N-1 versions of text-foo with an
invalid version constraint, and if the cabal solver tries to find a
compatible package of 'text-foo' it will simply backtrack to
text-foo-0.7.2.2 which still has the optimistic 'text >= 0.10'
constraint. In order to really fix 'text-foo' one would have to edit all
its N version's .cabal files server-side.
So what I'm trying to say is that the current design of the CABAL solver
and the proposed server-side-.cabal edit feature lends itself better to
relaxing version constraints than to restrict version constraints.
Does this make any sense?
Cheers,
hvr
[1] By "building properly" I mean that the build doesn't fail due to
inter-package API incompatibilities.
More information about the cabal-devel
mailing list