[Haskell-cafe] Tool to brute-force test against hackage libraries to determine lower bounds?
rrnewton at gmail.com
Thu Nov 10 06:16:20 CET 2011
> What about dependency interactions? If you depend on foo and bar there
> might be versions of foo and bar that don't build together that you might
> not discover by varying their versions independently.
Indeed. But assuming for a moment that foo & bar have correctly specified
their own dependency bounds won't the constraint solver make up for some of
this deficiency? I.e. you specify too low a version for foo but the range
gets further restricted by cabal's constraint solver and you end up ok?
I proposed the greedy approach just because I think given current compile
times it wouldn't be possible to try all combinations ;-). **
Though I suppose a decent heuristic would compute the total # of
combinations and -- if it is manageable -- do them all. If not, either
resort to greedy/independent testing or bring out the more complex
strategies for sampling the version space...
But enough idle speculation! I know people have studied this problem in
earnest and I haven't read any of that.
** P.S. If one could carefully control how the compiler output is managed I
guess you could cut way down on the number of actual module compilations to
explore a given set of combinations. (A particular module should only need
to be compiled once for each unique combination of its own dependencies
present in the set of combinations being examined, right?)
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Haskell-Cafe