qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Greg Weber greg at gregweber.info
Thu Feb 27 20:36:51 UTC 2014


On Thu, Feb 27, 2014 at 9:35 AM, Austin Seipp <austin at well-typed.com> wrote:

> Hi Greg,
>
> On Thu, Feb 27, 2014 at 10:30 AM, Greg Weber <greg at gregweber.info> wrote:
> > I actually think work on the a) cabal solver has been a distraction from
> > more pressing issues: the need for sandboxes (that is done now) and
> > reproducible builds (frozen dependencies). If you look at Ruby's Bundler,
> > which has been extremely successful, it has historically (maybe they
> have a
> > better solver now) been a dumb tool in terms of its solver that works
> > extremely well. I think 90+% of this conversation is pretty wasteful,
> > because once we have reproducible builds everything is going to change.
> If
> > the energy could be re-directed to being able to create reproducible
> builds
> > in Haskell, then we could figure out what the next most important
> priority
> > is.
>
> I'd like to carefully point out however, that it is not a zero-sum
> game - work dedicated to improving the constraint solver is not work
> which is implicitly taken away any other set of tools - like a
> 'freeze' command. There is no 'distraction' IMO - it is a set of
> individuals (or companies, even) each with their own priorities. I
> think this is the sign of a healthy community, actually - one that
> places importance on its tools and seeks to find optimal ways to
> improve them in a variety of ways. A freeze command and an improved
> solver are both excellent (and worthy) improvements.
>

I agree that it is not zero sum, but I do think that at some point the
wrong priorities must have been chosen since I have to go to special effort
to produce a consistent build. Also this is all getting mixed up with a lot
of talk about PVP and other things whose relevance changes if the
underlying installation machinery supports what every application developer
should be doing.


> In reality, bundler works precisely for the reason you said it did: it
> avoids all the actually difficult problems. But that comes at a cost,
> because Bundler for example can't actually tell me when things *are*
> going to break. If I bump my dependencies, create a new Gemfile lock,
> and test - it could all simply explode later on at runtime, even if it
> could have been concluded from the constraints that it was all invalid
> in the first place. The only thing bundler buys me is that this
> explosion won't potentially extend to the rest of my global
> environment when it happens. Which is a good thing, truth be told, and
> why it is so popular - otherwise this happens constantly.
>

This wasn't my experience using bundler. Bundler supports conservative
upgrades that create consistent packages. So if you want to upgrade
something you place a range on it and ask Bundler to upgrade it. I don't
doubt though that it may let you manually subvert the system.


> These two concerns are, as far as I can see, in no way opposed in
> spirit or practice, and suggesting one is essentially wasted effort
> that distracts people - when I see no evidence of that - strikes me as
> odd.
>
>
I think the industrial Haskell group supported work on a better solver,
which was definitely helpful, but I just think it would have been wiser to
support work on consistent builds first. I agree that they can be worked on
independently.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.haskell.org/pipermail/libraries/attachments/20140227/02456fc5/attachment.html>


More information about the Libraries mailing list