<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Tue, Oct 20, 2015 at 2:24 PM Ivan Perez <<a href="mailto:ivan.perez@keera.co.uk" target="_blank">ivan.perez@keera.co.uk</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div text="#000000" bgcolor="#FFFFFF">
<div>On 20/10/15 19:47, Mike Meyer wrote:<br>
</div>
<blockquote type="cite">
<div dir="ltr">
<div class="gmail_quote">
<div dir="ltr">On Tue, Oct 20, 2015 at 1:35 PM Gregory Collins
<<a href="mailto:greg@gregorycollins.net" target="_blank">greg@gregorycollins.net</a>>
wrote:</div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div dir="ltr">
<div class="gmail_extra">
<div class="gmail_quote">The point Johan is trying to
make is this: if I'm thinking of using Haskell, then
I'm taking on a lot of project risk to get a
(hypothetical, difficult to quantify) X% productivity
benefit. If choosing it actually <b>costs</b> me a
(real, obvious, easy to quantify) Y% tax because I
have to invest K hours every other quarter fixing all
my programs to cope with random/spurious changes in
the ecosystem and base libraries, then unless we can
clearly convince people that X >> Y, the
rationale for choosing to use it is degraded or even
nullified altogether.</div>
</div>
</div>
</blockquote>
<div><br>
</div>
<div>So I'll rephrase a question I asked earlier that never
got an answer: if I'm developing a commercial project based
on ghc and some ecosystem, what would possibly cause me to
change either the ghc version or any part of the ecosystem
every other quarter? Or ever, for that matter?</div>
</div>
</div>
</blockquote></div><div text="#000000" bgcolor="#FFFFFF">
I don't know about them, I can tell you my personal experience.<br>
<br>
If GHC and all libraries were perfect and free from bugs and
ultimately optimized, then you'd be right: there would be no reason
to change.<br>
<br>
But if you ever hit a bug in GHC or a library which was fixed in a
future version, or if you want an improvement made to it, you may
have to update the compiler.<br>
<br>
Library creators/maintainers do not always maintain their libraries
compatible with very old/very new versions of the compiler. In an
ecosystem like ours, with 3 versions of the compiler in use
simultaneously, each with different language features and base APIs
changed, compatibility requires a lot of work.<br>
<br>
This problem is transitive: if you depend on (a new version of a
library that depends on)* a new version of base or a new language
feature, you'll may have to update GHC. If you do not have the
resources to backport those fixes and improvements, you'll be forced
to update. In large projects you are likely to use hundreds of
auxiliary libraries, so this is very likely to happen.<br>
<br>
I recently had to do this for one library because I could only
compile it with a newer version of GHC. This project had 30K lines
of Haskell split in dozens of libraries and a few commercial
projects in production. It meant fixing, recompiling, packaging and
testing everything again, which takes days and it's not unattended
work :( It could easily happen again if I depend on anything that
stops compiling with this version of GHC because someone considers
it "outdated" or does not have the resources to maintain two
versions of his/her library.<br>
<br>
Does that more or less answer your question?<br></div></blockquote><div><br></div></div><div dir="ltr"><div class="gmail_quote">Not really. IIUC, your fundamental complaint is that the cost of tracking changes to the Haskell ecosystem outweighs any potential gains from using Haskell. But the choices that lead you to needing to track those changes don't make sense to me.</div><div class="gmail_quote"><br></div><div class="gmail_quote">For instance, you talk about compatibility requiring a lot of work, which I presume means between projects. Yes, having to swap out ecosystems and tool sets when you change projects can be a PITA, but even maintaining the environment by hand is less work than trying to keep all your projects compatible across multiple environments. So why do that? Especially when you have tools like virtual environments and stack to take away the pain of multiple environments?</div><div class="gmail_quote"><br></div><div class="gmail_quote">And yes, if some part of the ecosystem has a bug you have to get fixed and an update will get the fix, that's one option. But it also comes with a cost, in that you need to verify that it didn't introduce any new bugs while fixing the old one. Plus, dealing with possible changes in the API. And as you note, if that forces you to update some other part of the ecosystem, all that work is transitive to those other parts. It indeed adds up to a lot of work. Enough that I have to question that it's less work than backporting a fix, or even developing a new one from scratch.</div><div class="gmail_quote"><br></div><div class="gmail_quote">Over a couple of decades of building commercial projects in the P languages, when faced with the alternatives you outlined here, updating anything major was never the choice if more than one person was actively writing code. Even with a language that put a priority on not breaking old code in order to minimize the cost of doing that update.</div><div class="gmail_quote"><br></div><div class="gmail_quote">Maybe there's something I'm missing about Haskell that makes fixing somebody else's code take a lot more resources than it does in other languages. In which case that, not the changing ecosystem, is the argument against Haskell.</div></div></div>