[Haskell-cafe] An interesting paper from Google
andrewcoppin at btinternet.com
Fri Oct 15 18:08:14 EDT 2010
On 15/10/2010 10:43 PM, Iustin Pop wrote:
> On Fri, Oct 15, 2010 at 09:28:09PM +0100, Andrew Coppin wrote:
>> I'm sure some of you have seen this already. For those who lack the
>> time or inclination to read through the (six) pages of this report,
>> here's the summary...
> Nice summary, I hope you found the paper interesting!
I often find it interesting seeing newcommer's opinions of Haskell.
Usually that's in the form of a blog that's just a braindump of what a
person has learned in half an hour of tinkering, following a tutorial.
(And usually it either says "oh wow, this is amazing! I never knew
writing programs could be so much FUN!" or it says "oh my God, I can't
believe Haskell SUCKS so much! It doesn't even have an IDE yet. How
primitive is that? And what the heck is a monomorphism anyway?!") It's a
first for me to see a coherantly written, measured account from people
who actually do software "for real", as it were.
>> - The parts of Haskell stolen by Python aren't as nice in Python as
>> they are in Haskell. [Well, duh.]
> I'd say unfortunately, not just duh…
Well, yeah, I can see that angle. I'm just a hopeless Haskell fanatic,
>> The paper also contains an interesting section that basically says
>> "we tried porting the Python implementing of XYZ into Haskell, but
>> there wasn't really any advantage because it's all I/O". In my
>> humble opinion, "it's all I/O" is a common beginner's mistake.
>> Reading between the lines, it sounds like they wrote the whole thing
>> in the IO monad, and then decided it looked just like the existing
>> Python code so there wasn't much point in continuing.
> Not quite (not all was in the I/O monad). It doesn't make sense to
> rewrite 40K of lines from language A into language B just for fun.
...you're talking to somebody who just spent an entire day implementing
generate SVG (hell, I didn't even know that was physically possible!) in
order to do a real-time demonstration of Huffman coding. Just to see if
I could do it. Just for the hell of it.
Really, the notion of actually getting *paid* to write software is quite
alien to me. I'd imagine you prioritise things quite differently.
> the advantages were not as strong as for the balancing algorithms to
> justify any potential conversion. They were strong, just not strong
> Basically, if you take a random, numerical/algorithmic problem, and you
> write it in FP/Haskell, it's easy to show to most non-FP programmers why
> Haskell wins on many accounts. But if you take a heavy I/O problem
> (networking code, etc.), while Haskell is as good as Python, it is less
> easy to show the strengths of the language. Yes, all the nice bits are
> still there, but when you marshall data between network and your
> internal structures the type system is less useful than when you just
> write algorithms that process the internal data. Similar with the other
> nice parts.
I forget whether it was Galios or Well-Typed who claimed that "every
program we write is either a compiler or an interpretter". It depends on
exactly how much low-level bit-fiddling your program domain actually
requires of course, but most problems aren't nearly as I/O-centric as
they look. (Then again, you're the one who's seen the code, not me.)
> Now, if I were to start from scratch… :)
Hasn't every programmer said *that* before? ;-)
>> I'm surprised about the profiler. They seem really, really impressed
>> with it. Which is interesting to me, since I can never seen to get
>> anything sensible out of it. It always seems to claim that my
>> program is spending 80% of its runtime executing zipWith or
>> something equally absurd.
> I'm surprised that you're surprised :) The profiler is indeed awesome,
> and in general I can manage to get one factor of magnitude speedup on my
> initial algorithms, if not more.
> Even if it just tells me that zipWith is the slow part, that's enough.
> I'd even say it's a very good hint where to start.
zipWith is a generic library function which always takes exactly the
same amount of time. Unless you're using it so extensively that it's
allocating huge amounts of memory or something, it would seem infinitely
more likely that whatever function zipWith is *applying* should be the
actual culprit, not zipWith itself.
Of course, I'm talking about profiling in time. GHC also enables you to
profile in space as well. I'm not actually sure to which one you're
referring. I haven't had much success with either. It's just too hard to
figure out what the sea of numbers actually represent. (Since it's quite
new, I'm assuming it's not the new ThreadScope functionallity - which I
haven't tried yet, but looks extremely cool...)
> Thanks again for the summary :)
No problem. There's no charge for this service. ;-)
More information about the Haskell-Cafe