[Haskell] Re: Comments from Brent Fulgham on Haskell and the shootout

John Meacham john at repetae.net
Tue Jun 27 22:17:29 EDT 2006


On Tue, Jun 27, 2006 at 02:58:05PM +0100, Simon Marlow wrote:
> >.. (and jhc already generates native C code, so it will have at least
> > one substantial advantage over GHC) ...
>
> Compiling via C is a dead end.  We realised this about 5 years ago, and
> yet we still haven't managed to shake off the C backend from GHC, but I
> remain hopeful that one day we will.  /me heads for the cafe...

Out of curiosity, compiling via C as opposed to what? c--? Parrot? JVM?
I am certainly looking forward to compiling to C-- when it becomes
feasable. In any case, jhc's design was not overly influenced by needing
to compile to C, compiling to C was easy as a side effect of other
design choices. Of course, the advantages of C-- over C arn't as
pronounced for jhc as they are for ghc, but there are certainly features
I can use.

Jhc's performance has been in the dumps recently since I completly broke
the strictness analyzer. pretty much no optimizations occur without good
strictness transformations.

I have been writing a new one using the simple usage polymorphism
algorithm,
 http://research.microsoft.com/~simonpj/Papers/usage-types/usage.htm

but with the 2-point strict/notstrict lattice rather than the at most
once/many one. something I have noticed is some strong parallels to the
other strictness algorithm I played with, the one using HORN
constraints.

 http://citeseer.ist.psu.edu/493889.html

in fact, the formulas derived end up pretty much exactly the same if you
consider the 2 point lattice to be boolean values and <= to be boolean
implication. the act of generalization where you find the variables that
occur in both contravarient and covarient positions in the simple usage
polymorphism algorithm is similar to the satisfyability problem over
boolean formulas. only variables that appear in both positions may be
'pulled strict' by another demanded value.

the simple usage analysis algorithm is signfigantly faster, making me
think the HORN constraint version is as strong as (and perhaps even
equivalent to) constrained polymorphism rather than just simple
polymorphism. but this is all speculation..

This also might imply that in order to get good performance extending the
simple usage analysis to the full 7-point lattice combining
strictenss/abscence/one-shot/linear analysis in one pass, you might need
full constrained polymorphism to get good strictness results.

Most of this is just idle speculation though based on some perceived
parallels...

of course, this is probably not very interesting and obvious...
but I found it pretty neat. I'm odd like that.

Hmm.. what a tangent.

        John

--
John Meacham - ⑆repetae.net⑆john⑈


More information about the Haskell mailing list