kind inference question
Mark P Jones
mpj@cse.ogi.edu
Mon, 12 Feb 2001 12:10:59 -0800
Hi Bernie,
You ask why Haskell infers kinds for datatypes in dependency order.
As you point out, if Haskell tried instead to infer kinds for all of
the datatypes in a program at the same time, then it would sometimes
accept programs that are currently rejected. For example:
| data C x =3D Foo (B x) (x Int)
| data B y =3D Bar=20
|=20
| According to the report (and ghc and hugs) this is ill-kinded.=20
| The reason is that kinds must be inferred in dependency order,
| and when parts of a kind are not fully determined they default
| to * (star).
| ...
| However, an alternative kind inference algorithm might allow the
| above declarations, by performing kind inference of C and B together
| (effectively unifying the kind of x and the kind of y). This would
| result in:
|=20
| kind (C) =3D (* -> *) -> *
| kind (B) =3D (* -> *) -> *
|
| I'm not suggesting that allowing this inference would be a good=20
| idea, but I am wondering why haskell requires the dependency ordering. =
| Perhaps there are better examples that elucidate the motivation for
| dependency ordering. Basically I'm just curious.
I think the motivation for using dependency ordering was that then
you base inferred kinds only the minimum amount of information that
is needed. At the opposite end of the spectrum, you could, in general,
wait until the whole program (i.e., every single module) has been seen.
But then the kind that we infer for a given datatype definition would,
in general, depend on the context in which the definition appeared
(and hence vary from one program to the next) and not depend on the
datatype alone. That would be very strange!
Should Haskell's currently conservative position be modified to take
into account all of the definitions in a given module? Or perhaps
all of the definitions in a collection of mutually recursive modules?
Each of these pushes us closer to the problem I described previously.
For my money, they are not a good solution. I think it would be better
to infer polymorphic kinds for datatype definitions instead of the
current "default to *" wart.
All the best,
Mark