[Haskell-cafe] Equality constraints in type families
Claus Reinke
claus.reinke at talk21.com
Wed Mar 26 06:17:04 EDT 2008
> Well, we still need normal subject reduction (i.e., types don't change
> under value term reduction), and we have established that for System
> FC (in the TLDI paper).
>
> In addition, type term normalisation (much like value term
> normalisation) needs to be confluent; otherwise, you won't get a
> complete type inference/checking algorithm.
if you have separate rules for generating constraints (decomposition)
and type term normalisation, that also needs to include confluence
of the combined system, which was the issue here, i believe.
>> as far as i could determine, the TLDI paper does not deal with
>> the full generality of ghc's type extensions, so it doesn't stumble
>> over this issue, and might need elaboration.
>
> System FC as described in the paper is GHC's current Core language.
> What are you missing?
for now, interaction with pre-Core features, such as generalised type
synonyms?-) i just tend to program in ghc Front, not ghc Core;-)
later, there'll be other fun, eg, my extensible record code depends
on ghc's particular interpretation of the interaction between FDs
and overlap resolution, much of Oleg's code depends on a trick
that he has isolated in a small group of type equality predicates,
carefully exploiting ghc's ideosynchrasies. neither his nor my code
works in hugs, even though we nominally use language features
supported by both ghc and hugs, even originating in hugs.
probably, closed type families can provide better answers to these
issues, but until they are here, and their interaction with other ghc
type system features has been studied, the idea of mapping FDs
to type families has to be taken with a grain of salt, i guess. as
with FDs, it isn't a single feature that causes trouble, it is the
interaction of many features, combined with real-life programs
that stretch the language boundaries they run into.
> I don't think we can avoid losing that expressiveness, as you
> demonstrated that it leads to non-confluence of type term
> normalisation - see also my reply to SimonPJ's message in this thread.
i know that we sometimes have to give up features that look
nice but have hidden traps - that doesn't mean i have to like
it, though!-)
> Haskell always had the decomposition rule. Maybe we confused the
> matter, by always showing a particular instance of the decomposition
> rule, rather than the general rule. Here it is in all it's glory:
>
> t1 t2 ~ t1' t2' <==> t1 ~ t1', t2 ~ t2'
>
> No mention of a type family. However, the rule obviously still needs
> to be correct if we allow any of the type terms (including t1 and t1')
> to include *saturated* applications of type families.
before type families, t1 and t1' were straightforward type constructors
(unless you include type synonyms, for which the decomposition doesn't
apply). with type families, either may be a type-level function, and
suddenly it looks as if you are matching on a function application.
compare the expression-level:
f (Just x) = x -- ok, because 'Just x' is irreducible
g (f x) = x -- oops?
so what i was missing was that you had arranged matters to
exclude some problematic cases from that decomposition rule:
- partially applied type synonyms (implicitly, by working in core)
- partially applied (in terms of type indices) type families (by a
side condition not directly visible in the rule system)
in other words, the rule looks too general for what you
had in mind, not too specific!-)
claus
More information about the Haskell-Cafe
mailing list