Problem with functional dependencies
Mark P Jones
Wed, 3 Jan 2001 09:05:07 -0800
| I think you can simplify the example. Given
| class HasFoo a b | a -> b where
| foo :: a -> b
| instance HasFoo Int Bool where ...
| Is this legal?
| f :: HasFoo Int b =3D> Int -> b
| f x =3D foo x
The theoretical foundation for functional dependencies goes back to
the work I did on "Simplifying and Improving Qualified Types".
(Interested parties can find a 1994 report on this on my web pages;
email me if you need a pointer.)
According to that theory, the type above is a "principal satisfiable
type" for f, as is the more accurate Int -> Bool: under the
satisfiability ordering described in the report, these two types
are (satisfiably) equivalent. There is, therefore, no technical
reason why the function f could not be treated as having the
polymorphic type shown above.
On the other hand, from a practical perspective, one can argue that
the polymorphic type is misleading, obfuscating, and cumbersome:
Misleading because f doesn't really have a polymorphic type as the
declaration pretends; Obfuscating because it forces a reader to
study instance declarations that are not included in the type;
and Cumbersome because it includes an unnecessary (HasFoo Int b)
constraint that could be eliminated to produce a shorter, simpler
So it comes down to a language design *decision* for which functional
dependencies, by themselves, do not force a particular choice.
- The current Hugs implementation does not allow the polymorphic
type; the intention in that implementation was to infer more
accurate, less complex types. The idea here is to make programs
easier for programmers to read, write, and understand.
- Marcin indicates that he would prefer the more relaxed approach
that allows polymorphic types; he is writing a preprocessor that
generates type signatures, and his task is easier if he doesn't
have to worry about the "improvement" of class constraints.
The idea here is to make programs easier for generators to read,
write and manipulate.
Clearly, some compromise is needed because neither approach is right
for all purposes. If we look to other aspects of the language for
inspiration, then the best way to deal with this is (probably):
(i) to infer simpler types whenever possible, but
(ii) to allow more polymorphic types when they are requested by
means of an explicit type signature.
(Incidentally, in the interests of consistency, such a system should
also programmers to use types like Num Int =3D> Int -> Bool.)
All the best,