[Haskell-cafe] Strange type error with associated type synonyms
Claus Reinke
claus.reinke at talk21.com
Thu Apr 16 08:11:25 EDT 2009
| But the core part of my suggestion (which this example was meant
| to help explain) remains attractive, at least to me: somewhere during
| type inference, GHC *does* unify the *apparently free* 'd' with an
| internal type variable (lets call it 'd1, as in the type error message)
>You are speculating about the *algorithm*.
There is no need to reason about the algorithm, only about its observable
results ('d1' appeared in the error message, but not in the inferred type).
But I've finally figured out what had me confused - sadly something
that has come up before, I just had forgotten to apply it here.
First, the code again:
{-# LANGUAGE NoMonomorphismRestriction #-}
{-# LANGUAGE TypeFamilies #-}
class Fun d where
type Memo d :: * -> *
abst :: (d -> a) -> Memo d a
appl :: Memo d a -> (d -> a)
-- f :: (Fun d) => Memo d a -> Memo d a -- (1)
f = abst . appl
(1) is the type inferred by GHCi. If we uncomment it, we get this
error:
D:\home\Haskell\tmp\desktop\types.hs:11:11:
Couldn't match expected type `Memo d1'
against inferred type `Memo d'
In the second argument of `(.)', namely `appl'
In the expression: abst . appl
In the definition of `f': f = abst . appl
My _erroneous_ reasoning was that some unknown 'd1' was being
unified with the 'd' from the signature. So I wanted that 'd1' to be
represented in the signature inferred by GHCi.
Of course, the error message really says something quite different,
and if I had recalled that 'Memo' is a type synonym, I would have
noticed that. If GHC ever provides an option to bracket fully applied
type synonyms, I'd probably switch it on by default.
(what one actually wants is a way to distinguish type-level general
functions from type-level constructors that can be decomposed
during unification, similar to what Haskell does with capitalization
at the expression level, to distinguish general functions from
constructors that can be decomposed during matching).
With such an option, the inferred type would look like this (er, hi Conor):
f :: (Fun d) => {Memo d} a -> {Memo d} a
The first parameter of 'Memo' is not directly available for unification,
so this signature is ambiguous as it stands, as you tried to point out
(with no way to pin down 'd', 'Memo d' can never unify with anything
other than itself, identically).
Apart from bracketing fully applied type synonyms, the error message
could be improved by providing the missing bit of information about
'Memo':
D:\home\Haskell\tmp\desktop\types.hs:11:11:
Couldn't match expected type `Memo d1'
against inferred type `Memo d'
(type Memo d :: * -> *)
In the second argument of `(.)', namely `appl'
In the expression: abst . appl
In the definition of `f': f = abst . appl
Sorry about letting myself get confused again by this known issue.
Claus
PS.
I thought I'd try this alternative signature, to make the ambiguity
explicit:
f :: (Memo d~md,Fun d) => md a -> md a -- (2)
but the error message is even less helpful:
D:\home\Haskell\tmp\desktop\types.hs:11:11:
Couldn't match expected type `Memo d' against inferred type `md'
`md' is a rigid type variable bound by
the type signature for `f'
at D:\home\Haskell\tmp\desktop\types.hs:10:14
In the second argument of `(.)', namely `appl'
In the expression: abst . appl
In the definition of `f': f = abst . appl
Shouldn't the inferred type still be 'Memo d1'? Why is part of the
_declared_ type "expected" ('Memo d'), the other "inferred" ('md')?
Could GHC perhaps be more detailed about 'expected', 'inferred',
and 'declared', and use these terms from a user perspective? Even
in the original error message, shouldn't 'expected' and 'inferred' be
the other way round?
And if we add the missing contexts for the types mentioned in the
message, the error message could really be:
D:\home\Haskell\tmp\desktop\types.hs:11:11:
Couldn't match inferred type `Memo d1'
(f :: (Fun d1) => {Memo d1} a -> {Memo d1} a)
against declared type `Memo d'
(f :: (Fun d) => {Memo d} a -> {Memo d} a)
(type {Memo d} :: * -> *)
In the second argument of `(.)', namely `appl'
In the expression: abst . appl
In the definition of `f': f = abst . appl
Perhaps then I would have seen what was going on?-)
More information about the Haskell-Cafe
mailing list