Inferring instance constraints with DeriveAnyClass
Simon Peyton Jones
simonpj at microsoft.com
Fri Jun 17 11:43:24 UTC 2016
| My question is then: why does DeriveAnyClass take the bizarre approach
| of co-opting the DeriveFunctor algorithm? Andres, you originally
| proposed this in #7346 [2], but I don't quite understand why you
| wanted to do it this way. Couldn't we infer the context simply from
| the contexts of the default method type signatures?
That last suggestion makes perfect sense to me. After all, we are going to generate an instance looking like
instance .. => C (T a) where
op1 = <default-op1>
op2 = <default-op2>
so all we need in ".." is enough context to satisfy the needs of <default-op1> etc.
Well, you need to take account of the class op type sig too:
class C a where
op :: Eq a => a -> a
default op :: (Eq a, Show a) => a -> a
We effectively define
default_op :: (Eq a, Show a) => a -> a
Now with DeriveAnyClass for lists, we effectively get
instance ... => C [a] where
op = default_op
What is ..? Well, we need (Eq [a], Show [a]); but we are given Eq [a] (because that's op's instantiated type. So Show a is all we need in the end.
Simon
More information about the ghc-devs
mailing list