[Haskell-cafe] Counterintuitive ScopedTypeVariables behaviour
Anthony Clayden
anthony_clayden at clear.net.nz
Mon Aug 17 05:08:10 UTC 2020
Hi Tom, I see nothing bizarre nor counterintuitive. Perhaps you need
to retrain your intuitions ;-)
> I have just noticed that with ScopedTypeVariables one can write the
> following bizarre definition
>
> -- Inferred signature:
> -- add :: Num a => a -> a -> a
> add (x :: a) (y :: b) = x + y
What do you think is bizarre? GHC makes a reasonable attempt to use
the tyvar names you put in a signature -- remembering that
alpha-renaming means the name is not really significant. I can get
`add :: Num c => c -> c -> c`, etc.
> The reason for this behaviour is that
>
>>* in all patterns other than pattern bindings, a pattern type
*>>* signature may mention a type variable that is not in scope; in this
*>>* case, the signature brings that type variable into scope.
*
OK so your `PatternSignatures` (a part of `ScopedTypeVariables` that
existed long before the more dubious parts of that extension) are
bringing `a`, `b`, into scope. It might have been clearer if you'd
used `b`, `c`, to avoid confusing yours with some `a` that might have
been lurking in the environment.
Anyhoo your definition is equivalent to this, by eta-reduction:
add = (+)
and `(+) :: Num a => a -> a -> a` from the Prelude. So the inferred
signature for `add` must be the same, possibly alpha-renamed.
Perhaps you think you could add an Int to a Float, as you can in
languages (COBOL, Algol 68) with implicit type coercion? That's not
what the signatures from the Prelude Num methods support. And with
Haskell's proclivity for partial application, type inference would
rapidly become incoherent; so we don't do that.
> But this leads to a rather puzzling user experience. Was it really
> not possible to design this extension in a way that would allow
> bringing new type variables into scope locally but not allow them to
> unify with other type variables from outer scopes?
>
> To be more concrete, what I would like to see is a design where `k` is
> allowed because the type `a` (bound within a pattern) does not need to
> unify with anything but `k2` is not allowed because `a` is forbidden
> to unify with `b`. ...
No: any tyvar might unify with any other. You can't ban tyvars from
unifying. Unification would be useless if you can't unify globally.
That's why all tyvars are (implicitly) universally quantified. (With
the exception of existentially-quantified tyvars as you quote, which
is another can of worms and we don't want to go there.) You can use
as many distinct tyvars as you like, but if type inference resolves
them to the same type, it must use the same tyvar in the inferred
signature.
> I believe this design might lead to much less puzzling behaviour.
> Would it be possible (within a new extension, of course) or have I
> overlooked something?
There's the `GADTs` or `TypeFamilies` extensions that introduce `~` as
a constraint-level equality on types. I suppose we might use that, but
(thinking creatively) any of these would be a legitimate resulting
signature; how should GHC choose amongst them?
add :: (Num a, a ~ b) => a -> b -> a
add :: (Num b, a ~ b) => a -> b -> b
add :: (Num a, Num b, a ~ b) => a -> b -> a
The point is: the return type for `add` aka `(+)` must be the same as the
first argument type must be the same as the second argument type must be
the same as the `Num` constraint applies to.
Any of those three signatures I've given (and there could be several
more in the same vein) obfuscates what's going on.
Actually, thinking a bit more: we want to infer a 'Principal type'
(see wikipedia). Precisely because there's no good reason to choose
amongst those I gave, none of them can be Principal.
AntC
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/haskell-cafe/attachments/20200817/0adea1f5/attachment.html>
More information about the Haskell-Cafe
mailing list