[Haskell-cafe] Counterintuitive ScopedTypeVariables behaviour

Viktor Dukhovni ietf-dane at dukhovni.org
Sat Aug 22 00:29:23 UTC 2020


On Fri, Aug 21, 2020 at 06:13:23PM -0500, Alexis King wrote:

> > On Aug 19, 2020, at 18:22, Viktor Dukhovni <ietf-dane at dukhovni.org> wrote:
> > 
> > Though perhaps much too careful/complete a read of the documentation is
> > needed to see support in the documentation for the conclusions I
> > ultimately reached, I do believe that the text is on some level there
> 
> I do not agree with your interpretation. :)

Fair enough, the text is ultimately not clear.  [ TL;DR the rest is
inessential detail for the morbidly curious. ]

> > But when we get to "9.17.4. Pattern type signatures" […] we see that
> > are they are rather different beasts, that deviate from the
> > description in the overview.  In particular, "2" no longer holds,
> > because these are neither generalised, nor universally quantified.
> 
> It is true that such variables are not generalized, but the
> documentation does not say “generalized”—it says rigid. If we consider
> that the primary purpose of this feature is to bring existentials into
> scope, then there is no violation of the principle when used in the
> intended way. After all, existentials brought into scope via `case`
> are very much rigid (and distinct from all other variables).

The leap I was making is that it is specifically universal
quantification that leaves the design no choice but to make the
variables rigid.  With "forall a b." naturally "a" and "b" vary
*independently*, and so must be kept distinct.

On the other hand, given that Pattern Type Signatures are not
universally quantified, there is no longer a *prior requirement* to keep
them distinct, and so I am ultimately not surprised that they may not be
rigid in the same way as those brought into scope via "forall", indeed
we can now (as of GHC 8.8) write:

    foo :: Int -> Int -> String
    foo (x :: a) (y :: b) = "(" ++ (show @a x) ++ ", " ++ (show @b y) ++ ")"

    λ> foo 1 2
    "(1, 2)"

where "a" becomes a synonym for "Int" and so does "b", and so are surely
not distinct types.  But even with GHC 8.0 (7.10 and perhaps older would
also be OK were this example changed to avoid TypeApplications), we can
write:

    foo :: Show c => c -> c -> String
    foo (x :: a) (y :: b) = "(" ++ (show @a x) ++ ", " ++ (show @b y) ++ ")"

    λ> foo 1 2
    "(1, 2)"

where "a" and "b" are aliases for "c".  I don't think you disagree that
this is what the actual implementation is doing.  My point was just that
with some effort one might read the actual behaviour as consistent with
the spirit of the documentation, and that the leap required to read it
that way may not be too great.

> It is true that GHC implements a much more flexible behavior that
> allows entirely different things to be brought into scope. But that is
> just part of the divergence from the documentation—there’s nothing
> that says pattern signatures are exempt from that restriction. It only
> states that pattern signatures that refer to unbound variables bring
> those variables into scope, nothing more.

Yes, the literal text does not actually carve out an exception, I'm
making a retroactive extrapolation from the inferred constraints on
the implementation given the context.

> Of course, this is all somewhat immaterial—there is no way to
> objectively determine which interpretation is “correct,” and the docs
> are unambiguously unclear regardless, so they should be changed. But
> purely as a matter of friendly debate, I stand by my interpretation.

Yes it would be nice for someone to craft a proposed change to the docs.
Any volunteers?

My attempt to read the docs liberally was largely in order to better
understand the actual implementation and to intuit how and why the
actual choice was made, and might be seen to be consistent with the
overall design.

-- 
    Viktor.


More information about the Haskell-Cafe mailing list