Question about retaining / deferring /defaulting ambiguous predicates (was Question about ambiguous predicates in pattern bindings)

Benjamin Redelings benjamin.redelings at
Tue Jan 18 20:55:39 UTC 2022


1. I think I have clarified my problem a bit.  It is actually not 
related to pattern bindings. Here's an example:

h = let f c   i = if i > 10 then c else g c 'b'
         g 'a' w = f 'b' 10
         g z   w = z
     in (f 'a' (1::Int), f 'a' (1.0::Double))

If I am understanding the Haskell type system correctly,

* the definitions of f and g form a recursive group

* the monomorphism restriction is not invoked

* the inner binding (to f and g) leads to a local value environment (LVE):

{ f :: Char -> a -> Char; g :: Char -> Char -> Char }

with predicates (Num a, Ord a)

2. In this situation, Typing Haskell in Haskell suggests that we should 
NOT apply the predicates to the environment because the type for g does 
not contain 'a', and would become ambiguous (section 11.6.2).  Instead, 
we should only apply predicates to the environment if their type 
variables are present in ALL types in the current declaration group.

Since the predicates (Num a, and Ord a) are not retained, then we cannot 
quantify over a.

It seems like this should make `f` monomorphic in a, and thus we should 
not be able apply (f 'a') to both (1::Int) and (1::Double).

Does that make any sense?

3. GHC, however, compiles this just fine.  However, if I add "default 
()", then it no longer compiles.

4. On further reflection, Typing Haskell in Haskell applies defaulting 
rules when evaluating each binding, and not just at the top level.  So 
this might be part of where my code is going wrong.


On 1/15/22 11:09 AM, Benjamin Redelings wrote:
> Hi,
> 1. I'm reading "A Static semantics for Haskell" and trying to code it 
> up.  I came across some odd behavior with pattern bindings, and I was 
> wondering if someone could explain or point me in the right direction.
> Suppose you have the declaration
>     (x,y) = ('a',2)
> My current code is yielding:
>     x :: Num a => Char
>     y :: Num a => a
> However, I notice that ghci gives x the type Char, with no constraint, 
> which is what I would expect.  It also gives y the type 'Num b => b', 
> so I don't think it is defaulting a to Int here.
> The weird results from my code stem from rule BIND-PRED in Figure 15 
> of 
>     E  |-  bind ~~> \dicts : theta -> monobinds in bind : (LIE_{encl}, 
> theta => LVE)
> Here theta = ( Num a ) and LVE = { x :: Char, y :: a }.  So, theta => 
> LVE is
>     { x :: Num a => Char, y :: Num a => a }
> The obvious thing to do is avoid changing a type T to Num a => T if T 
> does not contain a.  Also I'm not totally sure if that trick gets to 
> the bottom of the issue.  However, the paper doesn't mention define 
> theta => LVE that way.  Is there something else I should read on this?
> 2. If we just chop out predicates which mention variables not in the 
> type ( == ambiguous predicates?) then I'm not totally sure how to 
> create code for this.
> I would imagine that we have something like
>     tup dn = ('a', fromInteger dn 2)
>     x = case (tup dn) of (x,y) -> x
>     y dn case (tup dn) of (x,y) -> y
> In this case its not clear where to get the `dn` argument of `tup` 
> from, in the definition of `x`.  Can we pass in `undefined`? Should we 
> do something else?
> If anyone can shed light on this, I would be grateful :-)
> -BenRI
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the ghc-devs mailing list