[GHC] #16082: Sort out treatment of underscores in types
GHC
ghc-devs at haskell.org
Fri Dec 21 15:43:25 UTC 2018
#16082: Sort out treatment of underscores in types
-------------------------------------+-------------------------------------
Reporter: goldfire | Owner: (none)
Type: bug | Status: new
Priority: normal | Milestone:
Component: Compiler | Version: 8.7
Keywords: | Operating System: Unknown/Multiple
Architecture: | Type of failure: None/Unknown
Unknown/Multiple |
Test Case: | Blocked By:
Blocking: | Related Tickets:
Differential Rev(s): | Wiki Page:
-------------------------------------+-------------------------------------
I can count 4 different meanings for underscores in Haskell programs:
1. A pattern for which we don't care to write a name. (This dates back to
antiquity.)
Example:
{{{#!hs
const x _ = x
}}}
2. An expression for which we want GHC to tell us what its expected type
should be. (Relatively new: these are typed holes.)
Example:
{{{#!hs
plus x y = x + _
}}}
3. A type which we want GHC to infer, by looking at the expression.
(Relatively new: these are wild cards in partial type signatures.)
Example:
{{{#!hs
plus :: forall a. Num a => a -> a -> _
plus x y = x + y
}}}
4. A type which we want GHC to infer, by looking at the underscore's
context. (Relatively new: these are wild cards in type applications.)
Example:
{{{#!hs
x = const @_ @Bool 'x' -- the _ is inferred to mean Char
}}}
Problems arise with the advent of visible kind application (#12045): In
type signatures, 3 of these meanings make sense (2, 3, and 4). In
type/data family patterns, 3 of these meanings make sense (1, 2, and 4).
Ideally, the user should have the opportunity to choose which meaning they
want. In contrast, right now we use heuristics: in visible type/kind
applications, we always use (4); otherwise, we use (1) (in patterns) or
(3) (in types).
This is a mess, for at least three reasons:
A. Users might conceivably want different behavior than what we provide.
For example, perhaps a user is writing an intricate pattern (at either
term or type level) and wants to know the type (resp. kind) of the next
bit of pattern. Or maybe the user wants to do this in a visible type
application. Right now, there's just no way to do this.
B. It causes trouble for pretty-printing. Aside from term-level patterns,
all the uses of underscores above are stored identically in the AST. This
means that they are printed identically. But that's strange. For example,
uses (3) and (4) might have different underscores meaning different
variables. Should we number the underscores? But that would be silly for
usage (1). It's all a bit muddy.
C. This causes awkwardness in the implementation. #12045 has to twiddle
DynFlags to get its desired behavior, and that's sad.
This ticket is to track resolutions to these problems.
--
Ticket URL: <http://ghc.haskell.org/trac/ghc/ticket/16082>
GHC <http://www.haskell.org/ghc/>
The Glasgow Haskell Compiler
More information about the ghc-tickets
mailing list