Are anonymous type classes the right model at all? (replying to Re: Are fundeps the right model at all?)
Marcin 'Qrczak' Kowalczyk
qrczak@knm.org.pl
24 Dec 2000 20:25:12 GMT
Thu, 21 Dec 2000 21:20:46 +0100, George Russell <ger@tzi.de> pisze:
> So if you agree with me up to here, perhaps you are agreed that it is worth
> while trying to find a middle way, in which we try to combine both approaches.
I am thinking about a yet different approach. Leave classes and SML
structures as they are, and make *records* more flexible, to be used
instead of classes if instances are to be manipulated explicitly,
and instead of structures if we are using Haskell rather than SML
or OCaml, and instead of objects if we are using Haskell rather than
some OO language, and as a general way of expressing things behaving
like fixed dictionaries of values.
I have yet to play more with it. I already have some thoughts and
a working preprocessor which translates my extensions to Haskell
(with multi-parameter classes and fundeps).
-------- GOALS --------
* Replace the current record mechanism with a better one.
* Don't require sets of fields of different record types disjoint. It's
not only to avoid inventing unique field names, but also to have
functions polymorphic over all records containing specific fields
of specific types.
* Provide a way to specialize existing record types to new types that
behave similarly except of small changes. I.e. kind of inheritance.
* Since Haskell does not have subtyping, have coercions up the
inheritance tree. Overloading functions on record types is not
always enough, e.g. to put records in a heterogeneous collection
they must be coerced to a common type.
* Don't constrain the implementation of field access for different
record types. As long as it behaves like a record, it is a record.
* Don't constrain the implementation of methods even for the same
record type. Since Haskell does not have subtyping, records which
would have different types in other languages can have the same
type in Haskell, as long as the same interface suffices.
* Express keyword parameters of functions. A function might use many
parameters refining its behavior which usually have some default
values. Old code using that function must not break when more
parameters are added.
* A piece of code should be understandable locally, independently
of definitions and instances present elsewhere.
* Have a nice syntax.
* Keep it simple and easily translatable to the core language.
Fields and methods are really the same thing. Moreover, inheritance is
really delegation and coercions are the same things as field accesses
as well.
Record types are not anonymous, unlike TREX. Field names are born
implicitly and live in a separate namespace. Each field name is
associated with a class of record types having that field. Instances
of these classes are defined implicitly for types defined as records,
but can also be given explicitly for any type.
-------- FIELD SELECTION --------
A field selection expression of the form
expr.label
is equivalent to
(.label) expr
where
(.label) :: (r.label :: a) => r -> a
is an overloaded selector function.
(rec.label:: a) is a syntax for Has_label rec a, where Has_label is
the implicitly defined class for this label. Such class would look
like this if it were defined as normal classes:
class Has_label r a | r -> a where
(.label) :: r -> a
set_label :: r -> a -> r
except that there are no real names Has_label nor set_label.
-------- DEFINITION OF RECORD TYPES --------
The definition of a record type:
data Monoid e = record
zero :: e
plus :: e -> e -> e
defines the appropriate single-constructor algebraic type and
obvious instances:
instance (Monoid e).zero :: e where ...
instance (Monoid e).plus :: e -> e -> e where ...
We can construct values of this type thus:
numAddMonoid :: Num e => Monoid e
numAddMonoid = record
zero = 0
plus = (+)
The meaning of such overloaded record creation expressions will be
specified later.
-------- INHERITANCE --------
Here is another example of a record type definition:
data Group e = record
monoid :: Monoid e
minus :: e -> e -> e
neg :: e -> e
monoid (zero, plus)
x `minus` y = x `plus` neg y
neg y = zero `minus` y
This record type has three direct members: monoid, minus, and neg.
monoid holds its zero and plus.
We want to be able to extract zero and plus of a group directly,
instead of going through the underlying monoid. We could define
appropriate instances:
instance (Group e).zero :: e where ...
instance (Group e).plus :: e -> e -> e where ...
and this is what the inheritance declaration
monoid (zero, plus)
does automatically for us.
So groups too have zero and plus, which are deleagated to the monoid.
Seen from outside, these fields are indistinguishable from proper
Group's fields.
-------- DEFAULT DEFINITIONS --------
minus and neg in Group have default definitions expressed in terms
of each other. When making a Group we can provide the definition of
either one or both, otherwise both will diverge.
We could provide default definitions of inherited methods too. If they
had default definition in the supertype, they would be overridden.
This is how the system expresses OO methods belonging to a type: by
default definitions. They can be overridden in subtypes or at object
creation time.
How is it done that the default definition of minus refers to the
definition of neg which will be supplied later? It is not known yet
which fields will be specified at creation time. OTOH at the creation
time it is not known which fields have default definitions, because
the creation expression is polymorphic over record types containing
specific fields and will be instantiated based on the context.
There is a standard class defined as follows:
class Record r where
bless :: r -> r
A record creation expression, say:
record
zero = 0
plus = (+)
is a syntactic sugar for a recursively defined object:
let this = bless this `set_zero` 0 `set_plus` (+)
in this
The bless function, named after Perl's mechanism used in a similar
context, returns a record with all fields initialized using their
default definitions, or bottoms for fields with no defaults. Default
definitions refer to other fields through the parameter of bless.
As seen above, bless is applied to the record to be constructed, and
then fields with values specified at creation time are overridden.
That way all field definitions can find right versions of other fields,
no matter which were defined together with the type and which were
supplied at the creation time.
The type of the above record creation expression is
(Record r, Num a, Num b, r.zero :: a, r.plus :: b -> b -> b) => r
-------- DEFINITION OF BLESS --------
Definition of a record type automatically makes it an instance of
the class Record.
A field from which some other fields are inherited is initialized to
blessed value of the same field taken from the parameter of bless,
modified by setting those fields which have default defintions.
It sounds complicated but this is what yields right bindings of
all definitions.
If a type behaves like a record, it is a record. You can make Record
instances of arbitrary types, making them constructible using the
record syntax.
bless should be lazy. Field setters can be strict.
-------- UPDATING FIELDS --------
If fields represent state changing over time, they can be mutable
references. Fields can also be updated in a functional style, but
this is really construction of new objects basing on old ones.
Field update syntax is as follows:
expr.record
label1 = value1
label2 = value2
It is equivalent to simple nested set_label applications.
Fields initialized with default definitions will not switch to refer
to updated values of other fields! All magic already happened at
record creation time.
This can be changed in at least two ways. First, you can define
instances of appropriate Has_label classes yourself and associate
arbitrary magic with field updates. Second, you can make such instance
for the field that you want to be a function of other fields instead of
putting the field in the record directly.
Definitions of two methods of Has_label classes have special syntax:
instance (a,b).fst :: a where
(a,_).fst = a
(_,b).record {fst = a} = (a,b)
instance (a,b).snd :: b where
(_,b).snd = b
(a,_).record {snd = b} = (a,b)
I.e. pattern.label is equivalent to (.label) pattern and defines the
getter function, and pattern1.record {label = pattern2} defines the
setter when applied to the record matching pattern1 and field value
matching pattern2. Braces can be omitted, but they make the syntax
more clear.
-------- SYNTAX DETAILS --------
The record keyword triggers the layout rules. Value definitions after
the record keyword look like let bindings. They can be defined by
cases with argument patterns on the left of the equal sign.
In record type definitions, record creations and record updates
definitions of fields can refer to all fields mentioned in those
constructs in an unqualified form. They can also refer to a special
variable called this, which holds the whole record after construction
or update.
-------- EXAMPLE --------
This example introduces a feature of renaming fields while inheriting.
> data Monoid e = record
> zero :: e
> plus :: e -> e -> e
>
> numAddMonoid :: Num e => Monoid e
> numAddMonoid = record
> zero = 0
> plus = (+)
>
> numMulMonoid :: Num e => Monoid e
> numMulMonoid = record
> zero = 1
> plus = (*)
>
> data Group e = record
> monoid :: Monoid e
> minus :: e -> e -> e
> neg :: e -> e
> monoid (zero, plus)
> x `minus` y = x `plus` neg y
> neg y = zero `minus` y
>
> numAddGroup :: Num e => Group e
> numAddGroup = record
> monoid = numAddMonoid
> minus = (-)
> neg = negate
>
> numMulGroup :: Fractional e => Group e
> numMulGroup = record
> monoid = numMulMonoid
> minus = (/)
> neg = recip
>
> data Ring e = record
> addGroup :: Group e
> mulMonoid :: Monoid e
> addGroup (monoid as addMonoid, zero, plus, minus, neg)
> mulMonoid (zero as one, plus as times)
>
> numRing :: Num e => Ring e
> numRing = record
> addGroup = numAddGroup
> mulMonoid = numMulMonoid
>
> data Field e = record
> addGroup :: Group e
> mulGroup :: Group e
> addGroup (monoid as addMonoid, zero, plus, minus, neg)
> mulGroup (monoid as mulMonoid, zero as one, plus as times,
> minus as div, neg as recip)
>
> instance (Field e).ring :: Ring e where
> f.ring = record
> addGroup = f.addGroup
> mulMonoid = f.mulMonoid
> f.record {ring = r} = f.record
> addGroup = r.addGroup
> mulMonoid = r.mulMonoid
>
> -- Alternatively a Field could consist of a Ring and div + recip.
> -- The difference is an implementation detail not visible outside.
> -- The following definition will work with either variant:
>
> numField :: Fractional e => Field e
> numField = record
> addGroup = numAddGroup
> mulGroup = numMulGroup
-------- PROBLEMS --------
If those records are to simulate classes, they should be able to have
polymorphic fields. Unfortunately it does not work to have overloaded
setters in this case. I don't know a good solution.
Similarly we would want to have records with existentially quantified
types. Again it does not work to have overloaded getters and setters.
Listing all inherited fields can be annoying. It would not really
work otherwise, as arbitrary instances for sypertypes can be added
at any time. It is not necessary to list all fields: other fields
are available through the field we inherit from anyway.
It would be desirable to selectively export instances.
-------- PROTOTYPE IMPLEMENTATION --------
I have an implementation of this in the form of a preprocessor,
based on hssource from ghc-4.11's hslibs. I will polish it and put
for downloading to let people play with my records. I hope to have
more interesting examples.
The difference between this implementation and the above proposal
is that types of inherited fields must be given explicitly. This
is because delegation instances would otherwise have to have
types which are not accepted by ghc, and they would require
-fallow-undecidable-instances if they were legal (which is not a
surprise because cyclic inheritance makes it impossible to determine
the type of the field).
I reported the problem under the subject "Problem with functional
dependencies" on December 17th. I believe that both problems can
be fixed, especially if handling those constructs were inside the
compiler.
-------- THE REST OF MY REPLY TO GEORGE RUSSELL --------
> (1) We extend type classes to allow them to introduce types.
If your classes were expressed as my records, it would roughly
correspond to existential quantification. But there are big problems
with typechecking in this approach.
I hope somebody will invent a solution.
--
__("< Marcin Kowalczyk * qrczak@knm.org.pl http://qrczak.ids.net.pl/
\__/
^^ SYGNATURA ZASTĘPCZA
QRCZAK