Integer Enum ?

Matt Harden matth@mindspring.com
Sat, 15 Dec 2001 17:04:19 -0600


> Frank Dellaert wrote:
> 
> Hi
> I'm not entirely clear how Integer can be an Enum instance. I thought
> Integer was arbitrary size, while for the Enum class you need to
> define a mapping from and to Int, which is bounded (in a machine
> dependent way, even ?). I'm probably missing something obvious...

You're right to wonder about that.  It's even worse than you thought. 
Float and Double are also Enum instances!

IMHO, this is a wart in the Haskell definition.  Enum is used for two
purposes: to support the [x,y..z] syntactic sugar, and to define
conversion to/from Int.  I think everybody agrees that the [x,y..z]
syntax should support Integers, so it has to be an instance of Enum.  I
suspect that for convenience, the fromEnum and toEnum functions were put
in the class to allow easier definitions of new Enum instances for small
bounded enumerations.  It has the undesireable effect of forcing the
implementer of an Enum instance to produce a mapping to/from Int even
when it doesn't make any sense.  My preference would be to define them
as errors in that case; the Haskell Report does not.

The Report defines fromEnum for Float and Double; it is silent on
Integer, and the Ratio module in the Library Report defines fromEnum for
Ratio.  In all cases, the reports specify truncation to integer, and
there is a comment that the conversion may overflow.  Complex does not
define an Enum instance, even though it would make sense to me to be
able to write [0, 1:+2 .. 5:+10] or the like.  I guess fromEnum for
Complex, if it existed, would have to just "truncate" the imaginary
part!

Regarding the Integer instance, Hugs only issues errors on fromEnum x
where x is out of range.  GHC just returns the value modulo 2^32, or
something like that.

My advice is to just ignore fromEnum and toEnum for Integer, Float,
Double, and Ratio, and, if you need to define your own Enum instance,
just define some reasonable mapping to Int, and don't worry much about
truncation and overflow... the Haskell designers didn't!  :)


Matt Harden

P.S.  I hope the Haskell designers don't take offense; in reality, they
have nothing but my deepest respect!