[Haskell-cafe] Data.Binary and little endian encoding
Don Stewart
dons at galois.com
Fri May 15 00:37:22 EDT 2009
timd:
> On a related matter, I am using Data.Binary to serialise data from
> haskell
> for use from other languages. The Data.Binary encoding of a Double is a
> long
> integer for the mantissa, and an int for the exponent. This doesn't
> work too well for interacting with other languages as I'd need to have
> an arbitrary precision int type there to decode/encode. The CORBA CDR
> standard encodes doubles in a big ended fashion like this (excuse my
> possibly incorrect ascii art):
>
>
> | byte | msb lsb |
> |------+---------------------------|
> | 0 | S E6 E0 |
> | 1 | E10 E9 E8 E7 F3 F2 F1 F0 |
> | 2 | F11 F4 |
> | 3 | F19 F12 |
> | 4 | F27 F20 |
> | 5 | F35 F28 |
> | 6 | F43 F36 |
> | 7 | F51 F44 |
>
> Up until now, my code is pure haskell. Is it possible to get at the
> internal bits of a Double/CDouble in ghc? Or Should I use the FFI and
> write C to encode something like the above?
Yep, it's possible, just not portably so. Google for Data.Binary IEEE
discussions.
More information about the Haskell-Cafe
mailing list