Thu, 4 Oct 2001 20:47:36 +0200
On Sunday, 30 September 2001 20:01, John Meacham wrote:
> sorry for the me too post, but this has been a major pet peeve of mine
> for a long time. 16 bit unicode should be gotten rid of, being the worst
> of both worlds, non backwards compatable with ascii, endianness issues
> and no constant length encoding.... utf8 externally and utf32 when
> worknig with individual characters is the way to go.
I totally agree with you.
> seeing as how the haskell standard is horribly vauge when it comes to
> character set encodings anyway, I would recommend that we just omit any
> reference to the bit size of Char, and just say abstractly that each
> Char represents one unicode character, but the entire range of unicode
> is not guarenteed to be expressable, which must be true, since haskell
> 98 implementations can be written now, but unicode can change in the
> future. The only range guarenteed to be expressable in any
> representation are the values 0-127 US ASCII (or perhaps latin1)
This sounds also very good.