[Haskell-cafe] opengl type confusion
allbery.b at gmail.com
Sun Jun 16 22:15:25 CEST 2013
On Sun, Jun 16, 2013 at 4:03 PM, <briand at aracnet.com> wrote:
> Changing the declaration to GLdouble -> GLdouble -> GLdouble -> IO() and
> (0.0::GLdouble) fixes it, and I'm not clear on why it's not automagic.
> There are many times I see the
Haskell never "automagic"s types in that context; if it expects GLdouble,
it expects GLdouble. Pretending it's Double will not work. It "would" in
the specific case that GLdouble were actually a type synonym for Double;
however, for performance reasons it is not. Haskell Double is not directly
usable from the C-based API used by OpenGL, so GLdouble is a type synonym
for CDouble which is.
compiler doing type conversion an numerican arguments although sometimes
> the occasional fracSomethingIntegralorOther is required.
I presume the reason the type specification for numeric literals is because
there is no defaulting (and probably can't be without introducing other
strange type issues) for GLdouble.
In any case, the very fact that you refer to "automagic" and "type
conversion" indicates that you don't really have an understanding of how
Haskell's numeric types work; this will lead you into not only this kind of
confusion, but worse problems later. In particular, you're going to get
into dreadful messes where you expect Haskell to transparently deal with
strange combinations of numeric types as if Haskell were (almost-typeless)
Perl or something, and you'll have real trouble getting that code to work
until you sit down and figure out how strong typing and Haskell's numeric
brandon s allbery kf8nh sine nomine associates
allbery.b at gmail.com ballbery at sinenomine.net
unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Haskell-Cafe