[Haskell-beginners] GLfloat cast... have I got it right?
Arlen Cuss
celtic at sairyx.org
Fri Jul 8 02:04:16 CEST 2011
> Right. Precisely, it forces the fields to WHNF *when the entire value is
> forced to WHNF*.
A-ha! That gives a good definition for me to work from.
Thanks for the seq examples; I'm only just starting to get to the point
where (I think) I can consider these issues with any insight.
> So putting a `!' on a component of, say, type String only forces it enough
> to see whether it's empty or not. `!' is most useful for types where WHNF
> implies sufficient evaluation, which means the constructors of that type
> need to have strict fields too (if any). Types like Integer, Int, Word,
> Double, Float have (in GHC) strict fields in the form of "raw bytes",
> Data.Map has `!'-ed fields (but the values are lazy), so with
>
> data Quux a b = Q ... !(Map a b) ...
>
> forcing a value of such a type to WHNF also forces the entire spine of the
> Map, which often is sufficient, but not always. If you also need the values
> in the Map to be evaluated, you have to use other methods (normally it's
> best to make sure the values are evaluated when they are inserted into the
> Map, doing that later tends to be expensive).
That makes a lot of sense.
> Re a): Number literals come with an implicit conversion function,
> 1234 and 5.678 stand for "fromInteger 1234" resp. "fromRational 5.678"
> and a type signature tells the compiler which fromInteger/fromRational to
> invoke. Number literals are polymorphic expressions and polymorphic
> expressions can be "cast" to a specific type by a type signature [which
> tells the compiler which fromInteger, return, ... to use]. To convert a
> monomorphic expression, however, the conversion function has to be
> explicitly invoked.
Polymorphic expressions are handy :-)
One of the things that surprised me the most - particularly coming to
Haskell by way of ML - was terms like `minBound', `maxBound', `read x'
and so on.
Seeing how any term could be defined per-instance in a typeclass -- not
just functions -- was a key moment, as it's easy to get stuck (very
hard) in a given mindset. From ML, polymorphic functions made sense, but
*plain values*?
(Or not so plain.)
Of course, where typeclasses are concerned, there's not much difference,
but I didn't even consider that at first.
A
More information about the Beginners
mailing list