[Haskell-cafe] FW: Haskell
Andrew Bagdanov
bagdanov at gmail.com
Tue Apr 1 12:17:46 EDT 2008
On Tue, Apr 1, 2008 at 4:55 PM, Loup Vaillant <loup.vaillant at gmail.com> wrote:
> 2008/4/1, Andrew Bagdanov <bagdanov at gmail.com>:
>
> >
> > In short, I think the orginal question must be asked in context. For
> > some problems, types are just a natural way to start thinking about
> > them. For others dynamic typing, with _judicious_ use of macros to
> > model key aspects, is the most natural approach.
>
> Do you have any example? I mean, you had to choose between Scheme and
> Ocaml, sometimes, right? Ocaml is not Haskell, but maybe the reasons
> which influenced your choices would have been similar if you knew
> Haskell instead of Ocaml.
>
Sure. This may not be the best example, but it's the most immediate
one for me. I'll try to be brief and hopefully still clear... Years
ago I implemented an image processing system based on algorithmic
patterns of IP defined over algebraic pixel types (algebraic in the
ring, field, vector space sense). Here's a link to the chapter from
my dissertation, for the chronically bored:
http://www.micc.unifi.it/~bagdanov/thesis/thesis_08.pdf
This was partially motivated by the observation that a lot of image
processing is about _types_ and about getting them _right_. There's a
complex interplay between the numerical, computational and perceptual
semantics of the data you need to work with. A functional programming
language with strict typing and type inference seemed ideal for
modeling this. You get plenty of optimizations for "free" when
lifting primitive operations to work on images (except OCaml functors
really let me down here), and you don't have to worry figuring out
what someone means when convolving a greyscale image with a color
image -- unless you've already defined an instantiation of the
convolution on these types that has a meaningful interpretation.
Where "meaningful" is of course up to the implementor.
In retrospect, if I had it all over to do again, I might choose Scheme
over OCaml specifically because of dynamic typing. Or more flexible
typing, rather. To completely define a new pixel datatype it is
necessary to define a flotilla of primitive operations on it (e.g.
add, mul, neg, div, dot, abs, mag, ...) but for many higher-level
operations, only a handful were necessary. For example, for a
standard convolution, mul and add are sufficient. In cases like this,
I would rather explicitly dispatch at a high level -- in a way that
admits partial implementations of datatypes to still play in the
system. In retro-retrospect, the structural typing of OCaml objects
could probably do this pretty well too... Oh well.
This is a case where the resulting system was difficult to use in the
exploratory, experimental it was intended to be used, in my opinion
because typing got in the way. Strict typing and type inference were
a huge boon for the design and implementation. I would consider
Haskell this time around too (I think I did all those years ago too),
as I think lazy evaluation semantics, direct support of monadic style,
and yes, even it's terse syntax, could address other aspects of the
domain that are untouched. I don't have a clear enough understanding
of or experience with Haskell type classes, but my intuition is that
I'd have the same problems with typing as I did with OCaml.
Cheers,
-Andy
> Cheers,
> Loup
>
More information about the Haskell-Cafe
mailing list