ideas for compiler project
Jerzy Karczmarczuk
karczma@info.unicaen.fr
Fri, 01 Feb 2002 13:18:56 +0100
MOVED HERE from haskell list.
Eray Ozkural about numerical/matrix stuff in Matlab/Haskell:
> ... I don't think that it's feasible to write a haskell library that
> does it, or extend haskell such that it becomes "linear algebra" aware.
>
> I suppose the right direction is to write a compiler/interpreter for a linear
> algebra/numerical language in Haskell!
>
> That language can be made very mathematical, and still much more capable and
> efficient than matlab. Otherwise all you're going to have is another matlab
> clone. The hard part here is of course the design of this specific
> language...
>
> Nevertheless, writing a matlab clone is haskell would be fun as well! It
> could be more extensible and reliable than matlab itself surely.
My 2 euro-cents:
Compiler? yes, why not?
Interpreter? You mean, a virtual machine able to do FAST all those array
manipulations?
But you will get into the same problem as with a Haskell library...
The "kernel" with fast matrix multiplication, with really damn fast
submatrix extraction, with blitzing convolutions, Fourierisms etc. need
a rather low-level access tools to the memory.
The same story holds for bitmap processing.
Look at Smalltalk. Its compiler and a *good part* of the virtual machine
is written in Smalltalk. But when you have to snap an image from the screen,
to copy it back, to move it - no use, the PixBlt primitives are written in C.
So, I presume that a decent strategy would be the following, with points
A and B below developed synchronously.
A. Design a kernel which is stupid as the Napoleon's hat (concerning the
algebra), but which performs fast indexing and block transfer in many
typical examples: row extraction from a matrix, all simple iterators
(sums, pair-wise products, etc.) - you know what I mean. Such patterns
are not very numerous.
Make all this primitive.
B. Design a sound, functional, typed layer for matrix algebra, but using
those blocks, slices, rows, sub-matrices, Kronecker products etc. as
primitives.
Test both together with all the algorithms possible, and when something,
I don't know, some Householder algorithm, some Lanczos whatever turns out
to be too slow, analyze critically the performance, and augment the
kernel layer.
======
Last thing. It is easy to criticize Matlab saying that its replacement might
be better. Often such statements come from people who don't use it actually.
Now.
Although I am a declared believer in the Glory of Haskell and the Salvation
of the Universe by Functional Paradigms, I used quite a lot some integrated
scientific environments.
I had a look on Rlab, Yorick, Tela, *of course* on Scilab, etc. and I must say
that they never manage to catch up with Matlab in all domains: the interfacing
and its open architecture; plotting which makes from Matlab a 3D design
system (never dreamt of initially by the conceptors), and the object-oriented
layers of programming, with overloadable operations.
Matlab is extensible as seldom anything else.
I won't praise it here, they are degenerating as well; the version 6 is
a memory hog, slower than version 5 (on my machine), and their super-goodies
are sometimes too baroque.
As a programming language it is worse than Fortran (save for vectorized
arithmetic). So, linguistically a functional scientific programming tool
would be really very nice. But the performance is another issue.
Jerzy Karczmarczuk
Caen, France.
===
PS. Is it good English: "save for vectorized arithmetic"? It looks like
a French calque, but this "save" I found also in Tolkien.
But I am not Tolkien...