[Haskell-cafe] matrix computations based on the GSL
Jacques Carette
carette at mcmaster.ca
Wed Jun 29 19:42:40 EDT 2005
karczma at info.unicaen.fr wrote:
> One of the things I appreciate and I hate simultaneously in your postings
> is that you are so categorical.
'tis indeed simultaneously one of my strengths and one of my weaknesses
;-) I also like to play Devil's Advocate, to draw out the interesting
arguments. Luckily for me (but not my readers), I readily admit when
I'm wrong...
> This time you will *not* convince me that there is "one concept:
> multipli-
> cation", moreover "abstracted over unimportant details".
> If matrices represent operators, their multiplication is a *group*
> operation, the op. composition. Acting of a matrix on a vector is not.
> "Multiplication" of two vectors giving a scalar (their contration) is
> yet another beast.
They are all operations that have signatures a -> b -> c where either
a=b or b=c, is that enough structure? ;-)
More seriously, what about
forall A. 0.A = 0 [0 matrix, 0 matrix]
forall x. 0.x = 0 [0 matrix, 0 vector]
forall x. 0.x = 0 [0 vector, 0 scalar]
where x is a vector and A matrix. Also there is something strangely
similar in
forall x y. <A.x, B^T.y> = <A.B.x,y> [matrix, matrix]
forall x y. <I.x, B^T.y> = <B.x,y> [matrix, vector]
forall x y <I.x, I.y> = <x,y> [vector, vector]
where x,y are vectors, A,B matrices. There is some structure relating to
the identity matrix too, but it is a little more contrived.
That's a lot of similar structure for 3 ``unrelated'' operations, isn't
it? However, you might still be able to convince me that I was
stretching this a bit too far on that particular point.
> I believe that some progress has been done in math,
> when people discovered that mixing-up things is not necessarily a good
> thing, and different entities should be treated differently.
I agree. I certainly like going back to Newton to see that he made a
difference between derivatives and fluxions (ie static vs dynamic
derivatives) and Grassmann to make the difference between different
kinds of vectors (and linear operators and ...). Cauchy also knew some
things about solutions to real linear ODEs that are quite fascinating
[if you ``line up'' the singularities of the coefficients of an ODE with
the ODEs own singularities, you can get more solutions than the order of
the ODE -- see his 1821 Ecole Polytechnique lecture notes] that are not
included in most theorems about ODEs, because few appreciate the real
qualitative ``difference'' it makes to allow functions with
singularities in the coefficients of an ODE.
But just as much progress has been made when ``different'' things were
found to have a lot of similar structure. Or at least that is the main
lesson I draw from category theory. I draw similar lessons from Euler's
total disregard for convergence (with Tauberian theorems and the work of
Ecalle justifying him).
I like to find whatever scraps of underlying structure are present
between disparate looking concepts, just as much as I like seeing subtle
differences between concepts that had not been noticed before.
Jacques
More information about the Haskell-Cafe
mailing list