# [Haskell-cafe] matrix computations based on the GSL

Henning Thielemann lemming at henning-thielemann.de
Fri Jul 8 12:18:34 EDT 2005

```On Fri, 8 Jul 2005, David Roundy wrote:

> I don't particularly care what API you use for svd, since it's trivial to
> convert from one API to the other.  It's matrix arithmetic I care about,
> since that's the complicated part of the API.

Of course I want to use the results of more complicated routines with
basic matrix arithmetic and I like to reduce the number of conversions.
The other reason for the debate is that if you don't like the extra vector
type then you will not use it. If I want to apply a routine of you to my,
say, audio data I will have to decide whether I shall store it as column
or as row vector/matrix although this decision seems to me rather
irrelevant and annoying.

> On the other hand, the most natural return value for svd would be a
> diagonal matrix, since that is what the objects are, right?

Hm, since SVD means Singular Value Decomposition I like to have the
singular values as they are. I don't want to search them in a sparse
matrix.

> svd returns three matrices, which when multiplied together give the
> original matrix ...

This would be a nice property, though. I could do it by converting the
singular value list to a diagonal matrix. So I need a conversion, hm.

> or at least that's how I think of it.  But I'll grant that a diagonal
> matrix isn't the most convenient representation, and certain is far from
> the most efficient, unless we introduce a diagonal matrix constructor
> (which would certainly be nice).

I would not like to obtain a value of general Matrix type since it is
statically guaranted that it is always diagonal.

>  I guess you'd prefer that svd returns a list of doubles and two lists
> of vectors? Or a list of triplets of a double and two vectors?

Either there is a function to scale the columns of a matrix separately,
then two matrices and a list/vector of doubles are ok. Or there is a
function to multiply two vectors to obtain a matrix with rank 1 (Outer
product or tensor product might be the right name. Btw. the (<>) operator
has the problem, that it is always interpreted as scalar product. But it
could also mean this kind of multiplication.) Then adding such products of
left and right singular vectors scaled by the corresponding singular value
let me reconstruct the matrix.
The triplet approach has the advantage that it is statically sure that
the number of singular values and singular vectors match. The matrix
approach has the advantage that it is statically guaranteed that the
dimension of all singular vectors match.
With the (orthogonal) singular vector matrices we map vectors from one
basis to the representation in another basis. So these matrices represents
naturally linear maps and it makes sense to use them.

```