[Haskell-cafe] Neural Nets

Marco Righele marco_righele at yahoo.it
Sat May 22 21:39:32 EDT 2004


schrum2 at southwestern.edu wrote:

>Hello all. I was wondering if anyone knows of ways to implement neural nets in 
>Haskell, and if there are any sites or projects that you could direct me to. 
>I'm specifically interested in how to represent the neural net in Haskell, and 
>then how to represent the neural net as genes in a genetic algorithm.
>_______________________________________________
>  
>
Hi,
It mostly depends on which kind of nets you want to work on.
If feed-forward neural nets are enough for you, the best solution is 
probably to layer them and use vector to reperesent activation values 
and matrices to represent weights betwen neurons. The result will be at 
the same time elegant and efficient.
For example suppose we have some Vector and Matrix data types (there 
should be some implementations around) with the common operations defined
(What follows is not exactly Haskell, as I'm not going to write down 
everything, but it should give you the idea)
mmulv :: Matrix a -> Vector a ->Vector a
mmulm :: Matrix a -> Matrix a -> Matrix a
transpose :: Matrix a -> Matrix a
mmap :: Matrix a -> (a -> b) -> Matrix b
instance Functor Matrix where ...
etc.

For example let's suppose to have network with two layers, the input one 
with n neurons and the output one with m. All the weights can be 
represented by a mxn matrix. In mathematical term the output of the net 
will be o = net( W* net(i) ) where net is the activation function.
The following bit of code will evaluate the output of the net given the 
input value
eval :: Matrix Double -> Vector Double-> Vector Double
eval w i =
  let o = fmap net i
      i' = w `mmulv` o
  in
    fmap net i'

If we want to be a bit more general we can represent a network in the 
following way:
type ANN = ([Double -> Double], [Double->Double],[Matrix Double])

where the first element of the tuple is the list of the activation 
function and the second is the list of the their derivatives, and they 
both have one element more than the weights list (because we have one 
layer more than weights matrices)

evalAnn :: ANN -> Vector Double -> Vector Double
evalAnn (funs,dfuns,ws) = evalAnn' funs ws

evalAnn' [f] [] v = fmap f v
evalAnn' (f:fs) (w:ws) v = evalAnn' fs ws ( w `mmulv` (fmap f v))

You can get the training function in a similar way.
What you need is just a "train" function that takes the training 
examples and a partially trained network and returns an adjusted network
type Example = (Vector Double,Vector Double)
train :: [Example] -> ANN -> -> ANN

Iterating this function you can get the list of all the "states" of the 
network during the training and decide when to stop in any way you want 
(with validation on another set for example)

A final warning: I wrote all the code "on the fly" so expect it to be 
broken and far from working.
Hope it helps anyway.

Marco.



More information about the Haskell-Cafe mailing list