[Haskell-cafe] Backpropagation implementation for a neural net library

Trin Trin trin.cz at gmail.com
Mon Jun 15 11:04:20 EDT 2009


Hi Alp,
- even with correctly programmed back-propagation, it is usually hard to
make the net converge.
- usually you initialize neuron weights with somewhat random values, when
working with back-propagation.
- do some debug prints of the net error while training to see how it is
going
- xor function cannot be trained with a single layer neural net !!!
Cheers,
Martin
PS: I did not check the back-propagation algorithm itself.


On Mon, Jun 15, 2009 at 9:58 AM, Alp Mestan <alp at mestan.fr> wrote:

> Dear List,
>
> I'm working with a friend of mine on a Neural Net library in Haskell.
>
> There are 3 files : neuron.hs, layer.hs and net.hs.
> neuron.hs defines the Neuron data type and many utility functions, all of
> which have been tested and work well.
> layer.hs defines layer-level functions (computing the output of a whole
> layer of neurons, etc). Tested and working.
> net.hs defines net-level functions (computing the output of a whole neural
> net) and the famous -- but annoying -- back-propagation algorithm.
>
> You can find them there : http://mestan.fr/haskell/nn/html/
>
> The problem is that here when I ask for final_net or test_output (anything
> after the train call, in net.hs), it seems to loop and loop around, as if it
> never gets the error under 0.1.
>
> So I was just wondering if there was one or more Neural Nets and Haskell
> wizard in there to check the back-propagation implementation, given in
> net.hs, that seems to be wrong.
>
> Thanks a lot !
>
> --
> Alp Mestan
>
> _______________________________________________
> Haskell-Cafe mailing list
> Haskell-Cafe at haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20090615/3bce038a/attachment.html


More information about the Haskell-Cafe mailing list