[Haskell-cafe] Re: Haskell idioms for hierarchical memory modeling ?

Dmitri O.Kondratiev dokondr at gmail.com
Mon May 26 14:41:48 EDT 2008

On Mon May 26 09:02:01 EDT 2008
Marc Weber [marco-oweber at gmx.de] wrote:

>Searching haskell.org only gives one match:
>(I haven't read it)
>is the other most commonly used source of finding aready existing code
>(There is package about nn)
>Should your nn also support kind of learning by giving feedback?

Marc, thanks for explaining how to use State and the links!
As for your question about my NN learning, I need to tell more how it all

My classifier NN (Memory Tree - MT - as I call it), that I previously
described may work in two modes:

1) Usupervised learning. In this mode each node learns new categories when
it encounters new input vectors that classifier has not seen before. These
input vectors become new categories and get stored in node category memory
(CM). When MT starts in this mode all nodes have empty CMs. Size of CM
defines the number of categories that node may learn.

2) Supervised learning. In this mode MT also has a category node. Category
node feeds in category numbers to a top level node. Learning is done with
prepared in advance training data set. Traing data consits of pairs (input
vector, category number). Bottom nodes read input vector from this pair
while category node simultaneously sends corresponding category number to
the top level node. Thus when input reaches top node it already knows what
category this input belongs to. After MT consumes all training data it is
belived to be fully learned and ready to infer categories from 'work' data

Hope this answers your question about learning.

Dmitri O. Kondratiev
dokondr at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.haskell.org/pipermail/haskell-cafe/attachments/20080526/7a59e5cd/attachment.htm

More information about the Haskell-Cafe mailing list