<br><div><span class="gmail_quote">On 9/26/05, <b class="gmail_sendername">Joel Reymont</b> <<a href="mailto:firstname.lastname@example.org">email@example.com</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Folks,<br><br>I got a project where I have a large number of variables and an<br>outcome and I need to figure out which 20% of the variables has the<br>largest effect on the outcome. Of course I also need to optimize the<br>
20% of variables I end up with.<br><br>This sounds like a job for a neural network to me, with arguments<br>possibly optimized through genetic algorithms. I'm wondering, though,<br>if I'm complicating things for myself and there's an easier approach
<br>to this. If not I'm wondering if there are ready-made NN or GA<br>libraries for Haskell.<br><br>Also, I'm curious if Haskell is really the best language for this<br>type of problem and if lazy evaluation brings any specific advantages
<br>to the solution or would be a hindrance.</blockquote><div><br>
Check this paper - it seems they solved a similar problem with a hill-climbing<br>
</div><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">I would welcome any pointers and feedback, yes, someone is actually<br>paying me to do this :-).
<br><br> Thanks, Joel<br><br><br>