[Haskell-cafe] How to make way into a hadoop infrastructure
C K Kashyap
ckkashyap at gmail.com
Fri Apr 30 21:28:20 EDT 2010
Thanks Mike. I'll look at the pointers you've given.
On Fri, Apr 30, 2010 at 8:01 PM, Mike Dillon <mike at embody.org> wrote:
> I think you'll want to look at the Hadoop Streaming or Hadoop Pipes API.
> Further down the line, I think somebody will want to implement a Haskell
> library to deal with the Avro serialization protocol when it becomes
> possible to write non-JVM mappers and reducers directly. This JIRA issue
> covers the RPC part of the Avro-Hadoop integration work:
> Looks like folks have already implemented support for Thrift and
> Protocol Buffers, so implementing a library for Avro would likely be
> pretty similar.
> begin C K Kashyap quotation:
> > Dear Haskellers,
> > A big part of my new job requires tuning app's on Hadoop. I was wondering
> > there is a way to push some Haskell code in the mix. I did some googling
> > "Hadoop/Haskell" and came across Holumbus - but looks like that is
> > to Hadoop.
> > I was thinking in the lines of doing a Haskell implementation that could
> > in a Hadoop cluster - has anyone tried anything like that?
> > --
> > Regards,
> > Kashyap
> > _______________________________________________
> > Haskell-Cafe mailing list
> > Haskell-Cafe at haskell.org
> > http://www.haskell.org/mailman/listinfo/haskell-cafe
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Haskell-Cafe