[Haskell-cafe] How to make way into a hadoop infrastructure
mike at embody.org
Fri Apr 30 10:31:25 EDT 2010
I think you'll want to look at the Hadoop Streaming or Hadoop Pipes API.
Further down the line, I think somebody will want to implement a Haskell
library to deal with the Avro serialization protocol when it becomes
possible to write non-JVM mappers and reducers directly. This JIRA issue
covers the RPC part of the Avro-Hadoop integration work:
Looks like folks have already implemented support for Thrift and
Protocol Buffers, so implementing a library for Avro would likely be
begin C K Kashyap quotation:
> Dear Haskellers,
> A big part of my new job requires tuning app's on Hadoop. I was wondering if
> there is a way to push some Haskell code in the mix. I did some googling on
> "Hadoop/Haskell" and came across Holumbus - but looks like that is parallel
> to Hadoop.
> I was thinking in the lines of doing a Haskell implementation that could run
> in a Hadoop cluster - has anyone tried anything like that?
> Haskell-Cafe mailing list
> Haskell-Cafe at haskell.org
More information about the Haskell-Cafe