[Haskell-cafe] Accelerating Automatic Differentiation
cb307 at st-andrews.ac.uk
Sat Mar 24 17:20:03 UTC 2018
Thanks for the response Michal,
Yes, this did cross my mind - and I wouldn't be expecting to outperform those frameworks in the timeframe available! I assumed that the reason that this project was suggested was perhaps:
a) there is some intrinsic value in implementing these algorithms natively in haskell (hence why the 'ad' library was developed in the first place), so that those who want to use parallel automatic differentiation / the machine learning algorithms built on top of it can do so without leaving the haskell ecosystem,
and b) because the challenges involved in implementing parallel ad in a purely functional language are a little different to those involved in doing so in OO/imperative languages - so it might be interesting from that angle as well?
So perhaps my aim would no be to do something unique, but rather to do something that has already done well in other languages, but has not yet been provided as a haskell library. Does this sound like a reasonable approach or do I need to find a slightly more unique angle?
From: Michal J Gajda <mgajda at mimuw.edu.pl>
Sent: 24 March 2018 16:56:35
To: Dominic Steinitz; Marco Zocca; accelerate-haskell at googlegroups.com; Charles Blake; haskell-cafe at haskell.org
Subject: Re: Accelerating Automatic Differentiation
It certainly looks like exciting project, but the bar is currently placed very high.
TensorFlow package not only provides automatic differentiation for whole programs, but also optimizes data processing both on GPU, and reading to achieve large batches.
This field has a lot of hot developments, so You would either need to propose something unique to Haskell, or You risk being outclassed by PyTorch and TensorFlow bindings
Maybe Dominic suggests something too.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Haskell-Cafe