[Haskell-cafe] Accelerating Automatic Differentiation

Michal J Gajda mgajda at mimuw.edu.pl
Sun Mar 25 08:08:54 UTC 2018


Great! If they all agreed, we can just add Mikhail, and done?
On Sun, 25 Mar 2018 at 15:17, <dominic at steinitz.org> wrote:

> Hi Michal,
>
> I didn’t volunteer to be a mentor for this. The project already lists:
>
> *Mentor*: Fritz Henglein, Gabriele Keller, Trevor McDonell, Edward Kmett,
> Sacha Sokoloski
>
>
> I doubt there is much I could add to such an illustrious list.
>
> Dominic Steinitz
> dominic at steinitz.org
> http://idontgetoutmuch.wordpress.com
> Twitter: @idontgetoutmuch
>
>
>
> On 25 Mar 2018, at 06:11, Michal J Gajda <mgajda at mimuw.edu.pl> wrote:
>
> Given that Marco did not confirm, I am just now confirming that we can get
> you mentored by  Mikhail Baikov as the third mentor (beside Dominic and me).
> Both me (Michal) and Mikhail are performance optimization experts (I am in
> parsers, and data analytics, Mikhail is in real-time systems, he has his
> own top-notch serialization library - Beamable that outperforms Cereal in
> both data size and speed). Dominic is expert in numerical computing (ODEs
> and Julia, among other things).
>
> I believe that with these three excellent mentors you have very good
> chance to make outstanding contribution.
> We just make sure that you prep application by 27th deadline.
> --
>   Cheers
>     Michal
>
> On Sun, Mar 25, 2018 at 6:32 AM Michal J Gajda <mgajda at mimuw.edu.pl>
> wrote:
>
>> As a mentor I would say it is certainly possible to outperform exsting
>> mega-solutions in some narrow domain.
>> Just as I did with hPDB https://hackage.haskell.org/package/hPDB
>> But it requires a lot of skill and patiece.
>>
>> Please proceed with this project with the current list of mentors.
>> I think me and Dominic have already declared committment.
>>
>> You might also start by making a table of best competing solutions in
>> other. languages, their respective strengths, and ways that we can possibly
>> improve on them!
>>
>> Where do You keep Your application draft? Ideally it should be a shared
>> space where you can add mentors as co-editors.
>>>> Cheers
>> Michal
>> On Sun, 25 Mar 2018 at 02:32, <dominic at steinitz.org> wrote:
>>
>>> The list of mentors for this project looks great to me. I am not sure if
>>> I can add much other than I think this is a nice project. Perhaps it would
>>> be best to get the advice of some of the mentors?
>>>
>>> For some very simple tests with an ODE solver, I concluded that
>>> accelerate can perform at least as well as Julia. It would certainly be
>>> very helpful to be able to get Jacobians for ODE solving and for other
>>> applications.
>>>
>>> Dominic Steinitz
>>> dominic at steinitz.org
>>> http://idontgetoutmuch.wordpress.com
>>> Twitter: @idontgetoutmuch
>>>
>>>
>>>
>>> On 24 Mar 2018, at 17:20, Charles Blake <cb307 at st-andrews.ac.uk> wrote:
>>>
>>> Thanks for the response Michal,
>>>
>>> Yes, this did cross my mind - and I wouldn't be expecting to outperform
>>> those frameworks in the timeframe available! I assumed that the reason that
>>> this project was suggested was perhaps:
>>>
>>> a) there is some intrinsic value in implementing these algorithms
>>> natively in haskell (hence why the 'ad' library was developed in the first
>>> place), so that those who want to use parallel automatic differentiation /
>>> the machine learning algorithms built on top of it can do so without
>>> leaving the haskell ecosystem,
>>>
>>> and b) because the challenges involved in implementing parallel ad in a
>>> purely functional language are a little different to those involved in
>>> doing so in OO/imperative languages - so it might be interesting from that
>>> angle as well?
>>>
>>> So perhaps my aim would no be to do something unique, but rather to do
>>> something that has already done well in other languages, but has not yet
>>> been provided as a haskell library. Does this sound like a reasonable
>>> approach or do I need to find a slightly more unique angle?
>>>
>>> Thanks,
>>> Charlie
>>>
>>> ------------------------------
>>> *From:* Michal J Gajda <mgajda at mimuw.edu.pl>
>>> *Sent:* 24 March 2018 16:56:35
>>> *To:* Dominic Steinitz; Marco Zocca; accelerate-haskell at googlegroups.com;
>>> Charles Blake; haskell-cafe at haskell.org
>>> *Subject:* Re: Accelerating Automatic Differentiation
>>>
>>>
>>> Hi Charlie
>>>
>>> It certainly looks like exciting project, but the bar is currently
>>> placed very high.
>>> TensorFlow package not only provides automatic differentiation for whole
>>> programs, but also optimizes data processing both on GPU, and reading to
>>> achieve large batches.
>>> This field has a lot of hot developments, so You would either need to
>>> propose something unique to Haskell, or You risk being outclassed by
>>> PyTorch and TensorFlow bindings
>>>   Maybe Dominic suggests something too.
>>>  Cheers
>>>       Michal
>>>
>>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.haskell.org/pipermail/haskell-cafe/attachments/20180325/fff33b72/attachment.html>


More information about the Haskell-Cafe mailing list