[Haskell-cafe] ANNOUNCE: fad 1.0 -- Forward Automatic
Differentiation library
Bjorn Buckwalter
bjorn.buckwalter at gmail.com
Thu Apr 2 22:28:52 EDT 2009
I'm pleased to announce the initial release of the Haskell fad
library, developed by Barak A. Pearlmutter and Jeffrey Mark Siskind.
Fad provides Forward Automatic Differentiation (AD) for functions
polymorphic over instances of 'Num'. There have been many Haskell
implementations of forward AD, with varying levels of completeness,
published in papers and blog posts[1], but alarmingly few of these
have made it into hackage -- to date Conal Elliot's vector-spaces[2]
package is the only one I am aware of.
Fad is an attempt to make as comprehensive and usable a forward AD
package as is possible in Haskell. However, correctness is given
priority over ease of use, and this is in my opinion the defining
quality of fad. Specifically, Fad leverages Haskell's expressive
type system to tackle the problem of _perturbation confusion_,
brought to light in Pearlmutter and Siskind's 2005 paper "Perturbation
Confusion and Referential Transparency"[3]. Fad prevents perturbation
confusion by employing type-level "branding" as proposed by myself
in a 2007 post to haskell-cafe[4]. To the best of our knowledge all
other forward AD implementations in Haskell are susceptible to
perturbation confusion.
As this library has been in the works for quite some time it is
worth noting that it hasn't benefited from Conal's ground-breaking
work[5] in the area. Once we wrap our heads around his beautiful
constructs perhaps we'll be able to borrow some tricks from him.
As mentioned already, fad was developed primarily by Barak A.
Pearlmutter and Jeffrey Mark Siskind. My own contribution has been
providing Haskell infrastructure support and wrapping up loose ends
in order to get the library into a releasable state. Many thanks
to Barak and Jeffrey for permitting me to release fad under the BSD
license.
Fad resides on GitHub[6] and hackage[7] and is only a "cabal install
fad" away! What follows is Fad's README, refer to the haddocks for
detailed documentation.
Thanks,
Bjorn Buckwalter
[1] http://www.haskell.org/haskellwiki/Functional_differentiation
[2] http://www.haskell.org/haskellwiki/Vector-space
[3]: http://www.bcl.hamilton.ie/~qobi/nesting/papers/ifl2005.pdf
[4]: http://thread.gmane.org/gmane.comp.lang.haskell.cafe/22308/
[5]: http://conal.net/papers/beautiful-differentiation/
[6] http://github.com/bjornbm/fad/
[7] http://hackage.haskell.org/cgi-bin/hackage-scripts/package/fad
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Copyright : 2008-2009, Barak A. Pearlmutter and Jeffrey Mark Siskind
License : BSD3
Maintainer : bjorn.buckwalter at gmail.com
Stability : experimental
Portability: GHC only?
Forward Automatic Differentiation via overloading to perform
nonstandard interpretation that replaces original numeric type with
corresponding generalized dual number type.
Each invocation of the differentiation function introduces a
distinct perturbation, which requires a distinct dual number type.
In order to prevent these from being confused, tagging, called
branding in the Haskell community, is used. This seems to prevent
perturbation confusion, although it would be nice to have an actual
proof of this. The technique does require adding invocations of
lift at appropriate places when nesting is present.
For more information on perturbation confusion and the solution
employed in this library see:
<http://www.bcl.hamilton.ie/~barak/papers/ifl2005.pdf>
<http://thread.gmane.org/gmane.comp.lang.haskell.cafe/22308/>
Installation
============
To install:
cabal install
Or:
runhaskell Setup.lhs configure
runhaskell Setup.lhs build
runhaskell Setup.lhs install
Examples
========
Define an example function 'f':
> import Numeric.FAD
> f x = 6 - 5 * x + x ^ 2 -- Our example function
Basic usage of the differentiation operator:
> y = f 2 -- f(2) = 0
> y' = diff f 2 -- First derivative f'(2) = -1
> y'' = diff (diff f) 2 -- Second derivative f''(2) = 2
List of derivatives:
> ys = take 3 $ diffs f 2 -- [0, -1, 2]
Example optimization method; find a zero using Newton's method:
> y_newton1 = zeroNewton f 0 -- converges to first zero at 2.0.
> y_newton2 = zeroNewton f 10 -- converges to second zero at 3.0.
Credits
=======
Authors: Copyright 2008,
Barak A. Pearlmutter <barak at cs.nuim.ie> &
Jeffrey Mark Siskind <qobi at purdue.edu>
Work started as stripped-down version of higher-order tower code
published by Jerzy Karczmarczuk <jerzy.karczmarczuk at info.unicaen.fr>
which used a non-standard standard prelude.
Initial perturbation-confusing code is a modified version of
<http://cdsmith.wordpress.com/2007/11/29/some-playing-with-derivatives/>
Tag trick, called "branding" in the Haskell community, from
Bjorn Buckwalter <bjorn.buckwalter at gmail.com>
<http://thread.gmane.org/gmane.comp.lang.haskell.cafe/22308/>
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
More information about the Haskell-Cafe
mailing list