Skip to main content

Fast, transparent first- and second-order automatic differentiation

Project description

Overview

The ad package allows you to easily and transparently perform first and second-order automatic differentiation. Advanced math involving trigonometric, logarithmic, hyperbolic, etc. functions can also be evaluated directly using the admath sub-module.

All base numeric types are supported (int, float, complex, etc.). This package is designed so that the underlying numeric types will interact with each other as they normally do when performing any calculations. Thus, this package acts more like a “wrapper” that simply helps keep track of derivatives while maintaining the original functionality of the numeric calculations.

From the Wikipedia entry on Automatic differentiation (AD):

“AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, and accurate to working precision.”

See the package documentation for details and examples.

https://travis-ci.org/tisimst/ad.png?branch=master

Basic examples

Let’s start with the main import that all numbers use to track derivatives:

>>> from ad import adnumber

Creating AD objects (either a scalar or an N-dimensional array is acceptable):

>>> x = adnumber(2.0)
>>> x
ad(2.0)

>>> y = adnumber([1, 2, 3])
>>> y
[ad(1), ad(2), ad(3)]

>>> z = adnumber(3, tag='z')  # tags can help track variables
>>> z
ad(3, z)

Now for some math:

>>> square = x**2
>>> square
ad(4.0)

>>> sum_value = sum(y)
>>> sum_value
ad(6)

>>> w = x*z**2
>>> w
ad(18.0)

Using more advanced math functions like those in the standard math and cmath modules:

>>> from ad.admath import *  # sin, cos, log, exp, sqrt, etc.
>>> sin(1 + x**2)
ad(-0.9589242746631385)

Calculating derivatives (evaluated at the given input values):

>>> square.d(x)  # get the first derivative wrt x
4.0

>>> square.d2(x)  # get the second derivative wrt x
2.0

>>> z.d(x)  # returns zero if the derivative doesn't exist
0.0

>>> w.d2c(x, z)  # second cross-derivatives, order doesn't matter
6.0

>>> w.d2c(z, z)  # equivalent to "w.d2(z)"
4.0

>>> w.d()  # a dict of all relevant derivatives shown if no input
{ad(2.0): 9.0, ad(3, z): 12.0}

Some convenience functions (useful in optimization):

>>> w.gradient([x, z])  # show the gradient in the order given
[9.0, 12.0]

>>> w.hessian([x, z])
[[0.0, 6.0], [6.0, 4.0]]

>>> sum_value.gradient(y)  # works well with input arrays
[1.0, 1.0, 1.0]

# multiple dependents, multiple independents, first derivatives
>>> from ad import jacobian
>>> jacobian([w, square], [x, z])
[[9.0, 12.0], [4.0, 0.0]]

Working with NumPy arrays (many functions should work out-of-the-box):

>>> import numpy as np
>>> arr = np.array([1, 2, 3])
>>> a = adnumber(arr)

>>> a.sum()
ad(6)

>>> a.max()
ad(3)

>>> a.mean()
ad(2.0)

>>> a.var()  # array variance
ad(0.6666666666666666)

>>> print sqrt(a)  # vectorized operations supported with ad operators
[ad(1.0) ad(1.4142135623730951) ad(1.7320508075688772)]

Interfacing with scipy.optimize

To make it easier to work with the scipy.optimize module, there’s a convenient way to wrap functions that will generate appropriate gradient and hessian functions:

>>> from ad import gh  # the gradient and hessian function generator

>>> def objective(x):
...     return (x[0] - 10.0)**2 + (x[1] + 5.0)**2

>>> grad, hess = gh(objective)  # now gradient and hessian are automatic!

>>> from scipy.optimize import minimize
>>> x0 = np.array([24, 17])
>>> bnds = ((0, None), (0, None))
>>> method = 'L-BFGS-B'
>>> res = minimize(objective, x0, method=method, jac=grad, bounds=bnds,
...                options={'ftol': 1e-8, 'disp': False})
>>> res.x  # optimal parameter values
array([ 10.,   0.])
>>> res.fun  # optimal objective
25.0
>>> res.jac  # gradient at optimum
array([  7.10542736e-15,   1.00000000e+01])

Main Features

  • Transparent calculations with derivatives: no or little modification of existing code is needed, including when using the Numpy module.

  • Almost all mathematical operations are supported, including functions from the standard math module (sin, cos, exp, erf, etc.) and cmath module (phase, polar, etc.) with additional convenience trigonometric, hyperbolic, and logarithmic functions (csc, acoth, ln, etc.). Comparison operators follow the same rules as the underlying numeric types.

  • Real and complex arithmetic handled seamlessly. Treat objects as you normally would using the math and cmath functions, but with their new admath counterparts.

  • Automatic gradient and hessian function generator for optimization studies using scipy.optimize routines with gh(your_func_here).

  • Compatible Linear Algebra Routines in the ad.linalg submodule, similar to those found in NumPy’s linalg submodule, that are not dependent on LAPACK. There are currently:

    1. Decompositions

      1. chol: Cholesky Decomposition

      2. lu: LU Decomposition

      3. qr: QR Decomposition

    2. Solving equations and inverting matrices

      1. solve: General solver for linear systems of equations

      2. lstsq: Least-squares solver for linear systems of equations

      3. inv: Solve for the (multiplicative) inverse of a matrix

Installation

You have several easy, convenient options to install the ad package (administrative privileges may be required):

  1. Download the package files below, unzip to any directory, and run python setup.py install from the command-line.

  2. Simply copy the unzipped ad-XYZ directory to any other location that python can find it and rename it ad.

  3. If setuptools is installed, run easy_install --upgrade ad from the command-line.

  4. If pip is installed, run pip install --upgrade ad from the command-line.

  5. Download the bleeding-edge version on GitHub

Python 3

Download the file below, unzip it to any directory, and run:

$ python setup.py install

or:

$ python3 setup.py install

If bugs continue to pop up, please email the author.

Contact

Please send feature requests, bug reports, or feedback to Abraham Lee.

Acknowledgements

The author expresses his thanks to :

  • Eric O. LEBIGOT (EOL), author of the uncertainties package, for providing code insight and inspiration

  • Stephen Marks, professor at Pomona College, for useful feedback concerning the interface with optimization routines in scipy.optimize.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ad-1.2.2.tar.gz (24.1 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page