ad 1.1.7
Fast, transparent first and secondorder automatic differentiation
Latest Version: 1.3.1
Overview
The ad package allows you to easily and transparently perform first and secondorder automatic differentiation. Advanced math involving trigonometric, logarithmic, hyperbolic, etc. functions can also be evaluated directly using the admath submodule.
All base numeric types are supported (int, float, complex, etc.). This package is designed so that the underlying numeric types will interact with each other as they normally do when performing any calculations. Thus, this package acts more like a “wrapper” that simply helps keep track of derivatives while maintaining the original functionality of the numeric calculations.
From the Wikipedia entry on Automatic differentiation (AD):
“AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, and accurate to working precision.”
See the package documentation for details and examples.
Basic examples
Let’s start with the main import that all numbers use to track derivatives:
>>> from ad import adnumber
Creating AD objects (either a scalar or an Ndimensional array is acceptable):
>>> x = adnumber(2.0) >>> x ad(2.0) >>> y = adnumber([1, 2, 3]) >>> y [ad(1), ad(2), ad(3)] >>> z = adnumber(3, tag='z') # tags can help track variables >>> z ad(3, z)
Now for some math:
>>> square = x**2 >>> square ad(4.0) >>> sum_value = sum(y) >>> sum_value ad(6) >>> w = x*z**2 >>> w ad(18.0)
Using more advanced math functions like those in the standard math and cmath modules:
>>> from ad.admath import * # sin, cos, log, exp, sqrt, etc. >>> sin(1 + x**2) ad(0.9589242746631385)
Calculating derivatives (evaluated at the given input values):
>>> square.d(x) # get the first derivative wrt x 4.0 >>> square.d2(x) # get the second derivative wrt x 2.0 >>> z.d(x) # returns zero if the derivative doesn't exist 0.0 >>> w.d2c(x, z) # second crossderivatives, order doesn't matter 6.0 >>> w.d2c(z, z) # equivalent to "w.d2(z)" 4.0 >>> w.d() # a dict of all relevant derivatives shown if no input {ad(2.0): 9.0, ad(3, z): 12.0}
Some convenience functions (useful in optimization):
>>> w.gradient([x, z]) # show the gradient in the order given [9.0, 12.0] >>> w.hessian([x, z]) [[0.0, 6.0], [6.0, 4.0]] >>> sum_value.gradient(y) # works well with input arrays [1.0, 1.0, 1.0] # multiple dependents, multiple independents, first derivatives >>> from ad import jacobian >>> jacobian([w, square], [x, z]) [[9.0, 12.0], [4.0, 0.0]]
Working with NumPy arrays (many functions should work outofthebox):
>>> import numpy as np >>> arr = np.array([1, 2, 3]) >>> a = adnumber(arr) >>> a.sum() ad(6) >>> a.max() ad(3) >>> a.mean() ad(2.0) >>> a.var() # array variance ad(0.6666666666666666) >>> print sqrt(a) # vectorized operations supported with ad operators [ad(1.0) ad(1.4142135623730951) ad(1.7320508075688772)]
Interfacing with scipy.optimize
To make it easier to work with the scipy.optimize module, there’s a convenient way to wrap functions that will generate appropriate gradient and hessian functions:
>>> from ad import gh # the gradient and hessian function generator >>> def objective(x): ... return (x[0]  10.0)**2 + (x[1] + 5.0)**2 >>> grad, hess = gh(objective) # now gradient and hessian are automatic! >>> from scipy.optimize import minimize >>> x0 = np.array([24, 17]) >>> bnds = ((0, None), (0, None)) >>> method = 'LBFGSB' >>> res = minimize(objective, x0, method=method, jac=grad, bounds=bnds, ... options={'ftol': 1e8, 'disp': False}) >>> res.x # optimal parameter values array([ 10., 0.]) >>> res.fun # optimal objective 25.0 >>> res.jac # gradient at optimum array([ 7.10542736e15, 1.00000000e+01])
Main Features
 Transparent calculations with derivatives: no or little modification of existing code is needed, including when using the Numpy module.
 Almost all mathematical operations are supported, including functions from the standard math module (sin, cos, exp, erf, etc.) and cmath module (phase, polar, etc.) with additional convenience trigonometric, hyperbolic, and logarithmic functions (csc, acoth, ln, etc.). Comparison operators follow the same rules as the underlying numeric types.
 Real and complex arithmetic handled seamlessly. Treat objects as you normally would using the math and cmath functions, but with their new admath counterparts.
 Automatic gradient and hessian function generator for optimization studies using scipy.optimize routines with gh(your_func_here).
 Compatible Linear Algebra Routines in the ad.linalg submodule,
similar to those found in NumPy’s linalg submodule, that are not
dependent on LAPACK. There are currently:
 Decompositions
 chol: Cholesky Decomposition
 lu: LU Decomposition
 qr: QR Decomposition
 Solving equations and inverting matrices
 solve: General solver for linear systems of equations
 lstsq: Leastsquares solver for linear systems of equations
 inv: Solve for the (multiplicative) inverse of a matrix
 Decompositions
Installation
You have several easy, convenient options to install the ad package (administrative privileges may be required):
 Download the package files below, unzip to any directory, and run python setup.py install from the commandline.
 Simply copy the unzipped adXYZ directory to any other location that python can find it and rename it ad.
 If setuptools is installed, run easy_install upgrade ad from the commandline.
 If pip is installed, run pip install upgrade ad from the commandline.
 Download the bleedingedge version on GitHub
Python 3
Download the file below, unzip it to any directory, and run:
$ python setup.py install
or:
$ python3 setup.py install
If bugs continue to pop up, please email the author.
Contact
Please send feature requests, bug reports, or feedback to Abraham Lee.
Acknowledgements
The author expresses his thanks to :
 Eric O. LEBIGOT (EOL), author of the uncertainties package, for providing code insight and inspiration
 Stephen Marks, professor at Pomona College, for useful feedback concerning the interface with optimization routines in scipy.optimize.
File  Type  Py Version  Uploaded on  Size  

ad1.1.7.tar.gz (md5)  Source  20130926  18KB  
 Downloads (All Versions):
 64 downloads in the last day
 335 downloads in the last week
 1792 downloads in the last month
 Author: Abraham Lee
 Documentation: ad package documentation
 Home Page: http://pythonhosted.org/ad
 Keywords: automatic differentiation,first order,second order,derivative,algorithmic differentiation,computational differentiation,optimization
 License: BSD License

Categories
 Development Status :: 5  Production/Stable
 Intended Audience :: Education
 Intended Audience :: Science/Research
 License :: OSI Approved :: BSD License
 Operating System :: OS Independent
 Programming Language :: Python
 Programming Language :: Python :: 2.6
 Programming Language :: Python :: 2.7
 Programming Language :: Python :: 3.0
 Programming Language :: Python :: 3.1
 Programming Language :: Python :: 3.2
 Programming Language :: Python :: 3.3
 Topic :: Education
 Topic :: Scientific/Engineering
 Topic :: Scientific/Engineering :: Mathematics
 Topic :: Scientific/Engineering :: Physics
 Topic :: Software Development
 Topic :: Software Development :: Libraries
 Topic :: Software Development :: Libraries :: Python Modules
 Topic :: Utilities
 Package Index Owner: tisimst.myopenid.com
 DOAP record: ad1.1.7.xml