numdifftools package

Step generators

MinStepGenerator([base_step, step_ratio, ...]) Generates a sequence of steps
MaxStepGenerator([step_max, step_ratio, ...]) Generates a sequence of steps
MinMaxStepGenerator([step_min, step_max, ...]) Generates a sequence of steps

Utility functions

Finite difference approximations

fornberg_weights_all(x, x0[, M]) Return finite difference weights_and_points for derivatives of all orders.
fornberg_weights(x, x0[, m]) Return weights for finite difference approximation of the m’th derivative U^m(x0), evaluated at x0, based on n values of U at x[0], x[1],...
Derivative(f[, step, method, order, n, ...]) Calculate n-th derivative with finite difference approximation
Gradient(f[, step, method, order, full_output]) Calculate Gradient with finite difference approximation
Jacobian(f[, step, method, order, full_output]) Calculate Jacobian with finite difference approximation
Hessdiag(f[, step, method, order, full_output]) Calculate Hessian diagonal with finite difference approximation
Hessian(f[, step, method, full_output]) Calculate Hessian with finite difference approximation
directionaldiff(f, x0, vec, **options) Return directional derivative of a function of n variables

numdifftools.extrapolation module

convolve(sequence, rule, **kwds) Wrapper around scipy.ndimage.convolve1d that allows complex input.
Dea([limexp]) LIMEXP is the maximum number of elements the epsilon table data can contain.
dea3(v0, v1, v2[, symmetric]) Extrapolate a slowly convergent sequence
Richardson([step_ratio, step, order, num_terms]) Extrapolates as sequence with Richardsons method

numdifftools.limits module

MinStepGenerator([base_step, step_ratio, ...]) Generates a sequence of steps
Limit(f[, step, method, order, full_output]) Compute limit of a function at a given point

numdifftools.multicomplex module

bicomplex(z1, z2) Creates an instance of a bicomplex object.

numdifftools.nd_algopy module

Jacobian(f[, method]) Calculate Jacobian with Algorithmic Differentiation method
Hessdiag(f[, method]) Calculate Hessian diagonal with Algorithmic Differentiation method
Hessian(f[, method]) Calculate Hessian with Algorithmic Differentiation method
directionaldiff

numdifftools.run_benchmark module

Module contents

Introduction to Numdifftools

Numdifftools is a suite of tools written in Python to solve automatic numerical differentiation problems in one or more variables. Finite differences are used in an adaptive manner, coupled with a Richardson extrapolation methodology to provide a maximally accurate result. The user can configure many options like; changing the order of the method or the extrapolation, even allowing the user to specify whether complex, multicomplex, central, forward or backward differences are used. The methods provided are:

Derivative:
Computates the derivative of order 1 through 10 on any scalar function.
Gradient:
Computes the gradient vector of a scalar function of one or more variables.
Jacobian:
Computes the Jacobian matrix of a vector valued function of one or more variables.
Hessian:
Computes the Hessian matrix of all 2nd partial derivatives of a scalar function of one or more variables.
Hessdiag:
Computes only the diagonal elements of the Hessian matrix

All of these methods also produce error estimates on the result.

Numdifftools also provide an easy to use interface to derivatives calculated with AlgoPy. Algopy stands for Algorithmic Differentiation in Python. The purpose of AlgoPy is the evaluation of higher-order derivatives in the forward and reverse mode of Algorithmic Differentiation (AD) of functions that are implemented as Python programs.

Documentation is at: http://numdifftools.readthedocs.org/

Code and issue tracker is at https://github.com/pbrod/numdifftools.

Latest stable release is at http://pypi.python.org/pypi/Numdifftools.

To test if the toolbox is working paste the following in an interactive python session:

import numdifftools as nd
nd.test(coverage=True, doctests=True)

Getting Started

Compute 1’st and 2’nd derivative of exp(x), at x == 1:

>>> import numpy as np
>>> import numdifftools as nd
>>> fd = nd.Derivative(np.exp)        # 1'st derivative
>>> fdd = nd.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nd.Jacobian(fun)
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nd.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

Compute the same with the easy to use interface to AlgoPy:

>>> import numdifftools.nd_algopy as nda
>>> import numpy as np
>>> fd = nda.Derivative(np.exp)        # 1'st derivative
>>> fdd = nda.Derivative(np.exp, n=2)  # 2'nd derivative
>>> np.allclose(fd(1), 2.7182818284590424)
True
>>> np.allclose(fdd(1), 2.7182818284590424)
True

Nonlinear least squares:

>>> xdata = np.reshape(np.arange(0,1,0.1),(-1,1))
>>> ydata = 1+2*np.exp(0.75*xdata)
>>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
>>> Jfun = nda.Jacobian(fun, method='reverse')
>>> np.allclose(np.abs(Jfun([1,2,0.75])), 0) # should be numerically zero
True

Compute gradient of sum(x**2):

>>> fun = lambda x: np.sum(x**2)
>>> dfun = nda.Gradient(fun)
>>> dfun([1,2,3])
array([ 2.,  4.,  6.])

See also

scipy.misc.derivative