Development of a new Polynomial class for Scipy
Motivation
Currently numpy provides a class, poly1d, for representing polynomials in one variable. This class always represents the polynomials as arrays of coefficients in the basis 1, x, x**2, ..., x**n. Unfortunately this basis yields very poor numerical representations of many useful polynomials. As a result, if users want to work with polynomial objects, they are often stuck using low orders. For example, scipy.special includes several families of orthogonal polynomials described in terms of recurrence relations. The objects returned when one is created are poly1ds, and as a result the Chebyshev polynomials become useless by degree 35, while the underlying recurrence relations, with an appropriate evaluation scheme, would allow much higher degrees.
Scope
- One-dimensional polynomials only.
- Vector-valued polynomials? Good for splines and the like; bad for zero-finding; makes for messy code.
- Double-precision reals only.
- Complex variables and coefficients? Convergence and approximation issues become much more complicated. Roots and root finding may require complex variables.
- Arbitrary-precision coefficients? May be possible with type-checking. Needs tunable parameters.
- Single precision? No, polynomials are unstable enough already.
- Integers? Integer polynomials belong in the domain of symbolic computation.
Use cases
- Working with truncated Taylor series: Derivatives, antiderivatives, composition; specified center point.
- Function approximation: Chebyshev polynomials, fast evaluation, integration/differentiation, root finding, least-squares fitting, multiplication, possibly composition, truncation to lower order; specified interval.
- Curve fitting: least-squares fitting, fast evaluation, integration/differentiation; specified interval.
- Splines: specification by values and derivatives (Hermite problem), root finding, integration/differentiation; specified interval.
- Quadrature: roots and weights of orthogonal families; specified interval (possibly unbounded).
- Orthogonal polynomials: ufunc-like evaluation, roots, maxima/minima, high orders, special algorithms for calculation, usable as bases for series; specified interval (possibly unbounded).
- As building blocks for rational functions: evaluation, gcd, division with remainder, root finding, function fitting, Taylor/Laurent series extraction.
Requirements
- More than one representation: For some applications 1, x, x**2, ... is the correct basis; it's certainly the easiest one to understand. Chebyshev basis functions can allow easy construction of efficient polynomial approximations. When working with ODEs any of the classical families of orthogonal polynomials may be a natural basis. When finding roots or doing other polynomial computations, the literature seems to argue for Bernstein polynomials (Bezier splines are based on these) or Lagrange polynomials (specifying polynomials by (x,y) pairs), possibly implemented using a barycentric form. Conversion between representations is obviously a necessary operation.
- Fast vectorized evaluation: Clearly compiled code will be necessary.
- Basic polynomial operations: addition, subtraction, multiplication by a scalar, multiplication by another polynomial.
- Composition (?): substituting one polynomial into another.
- Division and gcds (?): this is a numerically tricky operation, but valuable for implementing change-of-basis among other things.
- Root-finding: can be numerically nasty.
- Calculus: derivatives and antiderivatives at a point; derivative and antiderivative polynomials.
- Construction of interpolating polynomials: from point and/or derivative information (can be numerically nasty, but not necessarily).
- Construction of approximating polynomials.
- Degree reduction: when is a polynomial of apparent degree n really of lower degree? ("Nearest Polynomial of Lower Degree"; algorithms exist.)
- Least-squares fitting of polynomials.
Code
A prototype is available on github as scikits.polynomial.
Literature
(Mostly behind paywalls I'm afraid.)
- Corless, R. M. and Watt, S. M., "Bernstein bases are optimal, but, sometimes, Lagrange bases are better." Proceedings of SYNASC, Timisoara, 2004. The authors describe Lagrange and Bernstein bases as the two main approaches for representing polynomials in a numerically-stable manner, and show that Lagrange polynomials can be better for root-finding than Bernstein polynomials if the points chosen are close to the zeros of interest.
Amiraslani, A., Corless, R. M., and Gonzalez-Vega, L., "Polynomial Algebra by Values." The authors describe algorithms for working with polynomials in the Lagrange basis.
- Farouki, R. T., "Computing with Barycentric Polynomials." The Mathematical Intelligencer, 1991. The author argues for the use of the Bernstein basis as a natural and numerically robust representation for polynomials.
- Tsai, Y. F., and Farouki, R. T., "BPOLY: An object-oriented library of algorithms for polynomials in Bernstein form." The authors present a C++ class for working with polynomials in Bernstein form.
- Minimair, M, "Basis-Independent Polynomial Division Applied to Division in Lagrange and Bernstein Bases." Lecture Notes in Artificial Intelligence, 2008. The author describes efficient polynomial division in arbitrary bases in terms of a short list of elementary operations. The author then shows how to implement these basic operations for Lagrange and Bernstein bases.