Runge's phenomenon
|
Rungesphenomenon.png
In the mathematical field of numerical analysis Runge's phenomenon is a problem which occurs when using polynomial interpolation with polynomials of high degree. It was discovered by Carle David Tolmé Runge when exploring the behaviour of errors when using polynomial interpolation to approximate certain functions.
Problem
Consider the function:
- <math>f(x) = \frac{1}{1+25x^2}<math>
Runge found that if you interpolate this function at equidistant points xi between −1 and 1 such that:
- <math>x_i = -1 + (i-1)\frac{2}{n}, i \in \left\{ 1, 2, \cdots n+1 \right\}<math>
with a polynomial <math>P_n(x)<math> which has a degree <math>\leq n<math>, the resulting interpolation would oscillate toward the end of the interval, i.e. close to −1 and 1. It can even be proven that the interpolation error tends toward infinity when the degree of the polynomial increases:
- <math>\lim_{n \rightarrow \infty} \left( \max_{-1 \leq x \leq 1} | f(x) -P_n(x)| \right) = \infty.<math>
Solutions to the problem of Runge's phenomenon
The oscillation can be minimized by using Chebyshev nodes instead of equidistant nodes. In this case the maximum error is guaranteed to diminish with increasing polynomial order. The phenomenon demonstrates that high degree polynomials are generally unsuitable for interpolation. The problem can be avoided by using spline curves which are piecewise polynomials. When trying to decrease the interpolation error one can increase the number of polynomial pieces which are used to construct the spline instead of increasing the degree of the polynomials used.
See also
- Compare with the Gibbs phenomenon for sinusoidal basis functionspt:Fenómeno de Runge