Cauchy distribution
|
Template:Probability distribution The Cauchy-Lorentz distribution, named after Augustin Cauchy, is a continuous probability distribution with probability density function
- <math> f(x|x_0,\gamma) = \frac{1}{\pi\gamma \left[1 + \left(\frac{x-x_0}{\gamma}\right)^2\right]} \!<math>
where x0 is the location parameter, specifying the location of the peak of the distribution, and γ is the scale parameter which specifies the half-width at half-maximum (HWHM). As a probability distribution, it is known as the Cauchy distribution while among physicists it is known as the Lorentz distribution or the Breit-Wigner distribution. Its importance in physics is largely due to the fact that it is the solution to the differential equation describing forced resonance. In spectroscopy it is the description of the line shape of spectral lines which are broadened by many mechanisms including resonance broadening. The statistical term Cauchy distribution will be used in the following discussion.
The special case when x0 = 0 and γ = 1 is called the standard Cauchy distribution with the probability density function
- <math> f(x|0,1) = \frac{1}{\pi (1 + x^2)}. \!<math>
Contents |
Properties
Since it is a distribution function, it integrates to unity:
- <math>\int_{-\infty}^\infty f(x|x_0,\gamma)\,dx=1. \!<math>
The cumulative distribution function is:
- <math>F(x|x_0,\gamma)=\frac{1}{\pi} \arctan\left(\frac{x-x_0}{\gamma}\right)+\frac{1}{2}<math>
and the inverse cumulative distribution function of the Cauchy distribution is
- <math>x = x_0 - \frac{\gamma}{\tan(\pi\,F)}.\,<math>
The Cauchy distribution is often cited as an example of a distribution which has no mean, variance or higher moments defined, although its mode and median are well defined and are both equal to x0.
The characteristic function of the Cauchy distribution is well defined:
- <math>\phi_x(t|x_0,\gamma) = \mathrm{E}(e^{i\,x\,t}) = \exp(i\,x_0\,t-\gamma\,|t|). \!<math>
When U and V are two independent normally distributed random variables with expected value 0 and variance 1, then the ratio U/V has the standard Cauchy distribution.
If X1, …, Xn are independent and identically distributed random variables, each with a standard Cauchy distribution, then the sample mean (X1 + … + Xn)/n has the same standard Cauchy distribution. To see that this is true, compute the characteristic function of the sample mean:
- <math>\phi_\overline{X}(t) = \mathrm{E}\left(e^{i\,\overline{X}\,t}\right) \!<math>
where <math>\overline{X}<math> is the sample mean. This example serves to show that the hypothesis of finite variance in the central limit theorem cannot be dropped. It is also an example of a more generalized version of the central limit theorem that is characteristic of all Lévy distributions, of which the Cauchy distribution is a special case.
The Cauchy distribution is an infinitely divisible probability distribution.
The standard Cauchy distribution coincides with the Student's t-distribution with one degree of freedom.
Why the mean of the Cauchy distribution is undefined
If a probability distribution has a density function f(x) then the mean or expected value is
- <math>\int_{-\infty}^\infty x f(x)\,dx. \qquad\qquad (1)\!<math>
The question is now whether this is the same thing as
- <math>\int_0^\infty x f(x)\,dx-\int_{-\infty}^0 |{x}| f(x)\,dx.\qquad\qquad (2) \!<math>
If both the positive and negative terms in (2) are finite, then (1) is the same as (2). If either the positive term or the negative term is finite, then (1) is the same as (2) (and is infinite, with either a positive or a negative sign). But in the case of the Cauchy distribution, both are infinite. This means (2) is undefined.
Moreover, if (1) is construed as a Lebesgue integral, then (1) is also undefined, since (1) is then defined simply as the difference (2) between positive and negative parts.
However, if (1) is construed as an improper integral rather than a Lebesgue integral, then (2) is undefined, and (1) is not necessarily well-defined. We may take (1) to mean
- <math>\lim_{a\to\infty}\int_{-a}^a x f(x)\,dx, \!<math>
and this is its Cauchy principal value, which is zero, but we could also take (1) to mean, for example,
- <math>\lim_{a\to\infty}\int_{-2a}^a x f(x)\,dx, \!<math>
which is not zero, as can be seen easily by computing the integral.
Various results in probability theory about expected values, such as the strong law of large numbers, will not work in such cases.
Why the second moment of the Cauchy distribution is infinite
Without a defined mean, it is impossible to consider the variance or standard deviation of a standard Cauchy distribution. But the second moment about zero can be considered. It turns out to be infinite:
- <math>\mathrm{E}(X^2) \propto \int_{-\infty}^{\infty} {x^2 \over 1+x^2}\,dx = \int_{-\infty}^{\infty} dx - \int_{-\infty}^{\infty} {1 \over 1+x^2}\,dx = \infty -\pi. \!<math>
External links
- MathWorld Cauchy Distribution (http://mathworld.wolfram.com/CauchyDistribution.html)
- GNU Scientific Library - Reference Manual (http://www.gnu.org/software/gsl/manual/gsl-ref.html#SEC294)de:Cauchy-Verteilung