# Chebyshev's inequality

Chebyshev's inequality (also known as Tchebysheff's inequality, Chebyshev's theorem, or the Bienaymé-Chebyshev inequality) is a theorem of probability theory named in honor of Pafnuty Chebyshev. Chebyshev's inequality gives a lower bound for the probability that a value of a random variable of any distribution with finite variance lies within a certain distance from the variable's mean; equivalently, the theorem provides an upper bound for the probability that values lie outside the same distance from the mean.

 Contents

## The theorem

'Theorem.' Let X be a random variable with mean μ and finite variance σ2. Then for any real number k > 0,

[itex]P(\left|X-\mu\right|\geq k\sigma)\leq\frac{1}{k^2}.[itex]

Only the cases k > 1 provide useful information.

Another consequence of the theorem is that for any distribution with mean μ and finite standard deviation σ, at least half of the values lie in the interval (μ − √2 σ, μ + √2 σ).

Typically, the theorem will provide rather loose bounds. However, the bounds provided by Chebyshev's inequality cannot, in general (remaining sound for variables of arbitrary distribution), be improved upon. For example, for any k > 1, the following example (where σ = 1/k) meets the bounds exactly.

[itex]\begin{matrix}P(X=-1) & = & 1/2k^2 \\ P(X=0) & = & 1 - 1/k^2 \\ P(X=1) & = & 1/2k^2 \end{matrix} [itex]

The theorem can be useful despite loose bounds because it applies to random variables of any distribution, and because these bounds can be calculated knowing no more about the distribution than the mean and variance.

Chebyshev's inequality is used for proving the weak law of large numbers.

### Variants

A one-tailed variant with k > 0, is

[itex]P(X-\mu \geq k\sigma)\leq\frac{1}{1+k^2}.[itex]

A stronger result applicable to unimodal probability distributions is the Vysochanskiï-Petunin inequality.

## Example application

For illustration, assume Wikipedia articles are on average 1000 characters long with a standard deviation of 200 characters. From Chebyshev's inequality we can then deduce that at least 75% of Wikipedia articles have a length between 600 and 1400 characters (k = 2).

## Proof

Markov's inequality is applied with [itex]a=k^2[itex], and [itex]X[itex] as [itex](X-\mu)^2[itex].

A direct proof shows why bounds we get are quite loose:

 [itex]P[|X-\mu | \geq k\sigma][itex] [itex]= \int_{\left \{\omega :\, \left | X(\omega)-\mu \right | \geq k\sigma\right \} } 1 \, dP[itex] [itex]\leq \int_{\left \{\omega :\, \left | X(\omega)-\mu \right | \geq k\sigma\right \} } \left (\frac{X-\mu }{k\sigma} \right )^2\, dP[itex] [itex]\leq \frac{1}{k^2\sigma^2}\int_R (X-\mu )^2 dP = \frac{1}{k^2}[itex]

First, we increase the 1 in integrand and then we increase the set over which we integrate to all reals, so any probability of not being at the mean or the boundary points loosens the inequality.

• Art and Cultures
• Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
• Space and Astronomy