Markov's inequality
|
In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov.
Markov's inequality (and other similar inequalities) relate probabilities to expectations, and provide (frequently) loose but still useful bounds for the cumulative distribution function of a random variable.
Contents |
Definition
Markov's inequality states that if X is a random variable and a is some positive constant, then
- <math>\textrm{Pr}(|X| \geq a) \leq \frac{\textrm{E}(|X|)}{a}.<math>
A generalisation
Markov's inequality is actually just one of a wider class of inequalities relating probabilities and expectations, that are all examples of a single theorem.
Theorem
Let X be a random variable and a be some positive constant (a > 0). If
- <math>h:\mathbb{R} \rightarrow [0,\infty),<math>
then
- <math>\textrm{Pr}(h(X) \geq a) \leq \frac{\textrm{E}(h(X))}{a}.<math>
Proof
Let A be the set {x : h(x) ≥ a}, and let IA(x) be the indicator function of A. (That is, IA(x) = 1 if x ∈ A, and is 0 otherwise.) Then,
- <math>aI_A(x) \leq h(x).<math>
The theorem follows by taking the expectation of both sides of this equation, and observing that
- <math>\textrm{E}(I_A(X)) = \textrm{Pr}(h(X) \geq a).<math>
Examples
- Markov's inequality is recovered by setting h(x) = |x|.
- If h(x) = x2, we obtain a version of Chebyshev's inequality.
- If X is a non-negative integer valued random variable (as is often the case in combinatorics), then taking a = 1 in Markov's inequality gives that <math>\textrm{Pr}(X \neq 0) \leq \textrm{E}(X).<math>de:Markow-Ungleichung