Expected value
|
In probability (and especially gambling), the expected value (or expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. Note that the value itself may not be expected in the general sense, it may be unlikely or even impossible.
For example, an American roulette wheel has 38 equally possible outcomes. A bet placed on a single number pays 35-to-1 (this means that you are paid 35 times your bet and your bet is returned, so you get 36 times your bet). So the expected value of the profit resulting from a $1 bet on a single number is, considering all 38 possible outcomes: <math>( -$1 \times \frac{37}{38} ) + ( $36 \times \frac{1}{38} )<math>, which is about -$0.0263. Therefore one expects, on average, to lose over two cents for every dollar bet.
Contents |
Mathematical definition
In general, if <math>X<math> is a random variable defined on a probability space <math>(\Omega, P)<math>, then the expected value of <math>X<math> (denoted <math>\mathrm{E}(X)<math> or sometimes <math>\langle X \rangle<math> or <math>\mathbb{E}X<math>) is defined as
- <math>\mathrm{E}(X) = \int_\Omega X\, dP<math>
where the Lebesgue integral is employed. Note that not all random variables have an expected value, since the integral may not exist (e.g., Cauchy distribution). Two variables with the same probability distribution will have the same expected value, if it is defined.
If <math>X<math> is a discrete random variable with values <math>x_1<math>, <math>x_2<math>, ... and corresponding probabilities <math>p_1<math>, <math>p_2<math>, ... which add up to 1, then <math>\mathrm{E}(X)<math> can be computed as the sum or series
- <math>\mathrm{E}(X) = \sum_i p_i x_i<math>
as in the gambling example mentioned above.
If the probability distribution of <math>X<math> admits a probability density function <math>f(x)<math>, then the expected value can be computed as
- <math>\mathrm{E}(X) = \int_{-\infty}^\infty x f(x)\, dx.<math>
It follows directly from the discrete case definition that if <math>X<math> is a constant random variable, i.e. <math>X = b<math> for some fixed real number <math>b<math>, then the expected value of <math>X<math> is also <math>b<math>.
Properties
Linearity
The expected value operator (or expectation operator) <math>\mathrm{E}<math> is linear in the sense that
- <math>\mathrm{E}(a X + b Y) = a \mathrm{E}(X) + b \mathrm{E}(Y)<math>
for any two random variables <math>X<math> and <math>Y<math> (which need to be defined on the same probability space) and any real numbers <math>a<math> and <math>b<math>.
Functional non-invariance
In general, the expectation operator and functions of random variables do not commute; that is
- <math>\mathrm{E}(g(X)) = \int_{\Omega} g(X)\, dP \neq g(\operatorname{E}X),<math>
except as noted above.
Non-multiplicativity
In general, the expected value operator is not multiplicative, i.e. <math>\mathrm{E}(X Y)<math> is not necessarily equal to <math>\mathrm{E}(X) \mathrm{E}(Y)<math>, except if <math>X<math> and <math>Y<math> are independent or uncorrelated. This lack of multiplicativity gives rise to study of covariance and correlation.
Iterated expectation
For any two random variables <math>X,Y<math> one may define the conditional expectation:
- <math> \mathbb{E}[X|Y](y) = \mathbb{E}[X|Y=y] = \sum_x \mathbb{P}(X=x|Y=y)\mathbb{P}(X=x).<math>
Then the expectation of <math>X<math> satisfies
- <math>\mathbb{E}[X] = \mathbb{E} \left( \mathbb{E}[X|Y] \right).<math>
The right hand side of this equation is referred to as the iterated expectation. This proposition is treated in law of total expectation.
Uses and applications of the expected value
The expected values of the powers of <math>X<math> are called the moments of <math>X<math>; the moments about the mean of <math>X<math> are expected values of powers of <math>X - \mathrm{E}(X)<math>. The moments of some random variables can be used to specify their distributions, via their moment generating functions.
To empirically estimate the expected value of a random variable, one repeatedly measures observations of the variable and computes the arithmetic mean of the results. This estimates the true expected value in an unbiased manner and has the property of minimizing the sum of the squares of the residuals (the sum of the squared differences between the observations and the estimate). The law of large numbers demonstrates that (under fairly mild conditions) as the size of the sample gets larger, the variance of this estimate gets smaller.
In classical mechanics, the center of mass is an analogous concept to expectation. For example, suppose <math>X<math> is a discrete random variable with values <math>x_i<math> and corresponding probabilities <math>p_i<math>. Now consider a weightless rod on which are placed weights, at locations <math>x_i<math> along the rod and having masses <math>p_i<math> (whose sum is one). The point at which the rod balances (its center of gravity) is <math>\mathrm{E}(X)<math>.
Expectation of matrices
If <math>X<math> is an <math>m \times n<math> matrix, then the expected value of the matrix is a matrix of expected values:
- <math>
\mathrm{E}[X] = \mathrm{E} \begin{bmatrix}
x_{1,1} & x_{1,2} & \cdots & x_{1,n} \\ x_{2,1} & x_{2,2} & \cdots & x_{2,n} \\ \vdots \\ x_{m,1} & x_{m,2} & \cdots & x_{m,n}
\end{bmatrix} = \begin{bmatrix}
\mathrm{E}(x_{1,1}) & \mathrm{E}(x_{1,2}) & \cdots & \mathrm{E}(x_{1,n}) \\ \mathrm{E}(x_{2,1}) & \mathrm{E}(x_{2,2}) & \cdots & \mathrm{E}(x_{2,n}) \\ \vdots \\ \mathrm{E}(x_{m,1}) & \mathrm{E}(x_{m,2}) & \cdots & \mathrm{E}(x_{m,n})
\end{bmatrix} <math>
This property is utilized in covariance matrices.
See also
- An inequality on location and scale parameters.
- Expected value is also a key concept in economics and finance.
- The general term expectation.
External links
es:Valor esperado fr:Espérance mathématique it:Valore atteso nl:Verwachting (wiskunde) ja:期待値 pl:Wartość oczekiwana su:Nilai ekspektasi sv:Väntevärde zh:期望值