Moment-generating function
|
In probability theory and statistics, the moment-generating function of a random variable X is
- <math>M_X(t)=E\left(e^{tX}\right), \quad t \in \mathbb{R},<math>
wherever this expectation exists. The moment-generating function generates the moments of the probability distribution, as follows. Provided the moment-generating function exists in an interval around t = 0,
- <math>E\left(X^n\right)=M_X^{(n)}(0)=\left.\frac{\mathrm{d}^n}{\mathrm{d}t^n}\right|_{t=0} M_X(t).<math>
If X has a continuous probability density function f(x) then the moment generating function is given by
- <math>M_X(t) = \int_{-\infty}^\infty e^{tx} f(x)\,\mathrm{d}x<math>
- <math> = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2x^2}{2!} + \cdots\right) f(x)\,\mathrm{d}x<math>
- <math> = 1 + tm_1 + \frac{t^2m_2}{2!} +\cdots,<math>
where <math>m_i<math> is the ith moment.
Regardless of whether probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral
- <math>\int_{-\infty}^\infty e^{tx}\,dF(x)<math>
where F is the cumulative distribution function.
If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and
- <math>S_n = \sum_{i=1}^n a_i X_i,<math>
where the a i are constants, then the probability density function for S n is the convolution of the probability density functions of each of the X i and the moment-generating function for S n is given by
- <math>
M_{S_n}(t)=M_{X_1}(a_1t)M_{X_2}(a_2t)\ldots M_{X_n}(a_nt). <math>
Related to the moment-generating function are a number of other transforms that are common in probability theory, including the characteristic function and the probability-generating function.
The cumulant-generating function is the logarithm of the moment-generating function.de:Momenterzeugende Funktion