In statistics, the Cramér-Rao inequality, named in honor of Harald Cramér and Calyampudi Radhakrishna Rao, states that the reciprocal of the Fisher information, <math>\mathcal{I}(\theta)<math>, of a parameter <math>\theta<math>, is a lower bound on the variance of an unbiased estimator of the parameter (denoted <math>\hat{\theta}<math>).

<math>

\mathrm{var} \left(\hat{\theta}\right) \geq \frac{1}{\mathcal{I}(\theta)} = \frac{1} {

\mathrm{E}
\left[
 \left[
  \frac{d}{d\theta} \log f(X;\theta)
 \right]^2
\right]

} <math>

In some cases, no unbiased estimator exists that realizes the lower bound.

The Cramér-Rao inequality is also known as the Cramér-Rao bounds (CRB) or Cramér-Rao lower bounds (CRLB) because it puts a lower bounds on the variance of an estimator <math>\hat{\theta}<math>

Contents

Regularity conditions

This inequality relies on two weak regularity conditions on the probability density function, <math>f(x; \theta)<math>, and the estimator <math>T(X)<math>:

  • The Fisher information is always defined; equivalently, for all <math>x<math> such that <math>f(x; \theta) > 0<math>,
<math> \frac{\partial}{\partial\theta} \ln f(x;\theta)<math>
is finite.
  • The operations of integration with respect to s and differentiation with respect to <math>\theta<math> can be interchanged in the expectation of <math>T<math>; that is,
<math>
\frac{\partial}{\partial\theta}
\left[
 \int T(x) f(x;\theta) \,dx
\right]
=
\int T(x)
 \left[
  \frac{\partial}{\partial\theta} f(x;\theta)
 \right]
\,dx

<math>

whenever the right-hand side is finite.

In some cases, a biased estimator can have both a variance and a mean squared error that are below the Cramér-Rao lower bound (the lower bound applies only to estimators that are unbiased). See bias (statistics).

If the second regularity condition extends to the second derivative, then an alternative form of Fisher information can be used and yields a new Cramér-Rao inequality

<math>

\mathrm{var} \left(\hat{\theta}\right) \geq \frac{1}{\mathcal{I}(\theta)} = \frac{1} {

-\mathrm{E}
\left[
 \frac{d^2}{d\theta^2} \log f(X;\theta)
\right]

} <math>

In some cases, it may be easier to take the expectation with respect to the second derivative than to take the expectation of the square of the first derivative.

Multiparameter

Extending the Cramér-Rao inequality to multiple parameters, define a parameter column vector

<math>\boldsymbol{\theta} = \left[ \theta_1, \theta_2, \dots, \theta_d \right]^T \in \mathbb{R}^d<math>

with probability density function (pdf), <math>f(x; \boldsymbol{\theta})<math>, that satisfies the above two regularity conditions.

The Fisher information matrix is a <math>d \times d<math> matrix with element <math>\mathcal{I}_{m, k}<math> defined as

<math>

\mathcal{I}_{m, k} = \mathrm{E} \left[

\frac{d}{d\theta_m} \log f\left(x; \boldsymbol(\theta)\right)
\frac{d}{d\theta_k} \log f\left(x; \boldsymbol(\theta)\right)

\right] <math>

then the Cramér-Rao inequality is

<math>

\mathrm{cov}_{\boldsymbol{\theta}}\left(\boldsymbol{T}(X)\right) \geq \frac

{\partial \boldsymbol{\psi} \left(\boldsymbol{\theta}\right)}
{\partial \boldsymbol{\theta}^T}

\mathcal{I}\left(\boldsymbol{\theta}\right)^{-1} \frac

{\partial \boldsymbol{\psi}\left(\boldsymbol{\theta}\right)^T}
{\partial \boldsymbol{\theta}}

<math>

where

  • <math>

\boldsymbol{T}(X) = \begin{bmatrix} T_1(X) & T_2(X) & \cdots & T_d(X) \end{bmatrix}^T <math>

  • <math>

\boldsymbol{\psi} = \mathrm{E}\left[\boldsymbol{T}(X)\right] = \begin{bmatrix} \psi_1\left(\boldsymbol{\theta}\right) &

\psi_2\left(\boldsymbol{\theta}\right) &
\cdots &
\psi_d\left(\boldsymbol{\theta}\right)

\end{bmatrix}^T <math>

  • <math>\frac{\partial \boldsymbol{\psi}\left(\boldsymbol{\theta}\right)}{\partial \boldsymbol{\theta}^T}

= \begin{bmatrix}

\psi_1 \left(\boldsymbol{\theta}\right) \\
\psi_2 \left(\boldsymbol{\theta}\right) \\
\vdots \\
\psi_d \left(\boldsymbol{\theta}\right)

\end{bmatrix} \begin{bmatrix}

\frac{\partial}{\partial \theta_1} &
\frac{\partial}{\partial \theta_2} &
\cdots &
\frac{\partial}{\partial \theta_d}

\end{bmatrix} = \begin{bmatrix}

\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_1} &
\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_2} &
\cdots &
\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_d} \\
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_1} &
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_2} &
\cdots &
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_d} \\
\vdots &
\vdots &
\ddots &
\vdots \\
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_1} &
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_2} &
\cdots &
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_d}

\end{bmatrix} <math>

  • <math>

\frac{\partial \boldsymbol{\psi}\left(\boldsymbol{\theta}\right)^T}{\partial \boldsymbol{\theta}} = \begin{bmatrix}

\frac{\partial}{\partial \theta_1} \\
\frac{\partial}{\partial \theta_2} \\
\vdots \\
\frac{\partial}{\partial \theta_d}

\end{bmatrix} \begin{bmatrix}

\psi_1 \left(\boldsymbol{\theta}\right) &
\psi_2 \left(\boldsymbol{\theta}\right) &
\cdots &
\psi_d \left(\boldsymbol{\theta}\right)

\end{bmatrix} = \begin{bmatrix}

\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_1} &
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_1} &
\cdots &
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_1} \\
\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_2} &
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_2} &
\cdots &
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_2} \\
\vdots &
\vdots &
\ddots &
\vdots \\
\frac{\partial \psi_1 \left(\boldsymbol{\theta}\right)}{\partial \theta_d} &
\frac{\partial \psi_2 \left(\boldsymbol{\theta}\right)}{\partial \theta_d} &
\cdots &
\frac{\partial \psi_d \left(\boldsymbol{\theta}\right)}{\partial \theta_d}

\end{bmatrix} <math>

And <math>\mathrm{cov}_{\boldsymbol{\theta}} \left( \boldsymbol{T}(X) \right)<math> is a positive-semidefinite matrix, that is

<math> x^{T} \mathrm{cov}_{\boldsymbol{\theta}} \left( \boldsymbol{T}(X) \right) x \geq 0 \quad \forall x \in \mathbb{R}^d<math>

If <math>\boldsymbol{T}(X) = \begin{bmatrix} T_1(X) & T_2(X) & \cdots & T_d(X) \end{bmatrix}^T<math> is an unbiased estimator (i.e., <math>\boldsymbol{\psi}\left(\boldsymbol{\theta}\right) = \boldsymbol{\theta}<math>) then the Cramér-Rao inequality is

<math>

\mathrm{cov}_{\boldsymbol{\theta}}\left(\boldsymbol{T}(X)\right) \geq \mathcal{I}\left(\boldsymbol{\theta}\right)^{-1} <math>

Single-parameter proof

First, a more general version of the inequality will be proven; namely, that if the expectation of <math>T<math> is denoted by <math>\psi (\theta)<math>, then for all <math>\theta<math>

<math>{\rm var}(t(X)) \geq \frac{[\psi^\prime(\theta)]^2}{I(\theta)}<math>

The Cramér-Rao inequality will then follow as a consequence.

Let <math>X<math> be a random variable with probability density function <math>f(x, \theta)<math>. Here <math>T = t(X)<math> is a statistic, which is used as an estimator for <math>\theta<math>. If <math>V<math> is the score, i.e.

<math>V = \frac{\partial}{\partial\theta} \log f(X;\theta)<math>

then the expectation of <math>V<math>, written <math>{\rm E}(V)<math>, is zero. If we consider the covariance <math>{\rm cov}(V, T)<math> of <math>V<math> and <math>T<math>, we have <math>{\rm cov}(V, T) = {\rm E}(V T)<math>, because <math>{\rm E}(V) = 0<math>. Expanding this expression we have

<math>

{\rm cov}(V,T) = {\rm E} \left(

T \cdot \frac{\partial}{\partial\theta} \ln f(X;\theta)

\right) <math>

This may be expanded using the chain rule

<math>\frac{\partial}{\partial\theta} \ln Q = \frac{1}{Q}\frac{\partial Q}{\partial\theta}<math>

and the definition of expectation gives, after cancelling <math>f(x; \theta)<math>,

<math>

{\rm E} \left(

T \cdot \frac{\partial}{\partial\theta} \ln f(X;\theta)

\right) = \int

t(x)
\left[
 \frac{\partial}{\partial\theta} f(x;\theta)
\right]

\, dx = \frac{\partial}{\partial\theta} \left[

\int t(x)f(x;\theta)\,dx

\right] = \psi^\prime(\theta) <math>

because the integration and differentiation operations commute (second condition).

The Cauchy-Schwarz inequality shows that

<math>

\sqrt{ {\rm var} (T) {\rm var} (V)} \geq {\rm cov}(V,T) = \psi^\prime (\theta) <math>

therefore

<math>

{\rm var\ } T \geq \frac{[\psi^\prime(\theta)]^2}{{\rm var} (V)} = \frac{[\psi^\prime(\theta)]^2}{I(\theta)} = \left[

\frac{\partial}{\partial\theta}
{\rm E} (T)

\right]^2 \frac{1}{I(\theta)} <math>

Q.E.D.

If <math>T<math> is an unbiased estimator of <math>\theta<math>, that is, <math>{\rm E}(T) =\theta<math>, then <math>\psi'(\theta) = 1<math>; the inequality then becomes

<math>

{\rm var}(T) \geq \frac{1}{I(\theta)} <math>

This is the Cramér-Rao inequality.

The efficiency of <math>T<math> is defined as

<math>e(T) = \frac{\frac{1}{I(\theta)}}{{\rm var}(T)}<math>

or the minimum possible variance for an unbiased estimator divided by its actual variance. The Cramér-Rao lower bound thus gives <math>e(T) \le 1<math>.

Multivariate normal distribution

For the case of a d-variate normal distribution

<math>

\boldsymbol{x} \sim N_d \left(

\boldsymbol{\mu} \left( \boldsymbol{\theta} \right)
,
C \left( \boldsymbol{\theta} \right)

\right) <math>

with a probability density function

<math>

f\left( \boldsymbol{x}; \boldsymbol{\theta} \right) = \frac{1}{\sqrt{ (2\pi)^d \left| C \right| }} \exp \left(

-\frac{1}{2}
\left(
 \boldsymbol{x} - \boldsymbol{\mu}
\right)^{T}
C^{-1}
\left(
 \boldsymbol{x} - \boldsymbol{\mu}
\right)

\right). <math>

The Fisher information matrix has elements

<math>

\mathcal{I}_{m, k} = \frac{\partial \boldsymbol{\mu}^T}{\partial \theta_m} C^{-1} \frac{\partial \boldsymbol{\mu}}{\partial \theta_k} + \frac{1}{2} \mathrm{tr} \left(

C^{-1}
\frac{\partial C}{\partial \theta_m}
C^{-1}
\frac{\partial C}{\partial \theta_k}

\right) <math>

where "tr" is the trace.

Let <math>w[n]<math> be a white Gaussian noise (a sample of <math>N<math> independent observations) with variance <math>\sigma^2<math>

<math>w[n] \sim \mathbb{N}_N \left( 0, \sigma^2 I \right).<math>

Then the Fisher information matrix is 1 × 1

<math>

\mathcal{I}(\sigma^2) = \frac{1}{2} \mathrm{tr} \left(

C^{-1}
\frac{\partial C}{\partial \theta_m}
C^{-1}
\frac{\partial C}{\partial \theta_k}

\right) = \frac{1}{2 \sigma^2} \mathrm{tr} \left(I\right) = \frac{N}{2 \sigma^2}, <math>

and so the Cramér-Rao inequality is

<math>

\mathrm{var}\left(\sigma^2\right) \geq \frac{2 \sigma^2}{N}. <math>

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools