Chi-square distribution

The title of this article is incorrect because of technical limitations. The correct title is chi-square distribution.

Template:Probability distribution For any positive integer <math>k<math>, the chi-square distribution with <math>k<math> degrees of freedom is the probability distribution of the random variable

<math>X=Z_1^2 + \cdots + Z_k^2<math>

where the <math>Z_i<math> are independent standard normal variables (zero expected value and unit variance). This distribution is usually written


The chi-square test can be used to test independence as well as goodness of fit.

An example of a test of independence would be if sex and political affiliation are connected. So you would gather your sample, your expected value, find your critical value, and if the chi-square test is greater than the critical value, you can reject the null, otherwise, you fail to reject the null. (you never accept the null)

The chi-square probability density function is


f_k(x)= \frac{(1/2)^{k/2}}{\Gamma(k/2)} x^{k/2 - 1} e^{-x/2} <math>

where <math>x \ge 0<math> and <math>f_k(x) = 0<math> for <math>x \le 0<math>. Here <math>\Gamma<math> denotes the Gamma function. The cumulative distribution function is:


where <math>\gamma(k,z)<math> is the incomplete Gamma function.

Tables of this distribution — usually in its cumulative form — are widely available (see the External links below for online versions), and the function is included in many spreadsheets (for example calc or Microsoft Excel) and all statistical packages.

If <math>p<math> independent linear homogeneous constraints are imposed on these variables, the distribution of <math>X<math> conditional on these constraints is <math>\chi^2_{k-p}<math>, justifying the term "degrees of freedom". The characteristic function of the Chi-square distribution is


The chi-square distribution has numerous applications in inferential statistics, for instance in chi-square tests and in estimating variances. It enters the problem of estimating the mean of a normally distributed population and the problem of estimating the slope of a regression line via its role in Student's t-distribution. It enters all analysis of variance problems via its role in the F-distribution, which is the distribution of the ratio of two independent chi-squared random variables.


1 Related distributions
2 See also
3 External links

The normal approximation

If <math>X\sim\chi^2_k<math>, then as <math>k<math> tends to infinity, the distribution of <math>X<math> tends to normality. However, the tendency is slow (the skewness is <math>\sqrt{8/k}<math> and the kurtosis is <math>12/k<math>) and two transformations are commonly considered, each of which approaches normality faster than <math>X<math> itself:

Fisher showed that <math>\sqrt{2X}<math> is approximately normally distributed with mean <math>\sqrt{2k-1}<math> and unit variance.

Wilson and Hilferty showed in 1931 that <math>\sqrt[3]{X/k}<math> is approximately normally distributed with mean <math>1-2/(9k)<math> and variance <math>2/(9k)<math>.

The expected value of a random variable having chi-square distribution with <math>k<math> degrees of freedom is <math>k<math> and the variance is <math>2k<math>. The median is given approximately by


Note that 2 degrees of freedom leads to an exponential distribution.

The chi-square distribution is a special case of the gamma distribution.

The information entropy is given by:


H = \int_{-\infty}^\infty f(x)\ln(f(x)) dx = \frac{k}{2} + \ln

 2 \Gamma

+ \left(1 - \frac{k}{2}\right) \psi(k/2) <math>

where <math>\psi(x)<math> is the Digamma function.

Related distributions

  • <math>X \sim \mathrm{Exponential}(\lambda = 2)<math> is an exponential distribution if <math>X \sim \chi_2^2<math> (with 2 degrees of freedom).
  • <math>Y \sim \chi_k^2<math> is a chi-square distribution if <math>Y = \sum_{m=1}^k X_m^2<math> for <math>X_i \sim N(0,1)<math> independent that are normally distributed.
  • <math>Y \sim \mathrm{F}(\nu_1, \nu_2)<math> is an F-distribution if <math>Y = (X_1 / \nu_1)/(X_2 / \nu_2)<math> where <math>X_1 \sim \chi_{\nu_1}^2<math> and <math>X_2 \sim \chi_{\nu_2}^2<math> are independent with their respective degrees of freedom.
  • <math>Y \sim \chi^2(\bar{\nu})<math> is a chi-square distribution if <math>Y = \sum_{m=1}^N X_m<math> where <math>X_m \sim \chi^2(\nu_m)<math> are independent and <math>\bar{\nu} = \sum_{m=1}^N \nu_m<math>.

See also

External links

es:Distribución Chi-cuadrada fr:Loi du χ² it:Variabile casuale chi quadro nl:Chi-kwadraatverdeling sv:Chitvĺfördelning


  • Art and Cultures
    • Art (
    • Architecture (
    • Cultures (
    • Music (
    • Musical Instruments (
  • Biographies (
  • Clipart (
  • Geography (
    • Countries of the World (
    • Maps (
    • Flags (
    • Continents (
  • History (
    • Ancient Civilizations (
    • Industrial Revolution (
    • Middle Ages (
    • Prehistory (
    • Renaissance (
    • Timelines (
    • United States (
    • Wars (
    • World History (
  • Human Body (
  • Mathematics (
  • Reference (
  • Science (
    • Animals (
    • Aviation (
    • Dinosaurs (
    • Earth (
    • Inventions (
    • Physical Science (
    • Plants (
    • Scientists (
  • Social Studies (
    • Anthropology (
    • Economics (
    • Government (
    • Religion (
    • Holidays (
  • Space and Astronomy
    • Solar System (
    • Planets (
  • Sports (
  • Timelines (
  • Weather (
  • US States (


  • Home Page (
  • Contact Us (

  • Clip Art (
Personal tools