Poisson distribution

Template:Probability distribution In probability theory and statistics, the Poisson distribution is a discrete probability distribution (discovered by SiméonDenis Poisson (1781–1840) and published, together with his probability theory, in 1838 in his work Recherches sur la probabilité des jugements en matières criminelles et matière civile) belonging to certain random variables N that count, among other things, a number of discrete occurrences (sometimes called "arrivals") that take place during a timeinterval of given length. The probability that there are exactly k occurrences (k being a nonnegative integer, k = 0, 1, 2, ...) is
 <math>P(N=k)=\frac{e^{\lambda} \lambda^k}{k!},\,\!<math>
where
 e is the base of the natural logarithm (e = 2.71828...),
 k! is the factorial of k,
 λ is a positive real number, equal to the expected number of occurrences that occur during the given interval. For instance, if the events occur on average every 2 minutes, and you are interested in the number of events occurring in a 10 minute interval, you would use as model a Poisson distribution with λ = 5.
Contents 
Poisson processes
Sometimes λ is taken to be the rate, i.e., the average number of occurrences per unit time. In that case, if N_{t} is the number of occurrences before time t then we have
 <math>P(N_t=k)=\frac{e^{\lambda t} (\lambda t)^k}{k!},\,\!<math>
and the waiting time T until the first occurrence is a continuous random variable with an exponential distribution (with parameter λ). This probability distribution may be deduced from the fact that
 <math>P(T>t)=P(N_t=0)=e^{\lambda t}.\,<math>
When time becomes involved, then we have a 1dimensional Poisson process, which involves both the discrete Poissondistributed random variables that count the number of arrivals in each time interval, and the continuous Erlangdistributed waiting times. There are also Poisson processes of dimension higher than 1.
Related distributions
 <math>Y \sim \mathrm{Poisson}(\bar{\lambda})<math> is a Poisson distribution if <math>Y = \sum_{m=1}^N X_m<math> for <math>X_m \sim \mathrm{Poisson}(\lambda_m)<math> independent Poisson distributions and <math>\bar{\lambda} = \sum_{m=1}^N \lambda_m<math>.
Occurrence
The Poisson distribution arises in connection with Poisson processes. It applies to various phenomena of discrete nature (that is, those that may happen 0, 1, 2, 3, ... times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Examples include:
 The number of unstable nuclei that decayed within a given period of time in a piece of radioactive substance.
 The number of cars that pass through a certain point on a road during a given period of time.
 The number of spelling mistakes a secretary makes while typing a single page.
 The number of phone calls at a call center per minute.
 The number of times a web server is accessed per minute.
 For instance, the number of edits per hour recorded on Wikipedia's Recent Changes page follows an approximately Poisson distribution.
 The number of roadkill found per unit length of road.
 The number of mutations in a given stretch of DNA after a certain amount of radiation.
 The number of pine trees per unit area of mixed forest.
 The number of stars in a given volume of space.
 The number of soldiers killed by horsekicks each year in each corps in the Prussian cavalry. This example was made famous by a book of Ladislaus Josephovich Bortkiewicz (1868–1931).
 The number of bombs falling on each unit area of London during a German air raid in the early part of the Second World War. Statistician Roger Mexico uses the Poisson distribution to study this topic in Gravity's Rainbow.
 The distribution of visual receptor cells in the retina of the human eye.
How does this distribution arise? – The law of rare events
The binomial distribution with parameters n and λ/n, i.e., the probability distribution of the number of successes in n trials, with probability λ/n of success on each trial, approaches the Poisson distribution with expected value λ as n approaches infinity. This is sometimes known as the law of rare events.
Here are the details. First, recall from calculus that
 <math>\lim_{n\to\infty}\left(1{\lambda \over n}\right)^n=e^{\lambda}.<math>
Let p = λ/n. Then we have
 <math>\lim_{n\to\infty} P(X=k)=\lim_{n\to\infty}{n \choose k} p^k (1p)^{nk}
=\lim_{n\to\infty}{n! \over (nk)!k!} \left({\lambda \over n}\right)^k \left(1{\lambda\over n}\right)^{nk}<math>
 <math>=\lim_{n\to\infty} \underbrace{\left({n \over n}\right)\left({n1 \over n}\right)\left({n2 \over n}\right) \cdots \left({nk+1 \over n}\right)}\ \underbrace{\left({\lambda^k \over k!}\right)}\ \underbrace{\left(1{\lambda \over n}\right)^n}\ \underbrace{\left(1{\lambda \over n}\right)^{k}}.<math>
As n approaches ∞, the expression over the first of the four <math>\underbrace{\mathrm{underbraces}}<math> approaches 1; the expression over the second underbrace remains constant since "n" does not appear in it at all; the expression over the third underbrace approaches e^{−λ}; and the one over the fourth underbrace approaches 1.
Consequently the limit is
 <math>{\lambda^k e^{\lambda} \over k!}.\,\!<math>
More generally, whenever a sequence of binomial random variables with parameters n and p_{n} is such that
 <math>\lim_{n\rightarrow\infty} np_n = \lambda,<math>
the sequence converges in distribution to a Poisson random variable with mean λ (see, e.g., law of rare events (http://planetmath.org/?op=getobj&from=objects&id=6252)).
Properties
The expected value of a Poisson distributed random variable is equal to λ and so is its variance. The higher moments of the Poisson distribution are Touchard polynomials in λ, whose coefficients have a combinatorial meaning.
The mode of a Poisson distributed random variable with noninteger λ is equal to <math>\lfloor \lambda \rfloor<math>, which is the largest integer less than or equal to λ. This is also written as floor(λ). When λ is a positive integer, the modes are λ and λ − 1.
For sufficiently large values of λ (λ > 1000 say), the normal distribution with mean λ and variance λ is an excellent approximation to the Poisson distribution. If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., P(X ≤ x), where (lowercase) x is a nonnegative integer, is replaced by P(X ≤ x + 0.5).
If N and M are two independent random variables, both following a Poisson distribution with parameters λ and μ, respectively, then N + M follows a Poisson distribution with parameter λ + μ.
The momentgenerating function of the Poisson distribution with expected value λ is
 <math>E\left(e^{tX}\right)=\sum_{k=0}^\infty e^{tk} P(X=k)=\sum_{k=0}^\infty e^{tk} {\lambda^k e^{\lambda} \over k!} =e^{\lambda(e^t1)}.<math>
All of the cumulants of the Poisson distribution are equal to the expected value λ. The nth factorial moment of the Poisson distribution is λ^{n}.
The Poisson distributions are infinitely divisible probability distributions.
Parameter estimation
Given a sample of N measured values <math>k_i<math> we wish to estimate the value of the parameter <math>\lambda<math> of the Poisson population from which the sample was drawn. To calculate the maximum likelihood value, we form the likelihood function
 <math>L(\lambda)=\prod_{i=1}^N f(k_i) = \prod_{i=1}^N \frac{e^{\lambda}\lambda^{k_i}}{k_i!}
= \frac{e^{N\lambda}\lambda^{\Sigma k_i}}{\prod k_i!}<math>
where the sums and products are from <math>i=1<math> to <math>N<math>. Taking the logarithm of L and then the derivative with respect to <math>\lambda<math> and equating to zero yields the MLE estimate of <math>\lambda<math>:
 <math>\lambda_\mathrm{MLE}=\frac{1}{N}\sum_{i=1}^N k_i<math>
From the properties of characteristic functions, it is seen that the characteristic function of the distribution of <math>\lambda_\mathrm{MLE}<math> is
 <math>\varphi_\mathrm{MLE}(t)=\left(\prod_{i=1}^N \varphi(t/N)\right)=\varphi^N(t/N)=\exp(N\lambda(e^{it/N}1))<math>
The mean value of <math>\lambda_\mathrm{MLE}<math> is then found to be:
 <math>\langle \lambda_\mathrm{MLE}\rangle =
i\left(\frac{d}{dt}\,\varphi_\mathrm{MLE}(t)\right)_{t=0}=\lambda<math>
Since the average value of <math>\lambda_\mathrm{MLE}<math> is equal to <math>\lambda<math> it is therefore an unbiased estimator of <math>\lambda<math>.
The "law of small numbers"
The word law is sometimes used as a synonym of probability distribution, and convergence in law means convergence in distribution. Accordingly, the Poisson distribution is sometimes called the law of small numbers because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898. Some historians of mathematics have argued that the Poisson distribution should have been called the Bortkiewicz distribution.
See also
 Compound Poisson distribution
 Poisson process
 Erlang distribution which describes the waiting time until n events have occurred. For temporally distributed events, the Poisson distribution is the probability distribution of the number of events that would occur within a preset time, the Erlang distribution is the probability distribution of the amount of time until the nth event.
 Skellam distribution, the distribution of the difference of two Poisson variates, not necessarily from the same parent distribution.
External links
 Template:Planetmath reference
 Queueing Theory Basics (http://www.eventhelix.com/RealtimeMantra/CongestionControl/queueing_theory.htm)
 M/M/1 Queueing System (http://www.eventhelix.com/RealtimeMantra/CongestionControl/m_m_1_queue.htm)de:PoissonVerteilung
es:Distribución de Poisson fr:Loi de Poisson it:Variabile casuale poissoniana nl:Poissonverdeling ja:ポアソン分布 pl:Rozkład Poissona fi:Poissonin jakauma sv:Poissonfördelning zh:泊松分布 he:התפלגות פואסון