Mahalanobis distance
|
In statistics, Mahalanobis distance is a distance measure invented by P. C. Mahalanobis in 1936. It is based on correlations between variables by which different patterns can be identified and analysed. It is a useful way of determining similarity of an unknown sample set to a known one. It differs from Euclidean distance in that it takes into account the correlations of the data set.
Formally, the Mahalanobis distance from a group of values with mean <math>\mu = ( \mu_1, \mu_2, \mu_3, \dots , \mu_p )<math> and covariance matrix <math>\Sigma<math> for a multivariate vector <math>x = ( x_1, x_2, x_3, \dots, x_p )<math> is defined as:
- <math>D_M(x) = \sqrt{(x - \mu)' \Sigma^{-1} (x-\mu)}.\, <math>
Mahalanobis distance can also be defined as dissimilarity measure between two random vectors <math> \vec{x}<math> and <math> \vec{y}<math> of the same distribution with the covariance matrix <math>\Sigma<math> :
- <math> d(\vec{x},\vec{y})=\sqrt{(\vec{x}-\vec{y})'\Sigma^{-1} (\vec{x}-\vec{y})}.\,
<math>
If the covariance matrix is the identity matrix then it is the same as Euclidean distance. If covariance matrix is diagonal, then it is called normalized Euclidean distance:
- <math> d(\vec{x},\vec{y})=
\sqrt{\sum_{i=1}^p {(x_i - y_i)^2 \over \sigma_i^2}}, <math>
where <math>\sigma_i<math> is the standard deviation of the <math> x_i <math> over the sample set.