Matrix inversion
|
Matrix inversion is the following problem in linear algebra: given a square n-by-n matrix A, find a square n-by-n matrix B (if one exists) such that AB = BA = In, the n-by-n identity matrix.
Gauss-Jordan elimination is an algorithm that can be used to determine whether a given matrix is invertible and to find the inverse. An alternative is the Cholesky decomposition which generates two upper triangular matrices which are easier to invert. For special purposes, it may be convenient to invert matrices by treating mn-by-mn matrices as m-by-m matrices of n-by-n matrices, and applying one or another formula recursively (other sized matrices can be padded out with dummy rows and columns). For other purposes, a variant of Newton's method may be convenient (particularly when dealing with families of related matrices, so inverses of earlier matrices can be used to seed generating inverses of later matrices).
Writing another special matrix of cofactors, known as an adjoint matrix, can also be an efficient way to calculate the inverse of small matrices (since this method is essentially recursive, it becomes inefficient for large matrices). To determine the inverse, we calculate a matrix of cofactors:
- <math>A^{-1}={1 \over \begin{vmatrix}A\end{vmatrix}}\left(C_{ij}\right)^{T}={1 \over \begin{vmatrix}A\end{vmatrix}}
\begin{pmatrix} C_{11} & C_{21} & \cdots & C_{j1} \\ C_{12} & \ddots & & C_{j2} \\ \vdots & & \ddots & \vdots \\ C_{1i} & \cdots & \cdots & C_{ji} \\ \end{pmatrix}<math>
where |A| is the determinant of A, Cij is the matrix cofactor, and AT represents the matrix transpose.
In most practical applications, it is in fact not necessary to invert a matrix, but only to solve a system of linear equations. This can be done in general using techniques like LU decomposition. Various fast algorithms for special classes of such systems have been developed.