LDU decomposition
|
In linear algebra, LDU decomposition states that for every square matrix A, there exists:
- a permutation matrix P
- a lower-triangular matrix L with diagonals of 1
- a diagonal matrix D
- and an upper triangular matrix U with diagonals of 1
so that <math>PA = LDU <math>.
The permutation matrix can possibly be equal to the identity matrix I. If it is, than LDU is a unique decomposition, that is, for any other decomposition L!=L*, D!=D*, and U!=U*.
This can be shown by showing a contradiction by assuming it's not true:
- <math>LDU=L_*D_*U_*<math>
This equation is solved to:
- <math>L_*^{-1}LD=U^{-1}U_*D_*<math>
If one multiplies a lower-triangular matrix by another, one gets a lower triangular matrix. Therefore the inverse of a lower-triangular matrix must also be a lower-triangular matrix. If one multiplies a lower-triangular matrix by a diagonal matrix, one still has a lower-triangular matrix. Therefore, the left side of the above equation is a lower-triangular matrix. The same arguments show that the right side must be an upper-triangular matrix. Therefore, the only way the above equation could hold (besides with null matrices) is if the terms (L with the inverse of L*, and U with the inverse of U*) cancel, which shows that they are equal because a term would only cancel if it was its own inverse.