System of linear equations
|
In mathematics and linear algebra, a system of linear equations is a set of linear equations such as
- 3x1 + 2x2 − x3 = 1
- 2x1 − 2x2 + 4x3 = −2
- −x1 + ½x2 − x3 = 0.
The problem is to find those values for the unknowns x1, x2 and x3 which satisfy all three equations simultaneously.
Systems of linear equations belong to the oldest problems in mathematics and they have many applications, such as in digital signal processing, estimation, forecasting and generally in linear programming and in the approximation of non-linear problems in numerical analysis. An efficient way to solve systems of linear equations is given by the Gauss-Jordan elimination or by the Cholesky decomposition.
In general, a system with m linear equations and n unknowns can be written as
- a11x1 + a12x2 + + a1nxn = b1
- a21x1 + a22x2 + + a2nxn = b2
- :
- :
- am1x1 + am2x2 + + amnxn = bm,
where x1, ... ,xn are the unknowns and the numbers aij are the coefficients of the system. We can separate the coefficients in a matrix as follows:
- <math>
\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}
\begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix} = \begin{bmatrix} b_1 \\ b_2 \\ \vdots \\ b_m \end{bmatrix}
<math>
If we represent each matrix by a single letter, this becomes
- Ax = b,
where A is an m-by-n matrix above, x is a column vector with n entries and b is a column vector with m entries. The above mentioned Gauss-Jordan elimination applies to all these systems, even if the coefficients come from an arbitrary field.
If the field is infinite (as in the case of the real or complex numbers), then only the following three cases are possible for any given system of linear equations:
- the system has no solution
- the system has a single solution
- the system has infinitely many solutions.
A system of the form
- Ax = 0
is called a homogenous system of linear equations. The set of all solutions of such a homogeneous system is called the null space of the matrix A, it is written as Nul A.
Especially in view of the above applications, several more efficient alternatives to Gauss-Jordan elimination have been developed for a wide diversity of special cases. Many of these improved algorithms are of complexity O(n²). Some of the most common special cases are:
- For problems of the form Ax = b, where A is a symmetric Toeplitz matrix, we can use Levinson recursion or one of its derivatives. One special commonly used Levinson-like derivative is Schur recursion, which is used in many digital signal processing applications.
- For problems of the form Ax = b, where A is a singular matrix or nearly singular, the matrix A is decomposed into the product of three matrices in a process called singular-value decomposition. The left and right hand matrices are left and right hand singular vectors. The middle matrix is a diagonal matrix and contains the singular values. The matrix can then be inverted simply by reversing the order of the three components, transposing the singular vector matrices, and taking the reciprocal of the diagonal elements of the middle matrix. If any of the singular values is too close to zero and therefore close to being singular, they are set to zero.
External links
- Online linear solver (http://wims.unice.fr/wims/wims.cgi?module=tool/linear/linsolver.en) from WIMS.de:Lineares Gleichungssystem
et:Lineaarvõrrandite süsteem fr:Système d'équations linéaires ja:線型方程式系 pl:Układ równań liniowych vi:Hệ phương trình tuyến tính