Linear algebra

Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (or linear spaces), linear transformations, and systems of linear equations. Vector spaces are a central theme in modern mathematics; thus, linear algebra is widely used in both abstract algebra and functional analysis. Linear algebra also has a concrete representation in analytic geometry. It has extensive applications in the natural sciences and the social sciences, since nonlinear models can often be approximated by a linear model.
Contents 
History
The history of modern linear algebra dates back to the years 1843 and 1844. In 1843, William Rowan Hamilton (from whom the term vector stems) discovered the quaternions. In 1844, Hermann Grassmann published his book Die lineale Ausdehnungslehre (see References).
Elementary introduction
Linear algebra had its beginnings in the study of vectors in Cartesian 2space and 3space. A vector, here, is a directed line segment, characterized by both length or magnitude and direction. Vectors can be used then to represent certain physical entities such as forces, and they can be added and multiplied with scalars, thus forming the first example of a real vector space.
Modern linear algebra has been extended to consider spaces of arbitrary or infinite dimension. A vector space of dimension n is called an nspace. Most of the useful results from 2 and 3space can be extended to these higher dimensional spaces. Although many people cannot easily visualize vectors in nspace, such vectors or ntuples are useful in representing data. Since vectors, as ntuples, are ordered lists of n components, most people can summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8dimensional vectors or 8tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v_{1}, v_{2}, v_{3}, v_{4}, v_{5}, v_{6}, v_{7}, v_{8}) where each country's GNP is in its respective position.
A vector space (or linear space), as a purely abstract concept about which we prove theorems, is part of abstract algebra, and well integrated into this field. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.
A vector space is defined over a field, such as the field of real numbers or the field of complex numbers. Linear operators take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.
One can say quite simply that the linear problems of mathematics  those that exhibit linearity in their behaviour  are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from nonlinear problems is very important in practice.
The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.
Some useful theorems
 Every linear space has a basis.
 A nonzero matrix A with n rows and n columns is invertible if there exists a matrix B that satisfies AB = BA = I where I is the identity matrix.
 A matrix is invertible if and only if its determinant is different from zero.
 A matrix is invertible if and only if the linear transformation represented by the matrix is an isomorphism.
 A matrix is positive semidefinite if and only if each of its eigenvalues is greater than or equal to zero.
 A matrix is positive definite if and only if each of its eigenvalues is greater than zero.
Equivalent statements for square matrices
A theorem in linear algebra states that for any n by n matrix A, all the following statements are equivalent (either always true or always false):
 A is invertible.
 <math>det(A) \neq 0<math>.
 rank(A)=n.
 nullity(A)=0.
 A does not have 0 as an eigenvalue.
 For any b <math>\in<math> F^{n}, Ax=b exactly one solution for x.
 Ax=0 only has the trivial solution.
 A^{T}A is invertible.
 A is row (column) equivalent to the identity matrix.
 The rows and columns of A span F^{n}.
 The null space of A contains only the zero vector.
 The range of A is F^{n}.
 The rows (columns) of A form a linearly independent set of vectors in F^{n} (F^{n}).
Where F is the field the matrix entries are from. Often, this field is the real numbers or the complex numbers.
Generalization and related topics
Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory one replaces the field of scalars by a ring. In multilinear algebra one deals with the 'several variables' problem of mappings linear in each of a number of different variables, inevitably leading to the tensor concept. In the spectral theory of operators control of infinitedimensional matrices is gained, by applying mathematical analysis in a theory that is not purely algebraic. In all these cases the technical difficulties are much greater.
See also
External links
 Linear Algebra Toolkit (http://www.math.odu.edu/~bogacki/lat/).
 Linear Algebra Workbench (http://www.algebra.com/algebra/college/linear/): multiply and invert matrices, solve systems, eigenvalues etc].
 Linear Algebra (http://mathworld.wolfram.com/topics/LinearAlgebra.html) on MathWorld.
References
 Beezer, Rob, A First Course in Linear Algebra (http://linear.ups.edu/index.html), licensed under GFDL.
 FearnleySander, Desmond, Hermann Grassmann and the Creation of Linear Algebra (http://www.maths.utas.edu.au/People/dfs/Papers/GrassmannLinAlgpaper/GrassmannLinAlgpaper.html), American Mathematical Monthly 86 (1979), pp. 809–817.
 Grassmann, Hermann, Die lineale Ausdehnungslehre ein neuer Zweig der Mathematik: dargestellt und durch Anwendungen auf die übrigen Zweige der Mathematik, wie auch auf die Statik, Mechanik, die Lehre vom Magnetismus und die Krystallonomie erläutert, O. Wigand, Leipzig, 1844.
Topics in mathematics related to space  Edit (http://en.wikipedia.org/w/wiki.phtml?title=Template:Space&action=edit) 
Geometry  Trigonometry  NonEuclidean geometry  Fractal geometry  Algebraic geometry  
Topology  Metric geometry  Algebraic topology  Differential geometry and topology  
Linear algebra  Functional analysis 
es:Álgebra lineal eo:Lineara algebro fr:Algèbre linéaire ko:선형대수학 he:אלגברה לינארית nl:Lineaire algebra ja:線型代数学 pl:Algebra liniowa pt:Álgebra linear ro:Algebră liniară ru:Линейная алгебра simple:Linear algebra sl:Linearna algebra sv:Linjär algebra vi:Đại số tuyến tính zh:线性代数