Projection (linear algebra)
|
In linear algebra, a projection is a linear transformation P such that P2 = P, i.e., an idempotent transformation. A matrix is a projection if the transformation it represents is a projection. An m × m matrix projection maps an m-dimensional vector space onto a k-dimensional subspace (k ≤ m). A special class of projections is the class of orthogonal projections, which are self-adjoint projections.
One such common projection is the projection of one vector in Rn onto another. For example, we can project the vector (1/2, 1/2)T onto the vector (0, 1)T, to get the vector (0, 1/2)T. We can describe in general the projection of one vector u onto another, v by
- <math>\mathrm{proj}_{\mathbf{v}}\,\mathbf{u} = {\mathbf{v}\mathbf{\cdot}\mathbf{u}\over\mathbf{v}\mathbf{\cdot}\mathbf{v}}\mathbf{v}<math>
where the dot represents the dot product. Since an inner product generalizes the idea of a dot product, then we have the equivalent formulation for any general inner product space:
- <math>\mathrm{proj}_{\mathbf{v}}\,\mathbf{u} = {\langle \mathbf{v}, \mathbf{u}\rangle\over\langle \mathbf{v}, \mathbf{v}\rangle}\mathbf{v}<math>
where <v1,v2> represents the inner product.
This projection is indeed a projection, observe:
- <math>\mathrm{proj}_{\mathbf{w}}\,\mathbf{x}={\langle\mathbf{w},\mathbf{x}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}<math>
by definition, then
- <math>\mathrm{proj}_{\mathbf{w}}\,\left({\langle\mathbf{w},\mathbf{x}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}\right)={\langle\mathbf{w},{\langle\mathbf{w},\mathbf{x}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}={{\langle\mathbf{w},\mathbf{x}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\langle\mathbf{w},\mathbf{w}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}<math>
- <math>={\langle\mathbf{w},\mathbf{x}\rangle\over\langle\mathbf{w}, \mathbf{w}\rangle}\mathbf{w}<math>
This projection is linear:
- <math>\mathrm{proj}_{\mathbf{w}}\,({\alpha\mathbf{a}+\beta\mathbf{b}})
={\langle\mathbf{w},\alpha\mathbf{a}+\beta\mathbf{b}\rangle\over\langle\mathbf{w},\mathbf{w}\rangle}\mathbf{w} ={\langle\mathbf{w},\alpha\mathbf{a}\rangle\over\langle\mathbf{w},\mathbf{w}\rangle}\mathbf{w}+{\langle\mathbf{w},\beta\mathbf{b}\rangle\over\langle\mathbf{w},\mathbf{w}\rangle}\mathbf{w}<math>
- <math>=\alpha{\langle\mathbf{w},\mathbf{a}\rangle\over\langle\mathbf{w},\mathbf{w}\rangle}\mathbf{w}+\beta{\langle\mathbf{w},\mathbf{b}\rangle\over\langle\mathbf{w},\mathbf{w}\rangle}\mathbf{w}
=\alpha\,\mathrm{proj}_{\mathbf{w}}\,\mathbf{a}+\beta\,\mathrm{proj}_{\mathbf{w}}\,\mathbf{b}<math> Projections (orthogonal and otherwise) play a major role in algorithms for certain linear algebra problems:
- QR decomposition (see Householder transformation and Gram-Schmidt decomposition);
- Singular value decomposition
- Reduction to Hessenberg form (the first step in many eigenvalue algorithms).