Self-adjoint operator

On a finite-dimensional inner product space, a self-adjoint operator is one that is its own adjoint, or, equivalently, one whose matrix is Hermitian, where a Hermitian matrix is one which is equal to its own conjugate transpose. By the finite-dimensional spectral theorem such operators have an orthonormal basis in which the operator can be represented as a diagonal matrix with entries in the real numbers. In this article, we consider generalizations of this concept to operators on Hilbert spaces of arbitrary dimension.

Self-adjoint operators are used in functional analysis and quantum mechanics. In quantum mechanics their importance lies in the fact that in the Dirac-von Neumann formulation of quantum mechanics, physical observables such as position, momentum, angular momentum and spin are represented by self-adjoint operators on a Hilbert space. Of particular significance is the Hamiltonian

<math> H \psi = - \frac{\hbar^2}{2 m} \Delta \psi + V \psi <math>

which as an observable corresponds to the total energy of a particle of mass m in a potential field V.

The structure of self-adjoint operators on infinite dimensional Hilbert spaces is complicated somewhat by the fact that the operators may be partial functions, meaning that they are defined only on a proper subspace of the Hilbert space. Such is the case for differential operators.

Contents

Symmetric operators

A partially defined linear operator A on a Hilbert space H is called symmetric iff

<math> \langle Ax \mid y \rangle = \lang x \mid Ay \rang <math>

for all elements x and y in the domain of A. This usage is fairly standard in the functional analysis literature.

By the Hellinger-Toeplitz theorem, a symmetric everywhere defined operator is bounded.

Bounded symmetric operators are also called Hermitian.

The previous definition agrees with the one for matrices given in the introduction to this article, if we take as H the Hilbert space Cn with the standard dot product and interpret a square matrix as a linear operator on this Hilbert space. It is however much more general as there are important infinite-dimensional Hilbert spaces.

The spectrum of any bounded symmetric operator is real; in particular all its eigenvalues are real, although a symmetric operator may not have any eigenvalues.

A general version of the spectral theorem which also applies to bounded symmetric operators is stated below; while the eigenvectors corresponding to different eigenvalues are orthogonal, in general it is not true that the Hilbert space H admits an orthonormal basis consisting only of eigenvectors of the operator. In fact, bounded symmetric operators need not have any eigenvalues or eigenvectors at all.

Example. Consider the complex Hilbert space L2[0,1] and the differential operator

<math> A = - \frac{d^2}{dx^2} <math>

defined on the subspace consisting of all complex-valued infinitely differentiable functions f on [0,1] with the boundary conditions:

<math> f(0) = f(1) = 0 \quad <math>

Then integration by parts shows that A is symmetric. Its eigenfunctions are the sinusoids

<math> f_n(x) = \sin(n \pi x) \quad n= 1,2, \ldots <math>

with the real eigenvalues n2π2; the well-known orthogonality of the sine functions follows as a consequence of the property of being symmetric.

We consider generalizations of this operator below.

Self-adjoint operators

Given a densely defined linear operator A on H, its adjoint A* is defined as follows:

  • The domain of A* consists of vectors x in H such that
<math> y \mapsto \langle x \mid A y \rangle <math>
(which is a densely defined linear map) is a continuous linear functional. By continuity and density of the domain of A, it extends to a unique continuous linear functional on all of H.
<math> \langle x \mid A y \rangle = \langle z \mid y \rangle \quad \forall y \in \operatorname{dom} A <math>
This vector z is defined to be A* x. It can be shown that the dependence of z on x is linear.

There is a useful geometrical way of looking at the adjoint of an operator A on H as follows: we consider the graph G(A) of A defined by

<math> \operatorname{G}(A) = \{(\xi, A \xi): \xi \in \operatorname{dom}(A)\} \subseteq H \oplus H .<math>

Theorem. Let J be the mapping

<math> H \oplus H \rightarrow H \oplus H <math>

given by

<math> \operatorname{J}: (\xi, \eta) \mapsto (-\eta, \xi). <math>

Then

<math> \operatorname{G}(A^*) = (\operatorname{J}\operatorname{G}(A))^\mathrm{perp}. <math>

The following is an alternate characterization of symmetric operator: A densely defined operator A is symmetric iff

<math> A \subseteq A^*.<math>

An operator A is self-adjoint iff A = A*.

Example. Consider the complex Hilbert space L2(R), and the operator which multiplies a given function by x:

<math> A f(x) = xf(x) <math>

The domain of A is the space of all L2 functions for which the right-hand-side is square-integrable. A is a symmetric operator without any eigenvalues and eigenfunctions. In fact it turns out that the operator is self-adjoint, as follows from the theory outlined below.

As we will see later, self-adjoint operators have very important spectral properties; they are in fact multiplication operators on general measure spaces.

Spectral theorem

Partially defined operators A, B on Hilbert spaces H, K are unitarily equivalent iff there is a unitary operator U:HK such that

  • <math> B U \xi = U A \xi ,\quad \xi \in \operatorname{dom}A. <math>

A multiplication operator is defined as follows: Let <math> (X, \mathcal{A}, \mu) <math> be a σ-finite countably additive measure space and f a real-valued measurable function on X. An operator T of the form

<math> [T \psi] (x) = f(x) \psi(x) \quad <math>

whose domain is the space of ψ for which the right-hand side above is in L2 is called a multiplication operator.

Theorem. Any multiplication operator is a (densely defined) self-adjoint operator. Any self-adjoint operator is unitarily equivalent to a multiplication operator.

Remark. Note that the σ-finiteness property is needed in order for T to be densely defined.

This version of the spectral theorem for self-adjoint operators can be proved by reduction to the spectral theorem for unitary operators. This reduction uses the Cayley transform for self-adjoint operators which is defined in the next section.

Borel functional calculus

Given the representation of T as a multiplication operator, it is then easy to explain how the Borel functional calculus should operate: If h is a bounded real-valued Borel function on R, then h(T) is the operator of multiplication by the composition <math> h \circ f <math>. In order for this to be well-defined, we need to show that it is the unique operation on bounded real-valued Borel functions satisfying a number of conditions.

Resolution of the identity

It has been customary to introduce the following notation

<math> \operatorname{E}_T(\lambda) = \mathbf{1}_{(-\infty, \lambda]} (T) <math>

The family of operators ET(λ) is called resolution of the identity for T. Moreover, the following Stieltjes integral representation for T can be proved:

<math> T = \int_{-\infty}^{+\infty} \operatorname{E}_T(\lambda) d \lambda. <math>

In more modern treatments, this representation is usually avoided, since most technical problems can be dealt with by the functional calculus.

Extensions of symmetric operators

If an operator A on the Hilbert space H is symmetric, when does it have self-adjoint extensions? The answer is provided by the Cayley transform of a self-adjoint operator and the deficiency indices.

Theorem. Suppose A is a symmetric operator. Then there is a unique partially defined linear operator

<math>\operatorname{W}(A):\operatorname{ran}(A+i) \rightarrow \operatorname{ran}(A-i)<math>

such that

<math> \operatorname{W}(A)(Ax + ix) = Ax - ix \quad x \in \operatorname{dom}(A). <math>

W(A) is isometric on its domain. Moreover, the range of 1 - W(A) is dense in H.

Conversely, given any partially defined operator U which is isometric on its domain (which is not necessarily closed) and such that 1 - U is dense, there is a (unique) operator S(U)

<math>\operatorname{S}(U):\operatorname{ran}(1 - U) \rightarrow \operatorname{ran}(1+U)<math>

such that

<math> \operatorname{S}(U)(x - Ux)= i(x + U x)) \quad x \in \operatorname{dom}(U). <math>

The operator S(U) is densely defined and symmetric.

The mappings W and S are inverses of each other.

The mapping W is called the Cayley transform. It associates a partially defined isometry to any symmetric densely-defined operator. Note that the mappings W and S are monotone: This means that if B is a symmetric operator that extends the densely defined symmetric operator A, then W(B) extends W(A), and similarly for S.

Theorem. A necessary and sufficient condition for A to be self-adjoint is that its Cayley transform W(A) be unitary.

This immediately gives us a necessary and sufficient condition for A to have a self-adjoint extension, as follows:

Theorem. A necessary and sufficient condition for A to have a self adjoint extension is that W(A) have a unitary extension.

A partially defined isometric operator V on a Hilbert space H has a unique isometric extension to the norm closure of dom(V). A partially defined isometric operator with closed domain is called a partial isometry.

Given a partial isometry V, the deficiency indices of V are defined as follows:

<math> n_+(V) = \operatorname{dim}\ \operatorname{dom}(V)^{\mathrm{perp}}<math>
<math> n_-(V) = \operatorname{dim} \ \operatorname{ran}(V)^{\mathrm{perp}}<math>

Theorem. A partial isometry V has a unitary extension iff the deficiency indices are identical. Moreover, V has a unique unitary extension iff the both deficiency indices are zero.

An operator which has a unique self-adjoint extension is said to be essentially self-adjoint. Such operators have a well-defined Borel functional calculus. Symmetric operators which are not essentially self-adjoint may still have a canonical self-adjoint extension. Such is the case for non-negative symmetric operators (or more generally, operators which are bounded below). These operators always have a canonically defined Friedrichs extension and for these operators we can define a canonical functional calculus. Many operators that occur in analysis are bounded below (such as the negative of the Laplacian operator), so the issue of essential adjointness for these operators is less critical.

Von Neumann's formulas

Suppose A is symmetric; any symmetric extension of A is a restriction of A*; Indeed if B is symmetric

<math> A \subseteq B \implies B \subseteq B^* \subseteq A^* <math>

Theorem. Suppose A is a densely defined symmetric operator. Let

<math> N_+ = \operatorname{ran}(A+i)^{\mathrm{perp}}<math>
<math> N_- = \operatorname{ran}(A-i)^{\mathrm{perp}}<math>

Then

<math> N_+ = \operatorname{ker}(A^*-i)<math>
<math> N_- = \operatorname{ker}(A^*+i)<math>

and

<math> \operatorname{dom}(A^*) = \overline{\operatorname{dom}(A)} \oplus N_+ \oplus N_- <math>

where the decomposition is orthogonal relative to the graph inner product of dom(A*):

<math> \langle \xi | \eta \rangle_\mathrm{graph} = \langle \xi | \eta \rangle + \langle A^*\xi | A^* \eta \rangle. <math>

These are referred to as von Neumann's formulas in the Akhiezer and Glazman reference.

Examples

We first consider the differential operator

<math> D: \phi \mapsto \frac{1}{i} \phi' <math>

defined on the space of complex-valued C functions on [0,1] vanishing near 0 and 1. D is a symmetric operator as can be shown by integration by parts. The spaces N+, N are given respectively by the distributional solutions to the equation

<math> u' = i u \quad <math>
<math> u' = - i u \quad <math>

which are in L2 [0,1]. One can show that each one of these solution spaces is 1-dimensional, generated by the functions x → ei x and x → e- i x respectively. This shows that D is not essentially self adjoint, but does have self-adjoint extensions. These self-adjoint extensions are parametrized by the space of unitary mappings

<math> N_+ \rightarrow N_- <math>

which in this case happens to be the unit circle T.

This simple example illustrates a general fact about self-adjoint extensions of symmetric differential operators P on an open set M. They are determined by the unitary maps between the eigenvalue spaces

<math> N_\pm = \{u \in L^2(M): P_{\operatorname{dist}} u = \pm i u\} <math>

where Pdist is the distributional extension of P.

We next give the example of differential operators with constant coefficients. Let

<math> P(\vec{x}) = \sum_\alpha c_\alpha x^\alpha <math>

be a polynomial on Rn with real coefficients, where α ranges over a (finite) set of multi-indices. Thus

<math> \alpha = (\alpha_1, \alpha_2, \ldots, \alpha_n) <math>

and

<math> x^\alpha = x_1^{\alpha_1} x_2^{\alpha_2} \cdots x_n^{\alpha_n}. <math>

We also use the notation:

<math> D^\alpha = \frac{1}{i^{|\alpha|}} \partial_{x_1}^{\alpha_1}\partial_{x_2}^{\alpha_2} \cdots \partial_{x_n}^{\alpha_n}. <math>

Then the operator P(D) defined on the space of infinitely differentiable functions of compact support on Rn by

<math> P(\operatorname{D}) \phi = \sum_\alpha c_\alpha D^\alpha \phi <math>

is essentially self-adjoint on L2(Rn).

Theorem. Let P a polynomial function on Rn with real coefficients, F the Fourier transform considered as a unitary map L2(Rn) → L2(Rn). Then F* P(D) F is essentially self-adjoint and its unique self-adjoint extension is the operator of multiplication by the function P.

More generally, consider linear differential operators acting on infinitely differentiable complex-valued functions of compact support. If M is an open subset of Rn

<math> P \phi(x) = \sum_\alpha a_\alpha (x) [D^\alpha \phi](x) \quad <math>

where aα are (not necessarily constant) infinitely differentiable functions. P is a linear operator

<math> C_0^\infty(M) \rightarrow C_0^\infty(M). \quad <math>

Corresponding to P there is another differential operator called the formal adjoint of P

<math> P^{\mathrm{*form}} \phi = \sum_\alpha D^\alpha (\overline{a_\alpha} \phi) \quad <math>

Theorem. The operator theoretic adjoint P* of P is a restriction of the distributional extension of the formal adjoint. Specifically:

<math> \operatorname{dom} P^* = \{u \in L^2(M): P^{\mathrm{*form}}u

\in L^2(M)\}. <math>

Spectral multiplicity theory

The multiplication representation of a self-adjoint operator, though extremely useful, is not a canonical representation. This suggests that it is not easy to extract from this representation a criterion to determine when self-adjoint operators A and B are unitarily equivalent. The finest grained representation which we now discuss involves spectral multiplicity.

We first define uniform multiplicity:

Definition. A self-adjoint operator A has uniform multiplicity n where n is such that 1 ≤ n ≤ ω iff A is unitarily equivalent to the operator Mf of multiplication by the function f(λ) = λ on

<math> L^2_{\mu}(\mathbb{R}, \mathbf{H}_n)= \{\psi: \mathbb{R} \rightarrow \mathbf{H}_n: \psi \mbox{ measurable and } \int_{\mathbb{R}} \|\psi(t)\|^2 d \mu(t) < \infty\} <math>

where Hn is a Hilbert space of dimension n. The domain of Mf consists of vector-valued functions ψ on R such that

<math> \int_{\mathbb{R}} |\lambda|^2 \ \| \psi(\lambda)\|^2 \, d \mu(\lambda) < \infty. <math>

Non-negative countably additive measures μ, ν are mutually singular iff they are supported on disjoint Borel sets.

Theorem. Let A be a self-adjoint operator on a separable Hilbert space H. Then there is an ω sequence of countably additive finite measures on R (some of which may be identically 0)

<math> \{\mu_\ell\}_{1 \leq \ell \leq \omega} <math>

such that the measures are pairwise singular and A is unitarily equivalent to the operator of multiplication by the function f(λ) = λ on

<math> \bigoplus_{1 \leq \ell \leq \omega} L^2_{\mu_\ell}(\mathbb{R}, \mathbf{H}_\ell). <math>

This representation is unique in the following sense: For any two such representations of the same A, the corresponding measures are equivalent in the sense that they have the same sets of measure 0.

The spectral multiplicity theorem can be reformulated using the language of direct integrals of Hilbert spaces:

Theorem. Any self-adjoint operator on a separable Hilbert space is unitarily equivalent to multiplication by the function λ → λ on

<math> \int_\mathbb{R}^\oplus H_x d \mu(x). <math>

The class of μ is uniquely determined and the measurable family {Hx}x is determined almost everywhere.

Example: structure of the Laplacian

The Laplacian on Rnis the operator

<math> \Delta = \sum_{i=1}^n \partial_{x_i}^2. <math>

As remarked above, the Laplacian is diagonalized by the Fourier transform. Actually it is more natural to consider the negative of the Laplacian - Δ since as an operator it is non-negative; (see elliptic operator).

Theorem. If n=1, the - Δ has uniform multiplicity mult=2, otherwise - Δ has uniform multiplicity mult=ω. Morover, the measure μmult is Borel measure on [0, ∞).

Pure point spectrum

A simple, though important class of operators are those with pure point spectrum. A self-adjoint operator A on H has pure point spectrum iff H has an orthonormal basis {ei}i ∈ I consisting of eigenvectors for A.

Example. The Hamiltonian for the harmonic oscillator has a quadratic potential V, that is

<math> -\Delta + |x|^2 \quad <math>

This Hamiltonian has pure point spectrum.

References

  • N.I. Akhiezer and I. M. Glazman, Theory of Linear Operators in Hilbert Space (two volumes), Pitman, 1981.
  • K. Yosida, Functional Analysis, Academic Press, 1965.

pl:Operator hermitowski

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools