Gauss-Markov process
|
This article is not about the Gauss-Markov theorem of mathematical statistics.
As one would expect, Gauss-Markov stochastic processes (named after Carl Friedrich Gauss and Andrey Markov) are stochastic processes that satisfy the requirements for both Gaussian processes and Markov processes.
Every Gauss-Markov process X(t) possesses the three following properties:
- If h(t) is a non-zero scalar function of t, then Z(t) = h(t)X(t) is also a Gauss-Markov process
- If f(t) is a non-decreasing scalar function of t, then Z(t) = X(f(t)) is also a Gauss-Markov process
- There exists a non-zero scalar function h(t) and a non-decreasing scalar function f(t) such that X(t) = h(t)W(f(t)), where W(t) is the standard Wiener process.
Property (3) means that every Gauss-Markov process can be synthesized from the standard Wiener process (SWP).
Properties
A stationary Gauss-Markov process with variance <math>\textbf{E}(X^{2}) = \sigma^{2}<math> and time constant <math>\beta^{-1}<math> have the following properties. [What random variable is being called "X" here? Is it X(1), i.e., the value of the process at time 1?]
Exponential autocorrelation:
- <math>\textbf{R}_{x}(\tau) = \sigma^{2}e^{-\beta |\tau|}.\,<math>
(Power) spectral density function:
- <math>\textbf{S}_{x}(j\omega) = \frac{2\sigma^{2}\beta}{\omega^{2} + \beta^{2}}.\,<math>
The above yields the following spectral factorisation:
- <math>\textbf{S}_{x}(s) = \frac{2\sigma^{2}\beta}{-s^{2} + \beta^{2}}
= \frac{\sqrt{2\beta}\,\sigma}{(s + \beta)} \cdot\frac{\sqrt{2\beta}\,\sigma}{(-s + \beta)}.
<math>