A review of autocovariance function for stationary process

Keywords: Non-Negative Definiteness, Autocovariance Function, Sample Autocovariance Function, Kolmogorov’s theorem

The autocovariance comes from the transition probability between states in $\mathcal{F}_t$, i.e. the probability measure of the probability space. Therefore, just as the martingale, Markov property and independency, the autocovariance is the consequence of specific probability measure. (Kolmogorov theorem某种程度上解决了随机过程逻辑构型上的重要环节,我们能够因此判断论域是否为空,是防止我们掉入逻辑陷阱的重要保险绳)

1. Stationary Process

1.1 The existence of a stochastic process

The distribution of a Stochastic Process

Let $\mathcal{T}$ be the $\sigma-$algebra defined on discrete time $\{t = (t_1, t_2, \dots, t_n) \in T^n, t_1 < t_2 < \dots < t_n, n = 1, 2, \dots\}$. Then the (finite-dimensional) distribution functions of $\{X_t, t \in T\}$ are the functions $\{F_t(\cdot), t \in \mathcal{F}\}$ defined for $t = (t_1, t_2, \dots, t_n)^\prime$ by

$F_t(\textbf{x}) = \textbf{Pr}(X_{t1} \leq x_1, \dots, X_{tn} \leq x_n )$

The distribution function of a stochastic process greatly depends on the time set. We only consider the discrete time problem.

Kolmogorov’s Theorem

The probability distribution functions $\{F_t(\cdot), t \in \mathcal{F}\}$ are the distribution functions of some stochastic process if and only if for any $n \in \{1, 2, \dots\}$, $t \in \mathcal{F}$ and $1 \leq i \leq n$,

$\lim\limits_{x_i \xrightarrow{} \infty} F_t(x) = F_{t(i)}(x(i))$

where $t(i)$ and $x(i)$ are the $(n - 1)-$component vectors obtained by deleting the $i$th components of $t$ and $x$ respectively.

If $\phi_t(\cdot)$ is the characteristic function corresponding to $F_t(\cdot)$, the Kolmogorov’s theorem can be written as:

$\lim\limits_{u_i \xrightarrow{} 0} \phi_t(u) = \phi_{t(i)}(u(i))$

Remark: Note that it is always a problem if the stochastic process we defined exists. Some most common stochastic processes, like Brownian motion and Poisson process, have strong constraints and properties. It is a must to check the definition since most logic errors are originated from the empty set. (marginal distribution 存在)

1.2 Stationarity

The time series $\{X_t, t \in \mathbb{Z}\}$ is said to be stationary if

$(i) E(|X_t|^2) < \infty$

$(ii) E X_t = m$

$(iii) \gamma_X(r, s) = \gamma_X(r + t, s + t) = \gamma_X(|r- s|, 0) = cov(X_r, X_s)$

for all $r, s, t \in \mathbb{Z}$

Remark: Stationarity only considers the limited properties of the stochastic process (the first moment and the second moment), while the strict stationarity considers all the properties of the distribution of stochastic process. When we are conducting statistical inference, the LLN, CLT, and Levy stable distribution are only based on the second-order properties. Both trend and seasonality are non-stationary features.

2. The autocovariance function of a stationary process

2.1 Elementary Properties

$\gamma(0) \ge 0$ (显然)

$\gamma(h) = \gamma(-h)$ (阿贝尔群, 易证)

$|\gamma(h)| \le \gamma(0)$ (Cauchy-Schwarz inequality)

Proof:

$|cov(X_{t+h}, X_t)| \le (Var(X_t))^{\frac{1}{2}}(Var(X_{t+h}))^{\frac{1}{2}} = \gamma(0)$

2.2 Non-Negative Definiteness

A real-values function on the integers, $\kappa: \mathbb{Z} \xrightarrow{} \mathbb{R}$, is said to be non-negative definite if and only if

$\sum_{i, j= 1}^n a_i k(t_i - t_j)a_j \ge 0$

for all positive integers n and for all vectors $\textbf{a}=(a_1, \dots, a_n) \in \mathbb{R}^n$ and $\textbf{t} = (t_1, \dots, t_n) \in \mathbb{Z}^n$ .

2.3 Characterization of Autocovariance functions

A real-valued function defined on the integers is the autocovariance function of a stationary time series if and only if it is even and non-negative definite.

Remark: 这条定理说明了非负定函数和平稳随机过程的关系。从下面的证明中我们亦能发现,所有定义在离散时间上的非负定函数一定能找到对应的定义在离散时间上的随机过程(不止一个,但必然存在),平稳的随机过程的自相关函数也必定是非负定的。

Proof:

$(i)$ For any $a \in \mathbb{R}^n$, $t \in \mathbb{Z}^n$, and $Z_t = (X_{t1} - \mathbb{E}(X_{t1}), X_{t2} - \mathbb{E}(X_{t2}), \dots, X_{tn} - \mathbb{E}(X_{tn}))$.

$0 \le Var(a^\prime Z_t) = a^\prime \mathbb{E}Z_t Z_t^\prime a = a^{\prime} \Gamma_n a$

$(ii)$ Let $K = [\kappa(t_i - t_j)]^n_{i, j = 1}$. Since $\kappa$ is non-negative definite, $k$ is also non-negative. There exists a characteristic function $\phi_t(u) = \exp(-u^\prime K u /2)$. It is clear that

$\phi_{t(i)}(u(i)) = \lim\limits_{u_i \xrightarrow{} 0} \phi_t(u)$ (线性代数问题,等于把$K$中第i行,第i列全都换成了0, 不影响其他entries). According to Kolmogorov’s theorem, there exists a stochastic process whose covariance matrix is $K$.

Remark: For any non-negative definite function, we can find a joint Gaussian distribution. (这其实是一个构造性证明,从感觉上看,不同分布肯定有相同的ACF,因此只要找到一个就得证。)

2.4 Sample Autocovariance function of an observed series

The sample ACF of $\{x_1, x_2, \dots, x_n\}$ is defined by:

$\hat{\gamma}(h) := n^{-1}\sum_{j = 1}^{n - h}(x_{h + j} - \bar{x})(x_{j} - \bar{x})$.

Note that the coefficient must be $n^{-1}$ to make sure the non-negative definite of ACF.

Remark: 虽然我们有很多可能的估计量,但是这里的ACF是定义的,也就是说一旦我们提到样本ACF,定义就是这样,而不用考虑是不是UMVE或者UMVUE, admissible 以及 minimax这样的话题。

Inner product space and Hilbert space Statistical Inference-Notes-Part6-Wilks' Theorem

Comments

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×