# Stationarity, Autocorrelation, White Noise, and Linear Time Series

This tutorial introduces basic concepts about stationarity, autocorrelation, white noise, and linear time series.

### 1  Stationarity

#### 1.1  Strict stationarity

A time series {$$r_t$$} is said to be strictly stationary if the joint distribution of $$(t_{t_1},\cdots,r_{t_k})\,$$ is identical to  $$(t_{t_1+l},\cdots,r_{t_k+l})\,$$ for all t, where k is an arbitrary positive integer and ($$t_1,\cdots,r_k$$) is a collection of k positive integers.

In other words, strict stationarity requires that the joint distribution of ($$r_{t_1},\cdots,r_{t_k}$$) is invariant under time shift. This is a very hard condition that is hard to verify empirically. Read more Stationarity, Autocorrelation, White Noise, and Linear Time Series

# R in Time Series: Autocorrelation in R

This tutorial presents basics of autocorrelation in R.

#### 1  Stationality vs Ergodicity

The mean function of a time series model is

$\mu(t)=E(x_t)$

which is, in general, a function of time t. Since $$x_t\,$$ can have different realization at t, above definition of expectation is an average taken across the ensemble of all the possible times series that might have been realized by the time series model.

The ensemble constitutes the entire population. If we have a time series model, we can simulate more than one time series. However, with historical data, we usually only have a single time series. So, all we can do, without assuming a mathematical structure for the trend, is to estimate the mean at each sample point by the corresponding observed values. Read more R in Time Series: Autocorrelation in R