Introduction to Autoregressive models

This tutorial presents introduction of autoregressive models, and theoir implementation in R.

1  AR(1) model

1.1  Model definition

[r_t=phi_0+phi_1r_{t-1}+a_t]

where ({a_t},) is a white noise series of mean zero and variance (sigma_a^2).

Notes:

  • AR(1) model is widely used not only for returns, as shown with (r_t,) here, but also for volatility with (r_t,) replaced with (sigma_t).
  • Conditional on past return (r_{t-1}), we have conditional mean and variance as following[begin{gather*}mathbb{E}[r_t|r_{t-1}]=phi_0+phi_1r_{t-1}\text{Var}[r_t|r_{t-1}]=sigma_a^2end{gather*}]This is a Markov property in that, conditional on (r_{t-1}), the return (r_t,) is not correlated with (r_{t-i},) for i > 1.

Read more Introduction to Autoregressive models

Stationarity, Autocorrelation, White Noise, and Linear Time Series

This tutorial introduces basic concepts about stationarity, autocorrelation, white noise, and linear time series.

1  Stationarity

1.1  Strict stationarity

A time series {\(r_t\)} is said to be strictly stationary if the joint distribution of \((t_{t_1},\cdots,r_{t_k})\,\) is identical to  \((t_{t_1+l},\cdots,r_{t_k+l})\,\) for all t, where k is an arbitrary positive integer and (\(t_1,\cdots,r_k\)) is a collection of k positive integers.

In other words, strict stationarity requires that the joint distribution of (\(r_{t_1},\cdots,r_{t_k}\)) is invariant under time shift. This is a very hard condition that is hard to verify empirically. Read more Stationarity, Autocorrelation, White Noise, and Linear Time Series

Asset Return and Distributions

This post talks about asset return and distributions, covering various definitions of returns and the relationship among them, return distributions and tests of returns.

1  Asset returns

Most financial studies involves returns, instead of prices, for two reasons:

  1. Return is a complete and scale-free summary of investment opportunity;
  2. Return has more attractive statistical properties than price.

Read more Asset Return and Distributions

Download financial data using R’s quantmod package

This tutorial gives a short intruduction about how to use R’s Quantmod package to retrieve financial time series data from internet.

1  Overview of quantmod package

The quantmod package for R is designed to assist the quantitative trader in the development, testing, and deployment of statistically based trading models. It provides a rapid prototyping environment, where quant traders can quickly and cleanly explore and build trading models. Quantmod makes modelling easier by removing the repetitive workflow issues surrounding data management, modelling interfaces, and performance analysis.

However, quantmod is not a replacement for anything statistical. It has no ‘new’ modelling routines or analysis tool to speak of. It does now offer charting not currently available elsewhere in R, but most everything else is more of a wrapper to what you already know and love about the language and packages you currently use.

Read more Download financial data using R’s quantmod package

R in Time Series: Linear Regression With Harmonic Seasonality

This tutorial talks about linear regression with harmonic seasonality.

1  Underlying mathematics

In regression modeling with seasonality, we can use one parameter for each season. For instance, 12 parameters for 12 months in one year. However, seasonal effects often vary smoothly over the seasons, so that it may be more parameter-efficient to use a smooth function instead of separate indices. Sine and cosine functions can be used to build smooth variationinto a seasonal model. Read more R in Time Series: Linear Regression With Harmonic Seasonality

R in Time Series: Linear Regression with Seasonal Variables

This tutorial gives a short introduction about linear regression with seasonal variables.

A time series are observations measured sequentially in time, seasonal effects are often present in the data, especially annual cycles caused directly or indirectly by the Earth’s movement around the sun. Here we will present linear regression model with additive seasonal indicator variables included.

Suppose a time series contains s seasons. For example

  • For time series measured over each calendar month, s = 12.
  • For time series measured in six-month intevals, corresponding to summer and winter, s = 2.

Read more R in Time Series: Linear Regression with Seasonal Variables

R in Time Series: Linear Regression

This tutorial talks about linear regression on time series and implementations in R.

1  Trend: stochastic vs deterministic

  • We may consider a trend to be stochastic when it shows inexplicale changes in direction, and we attribute apparent transient trends to high serial correlations with random errors.
  • When we have some plausible physical explanation for a trend, we usually wish to model it in some deterministic manner. Deterministic trends and seasonal variations can be modelled using regression.
  • The practical difference between stochastic and deterministic trends is that we extrapolate the latter when we make forecasts. We justify short-term extrapolation by claiming that underlying trends will usually change slowly in comparison with the forecast lead time. For the same reason, short-term extrapolation should be based on a line, maybe fitted to the more recent data only, rather than a high-order polynomial.

Read more R in Time Series: Linear Regression

Autocorrelation Affects Regression on Time Series

This post talks about how autocorrelation affects regressions on time series.

Time series regression usually differs from a standard regression analysis because the residuals form a time series and therefore tend to be serially correlated.

  • When the residual correlation is positive, the estimated standard deviation of the parameter estimates, read from the computer output of a standard regression analysis, will tend to be less than their true value. \(\,\Longrightarrow\,\) This will lead to erroneously high statistical significance being attributed to statistical tests in standard computer output. In other words, the obtained p values will be smaller than they should be.

Read more Autocorrelation Affects Regression on Time Series

R in Time Series: Autoregressive model

This tutorial gives brief introduction about the autoregressive model in time series.

1   Definition

A time series \(\{x_t\}\,\) is an autoregressive process of order p, denoted by AR(p), if

\[x_t=\alpha_1x_{t-1}+\alpha_2x_{t-2}+\cdots+\alpha_px_{t-p}+w_t=\sum_{i=1}^p\alpha_ix_{t-i}+w_t\] Read more R in Time Series: Autoregressive model

R in Time Series: Holt-Winters Smoothing and Forecast

This tutorial tells about how to do Holt-WInters smoothing and forecast in R.

1  Basics of Holt-Winters method

1.1  Additive model

\[\text{Level:    }a_t=\alpha(x_t-s_{t-p})+(1-\alpha)(a_{t-1}+b_{t-1})\]

\[\text{Trend (or slope):    }b_t=\beta(a_t-a_{t-1})+(1-\beta)b_{t-1}\]

\[\text{Seasonal effect:    }s_t=\gamma(x_t-a_t)+(1-\gamma)s_{t-p}\]

where \(a_t\), \(b_t\), and \(s_t\,\) are the estimated level, slope, and seasonal effect at time t, and \(\alpha\), \(\beta\), and \(\gamma\,\) are the smoothing parameters. Read more R in Time Series: Holt-Winters Smoothing and Forecast