0% found this document useful (0 votes)
22 views9 pages

Time Series Analysis

This document is a comprehensive guide to Time Series Analysis, covering various models such as Autoregressive (AR), Moving Average (MA), Autoregressive Moving Average (ARMA), and Autoregressive Integrated Moving Average (ARIMA). It includes definitions, examples, properties, and solved problems for each model, as well as discussions on operators, stochastic vs deterministic models, and diagnostic tests. The primary goal is to help understand time-ordered data points and make accurate predictions.

Uploaded by

eavyjr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views9 pages

Time Series Analysis

This document is a comprehensive guide to Time Series Analysis, covering various models such as Autoregressive (AR), Moving Average (MA), Autoregressive Moving Average (ARMA), and Autoregressive Integrated Moving Average (ARIMA). It includes definitions, examples, properties, and solved problems for each model, as well as discussions on operators, stochastic vs deterministic models, and diagnostic tests. The primary goal is to help understand time-ordered data points and make accurate predictions.

Uploaded by

eavyjr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Time Series Analysis

Eavy Junior
August 2024

article amsmath, amssymb, amsthm, graphicx, hyperref


Comprehensive Guide to Time Series Analysis

Contents
1 Introduction 2

2 Autoregressive (AR) Model 3


2.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.3 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.4 Solved Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

3 Moving Average (MA) Model 3


3.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
3.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
3.3 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
3.4 Solved Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

4 Autoregressive Moving Average (ARMA) Model 4


4.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
4.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
4.3 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
4.4 Solved Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

5 Autoregressive Integrated Moving Average (ARIMA) Model 5


5.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
5.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.3 Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
5.4 Solved Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

6 Backshift Operator 6
6.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
6.2 Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1
7 Lag Operator 6
7.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
7.2 Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

8 Stochastic Model vs Deterministic Model 7


8.1 Stochastic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
8.2 Deterministic Model . . . . . . . . . . . . . . . . . . . . . . . . . 7
8.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

9 Stationarity Test 7
9.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
9.2 Augmented Dickey-Fuller (ADF) Test . . . . . . . . . . . . . . . 7
9.3 Phillips-Perron (PP) Test . . . . . . . . . . . . . . . . . . . . . . 7

10 Autocorrelation Function (ACF) & Partial Autocorrelation Func-


tion (PACF) 8
10.1 Autocorrelation Function (ACF) . . . . . . . . . . . . . . . . . . 8
10.2 Partial Autocorrelation Function (PACF) . . . . . . . . . . . . . 8

11 Unit Root Test 8


11.1 Dickey-Fuller (DF) Test . . . . . . . . . . . . . . . . . . . . . . . 8
11.2 Phillips-Perron (PP) Test . . . . . . . . . . . . . . . . . . . . . . 8

12 Diagnostic Tests 8
12.1 Jarque-Bera (JB) Test . . . . . . . . . . . . . . . . . . . . . . . . 8
12.2 Ljung-Box (LB) Test . . . . . . . . . . . . . . . . . . . . . . . . . 8
12.3 Heteroscedasticity Test . . . . . . . . . . . . . . . . . . . . . . . . 9
12.4 Autocorrelation Test . . . . . . . . . . . . . . . . . . . . . . . . . 9

13 Forecasting 9
13.1 Granger Causality Test . . . . . . . . . . . . . . . . . . . . . . . 9
13.2 Impulse Response . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
13.3 Variance Decomposition . . . . . . . . . . . . . . . . . . . . . . . 9

14 Conclusion 9

1 Introduction
Time series analysis involves methods for analyzing time-ordered data points.
The primary goal is to model the underlying structure of the series to understand
its behavior and make accurate predictions. This guide covers essential topics
in time series analysis, including models, tests, and forecasting techniques.

2
2 Autoregressive (AR) Model
2.1 Definition
An Autoregressive (AR) model is a representation of a type of random process;
as such, it is used to describe certain time-varying processes in nature, eco-
nomics, etc. The AR model specifies that the output variable depends linearly
on its previous values and a stochastic term, which accounts for randomness.

Yt = ϕ1 Yt−1 + ϕ2 Yt−2 + . . . + ϕp Yt−p + ϵt

where:
• Yt is the value at time t,
• ϕ1 , ϕ2 , . . . , ϕp are the parameters of the model,
• ϵt is white noise with mean zero and constant variance.
The order of the AR model is determined by the number of lagged observa-
tions included, which is denoted by p.

2.2 Example
Consider an AR(1) model:

Yt = 0.5Yt−1 + ϵt

This implies that the current value Yt is determined by half of the previous value
Yt−1 plus some random noise ϵt .

2.3 Properties
• Stationarity: For the AR model to be stationary, the roots of the char-
acteristic equation must lie outside the unit circle.
• Autocorrelation Function (ACF): The ACF of an AR model typically
decays exponentially or in a damped sinusoidal fashion.

2.4 Solved Problem


Given the series Yt = 0.7Yt−1 + ϵt and a set of data points, estimate ϕ1 .

3 Moving Average (MA) Model


3.1 Definition
A Moving Average (MA) model expresses the output variable as a linear com-
bination of current and past stochastic terms (errors). Unlike the AR model,

3
which regresses the series on its past values, the MA model regresses the series
on past forecast errors.

Yt = ϵt + θ1 ϵt−1 + θ2 ϵt−2 + . . . + θq ϵt−q

where:
• ϵt is white noise,
• θ1 , θ2 , . . . , θq are the parameters of the model.
The order of the MA model is denoted by q, which indicates the number of
lagged forecast errors in the model.

3.2 Example
Consider an MA(1) model:

Yt = ϵt + 0.5ϵt−1

This implies that the current value Yt is determined by the current error ϵt and
half of the previous error ϵt−1 .

3.3 Properties
• Stationarity: The MA model is inherently stationary.
• Autocorrelation Function (ACF): The ACF of an MA model cuts off
after lag q.

3.4 Solved Problem


Given a series Yt = ϵt + 0.3ϵt−1 + 0.2ϵt−2 , estimate the parameters θ1 and θ2 .

4 Autoregressive Moving Average (ARMA) Model


4.1 Definition
An ARMA model combines both AR and MA models to describe a time series
that exhibits both autoregressive and moving average characteristics.

Yt = ϕ1 Yt−1 + . . . + ϕp Yt−p + ϵt + θ1 ϵt−1 + . . . + θq ϵt−q

where the notations are as defined previously.

4
4.2 Example
Consider an ARMA(1,1) model:

Yt = 0.5Yt−1 + ϵt + 0.3ϵt−1

This model indicates that the current value Yt is influenced by both its imme-
diate past value Yt−1 and the past error ϵt−1 .

4.3 Properties
• Stationarity: The AR part of the model must satisfy the stationarity
condition.
• Invertibility: The MA part of the model must satisfy the invertibility
condition, ensuring that the model can be expressed as an infinite AR
process.

• ACF and PACF: The behavior of the ACF and PACF of an ARMA
process depends on the orders p and q.

4.4 Solved Problem


Given the series Yt = 0.4Yt−1 + ϵt − 0.2ϵt−1 , find the best estimates for ϕ1 and
θ1 .

5 Autoregressive Integrated Moving Average (ARIMA)


Model
5.1 Definition
The ARIMA model generalizes the ARMA model to accommodate non-stationary
series by including a differencing step. The model is denoted as ARIM A(p, d, q),
where:
• p is the order of the autoregressive part,
• d is the degree of differencing required to make the series stationary,
• q is the order of the moving average part.

ARIM A(p, d, q) : ∆d Yt = ϕ(B)Yt + θ(B)ϵt

where ∆d Yt represents the d-th differencing of Yt .

5
5.2 Example
Consider an ARIMA(1,1,1) model:

∆Yt = 0.5∆Yt−1 + ϵt + 0.3ϵt−1

This model indicates that after differencing the series once, it follows an ARMA(1,1)
process.

5.3 Properties
• Stationarity: The differencing step ensures that the series becomes sta-
tionary.
• ACF and PACF: The ACF and PACF can help determine the appro-
priate values of p, d, and q.

5.4 Solved Problem


Given the series Yt , perform differencing and find the best estimates for ϕ1 and
θ1 for an ARIMA(1,1,1) model.

6 Backshift Operator
6.1 Definition
The backshift operator B is a convenient notation in time series analysis that
shifts a time series back by one period. It is defined as:

BYt = Yt−1

More generally, B k Yt = Yt−k .

6.2 Usage
The backshift operator is used in expressing AR, MA, and ARMA models com-
pactly. For example, an AR(1) model can be written as:

(1 − ϕ1 B)Yt = ϵt

7 Lag Operator
7.1 Definition
The lag operator is essentially the same as the backshift operator and is used
to represent lags in time series models.

6
7.2 Usage
The lag operator helps simplify the notation of complex models. For instance,
an MA(2) model can be written as:
Yt = (1 + θ1 B + θ2 B 2 )ϵt

8 Stochastic Model vs Deterministic Model


8.1 Stochastic Model
A stochastic model incorporates random variables or processes. The outcome
is not determined entirely by the initial conditions, and the model accounts for
inherent randomness.

8.2 Deterministic Model


In contrast, a deterministic model produces the same output from a given set
of initial conditions. It does not account for randomness, and the outcome is
fully determined by the inputs.

8.3 Example
An AR model is an example of a stochastic model, while a simple linear regres-
sion without a stochastic error term is a deterministic model.

9 Stationarity Test
9.1 Definition
A time series is stationary if its statistical properties such as mean, variance,
and autocorrelation are constant over time. Stationarity is a crucial assumption
in many time series models.

9.2 Augmented Dickey-Fuller (ADF) Test


The ADF test is used to test the null hypothesis that a unit root is present in
a time series sample. The regression equation for the ADF test is:
∆Yt = α + βt + γYt−1 + δ1 ∆Yt−1 + . . . + δk ∆Yt−k + ϵt
where ∆Yt = Yt − Yt−1 is the differenced series.

9.3 Phillips-Perron (PP) Test


The Phillips-Perron test is another test for a unit root, similar to the ADF test
but with a different method of handling serial correlation and heteroscedasticity
in the error term.

7
10 Autocorrelation Function (ACF) & Partial
Autocorrelation Function (PACF)
10.1 Autocorrelation Function (ACF)
The ACF measures the correlation between observations in a time series sepa-
rated by k time periods. The ACF of a time series Yt is defined as:
Cov(Yt , Yt+k )
ρ(k) =
V ar(Yt )

10.2 Partial Autocorrelation Function (PACF)


The PACF measures the correlation between Yt and Yt+k after controlling for
the effects of intermediate lags (i.e., Yt+1 , Yt+2 , . . . , Yt+k−1 ).

11 Unit Root Test


11.1 Dickey-Fuller (DF) Test
The Dickey-Fuller test is used to determine whether a unit root is present in an
autoregressive model. The test statistic is derived from the following regression:
Yt = ρYt−1 + ϵt
where the null hypothesis is H0 : ρ = 1, indicating a unit root.

11.2 Phillips-Perron (PP) Test


The Phillips-Perron test extends the Dickey-Fuller test to allow for serial corre-
lation and heteroscedasticity in the errors.

12 Diagnostic Tests
12.1 Jarque-Bera (JB) Test
The JB test is used to test whether the sample data have the skewness and
kurtosis matching a normal distribution. The test statistic is:
(K − 3)2
 
n 2
JB = S +
6 4
where S is skewness, K is kurtosis, and n is the sample size.

12.2 Ljung-Box (LB) Test


The LB test is a type of portmanteau test for checking the lack of fit in a time
series model. It tests the null hypothesis that the residuals are independently
distributed (i.e., no autocorrelation).

8
12.3 Heteroscedasticity Test
Heteroscedasticity tests check whether the variance of the errors is constant
across observations. Common tests include the Breusch-Pagan test and the
White test.

12.4 Autocorrelation Test


Autocorrelation tests, such as the Durbin-Watson test, check for the presence
of autocorrelation in the residuals of a regression model.

13 Forecasting
13.1 Granger Causality Test
The Granger causality test is used to determine whether one time series can
predict another. If a time series X Granger-causes Y , then past values of X
contain information that helps predict Y beyond the information contained in
past values of Y alone.

13.2 Impulse Response


Impulse response functions measure the effect of a one-time shock to one of the
innovations on current and future values of the endogenous variables. They are
particularly useful in Vector Autoregressive (VAR) models.

13.3 Variance Decomposition


Variance decomposition quantifies the contribution of each shock to the variance
of the forecast error for each variable in a VAR model. It provides insights into
the relative importance of each shock in affecting the variables in the system.

14 Conclusion
This guide has provided an in-depth look at the fundamental models, tests, and
forecasting methods in time series analysis. By understanding these concepts,
you will be better prepared to analyze and interpret time series data effectively.

You might also like