Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
31
AR(1) as a linear process
Let {Xt } be the stationary solution to Xt − φXt−1 = Wt , where
Wt ∼ W N (0, σ 2 ).
If |φ| < 1,
∞
X
Xt = φj Wt−j
j=0
is the unique solution:
• This infinite sum converges in mean square, since |φ| < 1 implies
P j
|φ | < ∞.
• It satisfies the AR(1) recurrence.
3
AR(1) in terms of the back-shift operator
We can write
Xt − φXt−1 = Wt
⇔ (1 − φB) Xt = Wt
| {z }
φ(B)
⇔ φ(B)Xt = Wt
Recall that B is the back-shift operator: BXt = Xt−1 .
4
AR(1) in terms of the back-shift operator
Also, we can write
∞
X
Xt = φj Wt−j
j=0
∞
X
⇔ Xt = φj B j Wt
j=0
| {z }
π(B)
⇔ Xt = π(B)Wt
5
AR(1) in terms of the back-shift operator
With these definitions:
∞
X
π(B) = φj B j and φ(B) = 1 − φB,
j=0
we can check that π(B) = φ(B)−1 :
∞
X ∞
X ∞
X
π(B)φ(B) = φj B j (1 − φB) = φj B j − φj B j = 1.
j=0 j=0 j=1
Thus, φ(B)Xt = Wt
⇒ π(B)φ(B)Xt = π(B)Wt
⇔ Xt = π(B)Wt .
6
AR(1) in terms of the back-shift operator
Notice that manipulating operators like φ(B), π(B) is like manipulating
polynomials:
1
= 1 + φz + φ2 z 2 + φ3 z 3 + · · · ,
1 − φz
provided |φ| < 1 and |z| ≤ 1.
7
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
8
AR(1) and Causality
Let Xt be the stationary solution to
Xt − φXt−1 = Wt ,
where Wt ∼ W N (0, σ 2 ).
If |φ| < 1,
∞
X
Xt = φj Wt−j .
j=0
φ = 1?
φ = −1?
|φ| > 1?
9
AR(1) and Causality
If |φ| > 1, π(B)Wt does not converge.
But we can rearrange
Xt = φXt−1 + Wt
1 1
as Xt−1 = Xt − Wt ,
φ φ
and we can check that the unique stationary solution is
∞
X
Xt = − φ−j Wt+j .
j=1
But... Xt depends on future values of Wt .
10
Causality
A linear process {Xt } is causal (strictly, a causal function
of {Wt }) if there is a
ψ(B) = ψ0 + ψ1 B + ψ2 B 2 + · · ·
∞
X
with |ψj | < ∞
j=0
and Xt = ψ(B)Wt .
11
AR(1) and Causality
• Causality is a property of {Xt } and {Wt }.
• Consider the AR(1) process defined by φ(B)Xt = Wt (with
φ(B) = 1 − φB):
φ(B)Xt = Wt is causal
iff |φ| < 1
iff the root z1 of the polynomial φ(z) = 1 − φz satisfies |z1 | > 1.
12
AR(1) and Causality
• Consider the AR(1) process φ(B)Xt = Wt (with φ(B) = 1 − φB):
If |φ| > 1, we can define an equivalent causal model,
Xt − φ−1 Xt−1 = W̃t ,
where W̃t is a new white noise sequence.
13
AR(1) and Causality
• Is an MA(1) process causal?
14
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
15
MA(1) and Invertibility
Define
Xt = Wt + θWt−1
= (1 + θB)Wt .
If |θ| < 1, we can write
(1 + θB)−1 Xt = Wt
⇔ (1 − θB + θ 2 B 2 − θ 3 B 3 + · · · )Xt = Wt
∞
X
⇔ (−θ)j Xt−j = Wt .
j=0
That is, we can write Wt as a causal function of Xt .
We say that this MA(1) is invertible.
16
MA(1) and Invertibility
Xt = Wt + θWt−1
P∞ j
If |θ| > 1, the sum j=0 (−θ) Xt−j diverges, but we can write
Wt−1 = −θ −1 Wt + θ −1 Xt .
Just like the noncausal AR(1), we can show that
∞
X
Wt = − (−θ)−j Xt+j .
j=1
That is, we can write Wt as a linear function of Xt , but it is not causal.
We say that this MA(1) is not invertible.
17
Invertibility
A linear process {Xt } is invertible (strictly, an invertible
function of {Wt }) if there is a
π(B) = π0 + π1 B + π2 B 2 + · · ·
∞
X
with |πj | < ∞
j=0
and Wt = π(B)Xt .
18
MA(1) and Invertibility
• Invertibility is a property of {Xt } and {Wt }.
• Consider the MA(1) process defined by Xt = θ(B)Wt (with
θ(B) = 1 + θB):
Xt = θ(B)Wt is invertible
iff |θ| < 1
iff the root z1 of the polynomial θ(z) = 1 + θz satisfies |z1 | > 1.
19
MA(1) and Invertibility
• Consider the MA(1) process Xt = θ(B)Wt (with θ(B) = 1 + θB):
If |θ| > 1, we can define an equivalent invertible model in terms of a
new white noise sequence.
• Is an AR(1) process invertible?
20
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
21
AR(p): Autoregressive models of order p
An AR(p) process {Xt } is a stationary process that satisfies
Xt − φ1 Xt−1 − · · · − φp Xt−p = Wt ,
where {Wt } ∼ W N (0, σ 2 ).
Equivalently, φ(B)Xt = Wt ,
where φ(B) = 1 − φ1 B − · · · − φp B p .
22
AR(p): Constraints on φ
Recall: For p = 1 (AR(1)), φ(B) = 1 − φ1 B.
This is an AR(1) model only if there is a stationary solution to
φ(B)Xt = Wt , which is equivalent to |φ1 | = 6 1.
This is equivalent to the following condition on φ(z) = 1 − φ1 z:
∀z ∈ R, φ(z) = 0 ⇒ z 6= ±1
equivalently, ∀z ∈ C, φ(z) = 0 ⇒ |z| =
6 1,
where C is the set of complex numbers.
23
AR(p): Constraints on φ
Stationarity: ∀z ∈ C, φ(z) = 0 ⇒ |z| =
6 1,
where C is the set of complex numbers.
φ(z) = 1 − φ1 z has one root at z1 = 1/φ1 ∈ R.
But the roots of a degree p > 1 polynomial might be complex.
For stationarity, we want the roots of φ(z) to avoid the unit circle,
{z ∈ C : |z| = 1}.
24
AR(p): Stationarity and causality
Theorem: A (unique) stationary solution to φ(B)Xt = Wt
exists iff
φ(z) = 1 − φ1 z − · · · − φp z p = 0 ⇒ |z| =
6 1.
This AR(p) process is causal iff
φ(z) = 1 − φ1 z − · · · − φp z p = 0 ⇒ |z| > 1.
25
Recall: Causality
A linear process {Xt } is causal (strictly, a causal function
of {Wt }) if there is a
ψ(B) = ψ0 + ψ1 B + ψ2 B 2 + · · ·
∞
X
with |ψj | < ∞
j=0
and Xt = ψ(B)Wt .
26
AR(p): Roots outside the unit circle implies causal (Details)
∀z ∈ C, |z| ≤ 1 ⇒ φ(z) 6= 0
∞
1 X
⇔ ∃{ψj }, δ > 0, ∀|z| ≤ 1 + δ, = ψj z j .
φ(z) j=0
j
⇒ ∀|z| ≤ 1 + δ, |ψj z j | → 0, |ψj |1/j |z| → 0
∞
1 X
⇒ ∃j0 , ∀j ≥ j0 , |ψj |1/j ≤ ⇒ |ψj | < ∞.
1 + δ/2 j=0
m
X
So if |z| ≤ 1 ⇒ φ(z) 6= 0, then Sm = ψj B j Wt converges in mean
j=0
square, so we have a stationary, causal time series Xt = φ−1 (B)Wt .
27
Calculating ψ for an AR(p): matching coefficients
Example: Xt = ψ(B)Wt ⇔ (1 − 0.5B + 0.6B 2 )Xt = Wt ,
so 1 = ψ(B)(1 − 0.5B + 0.6B 2 )
⇔ 1 = (ψ0 + ψ1 B + ψ2 B 2 + · · · )(1 − 0.5B + 0.6B 2 )
⇔ 1 = ψ0 ,
0 = ψ1 − 0.5ψ0 ,
0 = ψ2 − 0.5ψ1 + 0.6ψ0 ,
0 = ψ3 − 0.5ψ2 + 0.6ψ1 ,
..
.
28
Calculating ψ for an AR(p): example
⇔ 1 = ψ0 , 0 = ψj (j ≤ 0),
0 = ψj − 0.5ψj−1 + 0.6ψj−2
⇔ 1 = ψ0 , 0 = ψj (j ≤ 0),
0 = φ(B)ψj .
We can solve these linear difference equations in several ways:
• numerically, or
• by guessing the form of a solution and using an inductive proof, or
• by using the theory of linear difference equations.
29
Calculating ψ for an AR(p): general case
φ(B)Xt = Wt , ⇔ Xt = ψ(B)Wt
so 1 = ψ(B)φ(B)
⇔ 1 = (ψ0 + ψ1 B + · · · )(1 − φ1 B − · · · − φp B p )
⇔ 1 = ψ0 ,
0 = ψ1 − φ1 ψ0 ,
0 = ψ2 − φ1 ψ1 − φ2 ψ0 ,
..
.
⇔ 1 = ψ0 , 0 = ψj (j < 0),
0 = φ(B)ψj .
30
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {Xt } is a stationary process that
satisfies
Xt −φ1 Xt−1 −· · ·−φp Xt−p = Wt +θ1 Wt−1 +· · ·+θq Wt−q ,
where {Wt } ∼ W N (0, σ 2 ).
• AR(p) = ARMA(p,0): θ(B) = 1.
• MA(q) = ARMA(0,q): φ(B) = 1.
32
ARMA processes
Can accurately approximate many stationary processes:
For any stationary process with autocovariance γ, and any k >
0, there is an ARMA process {Xt } for which
γX (h) = γ(h), h = 0, 1, . . . , k.
33
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {Xt } is a stationary process that
satisfies
Xt −φ1 Xt−1 −· · ·−φp Xt−p = Wt +θ1 Wt−1 +· · ·+θq Wt−q ,
where {Wt } ∼ W N (0, σ 2 ).
Usually, we insist that φp , θq 6= 0 and that the polynomials
φ(z) = 1 − φ1 z − · · · − φp z p , θ(z) = 1 + θ1 z + · · · + θq z q
have no common factors. This implies it is not a lower order ARMA model.
34