0% found this document useful (0 votes)
14 views19 pages

Lecture 1 Introduction

The document provides an introduction to time series analysis, including forecasting problems such as causal regression and autoregression, and emphasizes the use of R for statistical analysis. It covers concepts of covariance, autocovariance, and stationarity, detailing how to prepare data and assess relationships in time series data. Additionally, it discusses methods for transforming non-stationary data into stationary forms through differencing and logarithmic transformations.

Uploaded by

Eason Lau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views19 pages

Lecture 1 Introduction

The document provides an introduction to time series analysis, including forecasting problems such as causal regression and autoregression, and emphasizes the use of R for statistical analysis. It covers concepts of covariance, autocovariance, and stationarity, detailing how to prepare data and assess relationships in time series data. Additionally, it discusses methods for transforming non-stationary data into stationary forms through differencing and logarithmic transformations.

Uploaded by

Eason Lau
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Lecture 1 Introduction

Basic Ideas of Time Series


• Example 1
Basic Ideas of Time Series
• Example 2: Global warming
Forecasting Problems
• Problem Type 1: Causal regression

• Problem: Predict 𝑋𝑛+𝑘 from 𝑌1 , 𝑌2 , … , 𝑌𝑛

• Objective: Making forecasting based on another related time series data.

• Principles: The new value of X can be related to the old values of Y. Time
series methods that you learned in this course can be used to discover such
hidden relationships.

• Computer: Use R to do statistics.


Forecasting Problems
• Problem Type 2: Autoregression

• Problem: Predict 𝑋𝑛+𝑘 from 𝑋1 , 𝑋2 , … , 𝑋𝑛

• Objective: Making forecasting based on the historical data.

• Principles: The new value and the old values can be related. Time series
methods that you learned in this course can be used to discover such
hidden relationships.

• Computer: Use R to do statistics.


Preparing Data in R
• Read external data files, e.g. txt files and csv files

• Example 1: Global temperature data

• Set the working directory by the command:

setwd("C:/AMS4322 Fall 2023")


gtemp = read.csv("gtemp.csv",header=T)

• gtemp
Preparing Data in R
• Read external data files, e.g. txt files and csv files

• Example 2: U.S. savings and income data

setwd("C:/AMS4322 Fall 2023")


savings = read.table("savings.txt", sep = "\t", skip = 4, header=T)

• savings
Sample Covariance / Autocovariance
• Concepts of covariance/correlation

• Is X useful in predicting Y? Regression class

• Is today useful in predicting tomorrow? Time series class

• Is yesterday useful in predicting tomorrow ? Time series class

• These can be answered via covariance


Sample Covariance / Autocovariance
• Sample covariance:
𝑛
1
෢ 𝑋, 𝑌 = ෍(𝑋𝑡 − 𝑋)(𝑌
C𝑜𝑣 ത 𝑡 − 𝑌)

𝑛
𝑡=1

• Sample correlation:
෢ (𝑋, 𝑌)
𝐶𝑜𝑣
෣ 𝑋, 𝑌 =
C𝑜𝑟𝑟
෢ (𝑋)𝑉𝑎𝑟
𝑉𝑎𝑟 ෢ (𝑌)
Sample Covariance / Autocovariance
• Sample autocovariance:
𝑛−𝑘
1
ത 𝑡+𝑘 − 𝑋)
𝛾ො𝑘 = ෍ (𝑋𝑡 − 𝑋)(𝑋 ത
𝑛
𝑡=1

• Sample autocorrelation:
𝛾ො𝑘
𝜌ො𝑘 =
𝛾ො0

• Here, k is called the lag


Sample Covariance / Autocovariance
• In R, you can see the sample autocorrelations from the output of an R
function called acf. You will learn in more details the acf methods
latter on in this course.

• acf(gtemp$Temperature,20)

• acf(savings$SAVINGS, 20)
Concepts of
Stationarity
Concepts of Stationarity
• Mathematical details:

• Precise definition of weak stationarity: The mean, variance, and


autocorrelation structure do not change over time.

• 𝐸[𝑋𝑡 ] = 𝜇 for all t

• 𝐶𝑜𝑣 𝑋𝑡 , 𝑋𝑡−𝑘 = 𝐸 𝑋𝑡 − 𝜇 𝑋𝑡−𝑘 − 𝜇 = 𝛾𝑘 for all t

• 𝛾𝑘 is called autocovariance function. It is meaningful only for the weakly


stationary process.
Concepts of Stationarity
• Example 1: White noise process: Why to learn that? It is the building
block of time series model

• It means a sequence of uncorrelated random variables 𝑎1 , 𝑎2 , 𝑎3 , …


with mean zero and constant variance 𝜎𝑎2

𝜎𝑎2 , 𝑖𝑓 𝑘 = 0
• In this case, 𝐸[𝑎𝑡 ] = 0 and 𝐶𝑜𝑣 𝑎𝑡 , 𝑎𝑡−𝑘 =ቊ
0, 𝑖𝑓 𝑘 ≠ 0
Both do not depend on t. So, white noise process is weakly stationary.
Concepts of Stationarity
• Exercise:

• Let 𝑎1 , 𝑎2 , 𝑎3 , … be a white noise process. A new process is defined as

𝑋𝑡 = 3 + 𝑡𝑎𝑡

• Is X a weakly stationary process?


Concepts of Stationarity
• Exercise:

• Let 𝑎1 , 𝑎2 , 𝑎3 , … be a white noise process. A new process is defined as

𝑋𝑡 = 𝑎1 + 𝑎2 + ⋯ + 𝑎𝑡

• Is X a weakly stationary process?


Concepts of Stationarity
• Descriptive statistics like sample mean, sample variance, sample
autocovariance, etc. of non-stationary time series data are considered
NOT meaningful.

• If the time series data is non-stationary, the people usually do some


transformation to make the time series stationary.
Concepts of Stationarity
• In many cases, the non-stationarity can be removed by taking logarithm
and taking differencing

• The new sequence can be “less” non-stationary


𝑍𝑡 = 𝑋𝑡 − 𝑋𝑡−1

• To study stock price data, it is common to consider


𝑍𝑡 = log 𝑋𝑡 − log 𝑋𝑡−1

• In finance, people call it rate of return


Concepts of Stationarity
• Example of differencing in R:

• X = diff(log(savings$SAVINGS))

• acf(X, 20)

You might also like