Parameter estimation: ACVF of AR processes

Size: px
Start display at page:

Download "Parameter estimation: ACVF of AR processes"

Transcription

1 Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre / 8

2 Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). Multiplying X t = ϕ 1 X t ϕ p X t p + Z t by X t h (h = 1... p) and taking expectations, one has Γ p φ = γ p with (Γ p ) i,j = γ(i j), i.e. γ(0) γ(1) γ(2) γ(p 1) γ(1) γ(0) γ(1). γ(1) Γ p =. γ(2) γ(1) γ(0).. γ(2), γ p = γ(p) γ(p 1) γ(1) γ(0) and multiplying by X t one obtains σ 2 = γ(0) φ t γ p. 12 novembre / 8

3 Parameter estimation: Yule-Walker s method ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 12 novembre / 8

4 Parameter estimation: Yule-Walker s method Remarks ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 12 novembre / 8

5 Parameter estimation: Yule-Walker s method Remarks ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p 1 We proved ˆΓ p non-negative definite. Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 12 novembre / 8

6 Parameter estimation: Yule-Walker s method Remarks ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 1 We proved ˆΓ p non-negative definite. If ˆγ(0) > 0 (i.e. data are not identical), is positive definite, hence invertible = ˆφ well-defined. 12 novembre / 8

7 Parameter estimation: Yule-Walker s method Remarks ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 1 We proved ˆΓ p non-negative definite. If ˆγ(0) > 0 (i.e. data are not identical), is positive definite, hence invertible = ˆφ well-defined. 2 One can divide (1) by γ(0) to obtain ˆR p ˆφ = ˆρ p with (R p ) i,j = ρ(i j), ρ i = ρ(i). 12 novembre / 8

8 Parameter estimation: Yule-Walker s method Remarks ˆϕ 1 ˆϕ 2 Hence estimate ˆφ =. by solving ˆΓ p ˆφ = ˆγ p. (1) ˆϕ p Moreover ˆσ 2 = ˆγ(0) ˆφ t ˆγ p. (2) 1 We proved ˆΓ p non-negative definite. If ˆγ(0) > 0 (i.e. data are not identical), is positive definite, hence invertible = ˆφ well-defined. 2 One can divide (1) by γ(0) to obtain ˆR p ˆφ = ˆρ p with (R p ) i,j = ρ(i j), ρ i = ρ(i). 3 One can compute φ iteratively, applying Durbin-Levinson to ˆγ. 12 novembre / 8

9 Asymptotic properties of Yule-Walker Theorem If {X t } is the causal AR process satisfying X t = ϕ 1 X t ϕ p X t p + Z t {Z t } IID(0, σ 2 ) then n 1/2 ( ˆφ φ) = N(0, σ 2 Γ 1 p ) and ˆσ2 P σ novembre / 8

10 Asymptotic properties of Yule-Walker Theorem If {X t } is the causal AR process satisfying X t = ϕ 1 X t ϕ p X t p + Z t {Z t } IID(0, σ 2 ) then n 1/2 ( ˆφ φ) = N(0, σ 2 Γ 1 p ) and ˆσ2 P σ 2. An approximate γ-confidence interval for ϕ i, i = 1... p is then ( ˆϕ i ˆσ [(ˆΓ p ) 1] 1/2 ii where P( N(0, 1) z γ ) = γ. z γ, ˆϕ i + ˆσ [(ˆΓ p ) 1] 1/2 n ii z γ n ) 12 novembre / 8

11 Comparison with linear regression One could write the problem as Y = X φ + Z with X n X n 1 X n p X n 1 Y =. X = X n 2 X n p 1.. and Z IID(0, σ2 ). X p+1 X p X 1 Least square estimation asks for minimizing Y X φ 2 and results into X t Y = n t=p+1 n t=p+1 X t X t 1. X t X t p ˆφ = (X t X ) 1 X t Y. p i nˆγ p (X t X ) ij = t=p i+1 X t X t+i j n(ˆγ) ij. 12 novembre / 8

12 Variations on Yule-Walker: Burg s algorithm Yule-Walker s method finds ˆφ as the coefficients of the best linear predictor of X p+1 using X 1,..., X p, assuming that the ACF coincides with the sample ACF for i = 1,... p. 12 novembre / 8

13 Variations on Yule-Walker: Burg s algorithm Yule-Walker s method finds ˆφ as the coefficients of the best linear predictor of X p+1 using X 1,..., X p, assuming that the ACF coincides with the sample ACF for i = 1,... p. Burg s algorithm considers actual forward and backward linear predictors. Define u p (t) (p t < n) as the difference between X n t+p and the best linear prediction using the previous p values, i.e. u p (t) = X n t+p P L(Xn t,...,x n t+p 1 )X n t+p and v p (t) (1 t < n) be the difference between the value of X n t and the best linear prediction using the following p values, i.e. v p (t) = X n t P L(Xn t+1,...,x n t+p )X n t. 12 novembre / 8

14 Variations on Yule-Walker: Burg s algorithm Yule-Walker s method finds ˆφ as the coefficients of the best linear predictor of X p+1 using X 1,..., X p, assuming that the ACF coincides with the sample ACF for i = 1,... p. Burg s algorithm considers actual forward and backward linear predictors. Define u p (t) (p t < n) as the difference between X n t+p and the best linear prediction using the previous p values, i.e. u p (t) = X n t+p P L(Xn t,...,x n t+p 1 )X n t+p and v p (t) (1 t < n) be the difference between the value of X n t and the best linear prediction using the following p values, i.e. v p (t) = X n t P L(Xn t+1,...,x n t+p )X n t. The coefficients are found by minimizing n 1 t=p (u2 p(t) + v 2 p (t)). Details on the algorithm in an exercise. 12 novembre / 8

15 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. 12 novembre / 8

16 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. 12 novembre / 8

17 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. Ŵ t = X t p ˆϕ j X t j are not exactly WN, but close for n large. 12 novembre / 8

18 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. Ŵ t = X t p ˆϕ j X t j are not exactly WN, but close for n large. If m < p and we compute (Ŵ m ) t = X t m ( ˆϕ m ) j X t j, we expect to find them correlated. 12 novembre / 8

19 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. Ŵ t = X t p ˆϕ j X t j are not exactly WN, but close for n large. If m < p and we compute (Ŵ m ) t = X t m ( ˆϕ m ) j X t j, we expect to find them correlated. Significant ACF of Ŵ m suggests m < p. 12 novembre / 8

20 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. Ŵ t = X t p ˆϕ j X t j are not exactly WN, but close for n large. If m < p and we compute (Ŵ m ) t = X t m ( ˆϕ m ) j X t j, we expect to find them correlated. Significant ACF of Ŵ m suggests m < p. If m p, n 1/2 ( ˆφ m φ) = N(0, σ 2 Γ 1 m ) (Theorem). In particular for m > p, ˆϕ m N(0, 1/n). 12 novembre / 8

21 Order of the auto-regressive process Assume the true model is AR(p) and we fit a model of order m. If m < p (i.e. the model is wrong), we should see this from the residuals. p X t ˆX t = X t ϕ j X t j = Z t WN(0, σ 2 ) for t > p. Ŵ t = X t p ˆϕ j X t j are not exactly WN, but close for n large. If m < p and we compute (Ŵ m ) t = X t m ( ˆϕ m ) j X t j, we expect to find them correlated. Significant ACF of Ŵ m suggests m < p. If m p, n 1/2 ( ˆφ m φ) = N(0, σ 2 Γ 1 m ) (Theorem). In particular for m > p, ˆϕ m N(0, 1/n). Finding ˆϕ m within ±1.96n 1/2 suggests that the true value of ϕ m may be 0, and to fit a model of lower order. 12 novembre / 8

22 Maximum likelihood estimation A standard method of parameter estimate is maximum likelihood estimation (MLE). 12 novembre / 8

23 Maximum likelihood estimation A standard method of parameter estimate is maximum likelihood estimation (MLE). Likelihood of ARMA processes is computed assuming {Z t } IIDN(0, σ 2 ). Then {X t } is also normally distributed. 12 novembre / 8

24 Maximum likelihood estimation A standard method of parameter estimate is maximum likelihood estimation (MLE). Likelihood of ARMA processes is computed assuming {Z t } IIDN(0, σ 2 ). Then {X t } is also normally distributed. Example of AR(1): X t = µ + ϕ(x t 1 µ) + Z t, ϕ < 1. Then X t N(µ, σ 2 1 ϕ 2 ). 12 novembre / 8

25 Maximum likelihood estimation A standard method of parameter estimate is maximum likelihood estimation (MLE). Likelihood of ARMA processes is computed assuming {Z t } IIDN(0, σ 2 ). Then {X t } is also normally distributed. Example of AR(1): X t = µ + ϕ(x t 1 µ) + Z t, ϕ < 1. σ Then X t N(µ, 2 ). 1 ϕ 2 Joint distributions can be obtained by conditional distributions: conditional on the value of X t 1, X t N(µ + ϕ(x t 1 µ), σ 2 ). 12 novembre / 8

26 Maximum likelihood estimation A standard method of parameter estimate is maximum likelihood estimation (MLE). Likelihood of ARMA processes is computed assuming {Z t } IIDN(0, σ 2 ). Then {X t } is also normally distributed. Example of AR(1): X t = µ + ϕ(x t 1 µ) + Z t, ϕ < 1. σ Then X t N(µ, 2 ). 1 ϕ 2 Joint distributions can be obtained by conditional distributions: conditional on the value of X t 1, X t N(µ + ϕ(x t 1 µ), σ 2 ). Hence likelihood of data (x 1, x 2,..., x n ) can be obtained as f (x 1 )f (x 2 x 1 ) f (x n x n 1 ) where f ( ) is the (unconditional) density of X t, while f ( x) is the density of X t conditional on X t 1 = x. 12 novembre / 8

27 Computing likelihood for AR(1) f (x) = (2π σ2 (x 1 ϕ 2 ) 1/2 µ)2 exp{ 2σ 2 /(1 ϕ 2 ) } ( ) 1 ϕ 2 1/2 = 2πσ 2 exp{ (1 ϕ2 )(x µ) 2 2σ 2 } f (y, x) = (2πσ 2 ) 1/2 (y µ ϕ(x µ))2 exp{ 2σ 2 }. 12 novembre / 8

28 Computing likelihood for AR(1) f (x) = (2π σ2 (x 1 ϕ 2 ) 1/2 µ)2 exp{ 2σ 2 /(1 ϕ 2 ) } ( ) 1 ϕ 2 1/2 = 2πσ 2 exp{ (1 ϕ2 )(x µ) 2 2σ 2 } f (y, x) = (2πσ 2 ) 1/2 (y µ ϕ(x µ))2 exp{ 2σ 2 }. Parameters of the model are µ, ϕ, σ 2 : L(µ, ϕ, σ 2 ) = (1 ϕ 2 ) 1/2 (2πσ 2 ) n/2 exp{ (1 ϕ2 )(x 1 µ) 2 2σ 2 } n exp{ (x j µ ϕ(x j 1 µ)) 2 2σ 2 } j=2 12 novembre / 8

29 Computing likelihood for AR(1) f (x) = (2π σ2 (x 1 ϕ 2 ) 1/2 µ)2 exp{ 2σ 2 /(1 ϕ 2 ) } ( ) 1 ϕ 2 1/2 = 2πσ 2 exp{ (1 ϕ2 )(x µ) 2 2σ 2 } f (y, x) = (2πσ 2 ) 1/2 (y µ ϕ(x µ))2 exp{ 2σ 2 }. Parameters of the model are µ, ϕ, σ 2 : L(µ, ϕ, σ 2 ) = (1 ϕ 2 ) 1/2 (2πσ 2 ) n/2 exp{ (1 ϕ2 )(x 1 µ) 2 2σ 2 } n exp{ (x j µ ϕ(x j 1 µ)) 2 2σ 2 } j=2 [ = (1 ϕ2 ) 1/2 1 n (2πσ 2 exp{ ) n/2 2σ 2 (1 ϕ 2 )(x 1 µ) 2 + (x t µ ϕ(x t 1 µ)) ]} 2 t=2 12 novembre / 8

STA 6857 Estimation ( 3.6)

STA 6857 Estimation ( 3.6) STA 6857 Estimation ( 3.6) Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg STA 6857 Estimation ( 3.6) 2/ 19 Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Financial Time Series Analysis Week 5

Financial Time Series Analysis Week 5 Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

System Identification

System Identification System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 26, 2013 Module 6 Lecture 1 Arun K. Tangirala System Identification July 26, 2013 1 Objectives of this Module

More information

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E

More information

BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation

BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation BIO5312 Biostatistics Lecture 13: Maximum Likelihood Estimation Yujin Chung November 29th, 2016 Fall 2016 Yujin Chung Lec13: MLE Fall 2016 1/24 Previous Parametric tests Mean comparisons (normality assumption)

More information

Automatic Autocorrelation and Spectral Analysis

Automatic Autocorrelation and Spectral Analysis Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Estimating AR/MA models

Estimating AR/MA models September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

MEI Exam Review. June 7, 2002

MEI Exam Review. June 7, 2002 MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Residual Bootstrap for estimation in autoregressive processes

Residual Bootstrap for estimation in autoregressive processes Chapter 7 Residual Bootstrap for estimation in autoregressive processes In Chapter 6 we consider the asymptotic sampling properties of the several estimators including the least squares estimator of the

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS

COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS Y. ZHANG and A.I. MCLEOD Acadia University and The University of Western Ontario May 26, 2005 1 Abstract. A symbolic

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Recall that the AR(p) model is defined by the equation

Recall that the AR(p) model is defined by the equation Estimation of AR models Recall that the AR(p) model is defined by the equation X t = p φ j X t j + ɛ t j=1 where ɛ t are assumed independent and following a N(0, σ 2 ) distribution. Assume p is known and

More information

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models 6 Dependent data The AR(p) model The MA(q) model Hidden Markov models Dependent data Dependent data Huge portion of real-life data involving dependent datapoints Example (Capture-recapture) capture histories

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

6.3.3 Parameter Estimation

6.3.3 Parameter Estimation 130 CHAPTER 6. ARMA MODELS 6.3.3 Parameter Estimatio I this sectio we will iscuss methos of parameter estimatio for ARMAp,q assumig that the orers p a q are kow. Metho of Momets I this metho we equate

More information

ARMA Estimation Recipes

ARMA Estimation Recipes Econ. 1B D. McFadden, Fall 000 1. Preliminaries ARMA Estimation Recipes hese notes summarize procedures for estimating the lag coefficients in the stationary ARMA(p,q) model (1) y t = µ +a 1 (y t-1 -µ)

More information

Notes on the Multivariate Normal and Related Topics

Notes on the Multivariate Normal and Related Topics Version: July 10, 2013 Notes on the Multivariate Normal and Related Topics Let me refresh your memory about the distinctions between population and sample; parameters and statistics; population distributions

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

UC San Diego Recent Work

UC San Diego Recent Work UC San Diego Recent Work Title Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions Permalink https://escholarship.org/uc/item/67h5s74t Authors Pan, Li Politis, Dimitris

More information

Model Identification for Time Series with Dependent Innovations

Model Identification for Time Series with Dependent Innovations Model Identification for Time Series with Dependent Innovations Shuhao Chen 1, Wanli Min 2 and Rong Chen 1,3 1 Rutgers University, 2 IBM Singapore and 3 Peking University Abstract: This paper investigates

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Model selection using penalty function criteria

Model selection using penalty function criteria Model selection using penalty function criteria Laimonis Kavalieris University of Otago Dunedin, New Zealand Econometrics, Time Series Analysis, and Systems Theory Wien, June 18 20 Outline Classes of models.

More information

Robust Autocorrelation Estimation

Robust Autocorrelation Estimation Robust Autocorrelation Estimation Christopher C. Chang and Dimitris N. Politis Department of Mathematics University of California at San Diego La Jolla, CA 92093-0112 USA. E-mails: chrchang@alumni.caltech.edu,

More information

13.2 Example: W, LM and LR Tests

13.2 Example: W, LM and LR Tests 13.2 Example: W, LM and LR Tests Date file = cons99.txt (same data as before) Each column denotes year, nominal household expenditures ( 10 billion yen), household disposable income ( 10 billion yen) and

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Classical Decomposition Model Revisited: I

Classical Decomposition Model Revisited: I Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari MS&E 226: Small Data Lecture 11: Maximum likelihood (v2) Ramesh Johari ramesh.johari@stanford.edu 1 / 18 The likelihood function 2 / 18 Estimating the parameter This lecture develops the methodology behind

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

COMPUTER SESSION 3: ESTIMATION AND FORECASTING.

COMPUTER SESSION 3: ESTIMATION AND FORECASTING. UPPSALA UNIVERSITY Department of Mathematics JR Analysis of Time Series 1MS014 Spring 2010 COMPUTER SESSION 3: ESTIMATION AND FORECASTING. 1 Introduction The purpose of this exercise is two-fold: (i) By

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Introduction to Estimation Methods for Time Series models Lecture 2

Introduction to Estimation Methods for Time Series models Lecture 2 Introduction to Estimation Methods for Time Series models Lecture 2 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 2 SNS Pisa 1 / 21 Estimators:

More information

Problem Set 6 Solution

Problem Set 6 Solution Problem Set 6 Solution May st, 009 by Yang. Causal Expression of AR Let φz : αz βz. Zeros of φ are α and β, both of which are greater than in absolute value by the assumption in the question. By the theorem

More information

STOR 356: Summary Course Notes Part III

STOR 356: Summary Course Notes Part III STOR 356: Summary Course Notes Part III Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 27599-3260 rls@email.unc.edu April 23, 2008 1 ESTIMATION

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

REGRESSION WITH SPATIALLY MISALIGNED DATA. Lisa Madsen Oregon State University David Ruppert Cornell University

REGRESSION WITH SPATIALLY MISALIGNED DATA. Lisa Madsen Oregon State University David Ruppert Cornell University REGRESSION ITH SPATIALL MISALIGNED DATA Lisa Madsen Oregon State University David Ruppert Cornell University SPATIALL MISALIGNED DATA 10 X X X X X X X X 5 X X X X X 0 X 0 5 10 OUTLINE 1. Introduction 2.

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Generalized Linear Models. Kurt Hornik

Generalized Linear Models. Kurt Hornik Generalized Linear Models Kurt Hornik Motivation Assuming normality, the linear model y = Xβ + e has y = β + ε, ε N(0, σ 2 ) such that y N(μ, σ 2 ), E(y ) = μ = β. Various generalizations, including general

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach

Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach The University of Toledo The University of Toledo Digital Repository Theses and Dissertations 2015 Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Exam 2. Jeremy Morris. March 23, 2006

Exam 2. Jeremy Morris. March 23, 2006 Exam Jeremy Morris March 3, 006 4. Consider a bivariate normal population with µ 0, µ, σ, σ and ρ.5. a Write out the bivariate normal density. The multivariate normal density is defined by the following

More information

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables. Random vectors Recall that a random vector X = X X 2 is made up of, say, k random variables X k A random vector has a joint distribution, eg a density f(x), that gives probabilities P(X A) = f(x)dx Just

More information

Stationary Time Series

Stationary Time Series 3-Jul-3 Time Series Analysis Assoc. Prof. Dr. Sevap Kesel July 03 Saionary Time Series Sricly saionary process: If he oin dis. of is he same as he oin dis. of ( X,... X n) ( X h,... X nh) Weakly Saionary

More information

Autoregressive Approximation in Nonstandard Situations: Empirical Evidence

Autoregressive Approximation in Nonstandard Situations: Empirical Evidence Autoregressive Approximation in Nonstandard Situations: Empirical Evidence S. D. Grose and D. S. Poskitt Department of Econometrics and Business Statistics, Monash University Abstract This paper investigates

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Lecture 15. Hypothesis testing in the linear model

Lecture 15. Hypothesis testing in the linear model 14. Lecture 15. Hypothesis testing in the linear model Lecture 15. Hypothesis testing in the linear model 1 (1 1) Preliminary lemma 15. Hypothesis testing in the linear model 15.1. Preliminary lemma Lemma

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information