MATH 5075: Time Series Analysis

Size: px
Start display at page:

Download "MATH 5075: Time Series Analysis"

Transcription

1 NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1

2 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show that {X t } is not stationary. X t = a + bt + ct 2 + Y t. b) Show that 2 X t = (1 B) 2 X t, where B is the backshift operator, is stationary. (Compute E 2 X t and Cov( 2 X t, 2 X t+h )). c) Describe (in detail) a different procedure to eliminate the trendfunction a + bt + ct 2. Solution: a) EX t = a + bt is not constant. b) (1 B) 2 = 1 2B + B 2. Hence 2 X t = a + bt + ct 2 + Y t 2(a + b(t 1) + c(t 1) 2 + Y t 1 ) + a + b(t 2) + c(t 2) 2 + Y t 2 = 2c + Y t 2Y t 1 + Y t 2. Thus E[ 2 X t ] = 2c. Cov( 2 X t, 2 X t+h ) = Cov(2c + Y t 2Y t 1 + Y t 2, 2c + Y t+h 2Y t 1+h + Y t 2+h ) = E[(Y t 2Y t 1 + Y t 2 ) (Y t+h 2Y t 1+h + Y t 2+h )] = γ(h 2) 4γ(h 1) + 6γ(h) 4γ(h + 1) + γ(h + 2). c) Use for example least square estimation, i.e. minimize n [X t (a + bt + ct 2 )] 2 t=1 with respect to a, b, c. Take partial derivatives and set equal 0 = we get â,ˆb, ĉ. Now define W t = X t (â + ˆbt + ĉt 2 ). When the estimates are good W t Y t : the latter is stationary. 2

3 2) Let {X t, t Z} be an AR(3) process given by X t X t X t X t 3 = Z t. This process has a stationary and causal solution. Consider the MA( ) representation of X t : X t = ψ k Z t k. Compute ψ 0, ψ 1, ψ 2, ψ 3 numerically, then give a general recursive formula for ψ k. Solution: We have φ(b)x t = θ(b)z t, where φ(z) = z 1 8 z z3 and θ(z) = 1. Let ψ(z) = ψ kz k. Thus we have ψ(z) = 1 resp. ψ(z)φ(z) = 1, φ(z) i.e. ψ k z k + 1 ψ k z k+1 1 ψ k z k ψ k z k+3 = Now equate coefficients: ψ 0 = 1, ψ ψ 0 = 0 = ψ 1 = 1 4, ψ ψ ψ 0 = 0 = ψ 2 = 3 16, ψ ψ ψ ψ 0 = 0, = ψ 3 = , ψ k ψ k ψ k ψ k 3 = 0. 3

4 3) Let {X t, t Z} be an AR(2) process given by a) Show that this process is invertible. X t = Z t 1 2 Z t Z t 2. b) Consider the AR( ) representation of Z t : Z t = η k X t k. Compute η 0, η 1, η 2 numerically, then give a general recursive formula for η k. Solution: a) Solve Hence z z2 = 0 z 2 2z + 4 = 0. z 1,2 = 1 ± 3 = 1 ± i 3 = z i > 1, i = 1, 2. Since the roots are outside the complex unit circle, the process is invertible. b)we have φ(b)x t = θ(b)z t, where φ(z) = 1 and θ(z) = z z2. Let η(z) = η kz k. Thus we have η(z) = 1 θ(z) resp. θ(z)η(z) = 1, i.e. η k z k 1 η k z k ψ k z k+2 = Now equate coefficients: η 0 = 1, η η 0 = 0 = η 1 = 1 2, η η η 0 = 0 = η 2 = 0, η k 1 2 η k η k 2 = 0. 4

5 4) a) Show that the AR(4) process given by X t 5 3 X t X t X t X t 4 = Z t has a stationary and causal solution. Hint: (1 (5x)/6 + x 2 /6) 2 = 1 (5x)/3 + (37x 2 )/36 (5x 3 )/18 + x 4 /36. b) Show that the MA(1) process X t = Z t 1 2 Z t 1 is invertible. Determine the coefficients (η k ) k 0 in the AR( ) representation Solution Z t = a) We have to show that the roots of η k X t k. 1 (5z)/3 + (37z 2 )/36 (5z 3 )/18 + z 4 /36 are outside the complex unit circle. By the hint we only need the roots of 1 (5z)/6 + z 2 /6. It is trivial to show that the roots are z 1 = 2, z 2 = 3. b) 1 1 2z = 0 if z = 2. Done. Representation: Z t = 1 2 Z t 1 + X t = 1 4 Z t X t 1 + X t = 1 = 2 k X t k. 5

6 5) Assume that we have a causal and invertible AR(2) process X t = φ 1 X t 1 + φ 2 X t 2 + Z t. a) Denote by ρ(h) the autocorrelation function. Determine ρ(0) and ρ(1) in terms of φ 1, φ 2 and then give a recursive formula for ρ(h), h 2. b) Denote by α(h) the partial autocorrelation function. Determine α(h) for h 1 in terms of φ 1 and φ 2. Solution: a) Of course ρ(0) = 1. The equation multiplied with X t 1 yields: X t X t 1 = φ 1 X 2 t 1 + φ 2X t 2 X t 1 + Z t X t 1 E γ(1) = φ 1 γ(0) + φ 2 γ(1) : γ(0) ρ(1) = φ 1 + φ 2 ρ(1). Thus Now multply the equation with X t h : ρ(1) = φ 1 1 φ 2 X t X t h = φ 1 X t 1 X t h + φ 2 X t 2 X t h + Z t X t h E γ(h) = φ 1 γ(h 1) + φ 2 γ(h 2) : γ(0) ρ(h) = φ 1 ρ(h 1) + φ 2 ρ(h 2). b) To get α(1) solve the predicton equation Thus γ(0)φ 11 = γ(1). α(1) = φ 11 = γ(1) γ(0) = ρ(1) = φ 1 1 φ 2. For the AR(2) process α(2) = φ 22 = φ 2 and α(k) = φ kk = 0 if k 3. 6

7 6) Let X t be the MA(3) process X t = Z t Z t 1 Z t 2 + 2Z t 3. a) Compute the best linear predictor ˆX n+h based on X n and X n 1 for every h 1. b) Compute the mean squared prediction error E ˆX n+1 X n+1 2. Solution: First note that a) Solve the equation system This is trivial. Then Obviously (φ (h) b) Let (φ (1) 21, φ(h) 22 21, φ(1) 22 ( γ(0) = 6 + 1/4 γ(1) = 2 γ(2) = 0 γ(3) = 2 γ(k) = 0 for k 4. ) ( φ (h) 21 φ (h) 22 ) = ( γ(h) γ(h + 1) ˆX n+h = φ (h) 21 X n + φ (h) 22 X n 1. ) = (0, 0) if h 4. Hence then the best forecast is 0. ) = (a, b) = ( 0.36, 0.11). E[X n+1 ax n bx n 1 ] 2 = γ(0)(1 + a 2 + b 2 ) 2γ(1)(a ab) 2bγ(2) = ) 7

8 7) Let {X t µ} be the AR(1) process X t µ = 0.8(X t 1 µ) + Z t. Based on the sample X 1,...,X 400, compute the length of an approximate 95% confidence interval for µ. (I want to see the numerical value.) Solution: The CI is where τ 2 = h Z γ(h). We have [ X τ, X τ], X t µ = 0.8 k Z t k. Now compute γ(h), or use the result we derived in class for MA( ) process: τ 2 = Var(Z 0 ) (sum of coefficients) 2. Thus the length is = 1. τ 2 = ( 0.8 k ) 2 = 25. 8

9 8) Let {X t } be the MA(1) process X t = Z t Z t 1. Assume EZ 4 t = 1. Based on the sample X 1,..., X 400, compute the length of an approximate 95% confidence interval for γ(0). (I want to see the numerical value.) Solution: The CI is where [ˆγ τ, ˆγ τ], τ 2 = h ZCov(X 2 0, X2 h ). Note that Cov(X 2 0, X2 h ) = E[X2 0, X2 h ] γ(0)2, with γ(0) = 5/4. We have E[X0X 2 h] 2 = E[(Z Z2 1 + Z 0 Z 1 )(Zh Z2 h 1 + Z h Z h 1 )] 41/16 if h = 0, = 33/16 if h = 1, 25/16 else. Hence τ 2 = 1 + 2(8/16) = 2. Thus the length is = 2/5. 9

10 6000 Level 9) Let {X t } be a stationary process with mean µ. Show that P sp{1,x1,...,x n}x n+h = µ + P sp{y1,...,y n}y n+h where {Y t } = {X t µ}. Solution: The projection P sp{1,x1,...,x n}x n+h is characterized by X n+h P sp{1,x1,...,x n}x n+h X i = 0 i = 1,, n and X n+h P sp{1,x1,...,x n}x n+h 1 = 0. We have to show that X n+h µ P sp{y1,...,y n}y n+h X i = 0 i = 1,, n and X n+h µ P sp{y1,...,y n}y n+h 1 = 0. or equivalently Y n+h P sp{y1,...,y n}y n+h Y i + µ = 0 i = 1,, n and Y n+h P sp{y1,...,y n}y n+h 1 = 0. Using the linearity of the inner product, it only remains to show that Y n+h P sp{y1,...,y n}y n+h 1 = 0. And this is clear since EY i = 0. 10

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Time series models 2007

Time series models 2007 Norwegia Uiversity of Sciece ad Techology Departmet of Mathematical Scieces Solutios to problem sheet 1, 2007 Exercise 1.1 a Let Sc = E[Y c 2 ]. The This gives Sc = EY 2 2cEY + c 2 ds dc = 2EY + 2c = 0

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela Lesson 7: Estimation of Autocorrelation and Partial Autocorrelation Function Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Estimation

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 STAD57H3 Time Series Analysis Duration: One hour and fifty minutes Last Name: First Name: Student

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Homework 2 Solutions

Homework 2 Solutions Math 506 Spring 201 Homework 2 Solutions 1. Textbook Problem 2.7: Since {Y t } is stationary, E[Y t ] = µ Y for all t and {Y t } has an autocovariance function γ Y. Therefore, (a) E[W t ] = E[Y t Y t 1

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

Some Stochastic Shape Applications in. Time Series and Markov Chains. Ying Zhao. (Under the direction of Robert Lund) Abstract

Some Stochastic Shape Applications in. Time Series and Markov Chains. Ying Zhao. (Under the direction of Robert Lund) Abstract Some Stochastic Shape Applications in Time Series and Markov Chains by Ying Zhao (Under the direction of Robert Lund) Abstract This dissertation explores several shape ordering applications in statistics,

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

TMA4285 Time Series Models Exam December

TMA4285 Time Series Models Exam December Norges teknisk-naturvitenskapelige universitet Institutt for matematiske fag TMA485 Time Series Models Solution Oppgave a) A process {z t } is invertible if it can be represented as an A( ) process, z

More information

Introduction to Time Series Analysis. Lecture 18.

Introduction to Time Series Analysis. Lecture 18. Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

Spectral Analysis (Theory)

Spectral Analysis (Theory) Spectral Analysis (Theory) Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis (Theory) Winter 2016 1 / 28 Idea: Decompose a stationary time series {X t } into

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

SOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 2002 This paper is also taken for the relevant examination for the Associateship.

SOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 2002 This paper is also taken for the relevant examination for the Associateship. UNIVERSITY OF LONDON IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE SOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 00 This paper is also taken for the relevant examination for the Associateship

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

10 Linear Prediction. Statistics 626

10 Linear Prediction. Statistics 626 10 Linear Prediction If we have a realization x(1),...,x(n) fromatimeseriesx, we would like a rule for how to take the x s and the probability properties of X to find the function of the data that best

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Financial Time Series Analysis Week 5

Financial Time Series Analysis Week 5 Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Time Series Analysis - Part 1

Time Series Analysis - Part 1 Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Homework 4. 1 Data analysis problems

Homework 4. 1 Data analysis problems Homework 4 1 Data analysis problems This week we will be analyzing a number of data sets. We are going to build ARIMA models using the steps outlined in class. It is also a good idea to read section 3.8

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Examination paper for Solution: TMA4285 Time series models

Examination paper for Solution: TMA4285 Time series models Department of Mathematical Sciences Examination paper for Solution: TMA4285 Time series models Academic contact during examination: Håkon Tjelmeland Phone: 4822 1896 Examination date: December 7th 2013

More information

Problem Set 6 Solution

Problem Set 6 Solution Problem Set 6 Solution May st, 009 by Yang. Causal Expression of AR Let φz : αz βz. Zeros of φ are α and β, both of which are greater than in absolute value by the assumption in the question. By the theorem

More information

ARMA models with time-varying coefficients. Periodic case.

ARMA models with time-varying coefficients. Periodic case. ARMA models with time-varying coefficients. Periodic case. Agnieszka Wy lomańska Hugo Steinhaus Center Wroc law University of Technology ARMA models with time-varying coefficients. Periodic case. 1 Some

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

Variable Targeting and Reduction in High-Dimensional Vector Autoregressions

Variable Targeting and Reduction in High-Dimensional Vector Autoregressions Variable Targeting and Reduction in High-Dimensional Vector Autoregressions Tucker McElroy (U.S. Census Bureau) Frontiers in Forecasting February 21-23, 2018 1 / 22 Disclaimer This presentation is released

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Principles of forecasting

Principles of forecasting 2.5 Forecasting Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables X t (m 1 vector). Let y t+1

More information

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................

More information

ECONOMETRICS Part II PhD LBS

ECONOMETRICS Part II PhD LBS ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information