Financial Time Series Analysis Week 5

Size: px
Start display at page:

Download "Financial Time Series Analysis Week 5"

Transcription

1 Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ σ N(0, ) Recall 2: Let X, X 2,, be a sequence of IID(µ, σ 2 ) Let X n be the sample mean with sample size n, given by X n n X i Note that E[ X n ] µ an V ar( X n ) σ 2 /n Central limit theorem (CLT) says that where In other wors, X n µ σ/ i N(0, ) inicates the convergence in istribution: ( ) Xn µ P σ/ x Φ(x) ( Xn µ) x N(0, σ 2 ) 2π e x2 2 x The CLT is use to establish hypothesis test or confience interval Now we consier stationary AR() moel: X t µ φ(x t µ) + ɛ t where {ɛ t } IID(0, σ 2 ɛ ) an φ < We recall that the AR() moel can be expresse as a linear time series as follows: X t µ + φ j ɛ t j Let ˆµ X n n X t, an estimator of mean µ Our goal is to establish a CLT for µ Theorem 23 In the weakly stationary AR() moel X t µ φ(x t µ) + ɛ t with mean µ where {ɛ t } IID(0, σɛ 2 ), sample mean ˆµ follows asymptotic normality: as n, ( σ 2 ) n(ˆµ µ) ɛ N 0, ( φ) 2

2 Proof: Let b j kj+ φk Then b j b j φ k kj Define a new linear time series Y t b jɛ t j CHECK: b 2 j since φ < b2 j < kj+ φ k 2 ( ) φ j+ 2 φ kj+ φ 2 ( φ) 2 φ k φ j φ 2j φ 2 ( φ) 2 φ 2 < Thus V ar(y t ) σ 2 ɛ b2 j < Also note that EY t 0 an Cov(Y t, Y t+h ) σ 2 ɛ b jb j+h, an thus {Y t } is stationary Now we express X t in terms of {Y t } X t µ + φj ɛ t j µ + ɛ t + j φj ɛ t j µ + ɛ t + j (b j b j )ɛ t j µ + ( φj )ɛ t ( j φj )ɛ t + j b j ɛ t j j b jɛ t j µ + ( φj )ɛ t b 0 ɛ t j b jɛ t j + j b j ɛ t j (letting k j, j k + ) µ + φj )ɛ t b jɛ t j + k0 b kɛ t k ( ) µ + φj ɛ t Y t + Y t Thus Take n X t µ to both sies: X t µ n φ j n φ j ɛ t Y t + Y t ɛ t n Y t + n Y t ˆµ µ ( ) φ n ɛ t n (Y n Y 0 ) ( ) n(ˆµ µ) n φ Since {ɛ t } IID(0, σ 2 ɛ ), by the CLT of IID sequence, we know ɛ t ( ɛ n 0) 2 ɛ t (Y n Y 0 ) N(0, σ 2 ɛ )

3 where ɛ n n ɛ t, sample mean of {ɛ t } Thus, we have ( ) ( n ɛ t N 0, φ Note that Y n p 0, Y 0 σ 2 ɛ ( φ) 2 ) p 0 as n Hence, we have (ˆµ µ) ( σ 2 ) ɛ N 0, ( φ) 2 Yule-Walker Estimator in AR(p) Moel Consier a stationary AR(p) moel with mean zero: where ɛ t W N(0, σ 2 ɛ ) X t φ X t + φ 2 X t 2 + φ p X t p + ɛ t Our goal is to estimate unknown coefficients φ, φ 2,, φ p of AR(p) moel from the ata Suppose the financial log return ata are observe: r, r 2,, r T Recall: Autocorrelation function ρ(h) is estimate by sample autocorrelation function where ˆρ(h) ˆγ(h) ˆγ(0) ˆγ(h) T h (r t r)(r t+h r) with r T h T T r t Using the sample autocorrelation function ˆρ(h), we will estimate the coefficients φ, φ 2,, φ p In orer to construct Yule-Walker Equation, we multiply both sies of AR(p) moel by X t j, for j, 2,, p, an take expectation to obtain E[X t X t j ] E[φ X t X t j + φ 2 X t 2 X t j + + φ p X t p X t j + ɛ t X t j ] γ(j) φ γ(j ) + φ 2 γ(j 2) + + φ p γ(j p) + 0 This equation is calle Yule-Walker Equation of AR(p) moel Divie by γ(0) to obtain γ(j) γ(0) φ γ(j ) γ(j 2) γ(j p) + φ φ p γ(0) γ(0) γ(0) ρ(j) φ ρ(j ) + φ 2 ρ(j 2) + + φ p ρ(j p) 3

4 Note that ρ( h) ρ(h) below ρ() φ ρ(0) + φ 2 ρ() + φ 3 ρ(2) + + φ p ρ(p ) if j ρ(2) φ ρ() + φ 2 ρ(0) + φ 3 ρ() + + φ p ρ(p 2) if j 2 ρ(3) φ ρ(2) + φ 2 ρ() + φ 3 ρ(0) + + φ p ρ(p 3) if j 3 ρ(p) φ ρ(p ) + φ 2 ρ(p 2) + + φ p ρ(0) if j p Its matrix form is given by ρ() ρ(2) ρ(3) ρ(p) ρ(0) ρ() ρ(2) ρ(p ) ρ() ρ(0) ρ() ρ(p 2) ρ(2) ρ() ρ(0) ρ(p 3) ρ(p ) ρ(p 2) ρ(p 3) ρ(0) φ φ 2 φ 3 φ p Denote it by B AX where B is p column vector, A is p p matrix, an X is p column vector with unknown coefficients φ, φ 2,, φ p to be estimate Thus X A B, if A is invertible That is, φ φ 2 φ 3 φ p ρ(0) ρ() ρ(2) ρ(p ) ρ() ρ(0) ρ() ρ(p 2) ρ(2) ρ() ρ(0) ρ(p 3) ρ(p ) ρ(p 2) ρ(p 3) ρ(0) ρ() ρ(2) ρ(3) ρ(p) Therefore, using ˆρ(h) in place of ρ(h), we obtain Yule-Walker estimator ( ˆφ, ˆφ 2,, ˆφ p ) of unknown coefficients (φ, φ 2,, φ p ) : ˆφ ˆφ 2 ˆφ 3 ˆφ p ˆρ() ˆρ(2) ˆρ(p ) ˆρ() ˆρ() ˆρ(p 2) ˆρ(2) ˆρ() ˆρ(p 3) ˆρ(p ) ˆρ(p 2) ˆρ(p 3) ˆρ() ˆρ(2) ˆρ(3) ˆρ(p) Example : In a stationary AR() moel, Yule-Walker equation is given by γ() φ γ(0) 4

5 thus ρ() φ an hence Yule-Walker estimator of φ is given by ˆφ ˆρ() Example 2: In a stationary AR(2) moel, Yule-Walker equation is given by γ(j) φ γ(j ) + φ 2 γ(j 2) for j, 2 Thus for j, 2, ρ(j) φ ρ(j ) + φ 2 ρ(j 2) That is, ρ() φ ρ(0) + φ 2 ρ() ρ(2) φ ρ() + φ 2 ρ(0) Its matrix from is ρ() ρ(2) ρ(0) ρ() ρ() ρ(0) φ φ 2 an hence Yule-Walker estimators of φ, φ 2 are given by ˆφ ˆρ() ˆρ() ˆφ 2 ˆρ() ˆρ(2) Recall: Inverse matrix of 2 2 matrix: a b c b a bc c a Therefore, Yule-Walker estimators of φ, φ 2 are given by ˆφ ˆρ() ˆφ ˆρ() 2 ˆρ() 2 ˆρ() ˆρ(2) ˆρ() 2 ˆρ() ˆρ()ρ(2) ˆρ(2) ˆρ() 2 ˆφ ˆρ() ˆρ()ρ(2) ˆρ() 2, ˆφ2 ˆρ(2) ˆρ()2 ˆρ() 2 Regression Moel Before we start the least square estimator of AR moels, we recall regression moel an its least square estimator Let (X, Y ) (X n, Y n ) be observe We wish to know a linear relation between X t an Y t for each t Y t α + βx t + ɛ t {ɛ t } IID(0, σɛ 2 ) () 5

6 We will estimate unknown coefficients α, β Moel () is calle a regression moel To estimate, we use least square metho The least square metho minimizes the sum of square error (SSE): ɛ 2 t (Y t α βx t ) 2 We will fin α an β so that (Y t α βx t ) 2 is minimize Let (ˆα, ˆβ) arg min (α,β) SSE Let f(α, β) (Y t α βx t ) 2, which is a function of unknown (α, β) β f(α, β) 2 α f(α, β) 2 (Y t α βx t )( X t ) 0 (Y t α βx t )( ) 0 (X ty t αx t βx 2 t ) 0 (Y t α βx t ) 0 From the last equation, we obtain ( α Y t β n Thus, X ty t α X t β X2 t 0 Y t αn β X t 0 X t Y t (Ȳ β X) ) X t Ȳ β X X t β X t Y t n XȲ + βn X 2 β Xt 2 0 Xt 2 0 Therefore ˆβ X ty t n XȲ n, ˆα Ȳ ˆβ X X2 t n X 2 (ˆα, ˆβ) is calle orinary least squares (OLS) estimator of (α, β) Least Squares Estimator of AR() Moel Now we will fin the OLS estimator of the coefficient φ of AR() moel with mean zero (If the mean is µ 0, then consier AR() moel {X t µ : t, 2, }) X t φx t + ɛ t, {ɛ t } IID(0, σɛ 2 ) 6

7 Sum of square errors from ata X, X 2,, X n is given by S(φ) : (X t φx t ) 2 We will fin φ so that S(φ) is minimize Let Thus ˆφ arg min S(φ) φ φ S(φ) 2 (X t φx t )( X t ) 0 X t X t φ Xt 2 0 ˆφ X tx t n X2 t ˆφ is calle OLS estimator of φ in AR() moel Central Limit Theorem of OLS Estimator ˆφ Theorem 24 In AR() moel, X t φx t + ɛ t, with {ɛ t } IID(0, σ 2 ɛ ), the OLS estimator ˆφ follows the asymptotic normality as n, ( ˆφ φ) N ( 0, σ 2 ɛ (EX 2 ) ) Proof: X t φx t + ɛ t Divie by X2 t Thus X t X t φxt 2 + ɛ t X t X t X t φ Xt 2 + ɛ t X t in both sies to obtain ˆφ φ + ɛ tx t n X2 t ( ˆφ φ) ɛ tx t X2 t ɛ tx t n X2 t We will see the limits of numerator an enominator, respectively [] First, we observe the numerator n ɛ tx t : Its mean is E[ɛ t X t ] 0 7

8 Its variance is ( ) V ar ɛ t X t n n (n )V ar(ɛ tx t ) V ar(ɛ t X t ) E[ɛ t 2 X 2 t ] as n, (since E[ɛ t X t ] 0), Thus E[ɛ 2 t ]E[X 2 t ] σ 2 ɛ EX 2, by inepenence of ɛ t, X t an by the stationarity of X t ɛ t X t N ( 0, σ 2 ɛ EX 2 ) (Its etaile proof will be omitte since it is a grauate level A key point is that the sequence {ɛ t X t : t 2, 3,, n} are inepenent an we can apply by the CLT to the sequence) [2] Seconly, we observe the enominator n X2 t : X 2 p t E[X 2 n t ] EX 2 Therefore, by [] an [2], we have ( ˆφ φ) EX 2 N ( ( 0, σɛ 2 EX 2 ) N 0, σ 2 ɛ EX 2 ) where we use the fact that V ar(ax) a 2 V ar(x) Forecasting in AR() Moel Recall: (i) Conitional expectation Y given X : E[Y X] is a function of X, because E[Y X x] yf(y x)y which is a function of x (ii) Conitional expectation Y given X,, X t : E[Y X, X 2 X t ] is a function of (X, X 2,, X t ) (iii) E[g(X, X 2,, X n ) X,, X n ] g(x, X 2,, X n ) where g is a function from R n to R We stuy forecasting future values of a stationary AR() moel with mean µ X t µ + φ(x t µ) + ɛ t, φ < Mean µ an coefficient φ can be estimate from ata, an so we may assume that µ an φ are known: If these are unknown, we use estimators; sample mean ˆµ an Yule-Walker, or OLS estimator ˆφ, respectively: µ ˆµ sample mean 8

9 φ ˆφ Yule-Walker estimator or OLS estimator which are efine from ata X,, X t observe at times, 2, t Now we will forecast future ata Let t be the present time Suppose that we have information X, X 2,, X t an mean µ, coefficient φ One-step ahea forecast: Let ˆX t+ ˆX t () be one-step ahea forecast of AR() moel ˆX t+ ˆX t () E[X t+ X,, X t ] E[µ + φ(x t µ) + ɛ t+ X,, X t ] µ + φ(x t µ) + E[ɛ t+ X,, X t ] µ + φ(x t µ) Two-step ahea forecast: Let ˆX t+2 ˆX t (2) be two-step ahea forecast of AR() moel ˆX t+2 ˆX t (2) E[X t+2 X,, X t ] E[µ + φ(x t+ µ) + ɛ t+2 X,, X t ] µ + φ(e[x t+ X X t ] µ) + E[ɛ t+2 X X t ] µ + φ( ˆX t () µ) + 0 µ + φ(µ + φ(x t µ) µ) µ + φ 2 (X t µ) By the mathematical inuction, we may assume ˆX t+l ˆX t (l ) µ + φ( ˆX t (l 2) µ) ˆX t (l ) µ φ( ˆX t (l 2) µ), an for each k 2, 3,, l, ˆX t (k) µ φ( ˆX t (k ) µ) l-step ahea forecast: 9

10 Let ˆX t+l ˆX t (l) be l-step ahea forecast of AR() moel ˆX t+l ˆX t (l) E[X t+l X,, X t ] E[µ + φ(x t+l µ) + ɛ t+l X,, X t ] µ + φ( ˆX t (l ) µ) since ˆXt (l ) µ + φ( ˆX t (l 2) µ) µ + φ 2 ( ˆX t (l 2) µ) µ + φ l ( ˆX t () µ) µ + φ l (X t µ) Since φ < φ l 0 as l Thus ˆX t (l) µ as l It means that future values of stationary AR() moel approaches to the mean as time goes to infinity 0

11 Homework 3 Consier the following log-return financial time series ata: time t log return r t Compute sample autocovariance ˆγ(h) an sample autocorrelation ˆρ(h) for h, 2 2 Assume that we apply the ata to AR() moel an compute the Yule-Walker estimate of φ 3 Assume that we apply the ata to AR(2) moel an compute the Yule-Walker estimates of φ an φ 2 4 Assume that we apply the ata to AR() moel an compute the OLS estimate of φ 5 Forecast the future values of log-returns at time an time 2 by using the Yule-Walker estimate in problem 2 6 Forecast the future values of log-returns at time an time 2 by using the OLS estimate in problem 4 Warning: In some theories stuie in class, the moel has mean zero Thus when you apply the theory, you shoul use the ata with mean zero, compute from the ata in table above

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs Questions and Answers on Unit Roots, Cointegration, VARs and VECMs L. Magee Winter, 2012 1. Let ɛ t, t = 1,..., T be a series of independent draws from a N[0,1] distribution. Let w t, t = 1,..., T, be

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

MEI Exam Review. June 7, 2002

MEI Exam Review. June 7, 2002 MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)

More information

CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS. Sangyeol Lee

CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS. Sangyeol Lee CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS Sangyeol Lee 1 Contents 1. Introduction of the CUSUM test 2. Test for variance change in AR(p) model 3. Test for Parameter Change in Regression Models

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Comprehensive Examination Quantitative Methods Spring, 2018

Comprehensive Examination Quantitative Methods Spring, 2018 Comprehensive Examination Quantitative Methods Spring, 2018 Instruction: This exam consists of three parts. You are required to answer all the questions in all the parts. 1 Grading policy: 1. Each part

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

MA Advanced Econometrics: Applying Least Squares to Time Series

MA Advanced Econometrics: Applying Least Squares to Time Series MA Advanced Econometrics: Applying Least Squares to Time Series Karl Whelan School of Economics, UCD February 15, 2011 Karl Whelan (UCD) Time Series February 15, 2011 1 / 24 Part I Time Series: Standard

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013 Survey Sampling Kosuke Imai Department of Politics, Princeton University February 19, 2013 Survey sampling is one of the most commonly use ata collection methos for social scientists. We begin by escribing

More information

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares L Magee Fall, 2008 1 Consider a regression model y = Xβ +ɛ, where it is assumed that E(ɛ X) = 0 and E(ɛɛ X) =

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Topic 7: Convergence of Random Variables

Topic 7: Convergence of Random Variables Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information

More information

Lecture 14 Simple Linear Regression

Lecture 14 Simple Linear Regression Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

STA 6857 Estimation ( 3.6)

STA 6857 Estimation ( 3.6) STA 6857 Estimation ( 3.6) Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg STA 6857 Estimation ( 3.6) 2/ 19 Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg

More information

Final Exam November 24, Problem-1: Consider random walk with drift plus a linear time trend: ( t

Final Exam November 24, Problem-1: Consider random walk with drift plus a linear time trend: ( t Problem-1: Consider random walk with drift plus a linear time trend: y t = c + y t 1 + δ t + ϵ t, (1) where {ϵ t } is white noise with E[ϵ 2 t ] = σ 2 >, and y is a non-stochastic initial value. (a) Show

More information

Implicit Differentiation

Implicit Differentiation Implicit Differentiation Implicit Differentiation Using the Chain Rule In the previous section we focuse on the erivatives of composites an saw that THEOREM 20 (Chain Rule) Suppose that u = g(x) is ifferentiable

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Convergence of Random Walks

Convergence of Random Walks Chapter 16 Convergence of Ranom Walks This lecture examines the convergence of ranom walks to the Wiener process. This is very important both physically an statistically, an illustrates the utility of

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000

More information

ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes

ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes ED HERBST September 11, 2017 Background Hamilton, chapters 15-16 Trends vs Cycles A commond decomposition of macroeconomic time series

More information

under the null hypothesis, the sign test (with continuity correction) rejects H 0 when α n + n 2 2.

under the null hypothesis, the sign test (with continuity correction) rejects H 0 when α n + n 2 2. Assignment 13 Exercise 8.4 For the hypotheses consiere in Examples 8.12 an 8.13, the sign test is base on the statistic N + = #{i : Z i > 0}. Since 2 n(n + /n 1) N(0, 1) 2 uner the null hypothesis, the

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Linear models and their mathematical foundations: Simple linear regression

Linear models and their mathematical foundations: Simple linear regression Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction

More information

Regression and Statistical Inference

Regression and Statistical Inference Regression and Statistical Inference Walid Mnif wmnif@uwo.ca Department of Applied Mathematics The University of Western Ontario, London, Canada 1 Elements of Probability 2 Elements of Probability CDF&PDF

More information

Moreover, the second term is derived from: 1 T ) 2 1

Moreover, the second term is derived from: 1 T ) 2 1 170 Moreover, the second term is derived from: 1 T T ɛt 2 σ 2 ɛ. Therefore, 1 σ 2 ɛt T y t 1 ɛ t = 1 2 ( yt σ T ) 2 1 2σ 2 ɛ 1 T T ɛt 2 1 2 (χ2 (1) 1). (b) Next, consider y 2 t 1. T E y 2 t 1 T T = E(y

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Scatter plot of data from the study. Linear Regression

Scatter plot of data from the study. Linear Regression 1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Nonstationary time series models

Nonstationary time series models 13 November, 2009 Goals Trends in economic data. Alternative models of time series trends: deterministic trend, and stochastic trend. Comparison of deterministic and stochastic trend models The statistical

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Final Exam Study Guide and Practice Problems Solutions

Final Exam Study Guide and Practice Problems Solutions Final Exam Stuy Guie an Practice Problems Solutions Note: These problems are just some of the types of problems that might appear on the exam. However, to fully prepare for the exam, in aition to making

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Matrix Approach to Simple Linear Regression: An Overview

Matrix Approach to Simple Linear Regression: An Overview Matrix Approach to Simple Linear Regression: An Overview Aspects of matrices that you should know: Definition of a matrix Addition/subtraction/multiplication of matrices Symmetric/diagonal/identity matrix

More information

Bootstrap. Director of Center for Astrostatistics. G. Jogesh Babu. Penn State University babu.

Bootstrap. Director of Center for Astrostatistics. G. Jogesh Babu. Penn State University  babu. Bootstrap G. Jogesh Babu Penn State University http://www.stat.psu.edu/ babu Director of Center for Astrostatistics http://astrostatistics.psu.edu Outline 1 Motivation 2 Simple statistical problem 3 Resampling

More information

Random products and product auto-regression

Random products and product auto-regression Filomat 7:7 (013), 1197 103 DOI 10.98/FIL1307197B Publishe by Faculty of Sciences an Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat Ranom proucts an prouct auto-regression

More information

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:. MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss

More information

Test of Hypotheses in a Time Trend Panel Data Model with Serially Correlated Error Component Disturbances

Test of Hypotheses in a Time Trend Panel Data Model with Serially Correlated Error Component Disturbances HE UNIVERSIY OF EXAS A SAN ANONIO, COLLEGE OF BUSINESS Working Paper SERIES Date September 25, 205 WP # 000ECO-66-205 est of Hypotheses in a ime ren Panel Data Moel with Serially Correlate Error Component

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Lecture 10: Panel Data

Lecture 10: Panel Data Lecture 10: Instructor: Department of Economics Stanford University 2011 Random Effect Estimator: β R y it = x itβ + u it u it = α i + ɛ it i = 1,..., N, t = 1,..., T E (α i x i ) = E (ɛ it x i ) = 0.

More information

The derivative of a function f(x) is another function, defined in terms of a limiting expression: f(x + δx) f(x)

The derivative of a function f(x) is another function, defined in terms of a limiting expression: f(x + δx) f(x) Y. D. Chong (2016) MH2801: Complex Methos for the Sciences 1. Derivatives The erivative of a function f(x) is another function, efine in terms of a limiting expression: f (x) f (x) lim x δx 0 f(x + δx)

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

ECON 4160, Spring term Lecture 12

ECON 4160, Spring term Lecture 12 ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic

More information

Scatter plot of data from the study. Linear Regression

Scatter plot of data from the study. Linear Regression 1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Robust Autocorrelation Estimation

Robust Autocorrelation Estimation Robust Autocorrelation Estimation Christopher C. Chang and Dimitris N. Politis Department of Mathematics University of California at San Diego La Jolla, CA 92093-0112 USA. E-mails: chrchang@alumni.caltech.edu,

More information

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method.

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. Rebecca Barter May 5, 2015 Linear Regression Review Linear Regression Review

More information

Time Series Analysis - Part 1

Time Series Analysis - Part 1 Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating

More information

P n. This is called the law of large numbers but it comes in two forms: Strong and Weak.

P n. This is called the law of large numbers but it comes in two forms: Strong and Weak. Large Sample Theory Large Sample Theory is a name given to the search for approximations to the behaviour of statistical procedures which are derived by computing limits as the sample size, n, tends to

More information

A Modification of the Jarque-Bera Test. for Normality

A Modification of the Jarque-Bera Test. for Normality Int. J. Contemp. Math. Sciences, Vol. 8, 01, no. 17, 84-85 HIKARI Lt, www.m-hikari.com http://x.oi.org/10.1988/ijcms.01.9106 A Moification of the Jarque-Bera Test for Normality Moawa El-Fallah Ab El-Salam

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Lecture XII. where Φ is called the potential function. Let us introduce spherical coordinates defined through the relations

Lecture XII. where Φ is called the potential function. Let us introduce spherical coordinates defined through the relations Lecture XII Abstract We introuce the Laplace equation in spherical coorinates an apply the metho of separation of variables to solve it. This will generate three linear orinary secon orer ifferential equations:

More information

Lecture 1: Review of Basic Asymptotic Theory

Lecture 1: Review of Basic Asymptotic Theory Lecture 1: Instructor: Department of Economics Stanfor University Prepare by Wenbo Zhou, Renmin University Basic Probability Theory Takeshi Amemiya, Avance Econometrics, 1985, Harvar University Press.

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

ECONOMETRICS Part II PhD LBS

ECONOMETRICS Part II PhD LBS ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is

More information