Analyzing Aggregated AR(1) Processes

Size: px
Start display at page:

Download "Analyzing Aggregated AR(1) Processes"

Transcription

1 Analyzing Aggregated AR(1) Processes Jon Gunnip Supervisory Committee Professor Lajos Horváth (Committee Chair) Professor Davar Koshnevisan Professor Paul Roberts Analyzing Aggregated AR(1) Processes p.1

2 What is an AR(1) Process? Let with ǫ i be i.i.d. Eǫ = 0, Varǫ = σ 2. For some constant ρ, < ρ <, and for all i Z, let X i = ρx i 1 + ǫ i. This is an autoregressive process of order 1. Analyzing Aggregated AR(1) Processes p.2

3 AR(1) Process Example ρ =.5, ǫ N(0, 1) Analyzing Aggregated AR(1) Processes p.3

4 Aggregated AR(1) Processes What if a financial statistic for N companies each followed an AR(1) process? X (j) i = ρ (j) X (j) i 1 + ǫ(j) i, 1 j N, 1 i Analyzing Aggregated AR(1) Processes p.4

5 Aggregated AR(1) Processes What if a financial statistic for N companies each followed an AR(1) process? X (j) i = ρ (j) X (j) i 1 + ǫ(j) i, 1 j N, 1 i Only summary statistics might be reported: Y i = 1 N N j=1 X(j) i, 1 i n Analyzing Aggregated AR(1) Processes p.4

6 Aggregated AR(1) Processes What if a financial statistic for N companies each followed an AR(1) process? X (j) i = ρ (j) X (j) i 1 + ǫ(j) i, 1 j N, 1 i Only summary statistics might be reported: Y i = 1 N N j=1 X(j) i, 1 i n Is it plausible to consider Y 1,...,Y n as an AR(1) process? Y i = ρ Y i 1 + ǫ i, 1 i n Analyzing Aggregated AR(1) Processes p.4

7 Agenda Discuss some elementary facts about AR(1) processes Analyzing Aggregated AR(1) Processes p.5

8 Agenda Discuss some elementary facts about AR(1) processes Derive an estimator ˆρ for ρ in an AR(1) process and analyze the distribution of n(ˆρ ρ) Analyzing Aggregated AR(1) Processes p.5

9 Agenda Discuss some elementary facts about AR(1) processes Derive an estimator ˆρ for ρ in an AR(1) process and analyze the distribution of n(ˆρ ρ) Consider aggregations of AR(1) processes where ρ is a random variable and use simulations to test two estimators for Eρ that rely on the aggregated data Analyzing Aggregated AR(1) Processes p.5

10 Elementary Facts about AR(1) Processes Analyzing Aggregated AR(1) Processes p.6

11 Stationary, Predictable Solutions A solution to an AR(1) process is weakly stationary if EX i is independent of i and Cov(X i+h,x i ) is independent of i for each integer h Analyzing Aggregated AR(1) Processes p.7

12 Stationary, Predictable Solutions A solution to an AR(1) process is weakly stationary if EX i is independent of i and Cov(X i+h,x i ) is independent of i for each integer h A solution is predictable if X i is a function of ǫ i,ǫ i 1,... Analyzing Aggregated AR(1) Processes p.7

13 Solutions to AR(1) Processes An AR(1) process has a unique, stationary, predictable solution if and only if ρ < 1 Analyzing Aggregated AR(1) Processes p.8

14 Solutions to AR(1) Processes An AR(1) process has a unique, stationary, predictable solution if and only if ρ < 1 Assume ρ < 1. Using X i i = ρx i 2 + ǫ i 1, recursively expand X i = ρx i 1 + ǫ i to get Y i = k=0 ρk ǫ i k = ǫ i + ρǫ i 1 + ρ 2 ǫ i as solution Analyzing Aggregated AR(1) Processes p.8

15 Solutions to AR(1) Processes An AR(1) process has a unique, stationary, predictable solution if and only if ρ < 1 Assume ρ < 1. Using X i i = ρx i 2 + ǫ i 1, recursively expand X i = ρx i 1 + ǫ i to get Y i = k=0 ρk ǫ i k = ǫ i + ρǫ i 1 + ρ 2 ǫ i as solution Solution is predictable. It is also defined with probability 1 and that it satisfies X i = ρx i 1 + ǫ i Analyzing Aggregated AR(1) Processes p.8

16 Solutions to AR(1) Processes (cont d) Mean function is µ Y (i) = ( 0 and covariance function is γ Y (h) = ρ h so solution is stationary ) σ 2 1 ρ 2 Analyzing Aggregated AR(1) Processes p.9

17 Solutions to AR(1) Processes (cont d) Mean function is µ Y (i) = ( 0 and covariance function is γ Y (h) = ρ h so solution is stationary ) σ 2 1 ρ 2 For ρ > 1, there is a unique, stationary, non-predictable solution Analyzing Aggregated AR(1) Processes p.9

18 Solutions to AR(1) Processes - Conclusion Since we have a unique, stationary, predictable solution if and only if ρ < 1, we assume ρ < 1 throughout the rest of the presentation Analyzing Aggregated AR(1) Processes p.10

19 Solutions to AR(1) Processes - Conclusion Since we have a unique, stationary, predictable solution if and only if ρ < 1, we assume ρ < 1 throughout the rest of the presentation Next step is to have a way to estimate ρ given data X 1,...,X n from an AR(1) process Analyzing Aggregated AR(1) Processes p.10

20 Deriving ˆρ and Analyzing n(ˆρ ρ) Analyzing Aggregated AR(1) Processes p.11

21 Estimating ρ in AR(1) Process Least squares estimation: using ǫ k = X k ρx k 1 and minimizing n k=2 (X k ρx k 1 ) 2 yields ˆρ = n X k X k 1 k=2 n k=2 X 2 k 1 Analyzing Aggregated AR(1) Processes p.12

22 Estimating ρ in AR(1) Process Least squares estimation: using ǫ k = X k ρx k 1 and minimizing n k=2 (X k ρx k 1 ) 2 yields ˆρ = n X k X k 1 k=2 n k=2 X 2 k 1 Agrees with maximum liklihood estimator for ǫ N(0,σ 2 ) Analyzing Aggregated AR(1) Processes p.12

23 Properties of n(ˆρ ρ) By substituting ρx k 1 + ǫ k for X k in ˆρ we derive n n X k 1 ǫ k X k 1 ǫ k ˆρ ρ = k=2 n k=2 X 2 k 1 k=2 nex 2 0 Analyzing Aggregated AR(1) Processes p.13

24 Properties of n(ˆρ ρ) By substituting ρx k 1 + ǫ k for X k in ˆρ we derive n n X k 1 ǫ k X k 1 ǫ k ˆρ ρ = k=2 n k=2 X 2 k 1 k=2 nex 2 0 Predictability of X k implies EX k 1 ǫ k = 0 Analyzing Aggregated AR(1) Processes p.13

25 Properties of n(ˆρ ρ) By substituting ρx k 1 + ǫ k for X k in ˆρ we derive n n X k 1 ǫ k X k 1 ǫ k ˆρ ρ = k=2 n k=2 X 2 k 1 k=2 nex 2 0 Predictability of X k implies EX k 1 ǫ k = 0 Thus, E n(ˆρ ρ) 0 Analyzing Aggregated AR(1) Processes p.13

26 Properties of n(ˆρ ρ) (cont d) Similarly we can show Var n(ˆρ ρ) σ2 EX 2 0 Analyzing Aggregated AR(1) Processes p.14

27 Properties of n(ˆρ ρ) (cont d) Similarly we can show Var n(ˆρ ρ) σ2 EX 2 0 If n(ˆρ ρ) is normally distributed we would expect it to be approximately N(0, σ 2 EX 2 0 ) Analyzing Aggregated AR(1) Processes p.14

28 Properties of n(ˆρ ρ) (cont d) Similarly we can show Var n(ˆρ ρ) σ2 EX 2 0 If n(ˆρ ρ) is normally distributed we would expect it to be approximately N(0, σ 2 EX 2 0 ) We examine this proposition through simulations using several combinations of ρ and ǫ Analyzing Aggregated AR(1) Processes p.14

29 Properties of n(ˆρ ρ) (cont d) ρ =.1, ǫ N(0, 1) Analyzing Aggregated AR(1) Processes p.15

30 Properties of n(ˆρ ρ) (cont d) ρ =.5, ǫ N(0, 1) Analyzing Aggregated AR(1) Processes p.16

31 Properties of n(ˆρ ρ) (cont d) ρ =.9, ǫ N(0, 1) Analyzing Aggregated AR(1) Processes p.17

32 Properties of n(ˆρ ρ) (cont d) ρ =.99, ǫ N(0, 1) Analyzing Aggregated AR(1) Processes p.18

33 Properties of n(ˆρ ρ) (cont d) ρ =.5, ǫ DE(1, 0) f(x) = 1 2 e x Analyzing Aggregated AR(1) Processes p.19

34 Properties of n(ˆρ ρ) (cont d) ρ =.5, ǫ CAU(1, 0) f(x) = 1 π(1+x 2 ) Analyzing Aggregated AR(1) Processes p.20

35 Properties of n(ˆρ ρ) - Conclusion When ǫ is distributed as N(0, 1) or DE(1, 0), n(ˆρ ρ) is distributed approximately as N(0, σ 2 EX 2 0 ) Analyzing Aggregated AR(1) Processes p.21

36 Properties of n(ˆρ ρ) - Conclusion When ǫ is distributed as N(0, 1) or DE(1, 0), n(ˆρ ρ) is distributed approximately as N(0, σ 2 EX 2 0 ) We can create confidence intervals or do hypothesis testing for ρ under these circumstances Analyzing Aggregated AR(1) Processes p.21

37 Properties of n(ˆρ ρ) - Conclusion When ǫ is distributed as N(0, 1) or DE(1, 0), n(ˆρ ρ) is distributed approximately as N(0, σ 2 EX 2 0 ) We can create confidence intervals or do hypothesis testing for ρ under these circumstances We will use ǫ distributed as N(0, 1) and DE(1, 0) for our simulations of aggregrated AR(1) processes Analyzing Aggregated AR(1) Processes p.21

38 Aggregated AR(1) Processes Analyzing Aggregated AR(1) Processes p.22

39 Aggregated AR(1) Processes Assume a financial statistic for each of N companies follows an AR(1) process with ρ (j) randomly distributed X (j) i = ρ (j) X (j) i 1 + ǫ(j) i, 1 j N, 1 i Analyzing Aggregated AR(1) Processes p.23

40 Aggregated AR(1) Processes Assume a financial statistic for each of N companies follows an AR(1) process with ρ (j) randomly distributed X (j) i = ρ (j) X (j) i 1 + ǫ(j) i, 1 j N, 1 i We only have summary statistics and we want to estimate Eρ Y i = 1 N N j=1 X (j) i, 1 i n Analyzing Aggregated AR(1) Processes p.23

41 Using ˆρ LSE to estimate Eρ If we consider Y 1,...,Y n as an AR(1) process, Y i = ρ Y i 1 + ǫ i, 1 i n, our initial estimator for Eρ would be the least squares estimator ˆρ LSE = n i=2 n i=2 Y i Y i 1 Y 2 i 1 Analyzing Aggregated AR(1) Processes p.24

42 Using ˆρ LSE to est. Eρ - Granger and Morris Important results in the aggregation of stationary time series were achieved by Granger and Morris (1976). In particular, they showed that the sum of two AR(1) processes is not an AR(1) process, but is a more complex autoregressive moving average (ARMA) process. Analyzing Aggregated AR(1) Processes p.25

43 Using ˆρ LSE to est. Eρ - Horváth and Leipus Results of Horváth and Leipus (2005) Analyzing Aggregated AR(1) Processes p.26

44 Using ˆρ LSE to est. Eρ - Horváth and Leipus Results of Horváth and Leipus (2005) If P(0 < ρ < 1) = 1 or P( 1 < ρ < 0) = 1, ˆρ LSE converges in probability as the number of companies or time periods increase but the limit is not Eρ. Analyzing Aggregated AR(1) Processes p.26

45 Using ˆρ LSE to est. Eρ - Horváth and Leipus Results of Horváth and Leipus (2005) If P(0 < ρ < 1) = 1 or P( 1 < ρ < 0) = 1, ˆρ LSE converges in probability as the number of companies or time periods increase but the limit is not Eρ. If the distribution of ρ is symmetric around 0, ˆρ LSE converges to Eρ in probability as the number of companies or time periods goes to infinity. Analyzing Aggregated AR(1) Processes p.26

46 Using ˆρ LSE to estimate Eρ (cont d) We run simulations to test ˆρ LSE for ρ U(0, 1),ρ U( 1, 1) ǫ N(0, 1),ǫ DE(1, 0) N {100, 200, 300} n {10, 20, 30, 40, 50} Analyzing Aggregated AR(1) Processes p.27

47 Simulation Results for ρ U(0, 1) Analyzing Aggregated AR(1) Processes p.28

48 Simulation Results for ρ U( 1, 1) Analyzing Aggregated AR(1) Processes p.29

49 Using ˆρ MLSE to estimate Eρ Horváth and Leipus suggest the following modification of ˆρ LSE which should converge to Eρ for ρ U(0, 1) ˆρ MLSE = n i=2 Y i Y i 1 n i=1 Y 2 i n i=4 n i=3 Y i Y i 3 Y i Y i 2 Analyzing Aggregated AR(1) Processes p.30

50 Using ˆρ MLSE to estimate Eρ Horváth and Leipus suggest the following modification of ˆρ LSE which should converge to Eρ for ρ U(0, 1) ˆρ MLSE = n i=2 Y i Y i 1 n i=1 Y 2 i n i=4 n i=3 Y i Y i 3 Y i Y i 2 We repeat previous simulations using ˆρ MLSE instead of ˆρ LSE Analyzing Aggregated AR(1) Processes p.30

51 Simulation Results for ρ U(0, 1) Analyzing Aggregated AR(1) Processes p.31

52 Simulation Results for ρ U( 1, 1) Analyzing Aggregated AR(1) Processes p.32

53 Aggregated AR(1) Processes - Final Thoughts Did not fully replicate results of Horváth and Leipus with ˆρ MLSE for ρ U(0, 1) Analyzing Aggregated AR(1) Processes p.33

54 Aggregated AR(1) Processes - Final Thoughts Did not fully replicate results of Horváth and Leipus with ˆρ MLSE for ρ U(0, 1) Examined limited range of values for the number of companies and time periods Analyzing Aggregated AR(1) Processes p.33

55 Aggregated AR(1) Processes - Final Thoughts Did not fully replicate results of Horváth and Leipus with ˆρ MLSE for ρ U(0, 1) Examined limited range of values for the number of companies and time periods The AR(1) processes for each company were started with an initial value of 0 Resulted in non-stationary process? Throwing away a number of initial values in the series might have overcome this issue Analyzing Aggregated AR(1) Processes p.33

56 Questions? Analyzing Aggregated AR(1) Processes p.34

57 Thank You! Analyzing Aggregated AR(1) Processes p.35

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL) Economics Department LSE EC402 Lent 2015 Danny Quah TW1.10.01A x7535 : Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL) 1. Suppose ɛ is w.n. (0, σ 2 ), ρ < 1, and W t = ρw t 1 + ɛ t, for t = 1, 2,....

More information

Unit Root and Cointegration

Unit Root and Cointegration Unit Root and Cointegration Carlos Hurtado Department of Economics University of Illinois at Urbana-Champaign hrtdmrt@illinois.edu Oct 7th, 016 C. Hurtado (UIUC - Economics) Applied Econometrics On the

More information

Chapter 12 - Lecture 2 Inferences about regression coefficient

Chapter 12 - Lecture 2 Inferences about regression coefficient Chapter 12 - Lecture 2 Inferences about regression coefficient April 19th, 2010 Facts about slope Test Statistic Confidence interval Hypothesis testing Test using ANOVA Table Facts about slope In previous

More information

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela Lesson 7: Estimation of Autocorrelation and Partial Autocorrelation Function Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Estimation

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0 MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

α i ξ t i, where ξ t i.i.d. (0, 1),

α i ξ t i, where ξ t i.i.d. (0, 1), PROBABILITY AND MATHEMATICAL STATISTICS Vol. 31, Fasc. 1 (2011), pp. 300 000 NONLINEARITY OF ARCH AND STOCHASTIC VOLATILITY MODELS AND BARTLETT S FORMULA BY PIOTR S. KO KO S Z K A (LOGAN) AND DIMITRIS

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Review of probability and statistics 1 / 31

Review of probability and statistics 1 / 31 Review of probability and statistics 1 / 31 2 / 31 Why? This chapter follows Stock and Watson (all graphs are from Stock and Watson). You may as well refer to the appendix in Wooldridge or any other introduction

More information

On optimal stopping of autoregressive sequences

On optimal stopping of autoregressive sequences On optimal stopping of autoregressive sequences Sören Christensen (joint work with A. Irle, A. Novikov) Mathematisches Seminar, CAU Kiel September 24, 2010 Sören Christensen (CAU Kiel) Optimal stopping

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs Questions and Answers on Unit Roots, Cointegration, VARs and VECMs L. Magee Winter, 2012 1. Let ɛ t, t = 1,..., T be a series of independent draws from a N[0,1] distribution. Let w t, t = 1,..., T, be

More information

Ec402 Econometrics. Suitable for all candidates. Summer 2012 (part of) examination. Instructions to candidates. Time allowed: 3 hours

Ec402 Econometrics. Suitable for all candidates. Summer 2012 (part of) examination. Instructions to candidates. Time allowed: 3 hours Summer 2012 (part of) examination Ec402 Suitable for all candidates Instructions to candidates Time allowed: 3 hours This paper contains THREE sections. Answer all questions in Section A, ONE question

More information

Journal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh

Journal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh Journal of Statistical Research 007, Vol. 4, No., pp. 5 Bangladesh ISSN 056-4 X ESTIMATION OF AUTOREGRESSIVE COEFFICIENT IN AN ARMA(, ) MODEL WITH VAGUE INFORMATION ON THE MA COMPONENT M. Ould Haye School

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Bootstrapping high dimensional vector: interplay between dependence and dimensionality

Bootstrapping high dimensional vector: interplay between dependence and dimensionality Bootstrapping high dimensional vector: interplay between dependence and dimensionality Xianyang Zhang Joint work with Guang Cheng University of Missouri-Columbia LDHD: Transition Workshop, 2014 Xianyang

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Advances in Decision Sciences Volume, Article ID 893497, 6 pages doi:.55//893497 Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Junichi Hirukawa and

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

UC San Diego Recent Work

UC San Diego Recent Work UC San Diego Recent Work Title The Variance of Sample Autocorrelations: Does Barlett's Formula Work With ARCH Data? Permalink https://escholarship.org/uc/item/68c247dp Authors Kokoszka, Piotr S. Politis,

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Solutions to Homework Set #6 (Prepared by Lele Wang)

Solutions to Homework Set #6 (Prepared by Lele Wang) Solutions to Homework Set #6 (Prepared by Lele Wang) Gaussian random vector Given a Gaussian random vector X N (µ, Σ), where µ ( 5 ) T and 0 Σ 4 0 0 0 9 (a) Find the pdfs of i X, ii X + X 3, iii X + X

More information

Multiresolution Models of Time Series

Multiresolution Models of Time Series Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Semester , Example Exam 1

Semester , Example Exam 1 Semester 1 2017, Example Exam 1 1 of 10 Instructions The exam consists of 4 questions, 1-4. Each question has four items, a-d. Within each question: Item (a) carries a weight of 8 marks. Item (b) carries

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin ACE 562 Fall 2005 Lecture 2: Probability, Random Variables and Distributions Required Readings: by Professor Scott H. Irwin Griffiths, Hill and Judge. Some Basic Ideas: Statistical Concepts for Economists,

More information

ECON3327: Financial Econometrics, Spring 2016

ECON3327: Financial Econometrics, Spring 2016 ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary

More information

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

Modeling and testing long memory in random fields

Modeling and testing long memory in random fields Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous

More information

Linear Model Under General Variance Structure: Autocorrelation

Linear Model Under General Variance Structure: Autocorrelation Linear Model Under General Variance Structure: Autocorrelation A Definition of Autocorrelation In this section, we consider another special case of the model Y = X β + e, or y t = x t β + e t, t = 1,..,.

More information

On the robustness of cointegration tests when series are fractionally integrated

On the robustness of cointegration tests when series are fractionally integrated On the robustness of cointegration tests when series are fractionally integrated JESUS GONZALO 1 &TAE-HWYLEE 2, 1 Universidad Carlos III de Madrid, Spain and 2 University of California, Riverside, USA

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Key Words: first-order stationary autoregressive process, autocorrelation, entropy loss function,

Key Words: first-order stationary autoregressive process, autocorrelation, entropy loss function, ESTIMATING AR(1) WITH ENTROPY LOSS FUNCTION Małgorzata Schroeder 1 Institute of Mathematics University of Białystok Akademicka, 15-67 Białystok, Poland mszuli@mathuwbedupl Ryszard Zieliński Department

More information

A Study on the Correlation of Bivariate And Trivariate Normal Models

A Study on the Correlation of Bivariate And Trivariate Normal Models Florida International University FIU Digital Commons FIU Electronic Theses and Dissertations University Graduate School 11-1-2013 A Study on the Correlation of Bivariate And Trivariate Normal Models Maria

More information

Standard Testing Procedures for White Noise and Heteroskedasticity

Standard Testing Procedures for White Noise and Heteroskedasticity Standard Testing Procedures for White Noise and Heteroskedasticity Violetta Dalla 1, Liudas Giraitis 2 and Peter C.B. Phillips 3 1 University of Athens, 2 Queen Mary, UL, 3 Yale University A Celebration

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Recursive Estimation of Time-Average Variance Constants Through Prewhitening

Recursive Estimation of Time-Average Variance Constants Through Prewhitening Recursive Estimation of Time-Average Variance Constants Through Prewhitening Wei Zheng a,, Yong Jin b, Guoyi Zhang c a Department of Mathematical Sciences, Indiana University-Purdue University, Indianapolis,

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

Financial Econometrics Review Session Notes 3

Financial Econometrics Review Session Notes 3 Financial Econometrics Review Session Notes 3 Nina Boyarchenko January 22, 2010 Contents 1 k-step ahead forecast and forecast errors 2 1.1 Example 1: stationary series.......................... 2 1.2 Example

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 9 - Heteroskedasticity Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 43 Agenda 1 Introduction 2 AutoRegressive Conditional Heteroskedastic Model (ARCH)

More information

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED by W. Robert Reed Department of Economics and Finance University of Canterbury, New Zealand Email: bob.reed@canterbury.ac.nz

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Inference in VARs with Conditional Heteroskedasticity of Unknown Form

Inference in VARs with Conditional Heteroskedasticity of Unknown Form Inference in VARs with Conditional Heteroskedasticity of Unknown Form Ralf Brüggemann a Carsten Jentsch b Carsten Trenkler c University of Konstanz University of Mannheim University of Mannheim IAB Nuremberg

More information

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017 Introduction to Regression Analysis Dr. Devlina Chatterjee 11 th August, 2017 What is regression analysis? Regression analysis is a statistical technique for studying linear relationships. One dependent

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Applied Statistics I

Applied Statistics I Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 8, 2008 Liang Zhang (UofU) Applied Statistics I July 8, 2008 1 / 15 Distribution for Sample Mean Liang Zhang (UofU) Applied

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

The Role of "Leads" in the Dynamic Title of Cointegrating Regression Models. Author(s) Hayakawa, Kazuhiko; Kurozumi, Eiji

The Role of Leads in the Dynamic Title of Cointegrating Regression Models. Author(s) Hayakawa, Kazuhiko; Kurozumi, Eiji he Role of "Leads" in the Dynamic itle of Cointegrating Regression Models Author(s) Hayakawa, Kazuhiko; Kurozumi, Eiji Citation Issue 2006-12 Date ype echnical Report ext Version publisher URL http://hdl.handle.net/10086/13599

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

STAT 804: Lecture 1. Today: Plots of some time series. Discuss series using some of the jargon we will study. Basic classes of models.

STAT 804: Lecture 1. Today: Plots of some time series. Discuss series using some of the jargon we will study. Basic classes of models. STAT 804: Lecture 1 Today: Plots of some time series Discuss series using some of the jargon we will study. Basic classes of models. Existence of consistent estimates. 1 Plots of some series Mean Monthly

More information

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18 Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Serial Correlation. Edps/Psych/Stat 587. Carolyn J. Anderson. Fall Department of Educational Psychology

Serial Correlation. Edps/Psych/Stat 587. Carolyn J. Anderson. Fall Department of Educational Psychology Serial Correlation Edps/Psych/Stat 587 Carolyn J. Anderson Department of Educational Psychology c Board of Trustees, University of Illinois Fall 017 Model for Level 1 Residuals There are three sources

More information

ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles).

ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles). ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Set up standard example and notation. Review performance measures (means, probabilities and quantiles). A framework for conducting simulation experiments

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression In simple linear regression we are concerned about the relationship between two variables, X and Y. There are two components to such a relationship. 1. The strength of the relationship.

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

Nonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31

Nonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31 Nonlinear Error Correction Model and Multiple-Threshold Cointegration Man Wang Dong Hua University, China Joint work with N.H.Chan May 23, 2014 Nonlinear Error Correction Model and Multiple-Threshold Cointegration

More information

The Number of Bootstrap Replicates in Bootstrap Dickey-Fuller Unit Root Tests

The Number of Bootstrap Replicates in Bootstrap Dickey-Fuller Unit Root Tests Working Paper 2013:8 Department of Statistics The Number of Bootstrap Replicates in Bootstrap Dickey-Fuller Unit Root Tests Jianxin Wei Working Paper 2013:8 June 2013 Department of Statistics Uppsala

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Economics 620, Lecture 13: Time Series I

Economics 620, Lecture 13: Time Series I Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

STT 843 Key to Homework 1 Spring 2018

STT 843 Key to Homework 1 Spring 2018 STT 843 Key to Homework Spring 208 Due date: Feb 4, 208 42 (a Because σ = 2, σ 22 = and ρ 2 = 05, we have σ 2 = ρ 2 σ σ22 = 2/2 Then, the mean and covariance of the bivariate normal is µ = ( 0 2 and Σ

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Model Identification for Time Series with Dependent Innovations

Model Identification for Time Series with Dependent Innovations Model Identification for Time Series with Dependent Innovations Shuhao Chen 1, Wanli Min 2 and Rong Chen 1,3 1 Rutgers University, 2 IBM Singapore and 3 Peking University Abstract: This paper investigates

More information

Minitab Project Report Assignment 3

Minitab Project Report Assignment 3 3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Predictive Regressions: A Reduced-Bias. Estimation Method

Predictive Regressions: A Reduced-Bias. Estimation Method Predictive Regressions: A Reduced-Bias Estimation Method Yakov Amihud 1 Clifford M. Hurvich 2 November 28, 2003 1 Ira Leon Rennert Professor of Finance, Stern School of Business, New York University, New

More information

Nonstationary Time Series:

Nonstationary Time Series: Nonstationary Time Series: Unit Roots Egon Zakrajšek Division of Monetary Affairs Federal Reserve Board Summer School in Financial Mathematics Faculty of Mathematics & Physics University of Ljubljana September

More information

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching Communications for Statistical Applications and Methods 2013, Vol 20, No 4, 339 345 DOI: http://dxdoiorg/105351/csam2013204339 A Functional Central Limit Theorem for an ARMAp, q) Process with Markov Switching

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information