Time Series Analysis
|
|
- Leona Franklin
- 5 years ago
- Views:
Transcription
1 Time Series Analysis Christopher Ting Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March 3, / 37
2 Table of Contents 1 Introduction 2 Stationary Processes 3 AR, MA, & ARMA 4 Sample ACF 5 Yule-Walker Equations 6 Partial ACF Christopher Ting Week 9 March 3, / 37
3 Introduction Learning Objectives Understand thoroughly the nature of white noise Describe AR, MA, and ARMA time series using white noise as the building block Compute the unconditional and conditional means, as well as variances of these stationary processes Describe the autocorrelation function and how the function is estimated and tested for statistical significance (Box and Pierce Q-statistic, Ljung and Box statistic) Define back-shift operator B n and describe the relationship between MA and AR( ) processes Thoroughly understand the Yule-Walker equations and their applications to obtain partial autocorrelation function Christopher Ting Week 9 March 3, / 37
4 Introduction Components of a Time Series In general, a time series Y t has the following components: (a) trend-cycle (TC) component: the combined long-term and growth cycle movement of the time series (b) seasonal (S) component: the systematic variations of the time series (c) irregular (I) component: the random fluctuations of short-term movements of the time series. Christopher Ting Week 9 March 3, / 37
5 Introduction Time Series Trend-Cycle Component Seasonal Component Irregular Component Seasonality Calendar Effects Moving-Holiday Effect Trading-Day Effect Christopher Ting Week 9 March 3, / 37
6 Introduction Example: Nike s EPS Christopher Ting Week 9 March 3, / 37
7 Stationary Processes Basic Building Block: White Noise ˇ ( Properties of white noise Independently identically distributed process {u t } Stationary E(u t ) = 0 V(u t ) = σ 2 u C(u t, u t+k ) = 0 for any k 0 Stronger definition: u t is independent of u t+k for all k 0 ˇ ( How doe white noise sound like? Generate white noise with Matlab: u = randn(8192*10, 1); Create an audioplayer object: pu = audioplayer(u, 8192); Listen to white noise: play(pu); Christopher Ting Week 9 March 3, / 37
8 Stationary Processes AR, MA, and ARMA With basic white noise process {u t }, t =,...,, the following processes are generated: Autoregressive order one: AR(1) process Y t = θ + λ Y t 1 + u t Moving Average order one: MA(1) process Y t = θ + u t + α u t 1 Autoregressive Moving Average order one: ARMA(1, 1) process Y t = θ + λ Y t 1 + u t + α u t 1 Christopher Ting Week 9 March 3, / 37
9 AR, MA, & ARMA Autoregressive Process AR(1) Repeated substitution for Y t in the AR(1) process leads to For each t, provided λ < 1, Y t = θ + λ(θ + λy t 2 + u t 1 ) + u t Y t = (1 + λ + λ 2 + )θ + (u t + λu t 1 + λ 2 u t 2 + ) E(Y t ) = (1 + λ + λ 2 + )θ = θ 1 λ V(Y t ) = V(u t + λu t 1 + λ 2 u t 2 + ) = σ 2 u(1 + λ 2 + λ 4 + ) = σ 2 u 1 λ 2 σ 2 u C(Y t, Y t 1 ) = C ( ) θ + λy t 1 + u t, Y t 1 = λ 1 λ 2 Christopher Ting Week 9 March 3, / 37
10 AR, MA, & ARMA Autocorrelation Coefficients of AR(1) Autocovariance at lag k is a function of k only k 1 k 1 C(Y t, Y t k ) = C θ λ j + λ k Y t k + λ j u t j, Y t k = λk V(Y t ) γ(k) j=0 j=0 Accordingly, the serial correlation is Corr(Y t, Y t k ) = C(Y t, Y t k ) V(Y t ) = λ k For a symmetrical time shift t t + k, Corr(Y t+k, Y t ) = C(Y t+k, Y t ) V(Y t+k ) = λ k Hence ρ(k) Corr ( Y t, Y t±k ) = λ k Christopher Ting Week 9 March 3, / 37
11 AR, MA, & ARMA Illustrative Example of an AR(1) Process Suppose Y t = Y t 1 + u t, where E(u t ) = 0, V(u t ) = 3 1 What is the mean of Y t? 2 What is the variance of Y t? 3 What is the first-order autocovariance of this AR(1) process? 4 What is the first-order autocorrelation of this AR(1) process? 5 What is the third-order autocorrelation of this AR(1) process? Christopher Ting Week 9 March 3, / 37
12 Simulation parameters: AR, MA, & ARMA Simulating an AR(1) Process number of samples 5,000 σ 2 u = 3 θ = 2.5, λ 1 = 0.5 n = 5000 su2 = 3 theta, lambda1 = 2.5, 0.5 np.random.seed( ) u = np.random.normal(0, 1, n) mu, su = np.mean(u), np.std(u, ddof=1) u -= mu # ensure that mean of u is really 0 u /= su; # ensure that variance of u is really 1 u *= np.sqrt(su2) Y = np.zeros(n, dtype=float) Y[0] = theta + u[0] for t in range(1, n): Y[t] = theta + lambda1*y[t-1] + u[t] Christopher Ting Week 9 March 3, / 37
13 AR, MA, & ARMA Sample Autocorrelation Function of AR(1) Christopher Ting Week 9 March 3, / 37
14 AR, MA, & ARMA Another Example of AR(1), λ = 0.5 Christopher Ting Week 9 March 3, / 37
15 AR, MA, & ARMA Moving Average Process MA(1) For MA(1) process Y t = θ + u t + α u t 1, E ( Y t ) = θ V ( Y t ) = (1 + α 2 )σ 2 u C ( ) ( ) Y t, Y t 1 = C θ + ut + αu t 1, θ + u t 1 + αu t 2 = α σ 2 u Corr ( ) α Y t, Y t 1 = 1 + α 2 Corr ( ) Y t, Y t k = 0 for k > 1 Hence, MA(1) process is covariance-stationary with constant mean θ, constant variance σ 2 u(1 + α 2 ) and autocorrelation function of k = 1 only nonzero. Christopher Ting Week 9 March 3, / 37
16 AR, MA, & ARMA Illustrative Example of an MA(1) Process Suppose Y t = u t + 0.5u t 1, where E(u t ) = 0, V(u t ) = 3 1 What is the mean of Y t? 2 What is the variance of Y t? 3 What is the first-order autocovariance of this MA(1) process? 4 What is the first-order autocorrelation of this MA(1) process? 5 What is the third-order autocorrelation of this MA(1) process? Christopher Ting Week 9 March 3, / 37
17 AR, MA, & ARMA Simulating an MA(1) Process Simulation parameters: number of samples 5,000 σ 2 u = 3 θ = 2.5, α = 0.5 n = 5000 su2 = 3 theta, alpha = 2.5, 0.5 np.random.seed( ) u = np.random.normal(0, 1, n) mu, su = np.mean(u), np.std(u, ddof=1) u -= mu # ensure that mean of u is really 0 u /= su; # ensure that variance of u is really 1 u *= np.sqrt(su2) Y = np.zeros(n, dtype=float) Y[0] = theta + u[0] for t in range(1, n): Y[t] = theta + alpha*u[t-1] + u[t] Christopher Ting Week 9 March 3, / 37
18 AR, MA, & ARMA Sample ACF of MA(1) Process Christopher Ting Week 9 March 3, / 37
19 AR, MA, & ARMA Autoregressive Moving Average Process ARMA(1,1) Repeated substitution of Y t = θ + λy t 1 + u t + α u t 1 leads to Y t = θ λ i + u t + (λ + α) λ i u t i i=0 For each t, provided λ < 1 E(Y t ) = V(Y t ) = σ 2 u θ 1 λ 1 + (λ + α)2 i=0 C(Y t, Y t 1 ) = λ V(Y t k ) + α σ 2 u C(Y t, Y t k ) = λ k V(Y t k ), for k > 1 i=0 λ 2i = σ2 u (1 + ) (λ + α)2 1 λ 2 Christopher Ting Week 9 March 3, / 37
20 AR, MA, & ARMA Autoregressive Moving Average Process ARMA(1,1) (cont d) Hence, ARMA(1,1) process is covariance-stationary ) with constant θ mean 1 λ, constant variance (λ + σ2 α)2 u (1 + 1 λ 2, and autocovariance a function of k only. Christopher Ting Week 9 March 3, / 37
21 AR, MA, & ARMA Sample ACF of ARMA(1,1) Process Christopher Ting Week 9 March 3, / 37
22 AR, MA, & ARMA Autocorrelation Function of ARMA(1,1) Question 1: Derive the autocorrelation function of ARMA(1,1) process Question 2: If λ = α, what is the ACF of ARMA(1,1) at lag 1? Christopher Ting Week 9 March 3, / 37
23 AR, MA, & ARMA Changing Conditional Means AR(1) ℵ At t + 1, information about Y t is already known, so for AR(1) process, conditional mean is a random variable E ( Y t+1 Yt ) = θ + λyt + E ( u t+1 Yt ) = θ + λyt ℵ Conditional variance is constant MA(1) V ( Y t+1 Yt ) = V ( ut+1 Yt ) = σ 2 u < σ2 u 1 λ 2 θ 1 λ ℵ Conditional mean is stochastic but conditional variance is constant: E ( Y t+1 Yt ) = θ + αut V ( Y t+1 Yt ) = σ 2 u < σ 2 u(1 + α 2 ) Christopher Ting Week 9 March 3, / 37
24 Sample ACF Sample Autocorrelation Function Sample autocovariance lag k, for k = 0, 1, 2,..., p, with p < T/4. c(k) = 1 T k ( Yt Y )( Y t+k Y ) T The estimator c(k) is consistent, as lim T c(k) = C ( Y k, Y t k ). t=1 Sample autocorrelation lag k is then r(k) = c(k) c(0) = T k t=1 which is also consistent, as lim T r(k) = ρ(k). ( Yt Y )( Y t+k Y ) Tt=1 ( Yt Y ) 2 Variance of r(k) for AR(1) process V ( r(k) ) 1 ( (1 + λ 2 )(1 λ 2k ) ) 2kλ 2k T 1 λ 2 Christopher Ting Week 9 March 3, / 37
25 Sample ACF Test for ρ = 0, AR(1) Test hypothesis H 0 : ρ(j) = 0 for all j > 0 or H 0 = λ = 0. Then V ( r(k) ) 1 T To test that the j th autocorrelation is zero, the test statistic is z j = r(j) 0 1/T d N(0, 1) Reject H 0 : ρ(j) = 0 at 5% significance level if z > 1.96 Christopher Ting Week 9 March 3, / 37
26 Sample ACF Test for ρ = 0, MA(q) For MA(q) processes, the variance of r(k) is V ( r(k) ) 1 q T ρ(i) 2 Thus for MA(1) process, V ( r(k) ) for k > 1 is estimated by i=1 1 ( 1 + 2r(1) 2 ) > 1 T T and rejection of the null hypothesis allows the identification of MA(1). Christopher Ting Week 9 March 3, / 37
27 Sample ACF Joint Test Statistic To test if all autocorrelations are jointly zero, the null hypothesis is H 0 : ρ(1) = ρ(2) = = ρ(m) = 0 The Box and Pierce Q-statistic for asymptotic test is m m Q m = T r(k) 2 ( ) = T r(k) 2 = m z 2 k k=1 k=1 k=1 d χ 2 m The Ljung and Box test statistic provides an approximate correction for finite sample: Q m = T(T + 2) m k=1 r(k) 2 T k d χ 2 m Christopher Ting Week 9 March 3, / 37
28 Sample ACF ARMA (p, q) Processes AR(p) process is ARMA (p, 0); MA(q) process is ARMA(0, q). Backward shift operator B: BY t Y t 1 Applying the backward shift operator k times, AR(p) process B k Y t = Y t k Y t = θ + λ 1 Y t 1 + λ 2 Y t λ p Y t p + u t = θ + φ ( B ) p Y t 1 λ i B i Y t = θ + u t i=1 p λ i B i Y t + u t The equation φ(b) = 0 is called the characteristic equation of the AR(p) process. For AR(p) to be stationary, the roots of the characteristic equation must lie outside of the unit circle. Christopher Ting Week 9 March 3, / 37 i=1
29 Sample ACF Representation of MA(1) MA(1) process is representable as AR( ) process Y t = θ + u t + α u t 1 = θ + (1 + α B)u t (1 + αb) 1 Y t = (1 + αb) 1 θ + u t ( 1 αb + α 2 B 2 α 3 B 3 + )Y t = c + u t where c (1 + αb) 1 θ. Hence Y t = c + αy t 1 α 2 Y t 2 + α 3 Y t 3 α 4 Y t u t Stationarity requires (1 + αb) = 0 lie outside the unit circle, which is satisfied if α < 1. A stationary MA(q) process is said to be invertible if it can be represented as a stationary AR( ) process. What s the point of inverting an MA process? Christopher Ting Week 9 March 3, / 37
30 Yule-Walker Equations Yule-Walker Equations (1) Multiply both sides of AR(p) process by Y t k, Y t k Y t = Y t k θ + λ 1 Y t k Y t 1 + λ 2 Y t k Y t λ p Y t k Y t p + Y t k u t Taking unconditional expectations on both sides and noting that E ( ) Y t k Y t = γ(k) + µ 2 for any k, then γ(k) + µ 2 ( = µθ + λ 1 γ(k 1) + µ 2 ) ( + λ 2 γ(k 2) + µ 2 ) ( + + λ p γ(k p) + µ 2 ) The unconditional means of an AR(p) process is µ = θ + λ 1 µ + + λ p µ. So µ 2 = µθ + λ 1 µ λ p µ 2. Hence for k > 1, γ(k) = λ 1 γ(k 1) + λ 2 γ(k 2) + + λ p γ(k p) Dividing both sides by γ(0) to obtain correlations: ρ(k) = λ 1 ρ(k 1) + λ 2 ρ(k 2) + + λ p ρ(k p) Christopher Ting Week 9 March 3, / 37
31 Yule-Walker Equations Yule-Walker Equations (2) Note that ρ(0) = 1 and ρ(j) = ρ( j) For each k, there is a corresponding equation with p parameters of λ 1, λ 2,..., λ p, resulting in the Yule-Walker equations: ρ(1) = λ 1 + λ 2 ρ(1) + + λ p ρ(p 1) ρ(2) = λ 1 ρ(1) + λ λ p ρ(p 2) ρ(3) = λ 1 ρ(2) + λ 2 ρ(1) + + λ p ρ(p 3). = ρ(p) = λ 1 ρ(p 1) + λ 2 ρ(p 2) + + λ p Christopher Ting Week 9 March 3, / 37
32 Yule-Walker Equations Yule-Walker Equations in Matrix Form (1) Replace the ACF ρ(p) by sample ACF r(k), the p linear equations can be solved as 1 r(1) r(2) r(p 1) r(1) r(1) 1 r(1) r(p 2) r(2) R. r(p), Φ r(2) r(1) 1 r(p 3) r(p 1) r(p 2) r(p 3) 1 Λ λ 1 λ 2. λ p Christopher Ting Week 9 March 3, / 37
33 Yule-Walker Equations Yule-Walker Equations in Matrix Form (2) Write compactly as R = ΦΛ, so the parameters can be estimated as along with Λ = Φ 1 R, µ = 1 T T t=1 Y t Christopher Ting Week 9 March 3, / 37
34 Partial ACF Partial Autocorrelation Function ˇ * The sample ACF allows the identification respectively of either an AR or an MA process depending on whether the sample r(k) decays slowly or are clearly zero after some lag k. ˇ * However, even if an AR(p) is identified, it is still difficult to identify the order of the lag, p, since all AR(p) processes show similar decay patterns of ACF. ˇ * Is it possible to identify the order of an AR process? ˇ * Yes, use the Yule-Walker equations to generate a sequence of λ kk, k = 1, 2,..., p which is the PACF. Christopher Ting Week 9 March 3, / 37
35 Partial ACF Sample PACF Calculation (1) Suppose the true process is AR(p). Take k = 1 and based on AR(1) Y t = θ + λ 11 Y t 1 + u t The Yule-Walker equation for k = 1 gives λ 11 = r(1) Next take k = 2, i.e., assume that it is AR(2). The Yule-Walker system is ) ( ) 1 ( ) ( λ 21 1 r(1) r(1) = λ 22 r(1) 1 r(2) Solution for λ 22 is λ 22 = r(2) r(1)2 1 r(1) 2 Christopher Ting Week 9 March 3, / 37
36 Partial ACF Sample PACF Calculation (2) In general, to calculate the p-th value, take k = p and the AR(p) is Y t = θ + λ p1 Y t 1 + λ p2 Y t λ kp Y t p + u t Solve the Yule-Walker equations Λ = Φ 1 R, If the true order is p, theoretically, λ 11 0, λ 22 0,..., λ pp 0 λ p+1p+1 = λ p+2p+2 = = 0 Christopher Ting Week 9 March 3, / 37
37 Partial ACF Duality between AR and MA, ACF and PACF While an AR(p) process has a decaying ACF infinite in extent, the PACF cuts off after lag p. Recall that an MA(1) process is invertible into AR( ). In general, this property holds for MA(q) processes. So while the ACF of an MA(q) process cuts off after lag q, the PACF is infinite in extent. Christopher Ting Week 9 March 3, / 37
STAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More information6 NONSEASONAL BOX-JENKINS MODELS
6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationEconometrics of financial markets, -solutions to seminar 1. Problem 1
Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationSolutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14
Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationStochastic Modelling Solutions to Exercises on Time Series
Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have
More informationEconometrics I: Univariate Time Series Econometrics (1)
Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationMarcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design
Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More information18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013
18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationMCMC analysis of classical time series algorithms.
MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationA SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA
www.arpapress.com/volumes/vol14issue3/ijrras_14_3_14.pdf A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA Ette Harrison Etuk Department of Mathematics/Computer Science, Rivers State University
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationMODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo
Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study
More informationNote: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).
Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial
More informationAn introduction to volatility models with indices
Applied Mathematics Letters 20 (2007) 177 182 www.elsevier.com/locate/aml An introduction to volatility models with indices S. Peiris,A.Thavaneswaran 1 School of Mathematics and Statistics, The University
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function
More informationTime Series Analysis. Solutions to problems in Chapter 5 IMM
Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }
More informationCh 8. MODEL DIAGNOSTICS. Time Series Analysis
Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationWe use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.
Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationRoss Bettinger, Analytical Consultant, Seattle, WA
ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationAR(p) + I(d) + MA(q) = ARIMA(p, d, q)
AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the
More informationLinear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing
Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College d.mandic@imperial.ac.uk c Danilo
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationStochastic processes: basic notions
Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationLecture note 2 considered the statistical analysis of regression models for time
DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More information