Class 1: Stationary Time Series Analysis
|
|
- Esther Chambers
- 6 years ago
- Views:
Transcription
1 Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011
2 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models 4 AR(1) Model 5 Auto-Correlation Function (ACF)
3 Stochastic Process Stochastic process: a collection of random variables {..., Y 1, Y 0, Y 1, Y 2,..., Y T,...} = {Y t }. Observed series {y 1, y 2,..., y T } realizations of a stochastic process. We want a model for {Y t } to explain observed realizations {y t } T 1.
4 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j
5 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: mean is time-invariant
6 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: covariance doesn t depend on t
7 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: Var(Y t ) = γ 0 variance is also constant.
8 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j It is weak stationarity because it only relates to the first two moments. Higher moments can be time-variant.
9 Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j It is weak stationarity because it only relates to the first two moments. Higher moments can be time-variant. Examples 1 Y t iid(0, σ 2 ) {Y t } white noise (WN). 2 Y t iidn(0, σ 2 ) Gaussian white noise.
10 Strict (Strong) Stationary Definition {Y t } is (strictly/strongly) stationary if for any values of j 1, j 2,..., j n the joint distribution of (Y t, Y t+j1, Y t+j2,..., Y t+jn ) depends only on the intervals separating the dates (j 1, j 2,..., j n ) and not on date itself (t).
11 Strict (Strong) Stationary Definition {Y t } is (strictly/strongly) stationary if for any values of j 1, j 2,..., j n the joint distribution of (Y t, Y t+j1, Y t+j2,..., Y t+jn ) depends only on the intervals separating the dates (j 1, j 2,..., j n ) and not on date itself (t). For all τ, t 1, t 2,..., t n : F Y (y t1, y t2,..., y tn ) = F Y (y t1+τ, y t2+τ,..., y tn+τ )
12 Strict (Strong) Stationary Definition {Y t } is (strictly/strongly) stationary if for any values of j 1, j 2,..., j n the joint distribution of (Y t, Y t+j1, Y t+j2,..., Y t+jn ) depends only on the intervals separating the dates (j 1, j 2,..., j n ) and not on date itself (t). For all τ, t 1, t 2,..., t n : F Y (y t1, y t2,..., y tn ) = F Y (y t1+τ, y t2+τ,..., y tn+τ ) If a process is strictly stationary with a finite second moment it is also covariance-stationary.
13 Strict (Strong) Stationary Definition {Y t } is (strictly/strongly) stationary if for any values of j 1, j 2,..., j n the joint distribution of (Y t, Y t+j1, Y t+j2,..., Y t+jn ) depends only on the intervals separating the dates (j 1, j 2,..., j n ) and not on date itself (t). For all τ, t 1, t 2,..., t n : F Y (y t1, y t2,..., y tn ) = F Y (y t1+τ, y t2+τ,..., y tn+τ ) If a process is strictly stationary with a finite second moment it is also covariance-stationary. Normality strong stationarity: whole distribution depends on the first two moments.
14 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN
15 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN t - time dummy
16 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN deterministic part
17 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN stochastic component
18 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t But, X t = Y t βt is covariance stationary.
19 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t But, X t = Y t βt is covariance stationary. 2 Y t = Y t 1 + ε t, ε t WN, Y 0 constant Random walk
20 Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t But, X t = Y t βt is covariance stationary. 2 Y t = Y t 1 + ε t, ε t WN, Y 0 constant Solving recursively : Y t = t ε j + Y 0. j=1 E[Y t] = Y 0, time-invariant mean. But Var(Y t) = t σ 2 depends on t. X t = Y t Y t 1 is covariance stationary.
21 Wold s Decomposition Theorem Any covariance stationary {Y t } has infinite order, moving-average representation: Y t = ψ j ε t j + κ t, j=0 ψ 0 = 1, ε t WN.
22 Wold s Decomposition Theorem Any covariance stationary {Y t } has infinite order, moving-average representation: Y t = ψ j ε t j + κ t, j=0 ψ 0 = 1, ε t WN. Linear combination of ε s (innovations over time) Weights does not depend on time t, they only depend on j, i.e. how long ago the shock ε occurred.
23 Wold s Decomposition Theorem Any covariance stationary {Y t } has infinite order, moving-average representation: Y t = ψ j ε t j + κ t, j=0 ψ 0 = 1, ε t WN. j=0 ψ2 j <, ε t WN(O, σ 2 ), κ t deterministic term (perfectly forecastable). Example: κ t = µ, constant mean.
24 Wold s Decomposition Theorem - Illustration Let X t = Y t κ t. Then, E[X t ] = E[X 2 t ] = ψ j E[ε t j ] = 0, j=0 ψj 2 E[ε 2 t j] = σ 2 j=0 j=0 ψ 2 j <, as ε t are independent; we have constant finite variance. E[X t X t j ] = E[(ε t + ψ 1 ε t 1 + ψ 2 ε t )(ε t j + ψ 1 ε t j 1 + ψ 2 ε t j 2 = σ 2 (ψ j + ψ j+1 ψ 1 + ψ j+2 ψ ) = σ 2 ψ k ψ k+j, depends on j not t. k=0 So we have a covariance stationary process in mean and variance.
25 ARMA Models Approximate Wold form with finite number of parameters. Wold Form: Y t µ = ψ j ε t j, ε t WN, j=0 ARMA(p,q): Y t µ = φ 1 (Y t 1 µ)+...+φ p (Y t p µ)+ε t +θ 1 ε t θ q ε t q.
26 Lag Operator Define the operator L as LX t X t 1, L 2 X t = L LX t = X t 2. In general, If c is a constant, Also, L k X t = X t k. Lc = c. L 1 X t X t+1, X t = (1 L)X t = X t X t 1.
27 Lag Operator It satisfies and, when φ < 1, L(αX t + βy t ) = αx t 1 + βy t 1 (a L + b L 2 ) X t = = ax t 1 + bx t 2, lim (1 + φl + j φ2 L φ j L j ) = (1 φl) 1
28 Lag Operator It satisfies and, when φ < 1, L(αX t + βy t ) = αx t 1 + βy t 1 (a L + b L 2 ) X t = = ax t 1 + bx t 2, lim (1 + φl + j φ2 L φ j L j ) = (1 φl) 1
29 ARMA Models in Lag Notaion ARMA(p, q): Y t µ = φ 1 (Y t 1 µ)+...+φ p (Y t p µ)+ε t +θ 1 ε t θ q ε t q. With lag operator: φ(l)(y t µ) = θ(l)ε t, where φ(l) = 1 φ 1 L φ 2 L 2... φ p L p, θ(l) = 1 + θ 1 L + θ 2 L θ q L q.
30 Stochastic Difference Equation (SDE) Representation Let X t = Y t µ and w t = θ(l)ε t. Then φ(l)x t = w t, or X t = φ 1 X t φ p X t p + w t, is a pth-order stochastic difference equation.
31 SDE Representation - AR(1) Example Example: First-order SDE (AR(1)): X t = φx t 1 + ε t, ε t WN Solve for Wold Form (recursive substitution) X t = φ t+1 X 1 + φ t ε 0 + φ t 1 ε φε t 1 + ε t = φ t+1 X 1 + ψ iε t i. i=0 where X 1 is an initial condition and ψ i = φ i. We approximated Wold form with 1 parameter form for AR(1).
32 Dynamic Multiplier The dynamic multiplier measurers the effect of ε t on subsequent values of X τ : X t+j ε t For the X t being AR(1) process = X j ε 0 = ψ j. (1) X t+j ε t = ψ j = φ j. (2) The dynamic multiplier for any linear difference equations depends only on the length of time j, not on time t.
33 Impulse Response Function The impulse-response function is a sequence of dynamic multipliers as a function of time from the one time impulse on ε t
34 Cumulative impact Permanent increase in ε at time t, i.e. ε t = 1, ε t+1 = 1, ε t+2 = 1,... X t+j ε t + X t+j ε t+1 + X t+j ε t X t+j ε t+j = ψ j + ψ j ψ + 1 In the limit, as j lim j [ Xt+j ε t + X t+j X ] t+j = ε t+1 ε t+j ψ j = ψ(1), j=0 where ψ(1) = ψ(l = 1) = 1 + ψ 1 + ψ
35 AR(1) Model Recall X t = φx t 1 + ε t, ε t WN, = φ j ε t j = ψ j ε t j. Wold coefficients j=0 j=0 ψ j = φpsi j 1, ψ j = Y t+j ε t If φ < 1 X t is stationary solution to first-order SDE. If φ = 1 then ψ j = 1 j and X t = X 1 + t j=0 εj is neither stationary nor stable solution, and ψ(1) is infinite.
36 AR(1) lag notation AR(1): X t = φx t 1 + ε t, ε t WN, (1 φl)x t = ε t Multiply both sides by (1 φl) 1 : X t = (1 φl) 1 ε t = (1 + φl + φ 2 L 2 + φ 3 + L )ε t = φ j ε t j = ψ j ε t j, j=0 j=0 ψ(l) = (1 φl) 1.
37 AR(1): Long-run effects For AR(1), if φ < 1 the permanent increase in ε t equals X t+j ε t and as j + Xt+j ε t Xt+j ε t+j = 1 + φ + φ 2 + φ φ j. ψ(1) = 1 + φ + φ = 1/(1 φ) the cumulative consequences for X of a one-time change in ε, j=0 X t+j ε t = 1/(1 φ)
38 AR(1) Mean Intercept representation for Y t X t + µ Y t = c + φy t 1 + ε t, where c = µ(1 φ). Mean E[Y t] = c + φe[y t 1] + E[ε t], Since we have covariance stationary process, E[Y t] = E[Y t 1] and E[Y t] = c 1 φ µ.
39 AR(1) Variance Var(Y t ) = E[(Y t µ) 2 ] = E[(φ(Y t 1 µ) + ε t ) 2 ] = φ 2 E[(Y t 1 µ) 2 ] + 2φE[(Y t 1 µ)ε t ] + E[ε 2 t ]. Since Y t is covariance stationary and ε t is independently distributed, Var(Y t ) = Var(Y t 1 ) and, so Var(Y t ) = E[Y t 1 ε t ] = 0] σ2 1 φ = γ 0.
40 AR(1): Covariance Cov(Y t, Y t j ) = E[(Y t µ)(y t j µ)] = φe[(y t 1 µ)(y t j µ)] + E[ε t (Y t j µ)], Cov(Y t, Y t j ) γ j = φγ j 1.
41 AR(1): Auto-correlation Function (ACF) Define For AR(1), ρ j = φρ j 1. ρ j γ j γ 0 j th autocorrelation corr(y t, Y t j ) Figure For AR(1) ACF and IRF are the same. In general it not true. ACF < 1, 1 >. If φ < 0: Figure
Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationDefine y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationECONOMETRICS Part II PhD LBS
ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationChapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for
Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationLecture on ARMA model
Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationCh. 19 Models of Nonstationary Time Series
Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationClass 4: Non-stationary Time Series
DF Phillips-Perron Stationarity Tests Variance Ratio Structural Break UC Approach Homework Jacek Suda, BdF and PSE January 24, 2013 DF Phillips-Perron Stationarity Tests Variance Ratio Structural Break
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationSTAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics
David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationStat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationClass 4: VAR. Macroeconometrics - Fall October 11, Jacek Suda, Banque de France
VAR IRF Short-run Restrictions Long-run Restrictions Granger Summary Jacek Suda, Banque de France October 11, 2013 VAR IRF Short-run Restrictions Long-run Restrictions Granger Summary Outline Outline:
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationEmpirical Macroeconomics
Empirical Macroeconomics Francesco Franco Nova SBE April 5, 2016 Francesco Franco Empirical Macroeconomics 1/39 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationEmpirical Macroeconomics
Empirical Macroeconomics Francesco Franco Nova SBE April 21, 2015 Francesco Franco Empirical Macroeconomics 1/33 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationLecture note 2 considered the statistical analysis of regression models for time
DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationB y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal
Structural VAR Modeling for I(1) Data that is Not Cointegrated Assume y t =(y 1t,y 2t ) 0 be I(1) and not cointegrated. That is, y 1t and y 2t are both I(1) and there is no linear combination of y 1t and
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationCh 9. FORECASTING. Time Series Analysis
In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error
More informationHeteroskedasticity in Time Series
Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More information