Lecture 2: ARMA(p,q) models (part 2)

Size: px
Start display at page:

Download "Lecture 2: ARMA(p,q) models (part 2)"

Transcription

1 Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept Jan Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

2 Motivation Introduction Characterize the main properties of MA(q) models. Estimation of MA(q) models Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

3 Road map Introduction 1 Introduction 2 MA(1) model 3 Application of a counterfactual MA(1) 4 Moving average model of order q, MA(q) 5 Application of a MA(q) model Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

4 MA(1) model Moving average models 2.1. Moving average model of order 1, MA(1) Definition A stochastic process (X t ) t Z is said to be a moving average model of order 1 if it satisfies the following equation : X t = µ + ɛ t θɛ t 1 t where θ 0, µ is a constant term, (ɛ t ) t Z is a weak white noise process with expectation zero and variance σ 2 ɛ (ɛ t WN(0, σ 2 ɛ )). Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

5 MA(1) model Remarks : 1. In lag notation, one has : X t = µ + Θ(L)ɛ t µ + (1 θl)ɛ t 2. The previous process can be written in mean-deviation as follows : where X t = ɛ t θɛ t 1 X t = X t µ. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

6 MA(1) model Remarks (cont d) : 3. The properties of (X t ) only depend on those of the weak white noise process (ɛ t ). To some extent, the behavior of (X t ) is more noisy relative to an AR(1) process Iterating on the past infinite (and with some regularity conditions), the infinite autoregressive representation writes : X t = µ 1 θ + θ k X t k + ɛ t. k=1 5. The infinite autoregressive representation illustrates the fact that a certain form of persistence is captured by a moving average model, especially when θ is close to one. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

7 MA(1) model Simulation of a moving average process of order 1 (θ = 0.9) Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

8 MA(1) model Scatter plots of a moving average process of order 1: Left panel (Xt-1 versus Xt) and right panel (Xt-2 versus Xt) X _(t-1) X_t X _ (t-2) X_t Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

9 MA(1) model Stationarity and invertibility conditions Since (ɛ t ) is a weak noise process, (X t ) is weakly stationary (by definition). The invertibility condition is the counterpart of the stability (stationary) condition of an AR(1) process : 1 If θ < 1, then (X t ) is invertible. 2 If θ = 1, then (X t ) is non invertible. 3 If θ > 1, there exists a non-causal invertible representation of (X t ) that we rule out. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

10 MA(1) model Alternatively, if θ < 1, then : (1 θl) 1 = and θ k L k k=0 ɛ t = (1 θl) 1 (X t µ) i.e. X t = µ 1 θ + θ k X t k + ɛ t k=1 1 This is the infinite autoregressive representation of a MA(1) process. 2 The MA(1) representation is then called the fundamental or causal representation. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

11 MA(1) model More formally... Definition The representation of the moving average process of order one defined by : X t = µ + ɛ t θɛ t 1, is said to be causal or fundamental (ɛ t ) is the innovation process if the root of the characteristic equation zθ(z 1 ) = 0 z θ = 0 lies outside the unit circle : z < 1 θ < 1. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

12 MA(1) model Remark : One can also use the inverse characteristic equation to find the invertibility condition : Θ(z) = 0 1 θz = 0. The condition writes (for a MA(1) process) : z > 1 θ < 1. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

13 Moments of a MA(1) MA(1) model Definition Let (X t ) be a stationary stochastic process that satisfies a (fundamental) MA(1) representation, X t = µ + ɛ t θɛ t 1. Then : E [X t ] = µ V [X t ] = (1 + θ 2 )σ 2 ɛ γ X (1) = θσɛ 2 γ X (h) = 0 for all h > 1 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

14 MA(1) model Definition Let (X t ) be a stationary stochastic process that satisfies a (fundamental) MA(1) representation, X t = µ + ɛ t θɛ t 1. Then, the autocorrelation function is given by : 1 if h = 0 ρ X (h) = θ if h = ±1 1+θ 2 0 otherwise. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

15 MA(1) model The autocorrelation function of a moving average process of order 1, MA(1), is always zero for orders higher than 1 ( h > 1) : MA(1) process has no memory beyond 1 period (see Scatter plots and autocorrelograms). This property generalizes to MA(p) processes. Partial autocorrelations : Nothing special, with the exception that it should decrease (possibly, with damped oscillations)! The partial autocorrelation function cannot help for characterizing a MA(1). Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

16 MA(1) model Correlograms of a moving average process of order one (θ = 0.9, 0.5, and 0.2) Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

17 MA(1) model 0.8 Correlogram of an AR(1) with phi=0.8 1 Correlogram of an AR(1) with phi= Correlogram of an MA(1) with theta=0.8 Correlogram of an MA(2) with theta=(0.4;0.3) Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

18 Estimation MA(1) model Estimation is more difficult since the ɛ t terms are not observed! Different techniques : 1 Conditional nonlinear least squares estimator 2 Maximum likelihood estimator 3 Generalized method of moments estimator. Without loss of generality, the constant term is omitted and the model is written as : X t = ɛ t + θɛ t 1. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

19 MA(1) model Nonlinear conditional least squares estimator The objective function of the ordinary least squares estimator is : ˆθ mco = argmin θ T (x t θɛ t 1 ) 2 t=2 Conditionally on ɛ 0, one has (backcasting procedure) : t 2 ɛ t 1 = ( θ) j x t 1 j + ( θ) t 1 ɛ 0 j=0 Suppose that ɛ 0 = 0, the nonlinear objective function (with respect to θ) writes : 2 T t 2 x t θ ( θ) j x t 1 j t=2 j=0 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

20 MA(1) model The conditional nonlinear least squares estimator of θ is defined by : ˆθ cnls = argmin θ T t 2 x t θ ( θ) j x t 1 j t=2 j=0 The asymptotic distribution is given by : T (ˆθ cnls θ) a.d. N (0, 1 θ 2 ) The effect of ɛ 0 = 0 dies out if T is sufficiently large. An alternative is to consider ɛ 0 as an unknown parameter. An estimator of σ 2 ɛ is : ˆσ 2 ɛ = 1 T 1 T (x t ˆθ cnls ɛ t 1 ) 2. t=2 2 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

21 MA(1) model Maximum likelihood estimator Two estimators : the conditional maximum likelihood estimator and the exact maximum likelihood estimator The conditional maximum likelihood estimator proceeds in the same way as the conditional nonlinear least squares estimator (backcasting procedure) : Suppose that ɛ t is a Gaussian White noise process For t = 1 : For t > 1 : ɛ 1 = x 1 θɛ 0 t 1 ɛ t = ( θ) j x t j + ( θ) t ɛ 0 j=0 Write the conditional likelihood function (with ɛ 0 = 0) and maximize with respect to θ and σ 2 ɛ. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

22 MA(1) model The exact maximum likelihood estimator can be calculated by two convenient algorithms : 1 The Kalman filter 2 The triangular factorization of the variance-covariance matrix of a MA(1) process In contrast to the conditional maximum likelihood estimator, the exact maximum likelihood estimator does not require that the moving average representation is invertible. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

23 MA(1) model The (Generalized) method of moments estimator A simple method of moments estimator... Consider the first two moments of a MA(1) : E [ Xt 2 ] = (1 + θ 2 )σɛ 2 and E [X t X t 1 ] = θσɛ 2 Using the empirical counterpart of these two moments conditions yields : T ( g T (x; θ, σɛ 2 ) = T 1 x 2 t σɛ 2 (1 + θ 2 ) ) x t x t 1 σ 2 t=1 ɛ θ ( T = 1 ) T t=1 x t 2 σɛ 2 (1 + θ 2 ) T 1 T t=2 x tx t 1 σɛ 2 θ Solving the exactly (just-) identified equation g T (x; ˆθ, ˆσ 2 ɛ ) = for ˆθ and ˆσ 2 ɛ (with sone regularity conditions...) gives the method of moment estimator. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

24 MA(1) model Estimation of moving average processes of order 1 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

25 Application of a counterfactual MA(1) 3. Application of a counterfactual MA(1) Effective Fed fund rate : 1970 : :01 (monthly observations) As to be expected from the (partial) autocorrelogram function (and thus theory!), a moving average model of order 1 is not probably the most appropriate model... However, it is interesting to compare it with the AR(1) specification of the effective Fed fund rate. The estimate of the constant term (respectively, θ) is (respectively, 0.941). Both estimates are statistically significant. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

26 Application of a counterfactual MA(1) Effective Fed fund rate---ma(1) model Residual Actual Fitted Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

27 Application of a counterfactual MA(1) Effective Fed fund rate---diagnostics of a MA(1) model 1.0 Autocorrelation Actual Theoretical Partial autocorrelation Actual Theoretical Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

28 Application of a counterfactual MA(1) Effective Fed fund rate---impulse response function of the estimated MA(1) model 2.5 Impulse Response ± 2 S.E Accumulated Response ± 2 S.E Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

29 Moving average model of order q, MA(q) Moving average models 4. Moving average model of order q, MA(q) Definition A stochastic process (X t ) t Z is said to be a moving average model of order q if it satisfies the following equation : X t = µ + ɛ t + θ 1 ɛ t θ q ɛ t q t = µ + Θ(L)ɛ t where θ q 0, µ is a constant term, (ɛ t ) t Z is a weak white noise process with expectation zero and variance σ 2 ɛ (ɛ t WN(0, σ 2 ɛ )), and Θ(L) = 1 + θ 1 L + + θ q L q. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

30 Moving average model of order q, MA(q) Remark : One can also use the notation X t = µ + ɛ t θ1ɛ t 1 θqɛ t q t = µ + Θ (L)ɛ t where θj = θ j for j = 1,, q θq 0 Θ (L) = 1 θ1l θql q ɛ t WN(0, σɛ 2 ). Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

31 Moving average model of order q, MA(q) Stationarity and invertibility conditions A MA(q) process is always weakly stationary irrespective of the moving average part A MA(q) process is invertible if all the roots of the characteristic equation z q Θ(z 1 ) = 0 are of modulus less than one : for i = 1,, q. z q + θ 1 z q 1 + θ 2 λ θ q = 0 z i < 1 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

32 Moving average model of order q, MA(q) Definition The representation of the moving average process of order q defined by : X t = µ + ɛ t + θ 1 ɛ t φ q ɛ t q, is said to be causal or fundamental (ɛ t ) is the innovation process all the roots of the characteristic equation z q Θ(z 1 ) = 0 are of modulus less than one : for i = 1,, q. z q + θ 1 z q 1 + θ 2 λ θ q = 0 z i < 1 Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

33 Moving average model of order q, MA(q) Moments of stationary MA(q) Mean and autocovariances Using the insights of a MA(1) model, one gets : E(X t ) = µ γ X (h) = σ 2 ɛ σ 2 ɛ ( ) q 1 + θi 2 if h = 0 ( i=1 ) q θ h + θ i θ i h if 1 h < q i=h+1 θ q σ 2 ɛ if h = q 0 if h > q Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

34 Moving average model of order q, MA(q) Autocorrelations ρ X (h) = 1 if h = 0 θ h + q i=h+1 θ i θ i h q 1+ θi 2 i=1 θ q q 1+ θi 2 i=1 0 if h > q if h = q if 1 h < q The autocorrelation function of a moving average process of order q, MA(q), is always zero for orders high than q ( h > q) : MA(q) process has no memory beyond q periods. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

35 Moving average model of order q, MA(q) Partial autocorrelations : Nothing special! The theoretical partial autocorrelation of an MA(q) dies out as h but is not exactly zero as an AR(p) process for h > p. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

36 5. Application Application of a MA(q) model Effective Fed fund rate : 1970 : :01 (monthly observations) As to be expected from the (partial) autocorrelogram function (and thus theory!), a moving average model of order q is probably not the most appropriate model... However, it is interesting for comparison purposes. In particular, increasing the number of lags augments the overall persistence at the expense of over-parameterizing the model... Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

37 Application of a MA(q) model ML estimation of the effective Fed fund rate : MA(4) and MA(8) Coefficients Estimates Std. Error P-value µ θ θ µ θ θ θ All parameters are statistically significant at 1%, with the exception of θ 8...All the roots are outside the unit circle. Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

38 Application of a MA(q) model Effective Fed fund rate---ma(4) model Residual Actual Fitted Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

39 Application of a MA(q) model MA(4) Effective Fed fund rate: diagnostics MA(8) A u t o c o r r e l a t i o n A u t o c o r r e la t io n Actual Theoretical Actual Theoretical P a r tia l a u to c o r r e la tio n P a r t i a l a u t o c o r r e l a t i o n Actual Theoretical Actual Theoretical Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

40 Application of a MA(q) model Effective Fed fund rate---impulse response functions MA(4) MA(8) 2.0 Impulse Response ± 2 S.E. 1.6 Impulse Response ± 2 S.E Accumulated Response ± 2 S.E. 10 Accumulated Response ± 2 S.E Florian Pelgrin (HEC) Univariate time series Sept Jan / 40

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Univariate linear models

Univariate linear models Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Chapter 4: Constrained estimators and tests in the multiple linear regression model (Part III)

Chapter 4: Constrained estimators and tests in the multiple linear regression model (Part III) Chapter 4: Constrained estimators and tests in the multiple linear regression model (Part III) Florian Pelgrin HEC September-December 2010 Florian Pelgrin (HEC) Constrained estimators September-December

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Minitab Project Report Assignment 3

Minitab Project Report Assignment 3 3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Multiresolution Models of Time Series

Multiresolution Models of Time Series Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong Modeling, Estimation and Control, for Telecommunication Networks Notes for the MGR-815 course 12 June 2010 School of Superior Technology Professor Zbigniew Dziong 1 Table of Contents Preface 5 1. Example

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, 61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during

More information

Modelling using ARMA processes

Modelling using ARMA processes Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28

More information

Time Series. Chapter Time Series Data

Time Series. Chapter Time Series Data Chapter 10 Time Series 10.1 Time Series Data The main difference between time series data and cross-sectional data is the temporal ordering. To emphasize the proper ordering of the observations, Table

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information