X t = a t + r t, (7.1)

Size: px
Start display at page:

Download "X t = a t + r t, (7.1)"

Transcription

1 Chapter 7 State Space Models 71 Introduction State Space models, developed over the past years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical Decomposition Model of Chapter 2 as special cases, but go well beyond both They are important because (i) they provide a rich family of naturally interpretable models for data, and (ii) they lead to highly efficient estimation and forecasting algorithms, through the Kalman recursions see 73 They are widely used and, perhaps in consequence, are known under several different names: structural models (econometrics), dynamic linear models (Statistics), Bayesian forecasting models (Statistics), linear system models (engineering), Kalman filtering models (control engineering), The essential idea is that behind the observed time series X t there is an underlying process S t which itself is evolving through time in a way that reflects the structure of the system being observed 72 The Model Example 1: The Random Walk plus Noise Model For a series X t with trend, but no seasonal or cyclic variation the Classical Decomposition of 23 is based on X t = a t + r t, (71) where a t represents deterministic trend the underlying general level or signal at t and r t represents random variation or noise To make this into a more precisely specified model we might suppose that r t is white noise WN(0, σ 2 r) Also, instead of supposing that a t is deterministic we could take it to be random as well, but not changing much over time A way of representing this would be to suppose a t = a t 1 + η t, (72) 76

2 where η t is white noise WN(0, σ 2 η), uncorrelated with the r t Equations (71) and (72) together define the Random Walk plus Noise model, also known as the Local Level model A realization from this model, with a 0 = 0 and σ 2 r = 6, σ 2 η = 3, is given below Realization from Local Level Model Time The model might be suitable for an industrial process, with process level a t intended to be within design limits, but not directly observable itself The sizes of σ 2 r and σ 2 η will determine local and large-scale smoothness respectively Review Question: How will σ 2 r and σ 2 η affect local and large-scale smoothness? How would the graph of a process for which σ 2 r/σ 2 η is large differ from that of one for which it is small? Example 2: A Seasonal Model with Noise In the Classical Decomposition Method the seasonal/cyclic components s t with a period c say were taken to be numbers which repeated themselves every c time units and summed to 0 over the c seasons : that is, for each t, c s t+c = s t, s t+j = 0 for each t j=1 Given any c 1 values s 1,, s c 1, we can generate such a sequence by setting and s c = s 1 s c 1 (73) s t+c = s t for all t = 1, 2, (74) The result is a pattern exactly reproducing itself every c time units Note that (73) and (74) together amount to saying that s t can be found successively from s t = s t c+1 s t c+2 s t 1, t = c, c + 1, (75) 77

3 We can introduce some variability into the seasonal pattern by adding a white noise perturbation to (75): s t = s t c+1 s t c+2 s t 1 + η t, where η t is, say, WN(0, σ 2 η) On average (in expectation) the s t s will still sum to 0 over the seasons, but individual variability is now possible If, as in the previous example, we suppose that actual observations on the process are subject to error, we get X t = s t + r t (76) c 1 s t = s t j + η t, (77) j=1 where the observation error r t might be taken to be WN(0, σ 2 r) The two equations (76) and (77) define the model A realization with c = 12 is shown below Realization from Seasonal Model Time in Cycles If we write s t,, s t c+2 as a vector, S t say, then (77) can be written in matrix form as s t s t 1 η t s t s t 2 0 S t = = , that is, s t c+2 s t c+1 S t = F S t 1 + V t, (78) 78

4 say, where F is the (c 1) (c 1) matrix above and V t is the random vector V t = (η t, 0,, 0) The other equation defining the model, (76), may be written in terms of S t as X t = (1, 0,, 0)S t + r t (79) General Form of State Space Model In the examples each model consists of two parts: an underlying process m t or S t, called in general the state process of the system, whose evolution is governed by one equation ((72) and (78) respectively), and another process the observed time series X t itself called in general the observation process, which is related to the state process by another equation ((71) and (79) respectively) In general a state space model consists of a pair of random quantities X t and S t whose evolution and relationship are described by the equations: X t = G t S t + ɛ t (710) S t = F t S t 1 + V t (711) where S t denotes the state at time t, G t and F t are known matrices, possibly depending on time, ɛ t is WN(0, σ 2 ɛ ), V t is a vector of white noise processes, each uncorrelated with the ɛ t process Equation (710) is called the observation equation, and equation (711) the state or system equation The component white noise processes of V t may be correlated with each other, though components of V t and V s for t s are taken to be uncorrelated We use the notation V t WN(0, {Q t }) to mean that the random vector V t consists of univariate WN components (so that it has mean 0) and has variance-covariance matrix Q t, that is: E(V t V t) = Q t In Example 1 S t can be identified directly with a t, and ɛ t with r t G t and F t are both equal to the degenerate 1-dimensional unit matrix, and V t is the 1-dimensional vector with component η t The covariance matrix Q t is therefore 79

5 simply σ 2 η In Example 2 the matrix G t = (1, 0,, 0) as in (79), ɛ t is r t, the matrix F t and the vector V t are as in (78), and Q t is the (c 1) (c 1) matrix with all entries zero except the top left hand one, which is equal to ση 2 Example 3: AR(1) Model The stationary AR(1) process given by X t = αx t 1 + η t (712) is another example of a state space model Identify the state S t with X t itself, so that the state equation can be taken to be (712) if we set F = α and V t = η t ; and the observation equation is just X t = S t, which has the form (710) with G = 1 and ɛ = 0 Notes (a) By iterating the state equation (711) we get S t = F t S t 1 + V t = F t (F t 1 S t 2 + V t 1 ) + V t = (F t F t 1 F 2 )S 1 + (F t F 3 )V F t V t 1 + V t (713) = f t (S 1, V 2,, V t ) for a function f t From the observation equation therefore X t = G t f t (S 1, ) + ɛ t = g t (S 1, V 2,, V t, ɛ t ) for a function g t Thus the process is driven (through the G t and F t ) by the white noise terms and the initial state S 1 (b) It turns out to be possible to put a large number of time series models including, for example, all ARIMA models into a state space form An advantage of doing so is that the state equation gives a simple way of analysing the process S t, and from that it is easy via the observation equation to find out about the observation process X t If S 1 and V 2,, V t are independent (as opposed to just being uncorrelated) then S t has the Markov property, that is, the distribution of S t given S t 1, S t 2,, S 1 is the same as the distribution of S t given S t 1 alone 73 The Kalman Recursions 731 Filtering, Prediction and Smoothing In state space models the state is generally the aspect of greatest interest, but it is not usually observed directly What are observed are the X t s So we d like to have methods for estimating S t from the observations Three scenarios are: 80

6 Prediction Problem Estimate S t from X t 1, X t 2, Filtering Problem Estimate S t from X t, X t 1, Smoothing Problem Estimate S t from X n, X n 1,, where n > t A further problem, which turns out to have an answer useful for other things too, is X-Prediction Problem Estimate X t from X t 1, X t 2, 732 General Approach Note that equation (713) above shows that S t and X t are linear combinations of the initial state S 1 and the white noise processes V t and ɛ t If these are Gaussian, then both S t and X t will be Gaussian too for every t and their distributions will be completely determined by their means and covariances Thus the whole evolution of the model will be known if the means and covariances can be calculated The Kalman recursions give a highly efficient way of computing these means and covariances by building them up successively from earlier values The recursions lead to algorithms for the problems above and for fitting the models to data They are an enormously powerful tool for handling a wide range of time series models The basis of the Kalman recursions is the following simple result about multivariate Normal distributions 733 Conditioning in a Multivariate Normal Distribution Let Z and W denote random vectors with Normal distributions and with covariance matrix Z N (µ z, Σ zz ) W N (µ w, Σ ww ) E((Z µ z )(W µ w ) ) = Σ zw, so that the distribution of the vector obtained by stacking Z on W is Then ( Z W ) N (( µz µ w ) ( Σzz Σ, zw Σ wz Σ ww )) Z W N ( µ z + Σ zw Σ 1 ww(w µ w ), Σ zz Σ zw Σ 1 wwσ wz ) (714) For a proof, write down the ratio of the probability densities of (Z, W ) and of W and complete the square in the exponent term 734 The Recursions Suppose after data D t 1 = {X 1,, X t 1 } have been observed we know by some means that the state S t 1 has mean m t 1 and covariance matrix P t 1, so that S t 1 D t 1 N (m t 1, P t 1 ) 81

7 The recursions are built on relating the distributions of to this (a) S t D t 1, and (b) S t {X t, D t 1 } For (a), because S t is related to S t 1 through the state equation (711) (S t = F t S t 1 + V t ), it follows that, given D t 1, S t is also Normally distributed and its mean vector and covariance matrix are and S t D t 1 N (m t t 1, P t t 1 ), (715) m t t 1 = E(S t D t 1 ) = E(F t S t 1 D t 1 ) + E(V t D t 1 ) = F t m t 1 (716) P t t 1 = E ( (S t m t t 1 )(S t m t t 1 ) D t 1 ) = Ft P t 1 F t + Q t (717) For (b), from (715) and the observation equation (710), (X t = G t S t + ɛ t ) we find E(X t D t 1 ) = G t m t t 1, (718) so that and hence Similarly X t E(X t D t 1 ) = G t (S t m t t 1 ) + ɛ t Var(X t D t 1 ) = G t P t t 1 G t + σ 2 ɛ (719) Cov(X t, S t D t 1 ) = G t P t t 1, Cov(S t, X t D t 1 ) = P t t 1 G t Thus, given the data D t 1, S t and X t have the joint distribution ( ) (( ) ( St X t D mt t 1 Pt t 1 P t 1 N, t t 1 G t G t m t t 1 G t P t t 1 G t P t t 1 G t + σɛ 2 )) It follows from the result in 733 that the conditional distribution of S t given the new observation X t in addition to D t 1 (that is, the distribution of S t D t ) is where S t {X t, D t 1 } N (m t, P t ), m t = m t t 1 + P t t 1 G tvar(x t D t 1 ) 1 (X t G t m t t 1 ) (720) P t = P t t 1 P t t 1 G tvar(x t D t 1 ) 1 G t P t t 1 (721) The three equations (719), (720) and (721) called the updating equations together with (716) and (717) called the prediction equations are collectively referred to as the Kalman Filter equations Given starting values m 0 and P 0 they can be applied successively to calculate the distribution of the state vector as each new observation becomes available At any time they give values which contain all the information needed to make optimal predictions of future values of both the state and the observations, as follows 82

8 735 The Prediction and Filtering Problems By the general result about minimum mean square error forecasts in 61, the conditional mean of S t given D t 1, m t t 1, is the minimum mean square error estimate of the state S t given observations up to and including time t 1 The covariance matrix P t t 1 gives the estimation error variances and covariances Thus the prediction equations (716) and (717) give the means to solve the Prediction Problem of 731 In the same way the conditional mean of S t given D t, m t, is the solution to the Filtering Problem of 731, and the variances and covariances of the error in estimating S t by m t are given by P t 736 The X-Prediction Problem The minimum mean square error forecast of X t given observations up to time t 1, that is, given D t 1, is simply X t = G t m t t 1, by (718) The prediction error, which we will denote by e t, is therefore e t = X t X t = X t G t m t t 1 = G t (S t m t t 1 ) + ɛ t e t is also known as the innovation at time t since it consists of the new information in the observation at t From the updating equation (720) it can be seen that the innovations play a key part in the updating of the estimate of S t 1 to S t The further e t is from the zero vector, the greater the correction in the estimator of S t 1 The innovations have means E(e t ) = 0, and variances, which we will denote by φ t, given by φ t = Var(e t ) = E(X t G t m t t 1 ) 2 = G t P t t 1 G t + σ 2 ɛ, (722) from (719) and (721) The φ t can be calculated straightforwardly from the Kalman filter equations 737 Likelihood The likelihood function, L say, for any model is the probability density (or probability in the discrete case) of the observed data, taken as a function of the unknown parameters, θ say For a state space model therefore, if data X 1 = x 1,, X t = x t have been observed, and if p is the joint probability density function of X 1,, X t, L(θ : x) = p (x : θ) = p (x 1 θ) t p(x s D s 1 : θ) where the density function p(x s D s 1 ) is that of the Normal distribution (of X s given D s 1 ) with mean E(X s D s 1 ) = X s = G s m s s 1 and variance φ s given by (722) 83

9 Thus log L = const + log p(x 1 θ) 1 2 = const + log p(x 1 θ) 1 2 t log φ s 1 2 t log φ s 1 2 t (x s X s ) 2 which is easily calculated from the innovations and their variances, and p (x 1 θ) if necessary Standard methods of numerical maximization may then be used to estimate the unknown parameters θ This is the approach described in 531 t e 2 s φ s, φ s Summary of Ideas in Chapter 6 State space models specified by state variables observation variables and by state equations describing the evolution of states, and observation equations describing the relationship of observations to states Special cases include the ARIMA models studied in earlier chapters, and models underlying the Decomposition Method of 23 of Chapter 2 Various problems in relation to forecasting in state space models may be specified: prediction filtering smoothing X-prediction Solutions are based on the fact that if variables are Gaussian, then to describe the whole evolution of the system all that s needed are the means and variances-covariances of the variables through time These can be calculated recursively by the Kalman recursions The Kalman recursions yield immediately forecasts and their error variances efficient computation of the likelihood function, and therefore a powerful way of fitting models 84

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

The Kalman Filter. An Algorithm for Dealing with Uncertainty. Steven Janke. May Steven Janke (Seminar) The Kalman Filter May / 29

The Kalman Filter. An Algorithm for Dealing with Uncertainty. Steven Janke. May Steven Janke (Seminar) The Kalman Filter May / 29 The Kalman Filter An Algorithm for Dealing with Uncertainty Steven Janke May 2011 Steven Janke (Seminar) The Kalman Filter May 2011 1 / 29 Autonomous Robots Steven Janke (Seminar) The Kalman Filter May

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t. ECO 513 Fall 2008 C.Sims KALMAN FILTER Model in the form 1. THE KALMAN FILTER Plant equation : s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. Var(ε t ) = Ω, Var(ν t ) = Ξ. ε t ν t and (ε t,

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

STRUCTURAL TIME-SERIES MODELLING

STRUCTURAL TIME-SERIES MODELLING 1: Structural Time-Series Modelling STRUCTURAL TIME-SERIES MODELLING Prajneshu Indian Agricultural Statistics Research Institute, New Delhi-11001 1. Introduction. ARIMA time-series methodology is widely

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Part I State space models

Part I State space models Part I State space models 1 Introduction to state space time series analysis James Durbin Department of Statistics, London School of Economics and Political Science Abstract The paper presents a broad

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Open Economy Macroeconomics: Theory, methods and applications

Open Economy Macroeconomics: Theory, methods and applications Open Economy Macroeconomics: Theory, methods and applications Lecture 4: The state space representation and the Kalman Filter Hernán D. Seoane UC3M January, 2016 Today s lecture State space representation

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Statistics Homework #4

Statistics Homework #4 Statistics 910 1 Homework #4 Chapter 6, Shumway and Stoffer These are outlines of the solutions. If you would like to fill in other details, please come see me during office hours. 6.1 State-space representation

More information

CS281A/Stat241A Lecture 17

CS281A/Stat241A Lecture 17 CS281A/Stat241A Lecture 17 p. 1/4 CS281A/Stat241A Lecture 17 Factor Analysis and State Space Models Peter Bartlett CS281A/Stat241A Lecture 17 p. 2/4 Key ideas of this lecture Factor Analysis. Recall: Gaussian

More information

CS229 Lecture notes. Andrew Ng

CS229 Lecture notes. Andrew Ng CS229 Lecture notes Andrew Ng Part X Factor analysis When we have data x (i) R n that comes from a mixture of several Gaussians, the EM algorithm can be applied to fit a mixture model. In this setting,

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Kalman Filtering. Namrata Vaswani. March 29, Kalman Filter as a causal MMSE estimator

Kalman Filtering. Namrata Vaswani. March 29, Kalman Filter as a causal MMSE estimator Kalman Filtering Namrata Vaswani March 29, 2018 Notes are based on Vincent Poor s book. 1 Kalman Filter as a causal MMSE estimator Consider the following state space model (signal and observation model).

More information

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER KRISTOFFER P. NIMARK The Kalman Filter We will be concerned with state space systems of the form X t = A t X t 1 + C t u t 0.1 Z t

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Time-Varying Parameters

Time-Varying Parameters Kalman Filter and state-space models: time-varying parameter models; models with unobservable variables; basic tool: Kalman filter; implementation is task-specific. y t = x t β t + e t (1) β t = µ + Fβ

More information

Introduction to Probabilistic Graphical Models: Exercises

Introduction to Probabilistic Graphical Models: Exercises Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics

More information

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel Lecture Notes 4 Vector Detection and Estimation Vector Detection Reconstruction Problem Detection for Vector AGN Channel Vector Linear Estimation Linear Innovation Sequence Kalman Filter EE 278B: Random

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Monitoring and data filtering II. Dan Jensen IPH, KU

Monitoring and data filtering II. Dan Jensen IPH, KU Monitoring and data filtering II Dan Jensen IPH, KU Outline Introduction to Dynamic Linear Models (DLM) - Conceptual introduction - Difference between the Classical methods and DLM - A very simple DLM

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Factor Analysis and Kalman Filtering (11/2/04)

Factor Analysis and Kalman Filtering (11/2/04) CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit Evan Kwiatkowski, Jan Mandel University of Colorado Denver December 11, 2014 OUTLINE 2 Data Assimilation Bayesian Estimation

More information

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter D. Richard Brown III Worcester Polytechnic Institute 09-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 09-Apr-2009 1 /

More information

Research Division Federal Reserve Bank of St. Louis Working Paper Series

Research Division Federal Reserve Bank of St. Louis Working Paper Series Research Division Federal Reserve Bank of St Louis Working Paper Series Kalman Filtering with Truncated Normal State Variables for Bayesian Estimation of Macroeconomic Models Michael Dueker Working Paper

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Class: Trend-Cycle Decomposition

Class: Trend-Cycle Decomposition Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

Lecture Note 1: Probability Theory and Statistics

Lecture Note 1: Probability Theory and Statistics Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 1: Probability Theory and Statistics Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 For this and all future notes, if you would

More information

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees Master Degree in Data Science Particle Filtering for Nonlinear State Space Models Hans-Peter Höllwirth Director: Dr. Christian Brownlees June 2017 ABSTRACT IN ENGLISH : This paper studies a novel particle

More information

A Practical Guide to State Space Modeling

A Practical Guide to State Space Modeling A Practical Guide to State Space Modeling Jin-Lung Lin Institute of Economics, Academia Sinica Department of Economics, National Chengchi University March 006 1 1 Introduction State Space Model (SSM) has

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Generalized Autoregressive Score Smoothers

Generalized Autoregressive Score Smoothers Generalized Autoregressive Score Smoothers Giuseppe Buccheri 1, Giacomo Bormetti 2, Fulvio Corsi 3,4, and Fabrizio Lillo 2 1 Scuola Normale Superiore, Italy 2 University of Bologna, Italy 3 University

More information

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Lessons in Estimation Theory for Signal Processing, Communications, and Control Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL

More information

Predictive spatio-temporal models for spatially sparse environmental data. Umeå University

Predictive spatio-temporal models for spatially sparse environmental data. Umeå University Seminar p.1/28 Predictive spatio-temporal models for spatially sparse environmental data Xavier de Luna and Marc G. Genton xavier.deluna@stat.umu.se and genton@stat.ncsu.edu http://www.stat.umu.se/egna/xdl/index.html

More information

Chapter 17: Undirected Graphical Models

Chapter 17: Undirected Graphical Models Chapter 17: Undirected Graphical Models The Elements of Statistical Learning Biaobin Jiang Department of Biological Sciences Purdue University bjiang@purdue.edu October 30, 2014 Biaobin Jiang (Purdue)

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 7(extra): (Generalized) Hidden Markov Models Henrik Madsen Lecture Notes September 2016 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT (Econometrica, Vol. 80, No. 4, July 2012, )

SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT (Econometrica, Vol. 80, No. 4, July 2012, ) Econometrica Supplementary Material SUPPLEMENT TO TESTING FOR REGIME SWITCHING: A COMMENT Econometrica, Vol. 80, No. 4, July 01, 1809 181 BY ANDREW V. CARTER ANDDOUGLAS G. STEIGERWALD We present proof

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e. Kalman Filter Localization Bayes Filter Reminder Prediction Correction Gaussians p(x) ~ N(µ,σ 2 ) : Properties of Gaussians Univariate p(x) = 1 1 2πσ e 2 (x µ) 2 σ 2 µ Univariate -σ σ Multivariate µ Multivariate

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Bayesian Dynamic Linear Modelling for. Complex Computer Models

Bayesian Dynamic Linear Modelling for. Complex Computer Models Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

ENGR352 Problem Set 02

ENGR352 Problem Set 02 engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics 1. What are probabilistic graphical models (PGMs) 2. Use of PGMs Engineering and AI 3. Directionality in

More information

Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models

Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models Eric Zivot April 26, 2010 Outline Likehood of SV Models Survey of Estimation Techniques for SV Models GMM Estimation

More information

Multivariate GARCH models.

Multivariate GARCH models. Multivariate GARCH models. Financial market volatility moves together over time across assets and markets. Recognizing this commonality through a multivariate modeling framework leads to obvious gains

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München Motivation Bayes filter is a useful tool for state

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model 5 Kalman filters 5.1 Scalar Kalman filter 5.1.1 Signal model System model {Y (n)} is an unobservable sequence which is described by the following state or system equation: Y (n) = h(n)y (n 1) + Z(n), n

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

You must continuously work on this project over the course of four weeks.

You must continuously work on this project over the course of four weeks. The project Five project topics are described below. You should choose one the projects. Maximum of two people per project is allowed. If two people are working on a topic they are expected to do double

More information