STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Size: px
Start display at page:

Download "STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)"

Transcription

1 STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

2 Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 2/ 25

3 Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 3/ 25

4 Homework Our TA, Aixin Tan, will have office hours on Thursdays from 1 2pm in 218 Griffin-Floyd. Homework 1c will be assigned today and the last part of homework 1, homework 1d, will be assigned on Friday. Homework 1 will be collected on Friday, September 7. Don t wait till the last minute to do the homework, because more homework will follow next week. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 4/ 25

5 Questions from Last Time We ve seen the AR(p) and MA(q) models. Which one do we use? In scientific modeling we wish to choose the model that best describes or explains the data. Later we will develop many techniques to help us choose and fit the best model for our data. Is this white noise process in the models unique? We will see later that any stationary time series can be described as a MA( ) + deterministic part by the Wold decomposition, where the white noise process in the MA( ) part is unique. So in short, the answer is yes. We will also see later how to estimate the white noise process which will aid in forecasting. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 5/ 25

6 Notational Disclaimer For convenience, we will follow the textbook s style of not distinguishing the random sequence (typically denoted with capital letters in statistics) with an observation of the random sequence (typically denoted with lower-case letters in statistics). We shall use {x t } in both situations and the distinction between the random and non-random cases will be clear from the context. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 6/ 25

7 Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 7/ 25

8 Joint Distribution The joint distribution contains all of the information about the time series. There is no feasible way to estimate the joint distribution without some strict assumptions. Definition (Joint Distribution Function (Joint CDF)) Given time points t 1, t 2,..., t n, the joint CDF of x t1, x t2,..., x tn, evaluated at constants c 1,..., c n, is given by F(c 1, c 2,..., c n ) = P (x t1 c 1, x t2 c 2,..., x tn c n )) Example (CDF of w t iid N (0, 1)) where F(c 1, c 2,..., c n ) = Φ(x) = 1 2π x n Φ(c t ) t=1 e z2 /2 dz Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 8/ 25

9 Univariate Distribution and Density We may be also interested in the marginal distribution function at a particular time t. Knowing all marginal distributions cannot give you the full joint distribution. One dimensional distribution function: F t (x) = P (x t x) with corresponding density function (if exists) f t (x) = F t(x) x Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 9/ 25

10 Mean Function Definition (Mean Function) The mean function of a time series {x t } is given by (if it exists) µ t = E(x t ) = Example (Mean of an MA(q)) Let w t WN(0, σ 2 ) and x f t (x) dx x t = w t + θ 1 w t 1 + θ 2 w t θ q w t q then µ t = E(x t ) = E(w t ) + θ 1 E(w t 1 ) + θ 2 E(w t 2 ) + + θ q E(w t q ) 0 (free of the time variable t) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 10/ 25

11 Mean Function Examples Example (Mean of a Random Walk with Drift) We saw before that if x t = δ + x t 1 + w t where x 0 = 0, then x t has the representation t x t = δt + j=1 and the mean function of x t is µ t = E(x t ) = δt Example (Mean of Signal + Noise Model) If x t = s t + w t where w t is a mean zero time series, then µ t = E(x t ) = s t w j Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 11/ 25

12 The Autocovariance Function Definition (Autocovariance Function) The autocovariance function of a general random sequence {x t } is So in particular, we have γ(s, t) = cov(x s, x t ) = E [(x s µ s ) (x t µ t )] var(x t ) = cov(x t, x t ) = γ(t, t) = E [ (x t µ t ) 2] Also note that γ(s, t) = γ(t, s) since cov(x s, x t ) = cov(x t, x s ). Example (Autocovariance of White Noise) Let w t WN(0, σ 2 ). By the definition of white noise, we have { σ 2, s = t γ(s, t) = E(w s, w t ) = 0, s t Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 12/ 25

13 Autocovariance Function of MA(1) Example (Autocovariance of MA(1)) Let w t WN(0, σ 2 ) and x t = w t + θ 1 w t 1, then γ(s, t) = cov(w t + θ 1 w t 1, w s + θ 1 w s 1 ) = cov(w t, w s ) + θ 1 cov(w t, w s 1 )+ + θ 1 cov(w t 1, w s ) + θ1 2 cov(w t 1, w s 1 ) If s = t, then γ(s, t) = γ(t, t) = σ 2 + θ1σ 2 2 = (θ )σ 2 If s = t 1 or s = t + 1, then γ(s, t) = γ(t, t + 1) = γ(t, t 1) = θ 1 σ 2 So all together we have (θ )σ2, ifs = t γ(s, t) = θ 1 σ 2, if s t = 1 0, else Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 13/ 25

14 Autocovariance Function of a Random Walk Example (Autocovariance of a Random Walk)) If x t = t j=1 w j, then s γ(s, t) = cov(x s, x t ) = cov w j, = s j=1 k=1 j=1 t k=1 w k t cov (w j, w k ) = min(s, t)σ 2 Note the dependence on s and t is not just a function of s t! (The random walk is not stationary.) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 14/ 25

15 The Autocorrelation Function Definition (Autocorrelation Function) The autocorrelation function (ACF) of a general random sequence {x t } is ρ(s, t) = γ(s, t) γ(s, s)γ(t, t) As is well know from the Cauchy-Schwartz inequality, the correlation of two random variables is bounded between -1 and 1. Hence ρ(s, t) 1. And ρ(s, t) = 1 implies an exact linear relationship between x s and x t. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 15/ 25

16 The Cross-Covariance and Cross-Correlation Functions The predictability of one series on another is me measured with the cross-covariance and cross-correlation functions. Definition (Cross-Covariance Function) The cross-covariance function of two general time series {x t } and {y t } is defined as ρ xy (s, t) = E [(x s µ x s)(y t µ yt )] Definition (Cross-Correlation Function (CCF)) The cross-correlation function (CCF) of two general time series {x t } and {y t } is defined as γ xy (s, t) ρ xy (s, t) = γxy (s, s)γ xy (t, t) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 16/ 25

17 Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 17/ 25

18 Strict Stationarity Definition (Strict Stationarity) A time series {x t } is strictly stationary if any collection {x t1, x t2,..., x tn } has the same joint distribution as the time shifted set Strict stationarity implies the following: {x t1 +h, x t2 +h,..., x tn+h} All marginal distributions are equal, i.e. P(x s c) = P(x t c) for all s, t, and c. The autocovariance function is shift-independent, i.e. γ(s, t) = γ(s + h, t + h). Strict stationarity is typically assumes too much. This leads us to the weaker assumption of weak stationarity. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 18/ 25

19 Weak Stationarity Definition (Weak Stationarity) A weakly stationary time series is a finite variance process that (i) has a mean function, µ t, that is constant (so it doesn t depend on t); (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference s t. From now on when we say stationary, we mean weakly stationary. Since the mean function is free of t, we will simply drop the t and write µ(= µ t ) Since γ(s, t) = γ(s + h, t + h), we have γ(s, t) = γ(s t, 0). So the autocovariance function can be thought of as a function of only one variable (h = s t). Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 19/ 25

20 Autocovariance and ACF of a Stationary Time Series Definition (Autocovariance Function of a Stationary Time Series) The autocovariance function of a stationary time series is γ(h) = E [(x t+h µ) (x t µ)] (for any value of t). Definition (Autocorrelation Function of a Stationary Time Series) The autocorrelation function of a stationary time series is ρ(h) = γ(t + h, t) = γ(h) γ(t + h, t + h)γ(t, t) γ(0) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 20/ 25

21 Jointly Stationary Time Series Definition (Jointly Stationary) Two time series, {x t } and {y t }, are jointly stationary if the are each stationary, and the cross-covariance function is only a function of the lag h, i.e. is the same for all t. γ xy (h) = E [(x t+h µ x )(y t µ y )] Definition (CCF of Jointly Stationary Time Series) The Cross-Correlation function of jointly stationary time series {x t } and {y t } is ρ xy (h) = γ xy (h) γx (0)γ y (0) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 21/ 25

22 Example Using Cross-Correlation Example (Prediction Using Cross-Correlation) Consider the model y t = Ax t l + w t where l is a positive integer. Then clearly one can use {x t } to predict y t. To be able to detect such a model, we consider the cross-correlation function. Assuming µ x = µ y = 0, we have γ xy (h) = E (y t+h x t ) = E [(Ax t+h l + w t+h )x t ] = E (Ax t+h l x t ) + E (w t+h x t ) = Aγ x (h l) So the cross-covariance function is a shifted and scaled version of the autocovariance function of the {x t } series. Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 22/ 25

23 Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 23/ 25

24 Textbook Reading Read the following sections from the textbook 1.6 (Estimation of Correlation) 1.7 (Vector-Valued and Multidimensional Series) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 24/ 25

25 Textbook Problems Do the following exercises from the textbook (Note: cos(a B) = sin A sin B + cos A cos B) Arthur Berg STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) 25/ 25

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

white noise Time moving average

white noise Time moving average 1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

STAT 153, FALL 2015 HOMEWORK 1 SOLUTIONS

STAT 153, FALL 2015 HOMEWORK 1 SOLUTIONS STAT 153, FALL 015 HOMEWORK 1 SOLUTIONS Problem 1 (a) A solution in R is provided in Figure 1. We conclude that if δ = 0 the process is a random walk without drift. When δ 0 there is a linear drift with

More information

Time Series Analysis. tsaez. Using the R Statistical Package.

Time Series Analysis. tsaez. Using the R Statistical Package. Time Series Analysis Using the R Statistical Package tsaez http://www.stat.pitt.edu/stoffer/tsa4/ Copyright 2018 by R.H. Shumway & D.S. Stoffer Published by free dog publishing live free or bark This work

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Time Series Analysis. tsaez. Using the R Statistical Package.

Time Series Analysis. tsaez. Using the R Statistical Package. Time Series Analysis Using the R Statistical Package tsaez http://www.stat.pitt.edu/stoffer/tsa4/ Copyright 2017 by R.H. Shumway & D.S. Stoffer Published by free dog publishing This work is licensed under

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Brooklyn College, CUNY. Lecture Notes. Christian Beneš

Brooklyn College, CUNY. Lecture Notes. Christian Beneš Brooklyn College, CUNY Math 4506 Time Series Lecture Notes Spring 2015 Christian Beneš cbenes@brooklyn.cuny.edu http://userhome.brooklyn.cuny.edu/cbenes/timeseries.html Math 4506 (Spring 2015) January

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Homework 4. 1 Data analysis problems

Homework 4. 1 Data analysis problems Homework 4 1 Data analysis problems This week we will be analyzing a number of data sets. We are going to build ARIMA models using the steps outlined in class. It is also a good idea to read section 3.8

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Homework 2 Solutions

Homework 2 Solutions Math 506 Spring 201 Homework 2 Solutions 1. Textbook Problem 2.7: Since {Y t } is stationary, E[Y t ] = µ Y for all t and {Y t } has an autocovariance function γ Y. Therefore, (a) E[W t ] = E[Y t Y t 1

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

ECE534, Spring 2018: Solutions for Problem Set #3

ECE534, Spring 2018: Solutions for Problem Set #3 ECE534, Spring 08: Solutions for Problem Set #3 Jointly Gaussian Random Variables and MMSE Estimation Suppose that X, Y are jointly Gaussian random variables with µ X = µ Y = 0 and σ X = σ Y = Let their

More information

STA 6857 Cross Spectra & Linear Filters ( 4.6 & 4.7)

STA 6857 Cross Spectra & Linear Filters ( 4.6 & 4.7) STA 6857 Cross Spectra & Linear Filters ( 4.6 & 4.7) Outline 1 Midterm Exam Solutions 2 Final Project 3 From Last Time 4 Multiple Series and Cross-Spectra 5 Linear Filters Arthur Berg STA 6857 Cross Spectra

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information