1. Fundamental concepts

Size: px
Start display at page:

Download "1. Fundamental concepts"

Transcription

1 . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing and financial mathematics. Examples of time series: the daily closing value of the Dow Jones index, the annual flow volume of the River Nile at Aswan, daily air temperature or monthly precipitation in a specific location, the annual yield of corn in Iowa, the size of an organism, measured daily,

2 annual U.S. population data, daily closing stock prices, weekly interest rates, national income, sales figures, innumerable other sequences based on industrial, economic, and social phenomena, and studies in medicine, geophysics, and engineering. 2

3 Time series analysis is carried out for several major objectives: (a) To describe and explain general characteristics of the series, for example to separate the series into components representing trend (long term movements), seasonality (periodic movements due to seasonal variation), and random fluctuations; (b) To forecast future values using some mathematical model; (c) To control the series, making sure that it remains within in certain bounds. Needed to carry out these tasks is a model for the series, which includes estimating model parameters. 3

4 Time series methods may be divided into two main classes: (a) frequency domain methods (spectral analysis to examine cyclic behavior which need not be related to seasonality) (b) time domain methods (autocorrelation and cross-correlation analysis to examine dependence over time) Although the two methods are mathematically equivalent in the sense that the autocorrelation function and the spectrum function form a Fourier transform pair, there are occasions when one approach is more advantageous than the other. 4

5 Time series data have a natural temporal ordering. This makes it distinct from common data problems, where there is no natural ordering of the observations, and from spatial data analysis, where the observations typically relate to geographic locations. A time series model will generally reflect the fact that observations close together in time will be more closely related than observations further apart. In addition, time series models will often make use of the natural one-way ordering of time so that values for a given period will be expressed as being derived in some way from past values, rather than future values. 5

6 Models for time series data can have many forms. When modeling variation in the level of the process, three broad classes of practical importance are autoregressive (AR) models, integrated (I) models, moving average (MA) models. These three classes depend linearly on previous data points. Combinations of these ideas produce autoregressive moving average (ARMA) and autoregressive integrated moving average (ARIMA) models. Extension of these classes to deal with vector-valued data is available. 6

7 Among non-linear models are models to represent changes of variance (heteroskedasticity) through time. These models are called autoregressive conditional heteroskedasticity (ARCH) and the collection comprises a wide variety of representations. Here, changes in variability are related to, or predicted by, recent past values of the observed series. After reviewing probability spaces, we give a more formal definition of time series and properties of basic tools for their analysis. 7

8 . Probability spaces Let Ω be the set of all possible outcomes of an experiment, and ω Ω the elementary events. Let ϒ be a collection of subsets of Ω, and A ϒ. If ω A occurs, then we say that A occurs. We define a function PA ( ) that gives the long-run relative frequency with which A will occur. P satisfies the axioms: AXIOM : PA ( ) 0 for every A ϒ. AXIOM 2: P( Ω ) = AXIOM 3: If A, A 2,, is a countable sequence from ϒ and Ai Aj = for all i j, then P ( A ) i = PA ( ) i= i. i= 8

9 Example..: We roll a fair die and record the number of dots on the side that shows. Then Ω= {, 2,3, 4,5,6}. We take PA ( ) = /6 η( A), where η ( A) is the number of elements ω in A. Let ϒ be the collection of all possible subsets of Ω. Then since there are a nonnegative number of elementary events in A ϒ, PA ( ) 0. P( Ω ) = ( / 6) 6 =. Also, a disjoint union k A i= i has exactly k η( Ai ) subsets, and thus i= ( k ) k i = i= i= P A ( / 6) η( A ) k k i= i i= i. = (/6) η( A) = PA ( ) i 9

10 So that we can define PA ( ) for all A, we restrict the collection ϒ of subsets of Ω so that it satisfies the following:. If A ϒ then C A ϒ 2. If A, A 2,, is a countable sequence from ϒ, then 3. ϒ A is in ϒ i= i ϒ is called a σ -field or σ -algebra. ( Ωϒ,,P) is called a probability space. A random variable X is a real valued function on Ω ( X : Ω ) such that the set { ω: X( ω) x} is in ϒ for every real number x. FX ( x) = P{ ω: X( ω) x} is called the distribution function of the random variable X. 0

11 .2 Defining Time Series Let (,,P) Ωϒ be a probability space, and T an index set. A real valued stochastic process is a real valued function X t ( ω ) such that for each fixed t T, X t ( ω ) is a random Ωϒ,,P. variable on ( ) When the index set T corresponds to time indices (discrete or continuous), we will call X ( ω ) a time series. t For fixed ω, ( ) t X ω is a real valued function of t that is called a realization of the time series.

12 When we look at the plot of a recorded time series, we are looking at one realization of the collection (ensemble) of all such realizations. We typically suppress the ω and write X() t. A time series is strictly stationary if the joint distribution function satisfies F ( x,, x ) = F ( x,, x ) X,, X n X,, X n t tn t+ h tn+ h for all possible (nonempty finite distinct) sets of indices t,, t n and t + τ,, t n + τ in T and all ( x,, x n ) in the range of X t. 2

13 If { X t } is strictly stationary with { } t E X <, then the expectation of X t is constant for all t, and if in addition E{ X 2 } t < then the variance of X t is constant for all t as well. A time series { X t } is second-order stationary or weakly stationary or simply stationary if. The expectation of X t exists and is constant for all t. 2. The covariance matrix of X,, t X tn exists and is the same as the covariance matrix of Xt,, + τ Xt n + τ for all nonempty finite sets of indices ( t,, tn ) and all τ such that t,, t n and t + τ,, t n + τ are in T. 3

14 Since E( X t ) = µ, without loss of generality it may be taken to be zero in analyses. The autocovariance function γ h EXX ( t t+ h) of a weakly stationary process depends only on the distance h. Not all time series are stationary. For example, an economic time series may show a trend, i.e. it may be increasing through time in the mean, variance, or both. 4

15 This series appears to be stationary in the mean as it varies about a fixed level. 5

16 This series does not vary about a fixed level and exhibits an overall upward trend instead. Morever, the variance of the series increases as the level of the series increases. The series is nonstationary. 6

17 This quarterly series is repetitive in nature due to seasonal variations. This is an example of a seasonal time series. This and the last example may be reduced to stationary series by a proper transformation. 7

18 We will consider basic properties of a general class of models, called ARIMA (autoregressive integrated moving average) models that are useful for describing stationary, nonstationary, seasonal and nonseasonal time series such as those shown in the last three figures. 8

19 This plot displays another phenomenon of nonstationarity due to a change in structure in the series from some external disturbance. Such extenrnal disturbances are termed interventions or outliers. This type of nonstationarity cannot be removed by a standard transformation. 9

20 .3 More examples of time series Example.3. An important time series is a white noise sequence of uncorrelated random variables { et : t = 0, ±, ± 2, }, where each e t has mean zero and finite variance 2 σ > 0. Its autocovariance function is γ h σ 2, h = 0 =. 0 otherwise 20

21 Example.3.2 Let Xt = Acos( λt) + Bsin( λt) + et, 2 ABe,, t iid... N(0, σ ). Then E( X t ) = 0 t and ( λ ) λ( ) ( ) 2 EXX ( t t+ h) = E Acos tcos t+ h ( ) ( ) ( ) [ ] + E B 2 sin λt sin λ t+ h + E ee t t + h ( λh) 2 σ h> cos 0 =. 2 2 σ h = 0 (Use the trigonometric identity ( A B) ( A) ( B) ( A) ( B) cos = cos cos + sin sin ). Since EXX ( t t + h) does not depend on t, the process is wide-sense stationary. 2

22 Example.3.3 Let the covariance of the X : t, be given by process { ( )} t γ {( µ )( µ )} h h= E X t X t h Ae α + =, A, α > 0 (time in hours). 2 The covariance between daily reports is thus 576α Ae. Also, the covariance between the change from 7:00 a.m. to 8:00 a.m. and the change from 8:00 a.m. to 9:00 a.m. is given by E{ ( Xt+ Xt)( Xt+ 2 Xt+ ) } {[( t+ µ ) ( t µ )][( t+ 2 µ ) ( t+ µ )]} = E{ [( X ] t+ µ )( Xt+ 2 µ ) ( Xt+ µ ) } = E X X X X [( µ )( µ )] [( µ )( µ )] E X X + E X X t t+ 2 t t+ = A + e e 4α α [ 2 ] 22

23 The variance of the change from t to t + h is { } h Var ( X t h X t) = 2 A( e α + ). Thus, Var{ X X } lim h 0 ( t+ h t) = 0. Such processes are called mean square continuous. The rate of change per unit of time, defined by 2 Xt+ h Xt Rt ( h) = h has variance { ( )} Var R h t 2 h 2A α e =. 2 h 23

24 By L Hospital s rule, αh 4α hae lim h 0Var { Rt ( h) } = limh 0 = 2Aα 2h 2 Stochastic processes for which this limit exists are called mean square differentiable. 24

25 .4 Properties of the autovariance and autocorrelation functions It is often useful to have a measure of association that doesn t depend on the units of measure (as is the case with correlation as opposed to covariance). We define the autocorrelation function of a stationary time series by h γ γ h =. 0 The autocovariance and autocorrelation functions of a stationary time series have some distinguishing characteristics. 25

26 Theorem The autocovariance function of a real stationary time series { Xt, t T} is an even function of t, i.e. γh = γ h. Proof. Without loss of generality, let E X =. By stationarity, { } 0 t { } E Xt X t+ h = γ h for all t and t + h contained in the index set. Now set t = t h. Then 0 γ { } { } = E X X = E X X = γ. h t t + h t h t h

27 Definition A function f( x ), x χ is said to be positive semi-definite (or nonnegative definite) if it satisfies n n j= k= aa f( t t) 0 j k k j for any set of real numbers ( a,, an ) for any ( t,, tn ) such that t k t j χ ( jk., ) and for all Theorem The autocovariance function of a stationary time series { Xt, t T} is positive semi-definite. 27

28 Proof. Without loss of generality, we assume that E{ X t} = 0. Let ( t,, tn ) T and (,, n a an ). Let γ tk t be the j autocovariance function at between observations at times t j and t k. Since the variance of any random variable is nonnegative, we have 0 n n n Var a X = E a a X X j t j k t t j= j= k= j j k n n =. j= k= aaγ j k t t k j 28

29 Now, set n = 2. Then aγ0+ a2γ0+ 2aa 2γt2 t, which implies that 2 2 aa 2γ t t a 2 + a2. γ 0 2 If t t2 = h and a = a2 =, this implies that h. If t t 2 = h and a = a2 =, then we have that h, i.e. h. Thus. h 29

30 .5 The partial autocorrelation function In addition to the autocorrelation between X t and X t + k, we may want to investigate the correlation between X t and X t + k after their mutual linear dependency on the intervening variables Xt+, Xt+ 2,, Xt+ k has been removed., without For the stationary time series { X t } loss of generality, assume that ( t ) 0 E X =. Let the linear dependence of X t + k on Xt+, Xt+ 2,, Xt+ k be defined as the best linear estimate in the mean square sense of X t + kas a linear function of Xt+, Xt+ 2,, Xt+ k 30

31 That is, if X ˆ t + k denotes the best linear estimate of X +, then t k ˆ t + = α k t + k + α2 t + k α k t + X X X X, where α i, ( i k ) are the mean square linear regression coefficients obtained by minimizing ( ˆ ) t+ k t+ k = ( t+ k α t+ k αk t+ ) 2 2 E X X E X X X. 3

32 Differentiating with respect to the coefficients and equating to zero, we obtain γ = αγ + αγ + + α γ ( i k i i 2 i 2 k i k+ ) (*) Hence, = α + α + + α ( i k ). i i 2 i 2 k i k+ In terms of matrix notation, the above system becomes 2 k 2 α 2 k 3 α 2 = k k 2 k 3 k 4 αk (**) 32

33 Similarly, ˆ = β t t + + β2 t β k t + k X X X X, where β i, ( i k ) are the mean square linear regression coefficients obtained by minimizing ( ˆ ) t t = ( t β t+ βk t+ k ) 2 2 E X X E X X X. A similar calculation shows that 2 k 2 β 2 k 3 β 2 = k k 2 k 3 k 4 βk. 33

34 This, along with the linear independence of the rows (columns) of the matrix, implies that αi = βi, i k. The partial autocorrelation between X t and X + is the ordinary autocorrelation between t k ( X ˆ ) t Xt and ( X ˆ ) t+ k Xt+ k. Then if we let P k denote the partial autocorrelation between X and X +, then t t k P k = ( ˆ ),( ˆ ) t t t+ k t+ k Cov X X X X ( ˆ ) ( ˆ ) t t t+ k t+ k Var X X Var X X. 34

35 Now, ( ˆ ) ( ) 2 t+ k t+ k = t+ k α t+ k αk t+ Var X X E X X X ( α α ) = E X X X X t+ k t+ k t+ k k t+ ( ) α E X X α X α X t+ k t+ k t+ k k t+ ( ) α E X X α X α X k t+ t+ k t+ k k t+ ( α α ) = E X X X X t+ k t+ k t+ k k t+ since all other remaining terms reduce to zero by virtue of (*). Hence, ( ˆ ) ( ˆ ) t+ k t+ k = t t = 0 k k Var X X Var X X γ αγ α γ 35

36 Next, using the fact that αi = βi, i k, ( ˆ ),( ˆ ) t t t+ k t+ k Cov X X X X ( t α t+ αk t+ k )( t+ k α t+ k αk t+ ) = E ( X α X α X ) X = E X X X X X X t t+ k t+ k t+ k = γ αγ α γ. k k k Therefore, P k γk αγ k αk γ k α k αk = = γ αγ α γ α α 0 k k k k 36

37 Solving the system (**) by Cramer s rule gives α = i i 2 i k 2 i 3 2 i k 3 k 2 k 3 k i k k i 2 i 2 i i k 2 i 3 i i k 3 k 2 k 3 k i k i k i 2 The matrix in the numerator is the same as that in the denominator except for its ith column being replaced by ( ).,,, k 2 37

38 Substituting the last expression for α i into the formula for P k and multiplying both the numerator and denominator of the derived expression by i 2 i k 2 i 3 2 i k 3 k 2 k 3 k i k k i 2 the resulting P k is P k = 2 k 2 k 3 2 k k 2 k 3 k 2 k 2 k k 3 k 2 k k 2 k 3, 38

39 As an example, for k = 3, we have P 3 = 3 α 2 α 2 α α 2 2, where α 2 = and α 2 2 =. 39

40 Substituting, P 3 = = =

41 The partial autocorrelation can also be derived as follows. Consider the regression model, where the dependent variable X t+ k from a mean zero stationary process is regressed on k lagged variables Xt+ k, Xt+ k 2,, Xt, i.e. X = φ X + φ X + + φ X + e, t+ k k t+ k k2 t+ k 2 kk t t+ k where φ ki denotes the ith regression parameter and e t+ k is a normal error term uncorrelated with X t + k jfor j. Multiplying by X t+ k j on both sides of the regression equation and taking expectations, we obtain γ j = φkγ j + φk 2γ j φkkγ j k and hence, = φ + φ + + φ. j k j k 2 j 2 kk j k 4

42 For j =,, k, we have the following system of equations: = φ + φ + + φ k 0 k 2 kk k = φ + φ + + φ 2 k k 2 0 kk k 2 = φ k + φ + + φ k k k 2 k 2 kk 0 Using Cramer s rule successively for k =, 2, we have φ 22 φ =, 2 =, 42

43 φ 33 = φ kk = 2 k 2 k 3 2 k k 2 k 3 k 2 k 2 k k 3 k 2 k k 2 k 3 Thus φ kk = Pk. 43

44 Example.5. For a white noise sequence of uncorrelated random variables e : t = 0, ±, ± 2,, { } t γ h σ 2, h = 0 =, and 0 otherwise h, h = 0 =. 0 otherwise Also, φ kk, k = 0 (by definition) =. 0 otherwise 44

45 .6 Time series models We now consider the modeling of time series as stochastic processes. The model defines the mechanism through which the observations are considered to have been generated. A simple example is the model Xt = et + θet, t =,, n, where e t, t = 0,,, n is a sequence of uncorrelated random variables drawn from a distribution 2 with mean zero and variance σ, and θ is a parameter. 45

46 A particular set of values of e 0, e,, en results in a corresponding sequence of observations X,, Xn. By drawing different sets of values e0, e,, en we can generate an infinite set of realizations over t =,, n. Thus, the model effectively defines a joint distribution for the random variables X,, Xn. For a general model, the mean of the process at time t is µ = E( X ), t =,, n, the average of X t taken over all possible realizations. Also, Cov( X, X ) = E[ ( X µ )( X µ )], t =,, n τ t t+ τ t t t+ τ t t t 46

47 If many realizations are available, the above quantities may be estimated by ensemble averages, for example m ( j) xt j= ˆ µ = m, t =,, n t where x denotes the jth observation on ( j ) t X and m is the number of realizations. t However, in most time series problems, only a single series of observations is available. 47

48 In these situations, to carry out meaningful inference on the mean and covariance functions one needs to make restrictions of the process generating the observations, so that instead of aggregating observations at a particular point in time, we do so over time. This is where stationarity becomes necessary. Under stationarity, we can use the estimates n X n Xt t= ˆ µ = =, t n h ˆ h= ch ( ) = n ( Xt X)( Xt+ h X) t= γ. () 48

49 Under certain conditions, these statistics give consistent estimates of the mean and autovariance functions. The conditions basically require that the observations sufficiently far apart be almost uncorrelated. The quantities () may be normalized to obtain sample autocovariances ˆ h = rh ( ) = ch ( )/ c(0), h =, 2, A plot of rh ( ) against non-negative values of h is known as the sample autocorrelation function or correlogram. 49

50 The sample autocorrelations are estimates of the corresponding theoretical autorrelations for the stochastic process that is assumed to have generated the observations. Although the correlogram will tend to mirror the properties of the theoretical autocorrelation function, it will not reproduce it exactly. 50

51 The correlogram is the main tool for analyzing the properties of a time series in the time domain. Partial autocorrelations can be helpful as well. We must, however, know the autocovariance and partial autocovariance functions of different stochastic processes. We then fit a model with time domain properties similar to that of the data. Also important is knowledge of the sampling variability of the estimated autocorrelations and partial autocorrelations. 5

52 We next study the nature of the autocovariance and partial autocovariance functions for models that may be represented as a linear combination of past and present values of { } t e. 52

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006. 6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017 Introduction to Regression Analysis Dr. Devlina Chatterjee 11 th August, 2017 What is regression analysis? Regression analysis is a statistical technique for studying linear relationships. One dependent

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Time series models in the Frequency domain. The power spectrum, Spectral analysis

Time series models in the Frequency domain. The power spectrum, Spectral analysis ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

MCMC analysis of classical time series algorithms.

MCMC analysis of classical time series algorithms. MCMC analysis of classical time series algorithms. mbalawata@yahoo.com Lappeenranta University of Technology Lappeenranta, 19.03.2009 Outline Introduction 1 Introduction 2 3 Series generation Box-Jenkins

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

7 Introduction to Time Series

7 Introduction to Time Series Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

9) Time series econometrics

9) Time series econometrics 30C00200 Econometrics 9) Time series econometrics Timo Kuosmanen Professor Management Science http://nomepre.net/index.php/timokuosmanen 1 Macroeconomic data: GDP Inflation rate Examples of time series

More information

White Noise Processes (Section 6.2)

White Noise Processes (Section 6.2) White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 21, 2015 Francesco Franco Empirical Macroeconomics 1/33 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong Modeling, Estimation and Control, for Telecommunication Networks Notes for the MGR-815 course 12 June 2010 School of Superior Technology Professor Zbigniew Dziong 1 Table of Contents Preface 5 1. Example

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

The ARIMA Procedure: The ARIMA Procedure

The ARIMA Procedure: The ARIMA Procedure Page 1 of 120 Overview: ARIMA Procedure Getting Started: ARIMA Procedure The Three Stages of ARIMA Modeling Identification Stage Estimation and Diagnostic Checking Stage Forecasting Stage Using ARIMA Procedure

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 5, 2016 Francesco Franco Empirical Macroeconomics 1/39 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

11.1 Gujarati(2003): Chapter 12

11.1 Gujarati(2003): Chapter 12 11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

This note introduces some key concepts in time series econometrics. First, we

This note introduces some key concepts in time series econometrics. First, we INTRODUCTION TO TIME SERIES Econometrics 2 Heino Bohn Nielsen September, 2005 This note introduces some key concepts in time series econometrics. First, we present by means of examples some characteristic

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Ch3. TRENDS. Time Series Analysis

Ch3. TRENDS. Time Series Analysis 3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15 Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10 Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Model selection Mean square error

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated

More information

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic Chapter 6 ESTIMATION OF THE LONG-RUN COVARIANCE MATRIX An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic standard errors for the OLS and linear IV estimators presented

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Time Series Methods. Sanjaya Desilva

Time Series Methods. Sanjaya Desilva Time Series Methods Sanjaya Desilva 1 Dynamic Models In estimating time series models, sometimes we need to explicitly model the temporal relationships between variables, i.e. does X affect Y in the same

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Introduction to Econometrics

Introduction to Econometrics Introduction to Econometrics T H I R D E D I T I O N Global Edition James H. Stock Harvard University Mark W. Watson Princeton University Boston Columbus Indianapolis New York San Francisco Upper Saddle

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information