Generalised AR and MA Models and Applications

Size: px
Start display at page:

Download "Generalised AR and MA Models and Applications"

Transcription

1 Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and the pacf is: 1 k = 0 φ k = α k = 1 0 k with spectral density function: f X ω) = σ π1 α cos ω + α ) π ω π If you consider some simulations of time series generated by 1 αb) δ X t = Z t ; α < 1, δ > 0 you notice that the ACF, PACF and spectrum are similar for various valued of δ. Therefore it is difficult to identify these cases from the standard AR1) case with δ = 1. Therefore we call the class of time series generated by 1 αb) δ X t = Z t Generalised first order autoregression or GAR1). It can be seen that this parameter δ may increase the accuracy of forecast values. We now discuss some properties of this new class of GAR1). 33

2 3.1.1 Some basic results 1. Gamma Function t x 1 e t dt x > 0 0 Γx) = undefined x = 0 x 1 Γx + 1) x < 0 It can be seen that Γx) = x 1)Γx 1) so when x is an integer, Γx) = x 1)!. Furthermore, Γ 1 ) = π.. Beta Function It can be shown that Bx, y) = 1 0 t x 1 1 t) y 1 dt Bx, y) = Γx)Γy) Γx + y) 3. Binomial Coefficients for any n R: n nn 1) n k + 1) = k kk 1) 1 = Γn + 1) Γk + 1)Γn k + 1) When n is a positive integer, n = n C k = k n! n k)!k! Note that you can t define factorials when n is not an integer. 4. Hypergeometric Function F a, b; c; x) = 1+ ab 1!c aa + 1)bb + 1) x+ x +!cc + 1) We can re-express this using Pochhammer notation: aa + 1)a + )bb + 1)b + ) x !cc + 1)c + ) a) j = aa + 1) a + j 1) and a 0 ) = 1 then Further, since a) j = Γa+j) Γa) F a, b; c; x) = we have a) j b) j x j j!c) j F a, b; c; x) = Γa + j)γb + j)γc) Γj + 1)Γa)Γb)Γc + j) xj 34

3 Now, our model is 1 αb) δ X t = Z t or equivalently, X t = 1 αb) δ Z t. So we can look at the general binomial expansion: and 1 x) 1 = 1 + x + x + x x) n = r=0 n x) r r when n is an integer, the above expression cuts off at a particular point. For our purposes we have: δ 1 αb) δ = αb) j, δ R j j=1 δ = 1) j αb) j j Let ) π j = 1) j δ j = 1)j δδ 1) δ j + 1) j! δ) δ + 1) δ + j 1) = j! Γj δ) = Γ δ)γj + 1) Thus, 1 αb) δ X t = Z t is equivalent to the AR ) process: π j α j X t j = Z t which can be expressed in equivalent MA ) form: X t = 1 αb) δ Z t we can derive the form of the weights as above, just changing +δ to δ: X t = ψ j α j Z t j j=1 where i.e. π j with δ changed to +δ). ψ j = Γj + δ) Γδ)Γj + 1) 35

4 3.1. Convergence Theorem converges in the mean square sense. X t = ψ j α j Z t j Proof. EX t ) = σ ψ j α j ) Recall the ratio test of convergence): u 1, u t,..., let u j+1 lim = L j u j if L < 1 then u j converges; L = 1 then there s no decision; and if L > 1 then u j diverges. Letting u j = ψ j α j, then So, we have, u j+1 = ψ j+1 α u j = ψ j Γj δ) Γj + )Γδ) = j + δ j + 1 α 1 + δ j = α j u j+1 lim j u j Thus, as α < 1, the result follows. Γj + 1)Γδ) α Γj + δ) 1 + δ j = lim j α = α j Autocovariance function γ k = covx t, X t k ) = EX t X t k ) [ ) )] = E ψ j α j Z t j ψ l α l Z t k l l=0 = σ ψ k α k + ψ k+1 ψ 1 α k+ + ψ k+ ψ α k ) = σ α k ψ j+k ψ j α j 36

5 When k = 0, γ 0 = varx t ) = σ ψ j α j. Note that when δ = 1, ψ j = Γj+δ) Γj+1)Γδ) = 1 and we collapse back to our standard case: γ 0 = varx t ) = σ Recall the Hypergeometric function α j = σ 1 α. F θ 1, θ ; θ 3 ; θ) = Γθ 1 + j)γθ + j)γθ 3 )θ j Γθ 1 )Γθ )Γθ 3 + j)γj + 1) Now, we have γ 0 = σ ψ j ) α j = σ Γj + δ) Γj + δ) Γj + 1)Γδ) Γj + 1)Γδ) αj = σ F δ, δ; 1; α ) I.e. we have a hypergeometric function with θ 1 = θ = δ, θ 3 = 1 and θ = α. Similarly, we can re-express the autocovariance function as: γ k = σ α k ψ k+j ψ j α j = σ α k Γk + δ) Γδ)Γk + 1) F δ, k + δ; k + 1, α ) Combining these results we have the autocorrelation function: ρ k = γ k = αk Γk + δ)f δ, k + δ; k + 1; α ) γ 0 Γδ)Γk + 1)F δ, δ; 1; α ) Spectral Density Function Recall when we have X t = GB)Z t, f X ω) = Ge iω ) f Z ω) AR1) case: 1 αb)x t = Z t or X t = 1 αb) 1 Z t : f X ω) = 1 αe iω f Z ω) = 1 α cos ω + α ) 1 f Z ω) I {Z t } W N0, σ ) then f Z ω) = σ π 37 π < ω < π.

6 GAR1) case: 1 αb) δ X t = Z t and we have: f X ω) = 1 α cos ω + α δ σ ) π π < ω < π. Thus, the basic properties turning points, etc.) don t change. Therefore, it s hard to identify a δ class from an integer class as they have very similar properties. 3. Parameter Estimation 3..1 Maximum Likelihood Estimation Need to give a distribution to {Z t } rather than just white noise, therefore we assume {Z t } NID0, σ ). L = π) n Σ { 1 exp 1 } XT Σ 1 X where X = X 1,..., X n ) T, Σ = EXX T ). We can do MLE by numerical maximisation of L which gives ML estimates. 3.. Peridogram Estimation An alternative method of estimating α, δ and σ is to take log f X ω) = δ log α cos ω + α ) + C and let I X ω) be an estimate based on the periodogram) of f X ω). Then we have so: log f X ω) + log I X ω) = δ log α cos ω + α ) + C + log I X ω) log I X ω) = δ log α cos ω + α ) + log IX ω) + C. f X ω) Now, because I X ω) are uncorrelated random variables, we can write this as a regression type equation by letting y j = log I X ω j ) x j = log1 α cos ω + α ) b = δ e j = log a = c IX ω) f X ω) Putting the above together we have: y j = a + bx j + e j. 38

7 We can then estimate parameters by minimising the sum of squared errors: e j = y j a bx j ) but we can differentiate as out parameters are embedded, therefore we use numerical minimisation. In other words, since there is no explicit solution, we need to use the numerical method Whittle Estimation Procedure See Shitan and Peris 008). 3.3 Generalised Moving Average Models Standard MA1) models are defined as: X t = Z t + θz t 1 = 1 θb)z t. We didn t go into much detail, but we can write the Generalised Moving Average process of order 1, GMA1) as: X t = 1 θb) δ Z t We can get all of our results, as with the GAR1) by changing the sign of δ and letting θ = α. 39

8 Chapter 4 Analysis and Applications of Long Memory Time Series Let {X t } be a stationary time series and let ρ k be it s ACF at lag k. {X t } is said to be 1. No memory if ρ k = 0 for all k 0.. Short memory if ρ k φ k ; φ < 1. That is, ρ k decays to zero at an exponential rate. In this case, it is clear that there exists a constant A > 1 such that A k ρ k c as k. Equivalently, ρ k cφ k and hence ρ k <. 3. Long memory if ρ k δ, 0 < δ < 1. That is, ρ k decays slowly at the hyperbolic rate. In this case, k δ ρ k c as k. 4.1 Properties of Long Memory Time Series With an acf, ρ k that equivalently: k δ, 0 < δ < 1, we can check if ρ k is convergent by recalling 1 is convergent for p > 1 and divergent for p 1. rp r p is convergent for p > 1 and divergent for p 1. Thus ρk k δ is divergent as 0 < δ < 1. Thus ρ k =, however, ρ k 0 as k so we still have a stationary process. Note that the acf gives an indication of the memory type of the underlying process. We 40

9 can also look at the spectral properties. The sdf of a stationary process {X t } is: f X ω) = 1 π = 1 π k= [ 1 + ρ k e iωk ; π < ω < π ] ρ k cos ωk 1. No memory processes have ρ k = 0 for k 0 which implies that k=1 f X ω) = 1 π ; π < ω < π Thus the spectrum is uniformly distributed over π, π).. Short memory processes such as ARMA type processes have an sdf for all ω. If we look at the sdf at ω = 0, [ ] f X 0) = ρ k = c < π So for short memory type processes, we get a bounded point on the f X ω) axis. 3. Long memory type processes have a discontinuity at ω = 0: [ ] f X 0) = ρ k = π This the sdf is unbounded at ω = 0. Note that typically we plot with ω on the x axis and f X ω) on the y axis, so a short memory process will be continuous when crossing the y axis, but long memory processes will only approach the y axis asymptotically. k=1 k=1 4. Identification 1. Calculate the sample acf, r k.. Calculate the sample sdf or periodogram using I n ω) = 1 n x t e itω j t=1 ; π < ω j = πj n < π 3. Hurst coefficient. As famous hydrologist, Hurst 1951), notices that there is a relationship between the rescaled adjusted range, R, and long memory, or persistent time series. 41

10 The rescaled adjusted range is defined as [ Y i Y 1 i ] k Y n Y 1 ) R = max 1 i n [ min Y i Y 1 i ] 1 i n k Y n Y 1 ) where Y i = i t=1 X t. We further define the adjusted standard deviation as S = 1 n x i x) n i=1 The plots of log R S ) and log n were observed for many time series and it was noticed that for persistent time series a) the points were scattered around a straight line b) the slope was always greater than a half. Therefore we can write a regression line: R log = a + H logn) S where the slope coefficient H is called the Hurst coefficient. An estimate is: Ĥ = log R S log n 4.3 Fractional Integration and Long Memory Time Series Modelling In many time series problems, we usually consider the order of differencing, d, as either 0, 1 or. That is the dth difference of {X t } is stationary. Let Y t = 1 B) d X t A notation: X t Id). If {X t } I0) then no differencing is required to get a stationary series and the acf decays exponentially. If X t I1) then X t is non stationary and Y t = a + bt + Z t so EY t Y t 1 ) = b. Further if X t I) this implies that Y t = a + bt + ct + Z t Let u t = Y t Y t 1 then v t = u t u t 1 is a constant. Thus, we cannot accommodate the hyperbolic decay of the acf by integer differencing. It has been noticed that when 1 < d < 1 and d 0 or equivalently, 0 < d < 1 one can accommodate this hyperbolic decay of the acf. In particular we will show that when 0 < d < 1, Y t = 1 B) d X t represents a stationary, long memory time series. Note that in general then 1 < d < 1, 1 B) d X t is called fractional differencing or integration). 4

11 4.3.1 FARIMAp, d, q) If we fit an ARMAp, q) model to {Y t } in the form: Y t = 1 B) d X t we get: ΦB)Y t = ΘB)Z t where {Z t } W N0, 1) and 1 < d < 1 we have what s called fractionally integrated ARIMA or FARIMAp, d, q). To develop the theory, we first look at FARIMA0,d,0) or FDWN fractionally differenced white noise) given by: 1 B) d X t = Z t or d X t = Z t Note that this d X t = Z t is known as fractionally differenced white noise. Note that from here on take d as a fractional real number or 1 < d < 1. Theorem When d < 1, {X t} is stationary and equivalent to X t = d Z t = ψ j Z t j where ψ j = Γj + d) Γj + 1)Γd). When d > 1, {X t} is invertible and equivalent to π j X t j = Z t where π j = Γj d) Γj + 1)Γ d) Thus you need 1 < d < 1 otherwise, you still have fractional integration, but you ll have stationarity and not invertibility or invertibility without stationarity. Proof. 1. Recall that varx t ) = σ Γx + a) Γx + b) xa b so ψ j j d 1 for large j and therefore ψj for large x varx t ) σ j d 1). Now if we recall that r p is convergent for p > 1 and divergent for p 1 we require varx t ) < which will only occur if d 1) > 1 or d < 1. 43

12 . Similarly for invertibility, we need π j < which only occurs when d > 1. Exercise Show that the sdf of fractionally differenced white noise is given by f X ω) = d π [ ω )] d sin Recall that if you have X t and Y t stationary, with Y t = GB)X t then f Y ω) = Ge iω ) f X ω) so with X t = 1 B) d Z t = GB)Z t we have f X ω) = 1 e iω d f Z ω) = 1 cos ω) + i sin ω d fz ω) = 1 cos ω) + i sin ω ) d fz ω) = 1 cos ω) + sin ω ) d fz ω) = cos ω) d f Z ω) = d sin ω ) d fz ω) = d sin ω ) d fz ω) Note that we ve used the trigonometric identity: cos A = 1 sin A. Note that for 0 < d < 1, f Xω) as ω 0. We can also see this by noting that for small ω, sin ω ω so near zero, f Xω) ω d f Z ω) so as ω 0 we have a discontinuity. Note that when 1 < d < 0, f Xω) exists near ω = 0, therefore the series has short memory behaviour. It is clear that, a FARIMA process has long memory properties when 0 < d < 1. Exercise: Let γ k be the acf at lag k of a FDWN. Show that 1. γ k = 1) k Γ1 d) Γ d+k+1)γ d k+1).. ρ k = Γk+d)Γ1 d) Γk d+1)γd). Exercise: Find the pacf at lags 1 and of of a FDWN. Ans: d, 1 d d. d 44

13 4.3. Estimation of d First note that f X ω) = 4 sin ω ) d fz ω) therefore taking logs we have log f X ω) ) = d log 4 sin ω ) + log f Z ω) ) Let I X ω) be the periodogram based on n observations. Now, log I X ω) ) = d log 4 sin ω ) + log I X ω) log f X ω) + log f Z ω) = log f Z 0) d log 4 sin ω ) IX ω) fz ω) + log + log f X ω) f Z 0) Since Z t W N0, σ ), f Z ω) = σ π for 0 < ω < π. So in this case, f Z ω) f Z 0) log1) = 0, so we can drop this term. In general this will hold for ω near 0. So, letting = 1 and where ω j = πj n y j = log I X ω j ) x j = log 4 sin ω ) j b = d IX ω j ) ε j = log f X ω j ) a = log f Z 0) for j = n 1),..., n 1). Thus we can write y j = a + bx j + ε j where the distribution of ε j is unknown. In practice we set j = 1,,..., m < n Distribution of ε j We ve seen that for short memory time series, I X ω j ) asy 1 [ n ] f Xω j )χ ; j 0, Theorem Let X t be a process generated by 1 B) d X t = Z t with 1 < d < 0. Then for large n, IX ω j ) [ n ] log ; j 0, f X ω j ) 45

14 follows an independent Gumble distribution with mean δ = Eulers constant) and variance π. Note that Y is said to be a Gumble distribution with parameters α and 6 β if the cdf of Y is: y α F Y y) = exp { e β ) } In this case, EY ) = α + βδ and vary ) = π β 6. Proof. When d < 0, with pdf L X x) = 1 e x I X ω j ) 1 f X ω j ) χ exp and EX) = 1 and varx) = 1. Let 4 IX ω j ) U = log f X ω j ) then, ) IX ω j ) F u) = P U u) = P log u f X ω j IX ω j ) = P f X ω j ) e u IX ω j ) = P f X ω j ) e u = exp { e u} Noting that if X expλ) then P X > x) = e λx. Thus we have a Gumble type distribution with α = 0 and β = 1, therefore, EU) = δ and varu) = π, neither of which depend on j therefore we have an iid sequence and we 6 can use standard regression techniques. So our model is: Our least squares estimate of b j is: y j = a + bx j + ε j ˆb = m j=1 x j x)y j m j=1 x j x) m < n and It is easy to see that varˆb) = â = ȳ ˆb x. σ xj x) = π 6 x j x) Let ˆd p be the corresponding estimate of d based on the periodogram approach: ˆdp = ˆb. 46

15 Under certain conditions and large n, we have the usual result: ˆd p d SE ˆd p ) N 0, 1) Now in the general case we have X t = 1 B) d Y t with f X ω) = 4 sin ω ) d fy ω) and log f X ω) = d log 4 sin ω ) + log f Y ω) thus log I X ω) = d log 4 sin ω ) + log IX ω) + log f Y ω) f X ω) Note that at this stage, we don t have a regression as we log f Y ω) is not a constant, therefore we add and subtract log f Y 0): log I X ω) = log f Y 0) d log 4 sin ω ) IX ω) fy ω) + log + log f X ω) f Y 0) Now, log fy ω) f Y is not a constant for all ω 0) j as it was previously) but it is a constant when ω j 0. Thus, this can be considered as a simple linear regression equation if we consider the values of ω near the origin. In this case, y j = a + bx j + ε j ; ω j = πj n ; j = 1,,..., m n In practice we usually take m = n α with 0 < α < 1, a popular choice is α = Estimation of d using a smoothed periodogram We can also estimate d using a smoothed, or weighted, periodogram. Let ˆf s ω) = 1 iωs λs) ˆRs)e π s <m be a smoothed periodogram based on a suitable lag window. In this case, where ν = References: n s <n and a = 1. λs) ν ˆf s ω) fω) aχ ν V. Reisen 1994) J. Time Series Analysis 153) Chen, Abraham and Peiris 1994) J. Time Series Analysis 155)

16 Generalized Fractional Processes 48

17 . 49

18 . 50

19 . 51

An introduction to volatility models with indices

An introduction to volatility models with indices Applied Mathematics Letters 20 (2007) 177 182 www.elsevier.com/locate/aml An introduction to volatility models with indices S. Peiris,A.Thavaneswaran 1 School of Mathematics and Statistics, The University

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2 THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS Mor Ndongo & Abdou Kâ Diongue UFR SAT, Universit Gaston Berger, BP 34 Saint-Louis, Sénégal (morndongo000@yahoo.fr) UFR SAT, Universit Gaston

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Long and Short Memory in Economics: Fractional-Order Difference and Differentiation

Long and Short Memory in Economics: Fractional-Order Difference and Differentiation IRA-International Journal of Management and Social Sciences. 2016. Vol. 5. No. 2. P. 327-334. DOI: 10.21013/jmss.v5.n2.p10 Long and Short Memory in Economics: Fractional-Order Difference and Differentiation

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

M O N A S H U N I V E R S I T Y

M O N A S H U N I V E R S I T Y ISSN 440-77X ISBN 0 736 066 4 M O N A S H U N I V E R S I T Y AUSTRALIA A Test for the Difference Parameter of the ARIFMA Model Using the Moving Blocks Bootstrap Elizabeth Ann Mahara Working Paper /99

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

X random; interested in impact of X on Y. Time series analogue of regression.

X random; interested in impact of X on Y. Time series analogue of regression. Multiple time series Given: two series Y and X. Relationship between series? Possible approaches: X deterministic: regress Y on X via generalized least squares: arima.mle in SPlus or arima in R. We have

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Time Series Econometrics 4 Vijayamohanan Pillai N

Time Series Econometrics 4 Vijayamohanan Pillai N Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

AN ILLUSTRATION OF GENERALISED ARMA (GARMA) TIME SERIES MODELING OF FOREST AREA IN MALAYSIA

AN ILLUSTRATION OF GENERALISED ARMA (GARMA) TIME SERIES MODELING OF FOREST AREA IN MALAYSIA International Conference Mathematical and Computational Biology 2011 International Journal of Modern Physics: Conference Series Vol. 9 (2012) 390 397 c World Scientific Publishing Company DOI: 10.1142/S2010194512005466

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES THE ROYAL STATISTICAL SOCIETY 9 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES The Society provides these solutions to assist candidates preparing

More information

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Basics: Definitions and Notation. Stationarity. A More Formal Definition Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

1. Fundamental concepts

1. Fundamental concepts . Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction

ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 26, No. 2, May 2013 ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE Joo-Mok Kim* Abstract. We consider fractional Gussian noise

More information

Non-parametric Inference and Resampling

Non-parametric Inference and Resampling Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

In this chapter we study several functions that are useful in calculus and other areas of mathematics.

In this chapter we study several functions that are useful in calculus and other areas of mathematics. Calculus 5 7 Special functions In this chapter we study several functions that are useful in calculus and other areas of mathematics. 7. Hyperbolic trigonometric functions The functions we study in this

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Input-Output systems The z-transform important issues

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information