STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
|
|
- Harry Taylor
- 5 years ago
- Views:
Transcription
1 STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X (t, t + h) = Cov(X t, X t+h ) is independent of t for all h; the covariance only depends on the distance h instead of t 3 E[X 2 t ] < is also one of the conditions for weak stationarity Definition 12 Let x 1,, x n be observations of a time series The sample mean of x 1,, x n is x = 1 n The sample autocovariance function is ˆγ(h) = 1 n n h t=1 The sample autocorrelation function is ( xt+ h x ) (x t x), h ( n, n) ˆρ(h) = ˆγ(h), h ( n, n) ˆγ(0) n x i i=1 2 Statistical Tests The Shapiro-Wilk Test is as follows: H 0 : Y 1,, Y n come from a Gaussian distribution Reject H 0 if the p-value of this test is small In R, if the data is stored in the vector y, then use the command shapirotest(y) The Difference Sign Test is as follows: Count the number S of values such that y i y i 1 > 0 For large iid sequences µ S = E[S] = n 1, σs 2 = n For large n, S is approximately N(µ S, σs 2 ), therefore, W = S µ S σ 2 S N(0, 1) A large positive value of S µ S indicates the presence of increasing (decreasing) trend We reject (H 0 : data is random) if W > z 1 α/2 but this may not work for seasonal data 1
2 The Runs Test is as follows: Estimate the median and call it m Let n 1 be the number of observations > m and n 2 be the number of observations < m Let R be the number of consecutive observations which are all smaller (larger) than m For large iid sequences µ R = E[R] = 1 + 2n 1n 2, σr 2 = (µ R 1)(µ R 2) n 1 + n 2 n 1 + n 2 1 For large number of observations, R µ R σ R N(0, 1) 3 Filters and Smoothing 1 (Finite Moving Average Filter) Let q be a non-negative integer and consider the two-sided moving average of the series X t We have m t 1 2q + 1 q j= q X t j = 1 2q + 1 q m t j + 1 q Y t j 2q + 1 j= q 0 j= q 2 (Exponential Smoothing) For fixed α [0, 1] define the recursion ˆm t = αx t + (1 α) ˆm t 1 with initial condition ˆm 1 = X 1 This gives an exponentially decreasing weighted moving average where in the general t 2 case, t 2 ˆm t = α(1 α) j X t j + (1 α) t 1 X 1 Note that a smaller α creates a smoother plot compared to a larger α 3 (Polynomial Regression) This is just developing a parametric polynomial form of m t in the form where k is chosen arbitrarily m t = k β i t i 4 We can also eliminate the trend through differencing where i=0 X t = X t X t 1 = (1 B)X t and, B are known to be the differencing and backshift operators respectively Exponentiating these operators is equivalent to function composition In this case, we are applying differencing to get a stationary process (by eliminating the trend) Holt-Winters (Special Cases) In the case that β = γ = 0 we have no trend or seasonal updates in the H-W algorithm 2
3 Here, we have L t = αx t + (1 α)l t 1 which is exactly (simple) exponential smoothing under α In the case that γ = 0 we have no seasonal component and there are two H-W equations for updating L t and T t We call the above case double exponential smoothing 4 Linear Processes Definition 41 A process {X t } is called a moving average process of order q if X t = Z t + θ 1 Z t θ q Z t q where {Z t } W N(0, σ 2 ) and θ 1,, θ q are constants Sometimes Z t is referred to as the innovation Notice that these innovations are uncorrelated, have constant variance and zero mean Deriving the mean and autocovariance function of M A(q), it is easy to see that this process is stationary Definition 42 We say that a process {X t } is q dependent if X t and are X s are independent if t s > q That is, they are dependent if they are within q steps of each other Similarly, we saay that that stationary time series is q correlated if γ(h) = 0 whenever h > q Example 41 It is easy to show that the MA(q) process is q correlated The inverse of this statement is also true Proposition 41 If {X t } is a stationary q correlated time series with mean 0, then it can be represented as the MA(q) process Definition 43 process {X t } is called a autoregressive process of order p if X t = X t + φ 1 X t φ p X t p + Z t where {Z t } W N(0, σ 2 ) and φ 1,, φ p are constants Definition 44 {X t } is called a Gaussian time series if all its joint distributions are multivariate normal That is for any set i 1,, i m with each n N, the random vector (X i1,, X im ) follows a multivariate normal distribution Example 42 Consider the stationary Gaussian time series {X t } Suppose X n has been observed and we want to forecast X t+h using m(x n ), a function of X n Let us measure the quality of the forecast by ) MSE = E ([X n+h m(x n )] 2 X n It can be shown that m( ) which minimizes MSE in a general case is m(x n ) = E(X n+h X n ) Example 43 We now consider the problem of predicting X n+h, h > 0 for a stationary time series with known mean µ and ACVF γ( ) based on previous values {X n,, X 1 } showing the linear predictor of X n+h by P n X n+h We are interested in P n X n+h = a 0 + a 1 X n + a 2 X n a n X 1 which minimizes S(a 0,, a n ) = E [(X n+h P n X n+h ) 2] To get a 0, a 1,, a n we need to solve the system S a j = 0 for j = 0, 1,, n Doing so, we get ( ) n a 0 = µ 1 a i, Γ n a n = γ n (h) i=1 3
4 where a n = a 1 a 2 a n, Γ n = γ(0) γ(1) γ(n 1) γ(1) γ(0) γ(n 2) γ(n 1) γ(n 2) γ(0), γ n(h) = γ(h) γ(h + 1) γ(n + h 1) Here, a n = Γ 1 n γ n (h) = a n = Γ 1 n γ(0) ρ n(h) where ρ n (h) = γn(h) Note 1 Here are some properties from the above: P n X n+h is defined by µ, γ(h) γ(0) P n X n+1 = µ + n i=1 a i(x n+1 i µ) [ It can be shown that E (X n+h P n X n+h ) 2] = γ(0) a T n γ n (h) E(X n+h P n X n+h ) = 0 E [(X n+h P n X n+h )X j ] = 0 for j = 1, 2,, n Example 44 Derive the one-step prediction for the AR(1) model (Here, h = 1) To find the linear predictor, we need to solve Γ n a n = γ n (h) = Γ na n γ(0) = γ n(h) γ(0) 1 φ φ n 1 φ 1 φ n 2 = φ n 1 φ n 2 1 a 1 a 2 a n = φ φ 2 φ n An obvious solution is a n = a 1 a 2 a n You can use the formula of MSE to get = P nx n+1 = µ + = P n X n+1 = n a i (X n+1 i µ) i=1 n a i X n+1 i = a 1 X n + 0 = φx n i=1 MSE = γ(0) a T n γ n (h) = γ(0) φγ(1) = γ(0) φ 2 γ(0) = γ(0)[1 φ 2 ] = σ 2 5 Causal and Invertible Processes Definition 51 The time series {X t } is a linear process if X t = j= ψ jz t j for all t where {Z t } W N(0, σ 2 ) and ψ j is a sequence of constants such that j= ψ j < 4
5 Example 51 Show that AR(1) with φ < 1 is a linear process We know that X t = φx t 1 + Z t W N(0,σ 2 ) and we showed before that X t = φj Z t j Since φ < 1 then if ψ j = φ j then j= ψ j and therefore all assumptions in the definition above are satisfied So AR(1) is a linear process Definition 52 A linear process j= ψ jz t j is causal or future independent if ψ j = 0 for any j < 0 {X t, t T } is an ARMA(p, q) process if 1) {X t, t T } is stationary 2) X t φ 1 X t 1 φ 2 X t 2 φ p X t p = Z t + θ 1 Z t θ q Z t q where {Z t } W N(0, σ 2 ) 3) Polynomials (1 φ 1 z φ p z p ) and (1 + θ 1 z + + θ q z q ) have no common factors/roots (IMPORTANT FOR THE FINAL!) Definition 53 An ARMA(p, q) process φ(b)x t = θ(b)z t where Z t W N(0, σ 2 ) is causal if there exists constants {ψ j } such that ψ j < and X t = ψ jz t j for any t This condition is equivalent to for any z C such that z 1 Remark 51 f the condition above holds true, then and we have φ(z) = 1 φ 1 z 1 φ 2 z 2 φ p z p 0 θ(z) = ψ(z) φ(z) = θ(z) = φ(z) ψ(z) = 1 + θ 1 z + + θ q z q = (1 φ 1 z φ p z p )(ψ 0 + ψ 1 z + ) 1 = ψ 0 θ 1 = ψ 1 φ 1 ψ 0 Definition 54 An ARMA(p, q) process {X t } is invertible if there exists constants {Π j } such that Π j < and Z t = Π jx t j for all t Invertibility is equivalent to the condition θ(z) = 1 + θ 1 z + + θ q z q 0 for any z C such that z 1 Using the same methods above, one can get that Π 0 = 1 φ 1 = Π 0 θ 1 + Π 1 Example 52 Consider {X t, t T } satisfying X t 05X t 1 = Z t + 04Z t 1 where {Z t } W N(0, σ 2 ) Investigate the causality and invertibility of X t If the series is causal (invertible) then provide the causal (invertible) solutions These are called the M A( ) and AR( ) representations [Causality] We have φ(z) = 1 05z causal We then have = z = 2 = z > 1 Since this is outside the unit circle, X t is z = (1 05z)(ψ 0 + ψ 1 z + ) = ψ 0 = 1, ψ 1 05ψ 0 = 04, ψ 2 05ψ 1 = 0, = ψ 0 = 1, ψ 1 = 09, ψ 2 = 09(05), ψ 3 = 09(05) 2, 5
6 We can kind of see the pattern (and prove using induction) { ψ j = 1 j = 0 ψ j = ψ j = 09(05) j 1 j 0 = X t = Z t + 09 (05) j 1 Z t j [Invertibility] We have θ(z) = z = 0 = z = 10/4 = z > 1 Since this is outside the unit circle, X t is invertible We then have, like above, 1 05z = (1 + 04z)(Π 0 + Π 1 z + ) = Π 0 = 1, Π Π 0 = 05, Π Π 1 = 0, = Π 0 = 1, Π 0 = 09, ψ 2 = 09( 04), ψ 3 = 09( 04) 2, We can kind of see the pattern (and prove using induction) { ψ j = 1 j = 0 ψ j = ψ j = 09( 04) j 1 j 0 = X t = Z t 09 ( 04) j 1 Z t j Remark 52 (ACVF of ARMA processes) Consider a causal, stationary process φ(b)x t = θ(b)z t with Z t W N(0, σ 2 ) The MA( ) representation of X t is X t = ψ jz t j where E[X t ] = 0 We have γ(h) = E[X t X t+h ] E[X t ]E[X t+h ] =0 = E ψ j Z t j ψ j Z t+h j Notice that E[Z t Z s ] = 0 when t s We then have { γ(h) = ψ jψ j+h E[Zj 2 ] h 0 ψ jψ j h E[Zj 2] h < 0 = σ2 ψ j ψ j+ h Example 53 Derive the ACVF for the following ARM A(1, 1) process X t φx t 1 = Z t θz t 1 where Z t W N(0, σ 2 ) and φ < 1 Note that φ(z) is causal because 1 φz = 0 = z = 1/φ > 1 It can be shown, with similar methods above, that { ψ j = φ(φ + θ) j = 0 ψ j = ψ j = φ j 1 (φ + θ) j 0 Now if h = 0 then γ(0) = σ 2 ψj 2 = σ ψ 2 j = σ (θ + φ) 2 = σ 2 [ φ 2(j 1) ] 1 + (θ + φ) 2 φ 2i i=0 [ = σ (θ + ] φ)2 φ 1 φ 2 6
7 If h 0 then γ(0) = σ 2 ψ j ψ j+ h = σ 2 ψ 0 ψ h + ψ j ψ j+ h = σ 2 φ h 1 (θ + φ) + (θ + φ) 2 = σ 2 φ h 1 (θ + φ) + (θ + φ) 2 φ h 1 [ = σ 2 φ h 1 (θ + φ) + (θ + φ)2 φ h +1 ] 1 φ 4 φ j 1 φ j+ h φ 2j Summary 1 For ACF and PACF, we have the following summary: ACF PACF M A(q) Zero after lag q Decays exponentially AR(p) Decays exponentially Zero after lag p In the general case of ARMA processes, the PACF is defined as α(0) = 1 and α(h) = Φ hh for h 1 where Φ hh is the last component of the vector Φ h = Γ 1 h γ h in which γ(0) γ(1) γ(h 1) γ(1) γ(0) γ(h 2) Γ h =, γ h = γ(h 1) γ(h 2) γ(0) Example 54 Calculate α(2) for an M A(1) process X t = Z t + θz t 1, {Z t } W N(0, σ 2 ) γ(1) γ(2) γ(h) We have shown before that (1 + θ 2 )σ 2 h = 0 γ(h) = θσ 2 h = 1 0 h 2 We have Φ = Γ 1 h γ h So α(h) is the last element of Φ h and h = 1 = Φ 11 = (γ(0)) 1 γ(1) = γ(1) γ(0) = θ 1 + θ 2 ( ) (1 + θ h = 2 = 2 )σ 2 θσ 2 1 ( θσ 2 θσ 2 (1 + θ 2 )σ 2 0 ) ( = θ(1+θ 2 )σ 4 (1+θ 2 ) 2 σ 4 θ 2 σ 4 θσ 2 (1+θ 2 ) 2 σ 4 θ 2 σ 4 ) Where the last element of the case of h = 2, in reduced form, is It can be shown, in general, that α(2) = Φ 22 = θ θ 2 + θ 4 α(h) = Φ hh = ( θ)h h i=0 θ2h 7
8 6 ARIMA/SARIMA Models Definition 61 Let d be a non-negative integer {X t, t T } is an ARIMA(p, d, q) process if Y t = (1 B) d X t is a causal ARMA(p, q) process The definition above means that {X t, t T } satisfies an equation of the form φ (B)X t φ(b)(1 B) d X t = θ(b)z t, {Z t } W N(0, σ 2 ) Note that φ (1) = 0 = X t is not stationary unless d = 0 Therefore, {X t } is stationary iff d = 0 in which case it is reduced to an ARMA(p, q) process in the previous case Recall that if {X t } exhibits a polynomial trend of the form m(t) = α 0 + α 1 t + + α d t d then (1 B) d X t will not have that trend any more Therefore, ARIMA models (when d 0) are appropriate when the trend in the data is well approximated by a polynomial degree d Recall the operator B where B k X t = X t k Clearly (1 B k ) and (1 B) k are different filters The latter is performing k times differencing, but the former is differencing once in lag k In R, we will write diff(x,difference=k) (1 B) k X t diff(x,lag=k) (1 B k )X t Definition 62 If d, D are non-negative integers, then {X t t T } is a seasonal ARIMA(p, d, q) (P, D, Q) S process with period S if the differenced series is a causal ARMA process defined by Y t = d D S X t = (1 B) d (1 B S ) D X t φ(b)φ(b S )Y t = θ(b)θ(b S )Z t, Z t W N(0, σ 2 ) Remark 61 Notice that the process {X t, t T } is causal iff φ(z) 0 Φ(z) 0 for all z : z < 1 Example 61 Derive the ACF of SARIMA(0, 0, 1) 12 = SARIMA(0, 0, 0) (0, 0, 1) 12 This gives us the general form X t = Z t + Θ 1 Z t 12, Z t W N(0, σ 2 ) Show, as an exercise, that (1 + Θ 2 1)σ 2 h = 0 γ(h) = Cov(X t, X t+h ) = Θ 1 σ 2 h = 12 0 otherwise ρ(h) = γ(h) γ(0) = 1 h = 0 θ 1+θ h = otherwise Definition 63 Consider a causal AR(p) model (1) X t φ 1 X t 1 φ p X t p = Z t with causal solution X t = ψ jz t j where {Z t } W N(0, σ 2 ) Multiply both sides of (1) by X t j with j = 0, 1, 2,, p and taking expectations will give us E[X t X t j ] φ 1 E[X t 1 X t j ] φ p E[X t p X t j ] = E[Z t X t j ] = γ(j) φ 1 γ(j 1) φ p γ(j p) = E[Z t X t j ] 8
9 We then have { E[Zt X t j ] = E[Z t X t ] = E [Z t ψ jz t j ] = E[Z 2 t ] = σ 2 j = 0 E[Z t X t j ] = 0 j > 0 So the original equation reduces to { γ(0) φ 1 γ(1) φ p γ(p) = σ 2 j = 0 γ(j) φ 1 γ( j 1 ) γ( j p ) = 0 j 0 These are called the Yule-Walker equations This can be easily generalized to a matrix form Γ p φ = γ p Based on a sample {x 1, x 2,, x n } the parameters φ and σ 2 can be estimated by ˆφ = 1 ˆΓ p ˆγ p where the matrices are defined in a similar fashion as the best linear predictor section The system above is called the sample Yule-Walker equations We can write Yule-Walker equations in terms of ACF too Explicitly, if we divide ˆγ p by γ(0) and multiply it in ˆΓ p then ˆφ = 1 ˆR p ˆρ p ˆR p = ˆΓ p ˆγ(0) = 1 1 ˆR p = ˆΓ p ˆγ(0) ˆρ p = ˆγ p /ˆγ(0) where ˆσ 2 = ˆγ(0) [1 ˆφ ] ˆρ p Notice that ˆγ(0) is the sample variance of {x 1,, x n } Based on a sample {x 1,, x n }, the above equations will provide the parameter estimates Using advanced probability theory, it can be shown that φ 1 φ 1 φ = MV N φ =, σ2 n Γ 1 p φ p φ p for large n If we replace σ 2 and Γ p by their sample estimates ˆσ 2 and ˆΓ p we can use this result for large-sample confidence intervals for the parameters φ 1,, φ p Example 62 Based on the following sample ACF and PACF, an AR(2) has been proposed for the data Provide the Yule-Walker estimates of the parameters as well as 95% confidence intervals for the parameters in φ The data was collected over a window of 200 points with sample variance 369 with the following table: We want to estimate φ 1 and φ 2 in h ˆf(h) ˆα(h) X t = φ 1 X t 1 + φ 2 X t 2 + Z t, {Z t } N(0, σ 2 ) The system is Similarly, [ ˆφ = ] 1 [ [ [ ˆσ 2 = ˆγ(0) 1 ˆφ ˆρ(1) ˆρ(2) 369 ] = [ ]] = 1112 ] 9
10 Therefore the estimated model is X t = 0594X t X t 1 + Z t, {Z t } W N(0, 1112) Now φ N ) (φ, σ2 n Γ 1 2 ([ ] 0594 = N, 1112 [ ([ ] [ ]) = N, ]) So the 95% CI s for φ 1, φ 2 are ˆφ 1 ± 196 Vˆar( φ) = 0594 ± = (0455, 0733) ˆφ 2 ± 196 Vˆar( φ) = 0276 ± = (0137, 0415) 7 Forecasting We discuss how forecasting works under our studied processes 71 Forecasting AR(p) Let X t = p φ jx t j + Z t, Z t W N{0, σ 2 } be a causal AR(p) process We have ˆX n+h = E[X n+h X 1,, X n ], h > 0 h 1 p = E φ j X n+h j + φ j X n+h j X 1,, X n + E [Z n+h X 1,, X n ] j=h h 1 p = E φ j X n+h j X 1,, X n + E φ j X n+h j X 1,, X n due to the uncorrelatedness of Z n+h with respect to X k If h = 1, then the above equation becomes If h = 2, 3,, p then remark that and so ˆX n+1 = j=h p φ j X n+1 j j < h = n + h j > n j h = n + h j n =0 ˆX n+h = = p j=h h 1 φ j X n+h j + φ j E (X n+h j X 1,, X n ) h 1 φ j ˆXn+h j + p φ j X n+h j j=h 10
11 If h > p, then n + h j > n and ˆX n+h = p φ j E (X n+h j X 1,, X n ) = p φ j ˆXn+h j In summary, for a causal AR(p), the h step predictor is ˆX n+1 = p φ jx n+1 j h = 1 ˆX n+h = h 1 φ ˆX j n+h j + p j=h φ jx n+h j h = 2, 3,, p p φ ˆX j n+h j h > p In AR(p), the h step prediction is a linear combination of the previous steps We either have the previous p steps in X 1,, X n so we substitute the values (like the h = 1 case), or we don t have all or some of them, in which case we recursively predict Given a dataset, φ j can be estimated and ˆX n+h will be computed Example 71 Based on the annual sales data of a chain store, an AR(2) model with parameters ˆφ 1 = 1 and ˆφ 2 = 021 has bee fitted If the total sales of the last 3 years have been 9, 11 and 10 million dollars Forecast this year s total sales (2013) as well as that of 2015 We have Now X t = X t 1 021X t 2 + Z t, {Z t } W N(0, σ 2 ) ˆX 2013 = X X 2011 = 669 ˆX 2015 = ˆX ˆX 2013 = ˆX (669) and since then ˆX 2014 = ˆX ˆX 2012 = = 48 ˆX 2015 = (669) = Forecasting MA(q) MA processes are linear combinations of white noise To do forecasting in M A(q), we need to estimate θ 1,, θ q as well as approximate the innovations Z t, Z t+1, First, consider the very simple case of MA(1) where X t = Z t + θz t 1, {Z t } W N(0, σ 2 ) We have If h = 1, then the above equation is ˆX n+h = E [X n+h X 1,, X n ] = E [Z n+h X 1,, X n ] + θe [Z n+h 1 X 1,, X n ] ˆX n+1 = E [Z n+1 X 1,, X n ] +θe [Z n X 1,, X n ] =0 = θe [Z n X 1,, X n ] = θz n and if h > 1 then the equation becomes ˆX n+1 = E[Z n+h ] + θe Z n + h 1 X 1,, X n = 0 >n 11
12 Now we need to plug in a value for Z n We approximate the Z i s by U i s as follows Let U 0 = 0 and we estimate Ẑ t = U t = X t θu t 1, U 0 = 0 from the fact that Z t = X t θz t 1 We can then get that U 0 = 0 U 1 = X 1 U 2 = X 2 θx 1 U 3 = X 3 θx 2 + θ 2 X 1 Notice that as i, U i will need a convergence condition where θ < 1 is sufficient This was the invertibility condition for MA(1) We see that the U i s are recursively calculable and for an invertible MA(1) process, we have { θu n h = 1 ˆX n+h = 0 h > 1, U t = X t θu t 1, U 0 = 0 Now consider an MA(q) process X t = Z t + θ 1 Z t θ q Z t q We have ˆX n+h = E [X n+h X 1,, X n ] = E [Z n+h X 1,, X n ] + θ 1 E [Z n+h 1 X 1,, X n ] + + θ q E [Z n+h q X 1,, X n ] If h > q then the above equation s value is zero since we have n + h q > n If 0 < h q then at least some of the terms in the above are non-zero In particular, q ˆX n+h = θ j E [Z n+h 1 X 1,, X n ] = q θ j E [Z n+h 1 X 1,, X n ] and for j = h, h + 1,, q we know E[Z n+h j X 1,, X n ] = Z n+h j and hence j=h ˆX n+h = q θ j Z n+h j j=h Similar to MA(1), we approximate Z i s by U i s, provided the MA(q) process is invertible That is, θ(z) = 1 + θ 1 z + + θ q z q 0 for all z 1 Therefore, assuming that then U t = X t q θ ju t j and U 0 = U 1 = U 2 = = 0 U 0 = 0 U 1 = X 1 U 2 = X 2 θ 1 X 1 U 3 = X 3 θ 2 X 2 + θ 2 θ 1 X 1 In summary, for an invertible MA(q) process, we have { q j=h ˆX n+h = θ ju n+h j 1 h q 0 h > q where U 0 = U i = = 0, i < 0 and U t = X t q θ ju t j for t = 1, 2, 3, 12
13 Example 72 Consider the MA(1) process X t = Z t +05Z t 1 where {Z n } W N(0, σ 2 ) If X 1 = 03, X 2 = 01, X 3 = 01, predict X 4, X 5 Notice that ˆX 5 = ˆX 3+2 which is a 2-step prediction based on the history X 1 = X 2 = X 3 Since this is an MA(1) model, hence 1-correlated, ˆX5 = 0 For X 4 we have where ˆX 4 = U 0 = 0 and hence ˆX 4 = 05(0225) = = θ j U 3+1 j = θ 1 U 3 = 05U 3 U 1 = X 1 05U 0 = X 1 = 03 U 2 = X 2 05U 1 = 01 (05)(03) = 025 U 3 = X 3 05U 2 = 01 (05)( 025) = 0225 Example 73 Consider the MA(1) process X t = Z t + θz t 1 with {Z t } W N(0, σ 2 ) and θ < 1 Show that the one-step predictor ˆX n+1 = θu n is equal to the predictor ˆX n+1 = n ( θ) j X n j+1 This is by definition of U n which we can write the closed form and hence i=1 n 1 U n = X n + ( θ) i X n i, n 2 i=1 n 1 n 1 n ˆX n+1 = θu n = θx n ( θ) i+1 X n i = ( θ) i+1 X n i = ( θ) j X n j+1 = ˆXn+1 i=0 Clearly for n = 0, 1 we have ˆX n+1 = ˆXn+1 as well This shows that even in the MA process, the predictor may be written as a linear function of the history 13
STAT 443 (Winter ) Forecasting
Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationSTOR 356: Summary Course Notes
STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction
More informationContents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8
A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5
More informationMATH 5075: Time Series Analysis
NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationParameter estimation: ACVF of AR processes
Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationITSM-R Reference Manual
ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationMAT3379 (Winter 2016)
MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationMAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)
MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationMarcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design
Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling
More information6.3 Forecasting ARMA processes
6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss
More informationSTAD57 Time Series Analysis. Lecture 8
STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationARMA models with time-varying coefficients. Periodic case.
ARMA models with time-varying coefficients. Periodic case. Agnieszka Wy lomańska Hugo Steinhaus Center Wroc law University of Technology ARMA models with time-varying coefficients. Periodic case. 1 Some
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More informationTime Series Outlier Detection
Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationFinancial Time Series Analysis Week 5
Financial Time Series Analysis Week 5 25 Estimation in AR moels Central Limit Theorem for µ in AR() Moel Recall : If X N(µ, σ 2 ), normal istribute ranom variable with mean µ an variance σ 2, then X µ
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationTime Series Analysis - Part 1
Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationSTAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019
STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")
More informationSTAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina
STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationA SARIMAX coupled modelling applied to individual load curves intraday forecasting
A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationIntroduction to Time Series Analysis. Lecture 7.
Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationRoss Bettinger, Analytical Consultant, Seattle, WA
ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationThe autocorrelation and autocovariance functions - helpful tools in the modelling problem
The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,
More informationARMA (and ARIMA) models are often expressed in backshift notation.
Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time
More informationStat 248 Lab 2: Stationarity, More EDA, Basic TS Models
Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationSTOR 356: Summary Course Notes Part III
STOR 356: Summary Course Notes Part III Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 27599-3260 rls@email.unc.edu April 23, 2008 1 ESTIMATION
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationStochastic Modelling Solutions to Exercises on Time Series
Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationDifferencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t
Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More information14 Autoregressive Moving Average Models
14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationChapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models
Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function
More informationEfficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach
The University of Toledo The University of Toledo Digital Repository Theses and Dissertations 2015 Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach
More informationSTA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)
STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More information