Time series models in the Frequency domain. The power spectrum, Spectral analysis
|
|
- Arron Lucas
- 5 years ago
- Views:
Transcription
1 ime series models in the Frequency domain he power spectrum, Spectral analysis
2 Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ ( ) ( ) sinλ = yt Y t + Yt Y t t= t= λ = / ; ( λt t= t= cos = sinλ t= 0) = + t, s= = ( Yt Y)( Ys Y) cos λ ( s t) t, s= = ( Yt Y) + ( Yt Y ) ( Ys Y) cos λ ( s t ) t= ( Yt Y)( Ys Y)( cosλt cosλ s sinλt sinλ s) = 0+ p= t< s= C C cosλ p = [ C + C cos λ p] ; p=s-t 0 p= p p = C [+ r cosλ p]. 0 p= p
3 Periodogram and autocorrelations (cont.) ( ) 0 p= p. Ι () = C + C cosλ p A weighted average of the autocorrelations but the two statistics play a different role. he periodogram represents cyclical effects, which are deterministic. hus, if we have high correlations at lags, 4, we should see them in the periodogram. However, if we analyse, for example, a seasonal MA() process, it is no longer deterministic, the seasonal effects change over time and we only have r 0 with the other empirical correlations close to zero (not lie in a deterministic process). he periodogram may no longer indicate this phenomenon because the correlation is not the outcome of a single or few cycles with discrete frequencies. his suggests considering all the frequencies in the range [ 0, ], which is what Spectral analysis does. By considering the spectrum we should observe a hump around the seasonal frequencies. 3
4 he power spectrum he power spectrum of a stationary process is defined by the continuous function, f( λ ) = γ 0+ cos τ= τ γ λτ, 0 λ. Compare with the periodogram, ( ) p= f( λ ) = C0+ Cp cosλ p. f ( λ ) is defined for all λ and not just for λ / If t =. he summation is from to. Y is white noise, ( t) 0 ( ) γ = for all t 0 and f ( λ) = γ 0 / =const. hus, the spectrum is flat or the process may be regarded as consisting of an infinite number of cyclical components, all of which have equal weight. (his is what defines a white noise). 4
5 Notice that he power spectrum (cont.) γ 0 ( 0) 5 ( 0) dλ= γ (area under the spectrum). In fact, it is true in general that, 0 0 f ( λ) dλ= γ 0 ( ). (his is because ( λt) sin cos ( λt) dλ= 0 = 0). he function t f ( λ) / γ (0) is called the Spectral Density (uses correlations instead of covariances). Example MA() process- Yt = εt + θε t. ( ) ( ) ( ) ( ) γ 0 = σ + θ,, γ = θσ, γ = 0 ; ε ( + ) σ ε θ σε f ( λ) = + θ cos( λ) σε For θ = 0.5, f ( λ) = [5+ 4cos ( λ) ] 4 σε For θ = 0.5, f ( λ) = [5 4cos ( λ) ]. 4 ε
6 f σ = [5+ 4cos ]; θ = 0.5, σ ε = 4 ε ( λ) ( λ) 6
7 f σ = [5 4cos ]; θ = 0.5, σ ε = 4 ε ( λ) ( λ) 7
8 Exercises Exercise 5: Compute the power spectrum of the MA() model, Y = ε 0.5ε 0.6ε ; Var( ε ) = t t t t t Exercise 6: Let X t, Y t be two independent stationary series. 6.. Show that the series Zt = X t + Yt is also stationary. 6.. Let f ( λ), f ( λ ) denote the power X Y spectrum of the series X, Y. Show that, f ( λ) = f ( λ) + f ( λ) Z X Y 6.3. Compute f ( λ ) for the case where, X = ε 0.5 ε ; Var( ε ) = t t t t t t t t Z Y = e e ; Var( e ) = ; Cov(, e ) 0 t t ε = for all. t t 8
9 Properties of the periodogram as an estimator of the spectrum Power Spectrum: f ( λ ) = [ γ + γ cos λτ ] 0 τ= τ Periodogram: Ι () = [C + C cosλ p] 0 p= p It is natural to thin of I( λ ) / evaluated at any λ as an estimate of the spectrum. his is so because we would usually expect the covariances γ ( τ ) and hence C( τ ) to decay to zero as τ. However things are not so simple and to illustrate the problem consider the case where noise. K Y,..., Y are normal white 9
10 Poperties of the periodogram as an estimator of the power spectrum (cont.) For the white noise case, ( ˆ α ˆ ) K + βk I( λ ) σ ~ = = f ( λk ) σ K χ() E for λ = /. However, Var K 4 K σ σ f I( λ ) = 4 = = ( λk ). 4 hus, I( λk ) / is unbiased for f ( λ ), but the variance does not decay to zero with, so that the estimator is not consistent. his happens because we actually estimate parameters (the covariances appearing in the expression for the periodogram) by observations. Each covariance estimate is O(/), but the cumulative effect is O(). If U = Vi and Var(V i ) = VarU = 0.
11 Poperties of the periodogram as an estimator of the power spectrum (cont.) Notice also that since the periodograms are uncorrelated, Cov[ fˆ ( λ ), fˆ ( λ )] = 0 for j, so that the periodogram has a very irregular appearance. It can be shown that for any stationary fˆ ( λ) process, χ() for all 0 λ f ( λ) ( ˆ E f ( λ)) f ( λ) ; Var( fˆ ( λ)) f ( λ) j K. Also, the ordinates at different frequencies are asymptotically independent, giving rise to the erratic form of the estimates. In the white noise case, fˆ ( λ) ( ˆ α ˆ ) ˆ ˆ K + βk γ 0 ( α K + βk ) = / =. f ( λ) σ
12 Estimation of the power spectrum We are interested in estimating a continuous function and hence we lie our estimate to be a smooth function. Spectral analysis: How to estimate the power spectrum consistently. Assumption: he series is stationary, i.e., no trend or deterministic seasonality. Also needed from a practical point of view; we may not be able to detect frequencies other than those corresponding to deterministic trends and seasonals).
13 More on the stationarity requirement he stationarity requirement seems to rule out the use of the model, [ n / ] Y = α + ( u cosλ t+ v sin λ t) + ε. t 0 t = However, if we assume that u and v are random variables satisfying, E( u ) = E( v ) = 0 ; Var( u ) = Var( v ) =σ E( u u ) = E( v v ) = 0 i j i j i j E( u v ) = 0 for all i and j, i E( Y t ) [ n / ] = [ n / ] = j α =, γ ( τ ) = E( Y Y ) = = 0 Var( Y ) t [ n / ] ε σ = = σ +, and σ {cos( λ t)cos[ λ ( t τ )] + sin( λ t)sin[ λ ( t τ )]} t t τ σ cos( λτ ). 3
14 Stationarity requirement (cont.) he signal is stationary and yet deterministic (fixed) for any given realization, so that it is more a matter of interpretation. 4
15 Estimation of the power spectrum Method : Use estimators of the form, M fˆ( λ ) = [ w C + w C cos( λ )], 0 0 = where the { w } are decreasing weights called the lag window and M << is the truncation point. Basic Idea: we only consider some of the covariances and since the precision of the estimates C decreases as increases, the weights decrease in. Under normality assumptions, the periodogram estimates Ι ( ˆ λ K ) are sufficient statistics for the periodograms Ι ( λ K ) (considered as parameters). his means that if we are looing for smoothed estimates of the power spectrum, all that we need to do is to smooth the periodogram estimators. Indeed, the estimator f ˆ( λ ) defined above can be shown to be a smoothed function of the periodograms. 5
16 Lag windows in common use ) Parzen window M / M M 3 M wk = M M 0 > M ) uey window w K + cos = 0,..., M = M 0 > M he two windows are similar, but uey s weights are somewhat larger. 3) Daniel (rectangular) window w K sin( / M ) = =,... ; w 0=. / M 6
17 Sampling properties of spectral estimates w = for = 0,,,M, Suppose first that M such that, fˆ( λ ) = [ C0+ C cos( λ )]. = E[ ( Y )( ) t µ y Y t t µ y = γ = +, - C obtained by substituting Y for µ y and dividing by instead of -, is consistent for γ. We find that for M,, ( M / ) 0, M ˆ E[ f ( λ)] [ γ + ( ) γ cos( λ)] f ( λ) and 0 =, Var[ fˆ ( λ)] 0, since we only estimate M covariances and M<<. 7
18 Properties of spectral estimates (cont.) fˆ( λ ) = [ w0c0 + wc cos( λ )]. = he following results can be shown to hold: lim E fˆ( λ ) = f ( λ ) for all λ, ˆ Var [ f ( λ)] ( + δλ,0, ) f ( λ) w K= ( ) λ= 0, w = w ; δλ,0, =. 0 otherwise Also, f ˆ( λ ) and f ˆ( λ ) are correlated positively when λ λ 0. he correlation increases as λ λ decreases f ˆ( λ ) is a smooth function. he asymptotic results assume certain behaviour of the weights w. For consistency it is sufficient that M / 0 as. 8
19 Bias and variance of window estimates Window Bias var ˆ { f ( λ)}/ Mf ( λ) 6 Parzen M f ''( λ ) 0.54 uey Daniel 4M 6M f ( λ) 0.75 f ( λ).00 he smaller the bias, the larger the variance. he values in the table assume λ 0,. For λ= 0,, the variances should be multiplied by. 9
20 Example MA() Yt = εt 0.5ε t 0.6ε t ; Var( ε t ) = γ (0) = =.6 γ () = = 0.; γ () = 0.6 f ( λ) = [ γ (0) + γ ( τ )cos( λτ )] τ= = [.6 + ( 0.cosλ 0.6cos λ)] = [.6 0.4cosλ.cos λ]. f '( λ) = [0.4sinλ+.4sin λ] f ''( λ) = [0.4cosλ+ 4.8cos λ] 0
21 f ( λ ) [.6 0.4cosλ.cos λ] = ; σ ε =
22 Window Example (cont.) Bias 6 Parzen M uey Daniel 4M 6M f var ˆ ''( λ ) 0.54 f ( λ) 0.75 f ( λ).00 Suppose that =00, M=0 { f ( λ)}/ Mf ( λ) = =. ( M / ) f ( ) 0.6 ; ( M / ) f ( ) 0.03 Window Bias Variance MSE λ / / / Parzen uey Daniel f '( ) =.3, f '( ) = 0 ; f ''( ) =.53, f ''( ) =.4.
23 Exercise 7: Compute the bias, variance and MSE of the windows of Parzen, uey and Daniel at λ = 0., 0.5, 0.9 for the model, Y = ε ε σ =. Assume =00, M=0., t t t ε Choice of M (truncation point) Can be chosen in an optimal way if the underlying process is nown. Compromise between bias and variance; he smaller M, the smaller is the variance but the larger is the bias. Small Values of M possibly too smooth but may give an idea of the large peas (but may hide the small peas). Large values of M possibly too many peas (lie with the periodogram). 3
24 Choice of M (cont.) A possible procedure for determining M use the estimated autocorrelations, set M such that ˆ ρ j 0 for j> M Not very efficient because the ˆ ρ j are autocorrelated, so that they tend to decay slower than the theoretical autocorrelations. Another procedure is window closing where we try several values of M, say 3, such that ( M 3 / M) 4 and study the results. (Low values will highlight the large peas, high values will show other important peas but also spurious peas; compromise achieved by studying the results for the intermediate values of M. 4
25 Choice of M (cont.) It is sometimes recommended in the literature to choose M as a fixed proportion of, say M = (implies ( M / ) 0 as ) but there is no theoretical justification for this strategy. For example, in the white noise case the best choice is M=0. In principle, we can estimate the spectrum at any value of λ but often it is evaluated at frequencies j/q, where Q is chosen to be sufficiently large to show all the important cycles (common choice Q=M). 5
26 Estimation of the power spectrum, an alternative method So far we considered estimators of the form, M fˆ( λ ) = [ w C + w C cos( λ )]. 0 0 = Method : An alternative method is to use weighted averages of the periodograms, i.e., l ( j+ l) l= m l= m * * fˆ( λj ) = wl I ( λj+ ) = wl I ( ) l= m l= m j ; λ = j /, and l varies over m+ successive integers such that the λ l are symmetric around the λ for which the estimator is computed. j Example: l= m l fˆ( λj ) = I ( λj+ ) m +. l= m See later for other examples of weights. 6
27 Alternative method (cont.) General expression of estimators: l ; l= m * fˆ( λ j ) = w l m lι ( λ = j+ ) l * * * * * * l l 0 m l = l w = w, w w... w > 0 w = At the end points, m * fˆ(0) wl I ( l / ) l = m * f I wl I l= =, l ˆ( ) = [ ( ) + ( )]. ;. Assumes symmetry of the periodogram around 0 and and I (0) = 0. 7
28 Rationale of Method he periodogram estimators are asymptotically unbiased and since neighbouring estimators are asymptotically uncorrelated, the variance of f ˆ( λ ) obtained this way is of order [/(m+)]. However, f ˆ( λ ) is generally biased; ˆ ( j+ l) ( j+ l) E f E w w f [ ; l= m * l= m * [ ( λj)] = [ lι ( )] = l ( ) l= m l= m E[ ( ) f ( ) Ι ]. However, if f ( λ ) is approximately linear around λ, the bias will be small. 8
29 Method (cont.) l l= m * fˆ( λ j ) = w l m lι ( λ = j + ). In general, the bias is small if m is small and ( ) f λ is a reasonably smooth function of λ. Choice of m same problem as with M under the previous procedure. he larger is m, the smaller is the variance but the larger is the bias. Advantage of Method over method Method is more efficient computationally, particularly with the use of the Fast Fourier ransform, and this is now the standard method used in computer software. 9
30 Confidence Intervals under Method It can be shown that for the estimator, M fˆ( λ ) = ( w C + w C cos( λ ) (Method ) 0 0 = Asym.. q fˆ( λ) / f ( λ) ~ χ ; M 0 q q = /[ w + w ]. = A 00( -α)% confidence interval for f ( λ ) is, q fˆ ( λ) q fˆ ( λ) [, ] χ χ q, α / q, α / For Parzen window, q = 3.7 / M. For uey window, q =.67 / M. 30
Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationTheoretical and Simulation-guided Exploration of the AR(1) Model
Theoretical and Simulation-guided Exploration of the AR() Model Overview: Section : Motivation Section : Expectation A: Theory B: Simulation Section : Variance A: Theory B: Simulation Section : ACF A:
More informationcovariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of
Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated
More information1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample
More information6.435, System Identification
System Identification 6.435 SET 3 Nonparametric Identification Munther A. Dahleh 1 Nonparametric Methods for System ID Time domain methods Impulse response Step response Correlation analysis / time Frequency
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationTime Series Solutions HT 2009
Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by
More informationEconometrics of financial markets, -solutions to seminar 1. Problem 1
Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive
More informationVIII. Coherence and Transfer Function Applications A. Coherence Function Estimates
VIII. Coherence and Transfer Function Applications A. Coherence Function Estimates Consider the application of these ideas to the specific problem of atmospheric turbulence measurements outlined in Figure
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationStatistics 349(02) Review Questions
Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation
More informationBispectral resolution and leakage effect of the indirect bispectrum estimate for different types of 2D window functions
Bispectral resolution and leakage effect of the indirect bispectrum estimate for different types of D window functions Teofil-Cristian OROIAN, Constantin-Iulian VIZITIU, Florin ŞERBAN Communications and
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.34: Discrete-Time Signal Processing OpenCourseWare 006 ecture 8 Periodogram Reading: Sections 0.6 and 0.7
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationChapter 2: Unit Roots
Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity
More information13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.
For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationModern Navigation. Thomas Herring
12.215 Modern Navigation Thomas Herring Estimation methods Review of last class Restrict to basically linear estimation problems (also non-linear problems that are nearly linear) Restrict to parametric,
More informationChapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe
System Identification written by L. Ljung, Prentice Hall PTR, 1999 Chapter 6: Nonparametric Time- and Frequency-Domain Methods Problems presented by Uwe System Identification Problems Chapter 6 p. 1/33
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationGMM, HAC estimators, & Standard Errors for Business Cycle Statistics
GMM, HAC estimators, & Standard Errors for Business Cycle Statistics Wouter J. Den Haan London School of Economics c Wouter J. Den Haan Overview Generic GMM problem Estimation Heteroskedastic and Autocorrelation
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationINDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -20 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.
INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -20 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Case study -3: Monthly streamflows
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationINDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.
INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Model selection Mean square error
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationinterval forecasting
Interval Forecasting Based on Chapter 7 of the Time Series Forecasting by Chatfield Econometric Forecasting, January 2008 Outline 1 2 3 4 5 Terminology Interval Forecasts Density Forecast Fan Chart Most
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationTREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER
J. Japan Statist. Soc. Vol. 38 No. 1 2008 41 49 TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER Andrew Harvey* and Thomas Trimbur** The article analyses the relationship between unobserved component trend-cycle
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationProblem Sheet 1 Examples of Random Processes
RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationEE/CpE 345. Modeling and Simulation. Fall Class 9
EE/CpE 345 Modeling and Simulation Class 9 208 Input Modeling Inputs(t) Actual System Outputs(t) Parameters? Simulated System Outputs(t) The input data is the driving force for the simulation - the behavior
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the
More information11.1 Gujarati(2003): Chapter 12
11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed
More informationSIO 221B, Rudnick adapted from Davis 1. 1 x lim. N x 2 n = 1 N. { x} 1 N. N x = 1 N. N x = 1 ( N N x ) x = 0 (3) = 1 x N 2
SIO B, Rudnick adapted from Davis VII. Sampling errors We do not have access to the true statistics, so we must compute sample statistics. By this we mean that the number of realizations we average over
More informationHeteroskedasticity and Autocorrelation Consistent Standard Errors
NBER Summer Institute Minicourse What s New in Econometrics: ime Series Lecture 9 July 6, 008 Heteroskedasticity and Autocorrelation Consistent Standard Errors Lecture 9, July, 008 Outline. What are HAC
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 7 8 onparametric identification (continued) Important distributions: chi square, t distribution, F distribution Sampling distributions ib i Sample mean If the variance
More informationChapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model
Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationLecture 11: Spectral Analysis
Lecture 11: Spectral Analysis Methods For Estimating The Spectrum Walid Sharabati Purdue University Latest Update October 27, 2016 Professor Sharabati (Purdue University) Time Series Analysis October 27,
More informationTime Series Analysis
Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationModule 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur
Module Signal Representation and Baseband Processing Version ECE II, Kharagpur Lesson 8 Response of Linear System to Random Processes Version ECE II, Kharagpur After reading this lesson, you will learn
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationDATA IN SERIES AND TIME I. Several different techniques depending on data and what one wants to do
DATA IN SERIES AND TIME I Several different techniques depending on data and what one wants to do Data can be a series of events scaled to time or not scaled to time (scaled to space or just occurrence)
More informationF9 F10: Autocorrelation
F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?
More informationARIMA modeling to forecast area and production of rice in West Bengal
Journal of Crop and Weed, 9(2):26-31(2013) ARIMA modeling to forecast area and production of rice in West Bengal R. BISWAS AND B. BHATTACHARYYA Department of Agricultural Statistics Bidhan Chandra Krishi
More informationComputational Data Analysis!
12.714 Computational Data Analysis! Alan Chave (alan@whoi.edu)! Thomas Herring (tah@mit.edu),! http://geoweb.mit.edu/~tah/12.714! Introduction to Spectral Analysis! Topics Today! Aspects of Time series
More informationRegression of Time Series
Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationThe Identification of ARIMA Models
APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationMinitab Project Report Assignment 3
3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series
More informationCh3. TRENDS. Time Series Analysis
3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true
More informationCointegration, Stationarity and Error Correction Models.
Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average
More informationESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles).
ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Set up standard example and notation. Review performance measures (means, probabilities and quantiles). A framework for conducting simulation experiments
More informationSystem Identification & Parameter Estimation
System Identification & Parameter Estimation Wb3: SIPE lecture Correlation functions in time & frequency domain Alfred C. Schouten, Dept. of Biomechanical Engineering (BMechE), Fac. 3mE // Delft University
More information9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.
9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts
More informationNon-parametric identification
Non-parametric Non-parametric Transient Step-response using Spectral Transient Correlation Frequency function estimate Spectral System Identification, SSY230 Non-parametric 1 Non-parametric Transient Step-response
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationMA Advanced Econometrics: Applying Least Squares to Time Series
MA Advanced Econometrics: Applying Least Squares to Time Series Karl Whelan School of Economics, UCD February 15, 2011 Karl Whelan (UCD) Time Series February 15, 2011 1 / 24 Part I Time Series: Standard
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationEconomic modelling and forecasting
Economic modelling and forecasting 2-6 February 2015 Bank of England he generalised method of moments Ole Rummel Adviser, CCBS at the Bank of England ole.rummel@bankofengland.co.uk Outline Classical estimation
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationII. Nonparametric Spectrum Estimation for Stationary Random Signals - Non-parametric Methods -
II. onparametric Spectrum Estimation for Stationary Random Signals - on-parametric Methods - - [p. 3] Periodogram - [p. 12] Periodogram properties - [p. 23] Modified periodogram - [p. 25] Bartlett s method
More informationEconomics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)
Economics Department LSE EC402 Lent 2015 Danny Quah TW1.10.01A x7535 : Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL) 1. Suppose ɛ is w.n. (0, σ 2 ), ρ < 1, and W t = ρw t 1 + ɛ t, for t = 1, 2,....
More informationChapter 2: The Fourier Transform
EEE, EEE Part A : Digital Signal Processing Chapter Chapter : he Fourier ransform he Fourier ransform. Introduction he sampled Fourier transform of a periodic, discrete-time signal is nown as the discrete
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More informationSignals and Spectra (1A) Young Won Lim 11/26/12
Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later
More informationRandom processes and probability distributions. Phys 420/580 Lecture 20
Random processes and probability distributions Phys 420/580 Lecture 20 Random processes Many physical processes are random in character: e.g., nuclear decay (Poisson distributed event count) P (k, τ) =
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationUse of the Autocorrelation Function for Frequency Stability Analysis
Use of the Autocorrelation Function for Frequency Stability Analysis W.J. Riley, Hamilton Technical Services Introduction This paper describes the use of the autocorrelation function (ACF) as a complement
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationCross-Sectional Vs. Time Series Benchmarking in Small Area Estimation; Which Approach Should We Use? Danny Pfeffermann
Cross-Sectional Vs. Time Series Benchmarking in Small Area Estimation; Which Approach Should We Use? Danny Pfeffermann Joint work with Anna Sikov and Richard Tiller Graybill Conference on Modern Survey
More information