Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Size: px
Start display at page:

Download "Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing"

Transcription

1 Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College c Danilo P. Mandic Digital Signal Processing

2 Motivation:- Wold Decomposition Theorem The most fundamental justification for time series analysis is due to Wold s decomposition theorem, where it is explicitly proved that any (stationary) time series can be decomposed into two different parts. Therefore, a general random process can be written a sum of two processes x[n] = x p [n] + x r [n] x r [n] regular random process x p [n] predictable process, with x r [n] x p [n], E{x r [m]x p [n]} = that is we can separately treat the predictable process (i.e. a deterministic signal) and a random signal. c Danilo P. Mandic Digital Signal Processing 2

3 What do we actually mean? a) Periodic oscillations b) Small nonlinearity c) Route to chaos d) Route to chaos e) small noise f) HMM and others c Danilo P. Mandic Digital Signal Processing 3

4 Example from brain science Electrode positions Raw EEG Useful signal c Danilo P. Mandic Digital Signal Processing 4

5 Linear Stochastic Processes It therefore follows that the general form for the power spectrum of a WSS process is P x (e jω ) = P xr (e jω ) + N α k u (ω ω k ) k= We look at processes generated by filtering white noise with a linear shift invariant filter that has a rational system function. These include the Autoregressive (AR) all pole system Moving Average (MA) all zero system Autoregressive Moving Average (ARMA) poles and zeros Notice the difference between shift invariance and time invariance c Danilo P. Mandic Digital Signal Processing 5

6 ACF and Spectrum of ARMA models Much of interest are the autocorrelation function and power spectrum of these processes. (Recall that ACF PSD in terms of the available information) Suppose that we filter white noise w[n] with a causal linear shift invariant filter having a rational system function with p poles and q zeros H(z) = B q(z) A p (z) = q k= b q(k)z k + p k= a p(k)z k Assuming that the filter is stable, the output process x[n] will be wide sense stationary and with P w = σ 2 w, the power spectrum of x[n] will be P x (z) = σw 2 B q (z)b q (z ) A p (z)a p (z ) Recall that ( ) in analogue frequency corresponds to z in digital freq. c Danilo P. Mandic Digital Signal Processing 6

7 Frequency Domain In terms of digital frequency θ (unit circle e jθ = e jωt ) B q (z)b q (z ) quadratic form and real valued A p (z)a p (z ) quadratic form and real valued B P z (e jθ ) = σw 2 q (e jθ ) 2 A p (e jθ ) 2 We are therefore using H(z) to shape the spectrum of white noise. A process having a power spectrum of this form is known as an autoregressive moving average process of order (p,q) and is referred to as an ARMA(p,q) process c Danilo P. Mandic Digital Signal Processing 7

8 Example Plot the power spectrum of an ARMA(2,2) process for which the zeros of H(z) are z =.95e ±jπ/2 poles are at z =.9e ±j2π/5 Solution: The system function is (poles and zeros resonance & sink) H(z) = +.925z z +.8z 2 Power Spectrum Frequency c Danilo P. Mandic Digital Signal Processing 8

9 Difference Equation Representation Random processes x[n] and w[n] are related by the linear constant coefficient equation x[n] p a p (l)x[n l] = l= q b q (l)w[n l] l= Notice that the autocorrelation function of x[n] and crosscorrelation between x[n] and w[n] follow the same difference equation, i.e. if we multiply both sides of the above equation by x[n k] and take the expected value, we have r xx (k) p a p (l)r xx (k l) = l= q b q (l)r xw (k l) l= Since x is WSS, it follows that x[n] and w[n] are jointly WSS. c Danilo P. Mandic Digital Signal Processing 9

10 General Linear Processes: Stationarity and Invertibility Consider a linear stochastic process output from a linear filter, driven by WGN w[n] x[n] = w[n] + b w[n ] + b 2 w[n 2] + = w[n] + that is, a weighted sum of past inputs w[n]. b j w[n j] For this process to be a valid stationary process, the coefficients must be absolutely summable, that is j= b j <. The model implies that under suitable condition, x[n] is also a weighted sum of past values of x, plus an added shock w[n], that is x[n] = a x[n ] + a 2 x[n 2] + + w[n] Linear Process is stationary if j= b j < Linear Process is invertible if j= a j < j= c Danilo P. Mandic Digital Signal Processing

11 Are these ARMA(p,q) processes? Unit response u[n] = {, n <, n If w[n] = δ[n] then u[n] = u[n ] + w[n], n Ramp function r[n] = {, n < n, n If w[n] = u[n] then r[n] = r[n ] + w[n], n c Danilo P. Mandic Digital Signal Processing

12 Autoregressive Processes A general AR(p) process (autoregressive of order p) is given by p x[n] = a x[n ] + + a p x[n p] + w[n] = a i x[n i] + w[n] i= Observe the auto regression above Duality between AR and MA processes: For instance the first order autoregressive process x[n] = a x[n ] + w[n] b j w[n j] j= Due to its all pole nature follows the duality between IIR and FIR filters. c Danilo P. Mandic Digital Signal Processing 2

13 ACF and Spectrum of AR Processes To obtain the autocorrelation function of an AR process, multiply the above equation by x[n k] to obtain x[n k]x[n] = a x[n k]x[n ] + a 2 x[n k]x[n 2] + +a p x[n k]x[n p] + x[n k]w[n] Notice that E{x[n k]w[n]} vanishes when k >. Therefore we have r xx (k) = a r xx (k ) + a 2 r xx (k 2) + + a p r xx (k p) k > On dividing throughout by r xx () we obtain ρ(k) = a ρ(k ) + a 2 ρ(k 2) + + a p ρ(k p) k > Parameters ρ(k) are correlation coefficients c Danilo P. Mandic Digital Signal Processing 3

14 Variance and Spectrum of AR Processes Variance: When k = the contribution from the term E{x[n k]w[n]} is σ 2 w, and r xx () = a r xx ( ) + a 2 r xx ( 2) + + a p r xx ( p) + σ 2 w Divide by r xx () = σ 2 x to obtain σ 2 x = σ 2 w ρ a ρ 2 a 2 ρ p a p Spectrum: P xx (f) = 2σ 2 w a e j2πf a p e j2πpf 2 f /2 Recall Spectrum of linear systems from the Course Introduction c Danilo P. Mandic Digital Signal Processing 4

15 Yule Walker Equations For k =, 2,..., p a set of equations:- from the general autocorrelation function, we obtain r xx () = a r xx () + a 2 r xx () + + a p r xx (p ) r xx (2) = a r xx () + a 2 r xx () + + a p r xx (p 2). =. r xx (p) = a r xx (p ) + a 2 r xx (p 2) + + a p r xx () These equations are called the Yule Walker or normal equations. Their solution gives us the set of autoregressive parameters a = [a,...,a p ] T. This can be expressed in a vector matrix form as a = R xxr xx Due to Toeplitz structure of R xx, its positive definitness enables matrix inversion c Danilo P. Mandic Digital Signal Processing 5

16 ACF Coefficients For the autocorrelation coefficients ρ k = r xx (k)/r xx () we have ρ = a + a 2 ρ + + a p ρ p ρ 2 = a ρ + a a p ρ p 2. =. ρ p = a ρ p + a 2 ρ p a p When does the sequence {ρ,ρ,ρ 2,...} vanish? Homework:- Try command xcorr in Matlab c Danilo P. Mandic Digital Signal Processing 6

17 Example:- Yule Walker modelling in Matlab In Matlab Power spectral density using Y W method pyulear Pxx = pyulear(x,p) [Pxx,w] = pyulear(x,p,nfft) [Pxx,f] = pyulear(x,p,nfft,fs) [Pxx,f] = pyulear(x,p,nfft,fs, range ) [Pxx,w] = pyulear(x,p,nfft, range ) Description:- Pxx = pyulear(x,p) implements the Yule-Walker algorithm, and returns Pxx, an estimate of the power spectral density (PSD) of the vector x. To remember for later This estimate is also an estimate of the maximum entropy. Se also aryule, lpc, pburg, pcov, peig, periodogram c Danilo P. Mandic Digital Signal Processing 7

18 Example:- AR(p) signal generation Generate the input signal x by filtering white noise through the AR filter Estimate the PSD of x based on a fourth-order AR model Solution:- randn( state,); x = filter(,a,randn(256,)); pyulear(x,4) % AR system output % Fourth-order estimate c Danilo P. Mandic Digital Signal Processing 8

19 Alternatively:- Yule Walker modelling AR(4) system given by y[n] = 2.237y[n ] 2.943y[n 2]+2.697y[n 3].966y[n 4]+w[n] a = [ ]; % AR filter coefficients freqz(,a) % AR filter frequency response title( AR System Frequency Response ) c Danilo P. Mandic Digital Signal Processing 9

20 From Data to AR(p) Model So far, we assumed the model (AR, MA, or ARMA) and analysed the ACF and PSD based on known model coefficients. In practice:- DATA MODEL This procedure is as follows:- * record data x(k) * find the autocorrelation of the data ACF(x) * divide by r_xx() to obtain correlation coefficients \rho(k) * write down Yule-Walker equations * solve for the vector of AR paramters The problem is that we do not know the model order p beforehand; we will deal with this problem later in Lecture 2. c Danilo P. Mandic Digital Signal Processing 2

21 Example:- Finding parameters of x[n] =.2x[n ].8x[n 2] + w[n] 6 AR(2) signal x=filter([],[,.2,.8],w) 2 ACF for AR(2) signal x=filter([],[,.2,.8],w) ACF for AR(2) signal x=filter([],[,.2,.8],w) AR(2) signal values 2 2 ACF of AR(2) signal 5 5 ACF of AR(2) signal Apply:- Sample number Correlation lag 2 2 Correlation lag for i=:6; [a,e]=aryule(x,i); display(a);end a () = [.6689] a (2) = [.246,.88] a (3) = [.759,.7576,.358] a (4) = [.762,.753,.456,.83] a (5) = [.763,.752,.562,.248,.4] a (6) = [.762,.758,.565,.98,.62,.67] c Danilo P. Mandic Digital Signal Processing 2

22 Special case:- AR() Process (Markov) Given below (Recall p(x[n],x[n ],...,x[]) = p(x[n] x[n ])) x[n] = a x[n ] + w[n] = w[n] + a x[n ] + a 2 w[n 2] + i) for the process to be stationary < a <. ii) Autocorrelation Function:- from Yule-Walker equations r xx (k) = a r xx (k ), k > or for the correlation coefficients, with ρ = ρ k = a k, k > Notice the difference in the behaviour of the ACF for a positive and negative c Danilo P. Mandic Digital Signal Processing 22

23 Variance and Spectrum of AR() process Can be calculated directly from a general expression of the variance and spectrum of AR(p) processes. Variance:- Also from a general expression for the variance of linear processes from Lecture σ 2 x = σ2 w ρ a = σ2 w a 2 Spectrum:- Notice how the flat PSD of WGN is shaped according to the position of the pole of AR() model (LP or HP) P xx (f) = 2σ 2 w a e j2πf 2 = 2σ 2 w + a 2 2a cos(2πf) c Danilo P. Mandic Digital Signal Processing 23

24 Example: ACF and Spectrum of AR() for a = ±.8 4 x[n] =.8*x[n ] + w[n] 5 x[n] =.8*x[n ] + w[n] Signal values 2 2 Signal values Sample Number ACF Sample Number ACF Correlation.5.5 Correlation.5 Power/frequency (db/rad/sample) Correlation lag Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) a < High Pass Power/frequency (db/rad/sample) Correlation lag Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) a > Low Pass c Danilo P. Mandic Digital Signal Processing 24

25 Special Case:- Second Order Autoregressive Processes AR(2) The input output functional relationship is given by x[n] = a x[n ] + a 2 x[n 2] + w[n] For stationarity- (to be proven later) a + a 2 < a 2 a < < a 2 < This will be shown within the so called stability triangle c Danilo P. Mandic Digital Signal Processing 25

26 Work by Yule Modelling of sunspot numbers Recorded for more than 3 years. In 927, Yule modelled them and invented AR(2) model 5 Sunspot series Signal values Sample Number ACF for sunspot series Correlation Correlation lag Sunspot numbers and its autocorrelation function c Danilo P. Mandic Digital Signal Processing 26

27 Autocorrelation function of AR(2) processes The ACF ρ k = a ρ k + a 2 ρ k 2 k > Real roots: (a 2 + 4a 2 > ) ACF = mixture of damped exponentials Complex roots: (a 2 + 4a 2 < ) ACF exhibits a pseudo periodic behaviour ρ k = Dk sin(2πf k + F) sinf D - damping factor, of a sine wave with frequency f and phase F. D = a 2 cos(2πf ) = a 2 a 2 tan(f) = + D2 D 2 tan(2πf ) c Danilo P. Mandic Digital Signal Processing 27

28 Stability Triangle a 2 ACF ACF II I m Real Roots m 2 2 a ACF ACF III m Complex Roots m IV i) Real roots Region : Monotonically decaying ACF ii) Real roots Region 2: Decaying oscillating ACF iii) Complex roots Region 3: Oscilating pseudoperiodic ACF iv) Complex roots Region 4: Pseudoperiodic ACF c Danilo P. Mandic Digital Signal Processing 28

29 Yule Walker Equations Substituting p = 2 into Y-W equations we have ρ = a + a 2 ρ ρ 2 = a ρ + a 2 which when solved for a and a 2 gives a = ρ ( ρ 2 ) ρ 2 or substituting in the equation for ρ a 2 = ρ 2 ρ 2 ρ 2 ρ = a a 2 ρ 2 = a 2 + a2 a 2 c Danilo P. Mandic Digital Signal Processing 29

30 Variance and Spectrum More specifically, for the AR(2) process, we have:- Variance Spectrum σ 2 x = σ 2 w ρ a ρ 2 a 2 = ( ) a2 + a 2 σ 2 w ( a 2 ) 2 a 2 P xx (f) = = 2σ 2 w a e j2πf a 2 e j4πf 2 2σ 2 w + a 2 + a2 2 2a ( a 2 cos(2πf) 2a 2 cos(4πf)), f /2 c Danilo P. Mandic Digital Signal Processing 3

31 Example AR(2): x[n] =.75x[n ].5x[n 2] + w[n] 2 x[n] =.75*x[n 2].5*x[n ] + w[n] Signal values Sample Number ACF Correlation.5 Power/frequency (db/rad/sample) Correlation lag Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) The damping factor D =.5 =.7, frequency f = cos (.533) The fundamental period of the autocorrelation function is π = 6.2 c Danilo P. Mandic Digital Signal Processing 3

32 Partial Autocorrelation Function:- Motivation Let us revisit example from page 2 of Lecture Slides. AR(2) signal values AR(2) signal x=filter([],[,.2,.8],w) ACF for AR(2) signal x=filter([],[,.2,.8],w) ACF for AR(2) signal x=filter([],[,.2,.8],w) ACF of AR(2) signal ACF of AR(2) signal Sample number Correlation lag 2 2 Correlation lag We do not know p, let us re-write the coefficients as [a_p,...,a_p p = [.6689] = a p = 2 [.246,.88] = [a 2,a 22 ] p = 3 [.759,.7576,.358] = [a 3, a 32, a 33 ] p = 4 [.762,.753,.456,.83] = [a 4,a 42,a 43, a 44 ] p = 5 [.763,.752,.562,.248,.4] = [a 5,...,a 55 ] p = 6 [.762,.758,.565,.98,.62,.67] = [a 6,...,a 66 ] c Danilo P. Mandic Digital Signal Processing 32

33 Partial Autocorrelation Function Notice: ACF of AR(p) infinite in duration, but can by be described in terms of p nonzero functions ACFs. Denote by a kj the jth coefficient in an autoregressive representation of order k, so that a kk is the last coefficient. Then ρ j = a kj ρ j + + a k(k ) ρ j k+ + a kk ρ j k j =, 2,...,k leading to the Yule Walker equation, which can be written as ρ ρ 2 ρ k ρ ρ ρ k ρ k ρ k 2 ρ k 3 a k a k2. a kk = ρ ρ 2. ρ k c Danilo P. Mandic Digital Signal Processing 33

34 Partial ACF Coefficients: Solving these equations for k =, 2,... successively, we obtain a = ρ, a 22 = ρ 2 ρ 2 ρ 2, a 33 = ρ ρ ρ ρ 2 ρ 2 ρ ρ 3 ρ ρ 2 ρ ρ ρ 2 ρ, etc The quantity a kk, regarded as a function of lag k, is called the partial autocorrelation function. For an AR(p) process, the PAC a kk will be nonzero for k p and zero for k > p tells us the order of an AR(p) process. c Danilo P. Mandic Digital Signal Processing 34

35 Importance of Partial ACF For a zero mean process x[n], the best linear predictor in the mean square error sense of x[n] based on x[n ], x[n 2],... is ˆx[n] = a k, x[n ] + a k,2 x[n 2] + + a k,k x[n k + ] (apply the E{ } operator to the general AR(p) model expression, and recall that E{w[n]} = ) (Hint: E{x[n]} = ˆx[n] = E {a k, x[n ] + + a k,k x[n k + ] + w[n]} = a k, x[n ] + + a k,k x[n k + ]) ) whether the process is an AR or not In MATLAB, check the function: ARYULE and functions PYULEAR, ARMCOV, ARBURG, ARCOV, LPC, PRONY c Danilo P. Mandic Digital Signal Processing 35

36 Model order for Sunspot numbers 5 Sunspot series ACF for sunspot series Signal values 5 Correlation.5 Correlation Sample Number.5.5 Partial ACF for sunspot series Correlation lag Correlation lag Power/frequency (db/rad/sample) Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) Sunspot numbers, their ACF and partial autocorrelation (PAC) After lag k = 2, the PAC becomes very small c Danilo P. Mandic Digital Signal Processing 36

37 Model order for AR(2) generated process 8 AR(2) signal ACF for AR(2) signal 6 Signal values Correlation Correlation Sample Number Partial ACF for AR(2) signal Correlation lag Power/frequency (db/rad/sample) Correlation lag Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) AR(2) signal, its ACF and partial autocorrelation (PAC) After lag k = 2, the PAC becomes very small c Danilo P. Mandic Digital Signal Processing 37

38 Model order for AR(3) generated process AR(3) signal ACF for AR(3) signal 5.8 Signal values 5 Correlation Correlation Sample Number Partial ACF for AR(3) signal Correlation lag Power/frequency (db/rad/sample) Correlation lag Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) AR(3) signal, its ACF and partial autocorrelation (PAC) After lag k = 3, the PAC becomes very small c Danilo P. Mandic Digital Signal Processing 38

39 Model order for a financial time series From:- Nasdaq ascending Nasdaq descending 26 Nasdaq composite June 23 February Nasdaq composite February 27 June Nasdaq value Nasdaq value Day number ACF of Nasdaq composite June 23 February 27 7 x Day number ACF of Nasdaq composite June 23 February 27 7 x ACF value 3 2 ACF value Correlation lag Correlation lag c Danilo P. Mandic Digital Signal Processing 39

40 Partial ACF for financial time series a = a = a = a = a = a = c Danilo P. Mandic Digital Signal Processing 4

41 Model Order Selection Practical issues In practice the greater the model order the higher the accuracy When do we stop? To save on computational complexity, we introduce penalty for a high model order. The criteria for model order selection are, for instance MDL (minimum description length - Rissanen), AIC (Akaike Information criterion), given by MDL = log(e) + p log(n) N AIC = log(e) + 2p/N E = the loss function (typically cumulative squared error, p = the number of estimated parameters N = the number of estimated data. c Danilo P. Mandic Digital Signal Processing 4

42 Example:- Model order selection MDL vs AIC Let us have a look at the squared error and the MDL and AIC criteria for an AR(2) model with a =.5 a 2 =.3.98 MDL Cumulative Squared Error MDL for AR(2) AIC for AR(2) AIC Cumulative Squared Error AR(2) Model Order (Model error) 2 versus the model order p c Danilo P. Mandic Digital Signal Processing 42

43 Moving Average Processes A general MA(q) process is given by x[n] = w[n] + b w[n ] + + b q w[n q] Autocorrelation function: The autocovariance function of MA(q) c k = E[(w[n] + b w[n ] + + b q w[n q])(w[n k] +b w[n k ] + + b q w[n k q])] Hence the variance of the process c = ( + b b 2 q)σw 2 The ACF of an MA process has a cutoff after lag q. Spectrum: All zeros struggles to model PSD with peaks P(f) = 2σw 2 + b e j2πf + b 2 e j4πf + + b q e j2πqf c Danilo P. Mandic Digital Signal Processing 43

44 Example:- MA(3) process 3 MA(3) signal.2 ACF for MA(3) signal Signal values Sample Number Correlation Correlation lag Correlation Partial ACF for MA(3) signal Correlation lag Power/frequency (db/rad/sample) Burg Power Spectral Density Estimate Normalized Frequency ( π rad/sample) MA(3) model, its ACF and partial autocorrelation (PAC) After lag k = 3, the ACF becomes very small c Danilo P. Mandic Digital Signal Processing 44

45 Analysis of Nonstationary Signals Signal values.5 W Speech Signal W2 W Sample Number Partial ACF for W Partial ACF for W2 Partial ACF for W3 Correlation Correlation.5.5 Correlation Correlation lag MDL calculated for W 25 5 Correlation lag MDL calculated for W Correlation lag MDL calculated for W3 MDL.8.6 Calculated Model Order = 3 MDL.5 Calculated Model Order > 5 MDL.8.6 Calculated Model Order = Model Order 25 5 Model Order Model Order Different AR models for different segments of speech To deal with nonstationarity we need short sliding windows c Danilo P. Mandic Digital Signal Processing 45

46 Duality Between AR and MA Processes i) A stationary finite AR(p) process can be represented as an infinite order MA process. A finite MA process can be represented as an infinite AR process. ii) The finite MA(q) process has an ACF that is zero beyond q. For an AR process, the ACF is infinite in extent and consits of mixture of damped exponentials and/or damped sine waves. iii) Parameters of finite MA process are not required to satisfy any condition for stationarity. However, for invertibility, the roots of the characteristic equation must lie inside the unit circle. ARMA modelling is a classic technique which has found a tremendous number of applications c Danilo P. Mandic Digital Signal Processing 46

Advanced Signal Processing Linear Stochastic Processes

Advanced Signal Processing Linear Stochastic Processes Advanced Signal Processing Linear Stochastic Processes Danilo Mandic room 83, ext: 4627 Department of Electrical and Electronic Engineering Imperial College London, UK d.mandic@imperial.ac.uk, URL: www.commsp.ee.ic.ac.uk/

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied Linear Signal Models Overview Introduction Linear nonparametric vs. parametric models Equivalent representations Spectral flatness measure PZ vs. ARMA models Wold decomposition Introduction Many researchers

More information

Part III Spectrum Estimation

Part III Spectrum Estimation ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, 2007 Cover Sheet Test Duration: 120 minutes. Open Book but Closed Notes. Calculators allowed!! This test contains five problems. Each of

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Practical Spectral Estimation

Practical Spectral Estimation Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Quantitative Finance I

Quantitative Finance I Quantitative Finance I Linear AR and MA Models (Lecture 4) Winter Semester 01/013 by Lukas Vacha * If viewed in.pdf format - for full functionality use Mathematica 7 (or higher) notebook (.nb) version

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet EE538 Final Exam Fall 005 3:0 pm -5:0 pm PHYS 3 Dec. 17, 005 Cover Sheet Test Duration: 10 minutes. Open Book but Closed Notes. Calculators ARE allowed!! This test contains five problems. Each of the five

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M: Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Biomedical Signal Processing and Signal Modeling

Biomedical Signal Processing and Signal Modeling Biomedical Signal Processing and Signal Modeling Eugene N. Bruce University of Kentucky A Wiley-lnterscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim Brisbane Singapore Toronto

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

( ) John A. Quinn Lecture. ESE 531: Digital Signal Processing. Lecture Outline. Frequency Response of LTI System. Example: Zero on Real Axis

( ) John A. Quinn Lecture. ESE 531: Digital Signal Processing. Lecture Outline. Frequency Response of LTI System. Example: Zero on Real Axis John A. Quinn Lecture ESE 531: Digital Signal Processing Lec 15: March 21, 2017 Review, Generalized Linear Phase Systems Penn ESE 531 Spring 2017 Khanna Lecture Outline!!! 2 Frequency Response of LTI System

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Statistical and Adaptive Signal Processing

Statistical and Adaptive Signal Processing r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Laboratory Project 2: Spectral Analysis and Optimal Filtering

Laboratory Project 2: Spectral Analysis and Optimal Filtering Laboratory Project 2: Spectral Analysis and Optimal Filtering Random signals analysis (MVE136) Mats Viberg and Lennart Svensson Department of Signals and Systems Chalmers University of Technology 412 96

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Basics on 2-D 2 D Random Signal

Basics on 2-D 2 D Random Signal Basics on -D D Random Signal Spring 06 Instructor: K. J. Ray Liu ECE Department, Univ. of Maryland, College Park Overview Last Time: Fourier Analysis for -D signals Image enhancement via spatial filtering

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation. Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Random Signals Analysis (MVE136) Mats Viberg Department of Signals and Systems Chalmers University of Technology 412

More information

EE482: Digital Signal Processing Applications

EE482: Digital Signal Processing Applications Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu EE482: Digital Signal Processing Applications Spring 2014 TTh 14:30-15:45 CBC C222 Lecture 11 Adaptive Filtering 14/03/04 http://www.ee.unlv.edu/~b1morris/ee482/

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Lecture 4: FT Pairs, Random Signals and z-transform

Lecture 4: FT Pairs, Random Signals and z-transform EE518 Digital Signal Processing University of Washington Autumn 2001 Dept. of Electrical Engineering Lecture 4: T Pairs, Rom Signals z-transform Wed., Oct. 10, 2001 Prof: J. Bilmes

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1 DIGITAL SPECTRAL ANALYSIS WITH APPLICATIONS S.LAWRENCE MARPLE, JR. SUMMARY This new book provides a broad perspective of spectral estimation techniques and their implementation. It concerned with spectral

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes 1 Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering University of Maryland, College Park

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Automatic Autocorrelation and Spectral Analysis

Automatic Autocorrelation and Spectral Analysis Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins

Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins John L. Weatherwax June 20, 2008 Introduction Here you ll find some notes that I wrote up as I

More information

Computer Engineering 4TL4: Digital Signal Processing

Computer Engineering 4TL4: Digital Signal Processing Computer Engineering 4TL4: Digital Signal Processing Day Class Instructor: Dr. I. C. BRUCE Duration of Examination: 3 Hours McMaster University Final Examination December, 2003 This examination paper includes

More information

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Random Processes With Applications (MVE 135) Mats Viberg Department of Signals and Systems Chalmers University of Technology

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Autoregressive Models Fourier Analysis Wavelets

Autoregressive Models Fourier Analysis Wavelets Autoregressive Models Fourier Analysis Wavelets BFR Flood w/10yr smooth Spectrum Annual Max: Day of Water year Flood magnitude vs. timing Jain & Lall, 2000 Blacksmith Fork, Hyrum, UT Analyses of Flood

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

EE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI

EE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI EE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI ABSTRACT The goal of the paper is to present a parametric Fourier

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c =

1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c = ENEE630 ADSP Part II w/ solution. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix) R a = 4 4 4,R b = 0 0,R c = j 0 j 0 j 0 j 0 j,r d = 0 0 0

More information

Lecture 19 IIR Filters

Lecture 19 IIR Filters Lecture 19 IIR Filters Fundamentals of Digital Signal Processing Spring, 2012 Wei-Ta Chu 2012/5/10 1 General IIR Difference Equation IIR system: infinite-impulse response system The most general class

More information

Introduction to Biomedical Engineering

Introduction to Biomedical Engineering Introduction to Biomedical Engineering Biosignal processing Kung-Bin Sung 6/11/2007 1 Outline Chapter 10: Biosignal processing Characteristics of biosignals Frequency domain representation and analysis

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Lecture 9 Infinite Impulse Response Filters

Lecture 9 Infinite Impulse Response Filters Lecture 9 Infinite Impulse Response Filters Outline 9 Infinite Impulse Response Filters 9 First-Order Low-Pass Filter 93 IIR Filter Design 5 93 CT Butterworth filter design 5 93 Bilinear transform 7 9

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information