A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements
|
|
- Marvin Sims
- 5 years ago
- Views:
Transcription
1 A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at
2 Introduction time scales limited by clock noise can model clock noise as stochastic process {X t } set of random variables (RVs) indexed by t X t represents clock noise at time t will concentrate on sampled data, for which will take t Z {..., 1, 0, 1,...} (but sometimes use t Z {0, 1, 2,...}) Q: which stochastic processes are useful models? Q: how can we deduce model parameters & other characteristics from observed data? will cover the following in this tutorial: stationary processes & closely related processes fractionally differenced & related processes two analysis of variance ( power ) techniques spectral analysis wavelet analysis parameter estimation via analysis techniques 1
3 Stationary Processes: I stochastic process {X t } called stationary if E{X t } = µ X for all t; i.e., a constant that does not depend on t cov{x t,x t+τ } = s X,τ, all possible t & t + τ; i.e., depends on lag τ, but not t {s X,τ : τ Z} is autocovariance sequence (ACVS) s X,0 =cov{x t,x t } =var{x t }; i.e., process variance is constant for all t spectral density function (SDF) given by S X (f) = τ= s X,τ e i2πfτ, f 1/2 note: S X ( f) =S X (f) for real-valued processes 2
4 Stationary Processes: II if {X t } has SDF S X ( ), then 1/2 1/2 S X(f)e i2πfτ df = s X,τ, τ Z setting τ = 0 yields fundamental result: 1/2 S 1/2 X(f) df = s X,0 =var{x t }; i.e., SDF decomposes var {X t } across frequencies f if {a u } is a filter, then (with matching condition ) Y t u= a u X t u is stationary with SDF given by S Y (f) =A(f)S X (f), where A(f) u= a u e i2πfu 2 if {a u } narrow-band of bandwidth f about f, i.e., A(f )= 1 2 f, f f 2 f f + f 2 0, otherwise, then have following interpretation for S X (f): var {Y t } = 1/2 1/2 S Y (f ) df = 1/2 1/2 A(f )S X (f ) df S X (f) 3
5 White Noise Process simplest stationary process is white noise {ɛ t } is white noise process if E{ɛ t } = µ ɛ for all t (usually take µ ɛ =0) var {ɛ t } = σɛ 2 for all t cov {ɛ t,ɛ t } = 0 for all t t white noise thus stationary with ACVS and SDF s ɛ,τ =cov{ɛ t,ɛ t+τ } = σɛ 2, τ =0; 0, otherwise, S ɛ (f) = τ= s X,τ e i2πfτ = σ 2 ɛ 4
6 Backward Differences of White Noise consider first order backward difference of white noise: 1, u =0; X t = ɛ t ɛ t 1 = a u ɛ t u with a u 1, u =1; u= 0, otherwise. have S X (f) =A(f)S ɛ (f) = 2 sin(πf) 2 σ 2 ɛ 2πf 2 σ 2 ɛ at low frequencies (using sin(x) x for small x) let B be backward shift operator: Bɛ t = ɛ t 1, B 2 ɛ t = ɛ t 2,(1 B)ɛ t = ɛ t ɛ t 1, etc. consider dth order backward difference of white noise: X t =(1 B) d ɛ t = d d ( 1) k ɛ t k k=0 k = d d! k!(d k)! ( 1)k ɛ t k k=0 = k=0 with δ d, i.e., δ = 1, 2,... SDF given by S X (f) =A(f)S ɛ (f) = Γ(1 δ) Γ(k + 1)Γ(1 δ k) ( 1)k ɛ t k σ 2 ɛ 2 sin(πf) 2δ σ2 ɛ 2πf 2δ 5
7 Fractional Differences of White Noise for δ not necessary an integer, X t = k=0 Γ(1 δ) Γ(k + 1)Γ(1 δ k) ( 1)k ɛ t k makes sense as long as δ<1/2 k=0 {X t } stationary fractionally differenced (FD) process SDF is as before: S X (f) = σ 2 ɛ 2 sin(πf) 2δ σ2 ɛ 2πf 2δ {X t } said to obey power law at low frequencies if S X (f) lim f 0 C f =1 α for C>0; i.e., S X (f) C f α at low frequencies FD processes obey above with α = 2δ note: FD process reduces to white noise when δ =0 a k (δ)ɛ t k 6
8 ACVS & PACS for FD Processes for δ<1/2 &δ 0, 1,..., ACVS given by s X,τ = σɛ 2 sin(πδ)γ(1 2δ)Γ(τ + δ) ; πγ(1 + τ δ) when δ =0, 1,...,haves X,τ = 0 for τ > δ & s X,τ = σɛ 2 ( 1) τ Γ(1 2δ), 0 τ δ Γ(1 + τ δ)γ(1 τ δ) for all δ<1/2, have s X,0 =var{x t } = σɛ 2 Γ(1 2δ) Γ 2 (1 δ), and rest of ACVS can be computed easily via τ + δ 1 s X,τ = s X,τ 1, τ Z + {1, 2,...} τ δ (for negative lags τ, recall that s X, τ = s X,τ ). for all δ < 1/2, partial autocorrelation sequence (PACS) given by φ t,t δ t δ, t Z+ (useful for constructing best linear predictors) FD processes thus have simple and easily computed expressions for SDF, ACVS and PACS 7
9 Simulating Stationary FD Processes for 1 δ<1/2, can obtain exact simulations via circulant embedding (Davies Harte algorithm) given s X,0,...,s X,N, use discrete Fourier transform (DFT) to compute S k N τ=0 s X,τ e i2πfkτ + 2N 1 τ=n+1 s X,2N τ e i2πf kτ, k =0,...,N given 2N independent Gaussian deviates ε t with mean zero and variance σɛ 2, compute ε 0 2NS0, k =0; (ε Y k 2k 1 + iε 2k ) NS k, 1 k<n; ε 2N 1 2NSN, k = N; Y2N k, N < k 2N 1; (asterisk denotes complex conjugate) use inverse DFT to construct the real-valued sequence Y t = 1 2N 1 Y k e i2πfkt, t =0,...,2N 1 2N k=0 Y 0,Y 1,...,Y N 1 is exact simulation of FD process implication: can represent X 0,X 1,...,X N 1 as X t = 2N 1 c t,k (δ)ε k rather than X t = a k (δ)ɛ t k k=0 k=0 8
10 Nonstationary FD Processes: I suppose X t (1) is FD process with parameter δ (s) such that 1/2 δ (s) < 1/2 define X t,t Z, as cumulative sum of X (1) t,t Z : (for l<0, let X t 0) since, for t Z, X t t l=0 X (1) l X (1) t = X t X t 1 & S X (1)(f) = filtering theory suggests using relationship S X (1)(f) = 2 sin(πf) 2 S X (f) to define SDF for X t, i.e., S X (f) = σ 2 ɛ 2 sin(πf) 2δ(s), S X (1) (f) 2 sin(πf) 2 = σ 2 ɛ 2 sin(πf) 2δ with δ δ (s) + 1 (Yaglom, 1958) 9
11 Nonstationary FD Processes: II X t has stationary 1st order backward differences 1 sum defines FD processes for 1/2 δ<3/2 2 sums define FD processes for 3/2 δ<5/2, etc X t has stationary 2nd order backward differences, etc if X (1) t is white noise (δ (s) =0)soS X (1)(f) =σ 2 ɛ, then X t is random walk (δ = 1) with S X (f) = σ 2 ɛ 2 sin(πf) 2 σ2 ɛ 2πf 2 if X (2) t is white noise and if X (1) t t l=0 X (2) l & X t t l=0 then X t is random run (δ = 2), and S X (f) σ2 ɛ 2πf 4 X (1) l, t Z, 10
12 Summary of FD Processes X t said to be FD process if its SDF is given by S X (f) = σ 2 ɛ 2 sin(πf) 2δ σ2 ɛ 2πf 2δ at low frequencies well-defined for any real-valued δ FD process obeys power law at low frequencies with exponent α = 2δ if δ<1/2, FD process stationary with ACVS given by s X,0 = σɛ 2 Γ(1 2δ) Γ 2 (1 δ) & s τ + δ 1 X,τ = s X,τ 1, τ Z + τ δ PACS given by φ t,t δ t δ, t Z+ if δ 1/2, FD process nonstationary but its dth order backward difference is stationary FD process with parameter δ (s), where d δ +1/2 and δ (s) δ d (here x is largest integer x) 11
13 Alternatives to FD Processes: I fractional Brownian motion (FBM) B H (t), 0 t<, has SDF given by S BH (t)(f) = σ2 XC H f 2H+1, <f<, where σx 2 > 0, C H > 0&0<H<1 (H called Hurst parameter; C H depends on H) power law with 3 <α< 1 discrete fractional Brownian motion (DFBM) B t,t Z +, is DFBM if B t = B H (t) B t has SDF given by S Bt (f) =σ 2 XC H j= 1 f + j 2H+1, f 1/2 power law at low frequencies with 3 <α< 1 reduces to random walk if H =1/2 12
14 Alternatives to FD Processes: II fractional Gaussian noise (FGN) X t,t Z +,isfgnifx t = B t+1 B t X t has SDF given by S X (f) =4σXC 2 H sin 2 1 (πf) j= f + j 2H+1, f 1/2 power law at low frequencies with 1 <α<1 X t is stationary, with ACVS given by s X,τ = σ2 ( X τ +1 2H 2 τ 2H + τ 1 2H ), τ Z, 2 where σx 2 =var{x t } reduces to white noise if H =1/2 discrete pure power law (PPL) process SDF given by S X (f) =C S f α, f 1/2 if α > 1, stationary, but ACVS takes some effort to compute if α = 0, reduces to white noise α 1, nonstationary but backward differences of certain order are stationary 13
15 FD Processes vs. Alternatives FD processes cover full range of power laws FBMs, DFBMs and FGNs cover limited range PPL processes also cover full range differencing FD process yields another FD process; differencing alternatives yields new type of process FD process has simple SDF; if stationary, has simple ACVS & PACS FBM has simple SDF DFBM has complicated SDF FGN has simple ACVS, complicated SDF & PACS PPL has simple SDF, complicated ACVS & PACS FD, DFBM, FGN and PPL model sampled noise might be problematic to change sampling rate FBM models unsampled noise Fig. 1: comparison of SDFs for FGN, PPL & FD Fig. 2: comparison of realizations 14
16 Extensions to FD Processes: I composite FD processes S X (f) = M σm 2 m=1 2 sin(πf) ; 2δ m i.e., linear combinations of independent FD processes autoregressive, fractionally integrated, moving average (ARFIMA) processes idea is to replace ɛ t in X t = k=0 with ARMA process, say, U t = p k=1 yields process with SDF S X (f) = a k (δ)ɛ t k φ k U t k + ɛ t q σ 2 ɛ 2 sin(πf) 2δ k=1 θ k ɛ t k 1 q k=1 θ k e i2πfk 2 1 p k=1 φ k e i2πfk 2 ARMA part can model, e.g., high-frequency structure in noise 15
17 Extensions to FD Processes: II can define time-varying FD (TVFD) process via X t = k=0 as long as δ t < 1/2 for all t can use representation X t = 2N 1 k=0 a k (δ t )ɛ t k c t,k (δ t )ε k, t =0, 1,...,N 1, to extend definition to handle arbitrary δ t Fig. 3: realizations from 4 TVFD processes can also make σɛ 2 time-varying 16
18 FD Process Parameter Estimation Q: given realization (clock noise) of X 0,...,X N 1 from FD process, how can we estimate δ & σ 2 ɛ? many different estimators have been proposed! (area of active research) will concentrate on estimators based on spectral analysis (frequency-based) wavelet analysis (scale-based) advantages of spectral and wavelet analysis physically interpretable both are analysis of variance techniques (useful for more than just estimating δ & σɛ 2 ) can assess need for models more complex than simple FD process (e.g., composite FD process) provide preliminary estimates for more complicated schemes (maximum likelihood estimation) 17
19 Estimation via Spectral Analysis recall that SDF for FD process given by and thus S X (f) = σ 2 ɛ 2 sin(πf) 2δ log (S X (f)) = log (σ 2 ɛ ) 2δ log ( 2 sin(πf) ); i.e., plot of log (S X (f)) vs. log ( 2 sin(πf) ) linear with slope of 2δ for 0 <f<1/8, have sin(πf) πf, so log (S X (f)) log (σ 2 ɛ ) 2δ log (2πf); i.e., plot of log (S X (f)) vs. log (2πf) approximately linear at low frequencies with slope of 2δ = α basic scheme estimate S X (f) via ŜX(f) fit linear model to ŜX(f) vs. log (2πf) over low frequencies use estimated slope ˆα to estimate δ via ˆα/2 use estimated intercept to estimate σɛ 2 18
20 The Periodogram: I basic estimator of S(f) is periodogram: Ŝ (p) (f) 1 N N 1 t=0 X t e i2πft 2, f 1/2; represents decomposition of sample variance: 1/2 1/2 Ŝ(p) (f) df = 1 N N 1 t=0 X 2 t for stationary processes & large N, theory says Ŝ (p) (f) = d S(f) χ2 2 2, 0 <f<1/2, approximately, implying that E{Ŝ(p) (f)} E{S(f)χ 2 2/2} = S(f) var {Ŝ(p) (f)} var {S(f)χ 2 2/2} = S 2 (f) (in above d = means equal in distribution, and χ 2 2 is chi-square RVwith 2 degrees of freedom) additionally, cov {Ŝ(p) (f j ), Ŝ(p) (f k )} 0 for f j j/n &0<f j <f k < 1/2 19
21 The Periodogram: II taking log transform yields log (Ŝ(p) (f)) = d log S(f) χ2 2 = log (S(f))+log 2 Bartlett & Kendall (1946): E log χ 2 η η = ψ(η) log (η) & var log χ 2 η η χ 2 2 where ψ( ) &ψ ( ) are di & trigamma functions yields E{log (Ŝ(p) (f))} = log (S(f)) + ψ(2) log (2) = log (S(f)) γ var{log (Ŝ(p) (f))} = ψ (2) = π 2 /6 (γ. = is Euler s constant) 2 = ψ (η) 20
22 The Periodogram: III define Y (p) (f j ) log (Ŝ(p) (f j )) + γ can model Y (p) (f j )as Y (p) (f j ) log (S(f j )) + ɛ(f j ) log (σɛ 2 ) 2δ log (2πf j )+ɛ(f j ) over low frequencies indexed by 0 <j<j error ɛ(f j ) in linear regression model such that E{ɛ(f j )} =0&var{ɛ(f j )} = π 2 /6 (known!) if {X t } Gaussian & Ŝ(p) (f j ) s uncorrelated, then ɛ(f j ) s pairwise uncorrelated ɛ(f j ) d = log (χ 2 2) markedly non-gaussian least squares procedure yields estimates ˆδ and ˆσ 2 ɛ for δ and σ 2 ɛ estimates of variability in ˆδ and ˆσ 2 ɛ 21
23 Multitaper Spectral Estimation: I warnings about periodogram: approximations might require N to be very large! approximations of questionable validity for nonstationary FD processes Fig. 4: periodogram can suffer from leakage tapering is technique for alleviating leakage: Ŝ (d) (f) N 1 t=0 a t X t e i2πft 2 {a t } called data taper (typically bell-shaped curve) Ŝ(d) ( ) called direct spectral estimator critique: loses information at end of series (sample size N effectively shortened) Thomson (1982): multitapering recovers lost info use set of K orthonormal data tapers {a n,t }: N 1 t=0 a n,t a l,t = 1, if n = l; 0, if n l. 0 n, l K 1 22
24 Multitaper Spectral Estimation: II use {a n,t } to form kth direct spectral estimator: Ŝ (mt) k (f) N 1 t=0 a n,t X t e i2πft 2, n =0,...,K 1 simplest form of multitaper SDF estimator: Ŝ (mt) (f) 1 K K 1 n=0 Ŝ n (mt) (f) sinusoidal tapers are one family of multitapers: 1/2 2 a n,t = sin (N +1) (Riedel & Sidorenko, 1995) (n +1)π(t +1) N +1, Figs. 5 and 6: example of multitapering if S( ) slowly varying around S(f) &ifn large, Ŝ (mt) (f) d = S(f)χ2 2K 2K approximately for 0 <f<1/2, impling var {Ŝ(mt) (f)} S2 (f) 4K 2 var {χ2 2K} = S2 (f) K t =0,...,N 1 23
25 Multitaper Spectral Estimation: III define Y (mt) (f j ) log (Ŝ(mt) (f j )) ψ(k) + log (K) can model Y (mt) (f j )as Y (mt) (f j ) log (S(f j )) + η(f j ) log (σɛ 2 ) 2δ log (2πf j )+η(f j ) over low frequencies indexed by 0 <j<j error η(f j ) in linear regression model such that E{η(f j )} =0 var {η(f j )} = ψ (K), a known constant! approximately Gaussian if K 5 correlated, but with simple structure: cov{η(f j ),η(f j+ν )} ψ (K) ( 1 ν ) K+1, if ν K +1; 0, otherwise. generalized least squares procedure yields estimates ˆδ and ˆσ 2 ɛ for δ and σ 2 ɛ estimates of variability in ˆδ and ˆσ 2 ɛ multitaper approach superior to periodogram approach 24
26 Discrete Wavelet Transform (DWT) let X =[X 0,X 1,...,X N 1 ] T be observed time series (for convenience, assume N integer multiple of 2 J 0) let W be N N orthonormal DWT matrix W = WX is vector of DWT coefficients orthonormality says X = W T W,soX W can partition W as follows: W = W 1. W J0 V J0 W j contains N j = N/2 j wavelet coefficients related to changes of averages at scale τ j =2 j 1 (τ j is jth dyadic scale) related to times spaced 2 j units apart V J0 contains N J0 = N/2 J 0 scaling coefficients related to averages at scale λ J0 =2 J 0 related to times spaced 2 J 0 units apart 25
27 Example: Haar DWT Fig. 7: W for Haar DWT with N =16 first 8 rows yield W 1 changes on scale 1 next 4 rows yield W 2 changes on scale 2 next 2 rows yield W 3 changes on scale 4 next to last row yields W 4 change on scale 8 last row yields V 4 average on scale 16 Fig. 8: Haar DWT coefficients for clock
28 DWT in Terms of Filters filter X 0,X 1,...,X N 1 to obtain 2 j/2 W j,t L j 1 l=0 h j,l X t l mod N, t =0, 1,...,N 1; h j,l is jth level wavelet filter (note: circular filtering) subsample to obtain wavelet coefficients: W j,t =2 j/2 W j,2 j (t+1) 1, t =0, 1,...,N j 1, where W j,t is tth element of W j Figs. 9 & 10: four sets of wavelet filters jth wavelet filter is band-pass with pass-band [ 1 2 j+1, 1 2 j ] (i.e., scale related to interval of frequencies) similarly, scaling filters yield V J0 Figs. 11 & 12: four sets of scaling filters J 0 th scaling filter is low-pass with pass-band [0, 1 2 J 0 +1 ] as width L of 1st level filters increases, band-pass & low-pass approximations improve # of embedded differencing operations increases (related to # of vanishing moments ) 27
29 DWT-Based Analysis of Variance consider energy in time series: X 2 = X T X = N 1 t=0 X 2 t energy preserved in DWT coefficients: W 2 = WX 2 = X T W T WX = X T X = X 2 since W 1,...,W J0, V J0 partitions W, have W 2 = J 0 j=1 W j 2 + V J0 2, leading to analysis of sample variance: ˆσ 2 1 N N 1 t=0 X 2 t = 1 N J 0 j=1 W j 2 + V J0 2 scale-based decomposition (cf. frequency-based) 28
30 Variation: Maximal Overlap DWT can eliminate downsampling and use W j,t 1 2 j/2 L j 1 l=0 h j,l X t l mod N, t =0, 1,...,N 1 to define MODWT coefficients W j (& also Ṽj) unlike DWT, MODWT is not orthonormal (in fact MODWT is highly redundant) like DWT, can do analysis of variance because X 2 = J 0 j=1 W j 2 + ṼJ 0 2 unlike DWT, MODWT works for all samples sizes N (i.e., power of 2 assumption is not required) Fig. 13: Haar MODWT coefficients for clock 571 (cf. Fig. 8 with DWT coefficients) can use to track time-varying FD process 29
31 Definition of Wavelet Variance let X t, t Z, be a stochastic process run X t through jth level wavelet filter: W j,t L j 1 l=0 h j,l X t l, t Z definition of time dependent wavelet variance (also called wavelet spectrum): ν 2 X,t(τ j ) var {W j,t }, assuming var {W j,t } exists and is finite ν 2 X,t(τ j ) depends on τ j and t will consider time independent wavelet variance: ν 2 X(τ j ) var {W j,t } (can be easily adapted to time varying situation) rationale for wavelet variance decomposes variance on scale by scale basis useful substitute/complement for SDF 30
32 Variance Decomposition suppose X t has SDF S X (f): 1/2 1/2 S X(f) df =var{x t }; i.e., decomposes var {X t } across frequencies f involves uncountably infinite number of f s S X (f) f contribution to var {X t } due to f s in interval of length f centered at f note: var {X t } taken to be for nonstationary processes with stationary backward differences wavelet variance analog to fundamental result: j=1 ν 2 X(τ j )=var{x t } i.e., decomposes var {X t } across scales τ j recall DWT/MODWT and sample variance involves countably infinite number of τ j s νx(τ 2 j ) contribution to var {X t } due to scale τ j ν X (τ j ) has same units as X t (easier to interpret) 31
33 Spectrum Substitute/Complement because h j,l bandpass over [1/2 j+1, 1/2 j ], ν 2 X(τ j ) 2 1/2 j 1/2 j+1 S X (f) df ( ) if S X (f) featureless, info in ν 2 X(τ j ) info in S X (f) ν 2 X(τ j ) more succinct: only 1 value per octave band recall SDF for FD process: S X (f) = ( ) implies ν 2 X(τ j ) τ 2δ 1 j σ 2 ɛ 2 sin(πf) 2δ σ2 ɛ 2πf 2δ approximately can deduce δ from slope of log (ν 2 X(τ j )) vs. log (τ j ) can estimate δ & σ 2 ɛ by applying regression analysis to log of estimates of ν 2 X(τ j ) 32
34 Estimation of Wavelet Variance: I can base estimator on MODWT of X 0,X 1,...,X N 1 : W j,t L j 1 l=0 h j,l X t l mod N, t =0, 1,...,N 1 (DWT-based estimator possible, but less efficient) recall that W j,t L j 1 l=0 h j,l X t l, t =0, ±1, ±2,... so W j,t = W j,t if mod not needed: L j 1 t<n if N L j 0, unbiased estimator of ν 2 X(τ j )is ˆν 2 X(τ j ) 1 N L j +1 where M j N L j +1 N 1 t=l j 1 Wj,t 2 = 1 N 1 M j can also construct biased estimator of ν 2 X(τ j ): ν 2 X(τ j ) 1 N N 1 t=0 Wj,t 2 = 1 (L j 2 N t=0 Wj,t 2 + N 1 W 2 j,t, t=l j 1 W 2 j,t t=l j 1 1st sum in parentheses influenced by circularity ) 33
35 Estimation of Wavelet Variance: II biased estimator unbiased if {X t } white noise biased estimator offers exact analysis of ˆσ 2 ; unbiased estimator need not biased estimator can have better mean square error (Greenhall et al., 1999; need to reflect X t ) 34
36 Statistical Properties of ˆν 2 X(τ j ) suppose {W j,t } Gaussian, mean 0&SDFS j (f) suppose square integrability condition holds: A j 1/2 1/2 S2 j (f) df < & S j (f) > 0 (holds for FD process if L large enough) can show ˆν 2 X(τ j ) asymptotically normal with mean ν 2 X(τ j ) & large sample variance 2A j /M j can estimate A j and use with ˆν 2 X(τ j ) to construct confidence interval for ν 2 X(τ j ) example Fig. 14: clock errors X t X t (0) along with differences X t (i) X t (i 1) X (i 1) t 1 for i =1, 2 Fig. 15: ˆν X(τ 2 j ) for clock errors Fig. 16: ˆν Y 2 (τ j) for Y t X t (1) Haar ˆν 2 Y (τ j) related to Allan variance σ 2 Y (2,τ j): ν 2 Y (τ j)= 1 2 σ2 Y (2,τ j) 35
37 Summary fractionally differenced processes are able to cover all power laws easy to work with (SDF, ACVS & PACS simply expressed) extensible to composite, ARFIMA & time-varying processes spectral and wavelet analysis can provide estimates of parameters of FD processes decomposition of sample variance across frequencies (in case of spectral analysis) scales (in case of wavelet analysis) complementary analyses wavelet analysis has some advantages for clock noise estimates δ & σɛ 2 somewhat better useful with time-varying noise process can deal with polynomial trends (not covered here) results expressed in same units as Xt 2 a big thank you to conference organizers! 36
A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements
A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staffwashingtonedu/dbp/talkshtml
More informationAn Introduction to Wavelets with Applications in Environmental Science
An Introduction to Wavelets with Applications in Environmental Science Don Percival Applied Physics Lab, University of Washington Data Analysis Products Division, MathSoft overheads for talk available
More informationFGN 1.0 PPL 1.0 FD
FGN PPL FD f f Figure SDFs for FGN, PPL and FD processes (top to bottom rows, respectively) on both linear/log and log/log aes (left- and right-hand columns, respectively) Each SDF S X ( ) is normalized
More informationWavelet Methods for Time Series Analysis. Part IV: Wavelet-Based Decorrelation of Time Series
Wavelet Methods for Time Series Analysis Part IV: Wavelet-Based Decorrelation of Time Series DWT well-suited for decorrelating certain time series, including ones generated from a fractionally differenced
More informationSOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 2002 This paper is also taken for the relevant examination for the Associateship.
UNIVERSITY OF LONDON IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE SOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 00 This paper is also taken for the relevant examination for the Associateship
More informationWavelet Methods for Time Series Analysis. Part II: Wavelet-Based Statistical Analysis of Time Series
Wavelet Methods for Time Series Analysis Part II: Wavelet-Based Statistical Analysis of Time Series topics to covered: wavelet variance (analysis phase of MODWT) wavelet-based signal extraction (synthesis
More informationWavelet Methods for Time Series Analysis. Motivating Question
Wavelet Methods for Time Series Analysis Part VII: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work
More informationWavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping
Wavelet Methods for Time Series Analysis Part IX: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work with
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationDefining the Discrete Wavelet Transform (DWT)
Defining the Discrete Wavelet Transform (DWT) can formulate DWT via elegant pyramid algorithm defines W for non-haar wavelets (consistent with Haar) computes W = WX using O(N) multiplications brute force
More informationWavelet-Based Bootstrapping for Non-Gaussian Time Series. Don Percival
Wavelet-Based Bootstrapping for Non-Gaussian Time Series Don Percival Applied Physics Laboratory Department of Statistics University of Washington Seattle, Washington, USA overheads for talk available
More informationTesting for Homogeneity of Variance in Time Series: Long Memory, Wavelets and the Nile River
Testing for Homogeneity of Variance in Time Series: Long Memory, Wavelets and the Nile River Brandon Whitcher Geophysical Statistics Project National Center for Atmospheric Research Boulder, Colorado Peter
More informationWavelet Analysis of Discrete Time Series
Wavelet Analysis of Discrete Time Series Andrew T. Walden Abstract. We give a brief review of some of the wavelet-based techniques currently available for the analysis of arbitrary-length discrete time
More informationDifferencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t
Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X
More informationWavelet Methods for Time Series Analysis
Wavelet Methods for Time Series Analysis Donald B. Percival UNIVERSITY OF WASHINGTON, SEATTLE Andrew T. Walden IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE, LONDON CAMBRIDGE UNIVERSITY PRESS Contents
More informationWavelet-based estimation for Gaussian time series and spatio-temporal processes
Wavelet-based estimation for Gaussian time series and spatio-temporal processes Dissertation Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the Graduate School
More informationAsymptotic Decorrelation of Between-Scale Wavelet Coefficients
Asymptotic Decorrelation of Between-Scale Wavelet Coefficients Peter F. Craigmile 1 and Donald B. Percival 2 1 Department of Statistics, The Ohio State University, 44 Cockins Hall, 1958 Neil Avenue, Columbus.
More informationUsing wavelet tools to estimate and assess trends in atmospheric data
NRCSE Using wavelet tools to estimate and assess trends in atmospheric data Wavelets Fourier analysis uses big waves Wavelets are small waves Requirements for wavelets Integrate to zero Square integrate
More informationReview of Concepts from Fourier & Filtering Theory. Fourier theory for finite sequences. convolution/filtering of infinite sequences filter cascades
Review of Concepts from Fourier & Filtering Theory precise definition of DWT requires a few basic concepts from Fourier analysis and theory of linear filters will start with discussion/review of: basic
More informationWavelet Methods for Time Series Analysis. Quick Comparison of the MODWT to the DWT
Wavelet Methods for Time Series Analysis Part IV: MODWT and Examples of DWT/MODWT Analysis MODWT stands for maximal overlap discrete wavelet transform (pronounced mod WT ) transforms very similar to the
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationWavelet Methods for Time Series Analysis. Quick Comparison of the MODWT to the DWT
Wavelet Methods for Time Series Analysis Part III: MODWT and Examples of DWT/MODWT Analysis MODWT stands for maximal overlap discrete wavelet transform (pronounced mod WT ) transforms very similar to the
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationDecline of Arctic Sea-Ice Thickness as Evidenced by Submarine Measurements. Don Percival
Decline of Arctic Sea-Ice Thickness as Evidenced by Submarine Measurements Don Percival Applied Physics Laboratory Department of Statistics University of Washington Seattle, Washington, USA http://facultywashingtonedu/dbp
More informationWavelet Analysis of Variance for Time Series with Missing Values. Technical Report no. 535 Department of Statistics, University of Washington
Wavelet Analysis of Variance for Time Series with Missing Values Debashis Mondal 1 and Donald B. Percival Technical Report no. 535 Department of Statistics, University of Washington Abstract The wavelet
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49
State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing
More informationMultiresolution Models of Time Series
Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16 General Framework Time-scale decomposition General Framework Begin with
More informationAN INTRODUCTION TO SPECTRAL ANALYSIS AND WAVELETS
International Workshop on Advanced Mathematical Tools in Metrology Torino, October 1993 AN INTRODUCTION TO SPECTRAL ANALYSIS AND WAVELETS Donald B. Percival Applied Physics Laboratory, HN-10 University
More informationY t = φ t,t = s t,y t 1
Simulating Gaussian Random Processes with Specified Spectra * Donald B. Percival Applied Physics Laboratory, HN-10 niversity ofwashington Seattle, WA 98195 Abstract Abstract We discuss the problem ofgenerating
More informationCHANGE DETECTION IN TIME SERIES
CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationHarmonic Analysis: I. consider following two part stationary process: X t = µ + D l cos (2πf l t t + φ l ) {z }
Harmonic Analysis: I consider following two part stationary process: LX X t = µ + D l cos (2πf l t t + φ l ) l=1 {z } (1) part (1) is harmonic process (Equation (37c)) + η t {z} (2) µ, L, D l s and f l
More informationLinear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.
Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero
More informationIntroduction to Signal Processing
to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationWavestrapping Time Series: Adaptive Wavelet-Based Bootstrapping
Wavestrapping Time Series: Adaptive Wavelet-Based Bootstrapping D. B. Percival, S. Sardy and A. C. Davison 1 Introduction Suppose we observe a time series that can be regarded as a realization of a portion
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53
State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State
More informationTesting for homogeneity of variance in time series: Long memory, wavelets, and the Nile River
WATER RESOURCES RESEARCH, VOL. 38, NO. 5, 1054, 10.109/001WR000509, 00 Testing for homogeneity of variance in time series: Long memory, wavelets, and the Nile River B. Whitcher, 1 S. D. Byers, P. Guttorp,
More informationSIO 221B, Rudnick adapted from Davis 1. 1 x lim. N x 2 n = 1 N. { x} 1 N. N x = 1 N. N x = 1 ( N N x ) x = 0 (3) = 1 x N 2
SIO B, Rudnick adapted from Davis VII. Sampling errors We do not have access to the true statistics, so we must compute sample statistics. By this we mean that the number of realizations we average over
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationSpectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania
Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationComputational Data Analysis!
12.714 Computational Data Analysis! Alan Chave (alan@whoi.edu)! Thomas Herring (tah@mit.edu),! http://geoweb.mit.edu/~tah/12.714! Introduction to Spectral Analysis! Topics Today! Aspects of Time series
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationLong-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu
Long-Range Dependence and Self-Similarity c Vladas Pipiras and Murad S. Taqqu January 24, 2016 Contents Contents 2 Preface 8 List of abbreviations 10 Notation 11 1 A brief overview of times series and
More informationUtility of Correlation Functions
Utility of Correlation Functions 1. As a means for estimating power spectra (e.g. a correlator + WK theorem). 2. For establishing characteristic time scales in time series (width of the ACF or ACV). 3.
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationA test for improved forecasting performance at higher lead times
A test for improved forecasting performance at higher lead times John Haywood and Granville Tunnicliffe Wilson September 3 Abstract Tiao and Xu (1993) proposed a test of whether a time series model, estimated
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationThe Slow Convergence of OLS Estimators of α, β and Portfolio. β and Portfolio Weights under Long Memory Stochastic Volatility
The Slow Convergence of OLS Estimators of α, β and Portfolio Weights under Long Memory Stochastic Volatility New York University Stern School of Business June 21, 2018 Introduction Bivariate long memory
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationComments on New Approaches in Period Analysis of Astronomical Time Series by Pavlos Protopapas (Or: A Pavlosian Response ) Don Percival
Comments on New Approaches in Period Analysis of Astronomical Time Series by Pavlos Protopapas (Or: A Pavlosian Response ) Don Percival Applied Physics Laboratory Department of Statistics University of
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationStatistics 349(02) Review Questions
Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationFigure 18: Top row: example of a purely continuous spectrum (left) and one realization
1..5 S(). -.2 -.5 -.25..25.5 64 128 64 128 16 32 requency time time Lag 1..5 S(). -.5-1. -.5 -.1.1.5 64 128 64 128 16 32 requency time time Lag Figure 18: Top row: example o a purely continuous spectrum
More informationTime series models in the Frequency domain. The power spectrum, Spectral analysis
ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ
More informationEstimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements
of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements François Roueff Ecole Nat. Sup. des Télécommunications 46 rue Barrault, 75634 Paris cedex 13,
More informationA=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error
UCSD SIOC 221A: (Gille) 1 Reading: Bendat and Piersol, Ch. 5.2.1 Lecture 10: Recap Last time we looked at the sinc function, windowing, and detrending with an eye to reducing edge effects in our spectra.
More informationStatistics 910, #15 1. Kalman Filter
Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationWavelet domain test for long range dependence in the presence of a trend
Wavelet domain test for long range dependence in the presence of a trend Agnieszka Jach and Piotr Kokoszka Utah State University July 24, 27 Abstract We propose a test to distinguish a weakly dependent
More informationCh.10 Autocorrelated Disturbances (June 15, 2016)
Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More information10. Linear Models and Maximum Likelihood Estimation
10. Linear Models and Maximum Likelihood Estimation ECE 830, Spring 2017 Rebecca Willett 1 / 34 Primary Goal General problem statement: We observe y i iid pθ, θ Θ and the goal is to determine the θ that
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationCh3. TRENDS. Time Series Analysis
3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true
More informationcovariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of
Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated
More informationX t = a t + r t, (7.1)
Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical
More information18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013
18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationIntroduction to Time Series Analysis. Lecture 19.
Introduction to Time Series Analysis. Lecture 19. 1. Review: Spectral density estimation, sample autocovariance. 2. The periodogram and sample autocovariance. 3. Asymptotics of the periodogram. 1 Estimating
More information