Y t = φ t,t = s t,y t 1
|
|
- Kelly Lester
- 5 years ago
- Views:
Transcription
1 Simulating Gaussian Random Processes with Specified Spectra * Donald B. Percival Applied Physics Laboratory, HN-10 niversity ofwashington Seattle, WA Abstract Abstract We discuss the problem ofgenerating realizations oflength N from a Gaussian stationary process {Y t } with a specified spectral density function S Y ( ). We review three methods for generating the required realizations and consider their relative merits. In particular, we discuss an approximate frequency domain technique that is evidently used frequently in practice, but that has some potential pitfalls. We discuss extensions to this technique that allow it to be used to generate realizations from a power-law process with spectral density function similar to S(f) = f α for α<0. I. Introduction Let {Y t } be a real-valued Gaussian stationary process with spectral density function (sdf) S Y ( ), autocorrelation sequence () {s τ,y } and zero mean. Ifwe define the sampling time between observations Y t and Y t+1 to be unity so that the Nyquist frequency is 1, then the is related to the sdfvia the usual relationship s τ,y = 1 S Y (f)e iπfτ df, where i 1. (1) A problem ofconsiderable practical interest is to generate a sample oflength N ofthis process (i.e., a realization of Y 0,...,Y N 1 ) on a digital computer by suitably transforming samples of a zero mean, unit variance Gaussian white noise process {W t }. In this paper we discuss three methods for doing this: an exact time domain method that is valid for all sdf s (Section II), an exact frequency domain method that is valid only for some sdf s (Section III), and an approximate frequency domain method that can be used for all sdf s (Section IV). We place particular emphasis on generating time series from stationary and nonstationary power-law (long memory) processes (Sections V and VI). (The three methods we discuss here are certainly not the only ones that have been advocated in the literature one important omission is an approximate time domain method based upon the class ofautoregressive-moving average models.) * Except for a few minor corrections made here, this paper appeared in Computing Science and Statistics, 4 (199), pp II. An Exact Time Domain ethod Ifthe {s τ,y } is readily known out to lag N 1, there are well-known time domain techniques for generating samples of {Y t } (see, for example, Franklin, 1965). Typically these involve an lower-upper Cholesky factorization ofthe inverse ofthe N-th order Toeplitz covariance matrix for Y 0,...,Y N 1 (see Demeure and Scharf, 1987, for a good review). This factorization can be accomplished using the Levinson-Durbin recursions and then used to generate the desired samples, as follows. Let W 0,...,W N 1 be a set of N independent and identically distributed Gaussian random variables (rv s) with zero mean and unit variance. With Y 0 = σ 0 W 0, we generate the N 1 remaining samples recursively via Y t = t φ j,t Y t j + W t σ t, t =1,...,N 1. j=1 The σ t s and φ j,t are obtained by first setting σ 0 = s 0,Y and then recursively computing for t =1,...,N 1 φ t,t = s t,y t 1 j=1 φ j,t 1s t j,y σt 1 φ j,t = φ j,t 1 φ t,t φ t j,t 1, 1 j t 1 σt = σt 1 ( ) 1 φ t,t (for t = 1 the summation in the first equation is taken to be 0, and the second equation is skipped). There are two potential drawbacks to this exact method. First, once we have computed the σ t s and φ j,t s, the number offloating point operations needed to generate a sample oflength N is O(N ). There are ways, however, ofreducing this number to O(N) ifin fact the Toeplitz matrix possesses enough special structure (this is the case if, for example, {Y t } is an autoregressivemoving average process see Kay, 1981, for details). Second, ifwe are given the sdfs Y ( ) instead ofthe, we must first obtain the required s τ,y s. In principle this can always be done via numerical integration, but in practice this approach can be error-prone and time consuming. III. An Exact Frequency Domain ethod Davies and Harte (1987) recently outlined a frequency domain technique for simulating {Y t } that makes use of a fast Fourier transform (fft) algorithm and hence requires only O(N log(n)) operations. As these authors noted, their method is not completely general in that it can fail to work for some processes. The situations for which their method is applicable are easily described by a nonnegativity constraint. Their method has in fact appeared previously in the literature in the context of
2 simulating Gaussian moving average processes oforder q, for which the nonnegativity constraint holds as long as N is greater than q (see Davis, Hagan, and Borgman, 1981, and the discussion oftheir work in Ripley, 1987). Let be any even positive integer (typically a power of), and define f j = j. Let S j s τ,y e iπfjτ, τ= ( 1) 0 j. () Note that we can rewrite the above as 1 S j = s τ,y e iπfjτ + s τ,y e iπfjτ, τ=0 τ= +1 so we can obtain the S j s via the discrete Fourier transform of the following sequence of length : s 0,Y,s 1,Y,...,s 1,Y,s,Y,s 1,Y,s,Y,...,s 1,Y. We can also reexpress Equation () as S j = s,y ( 1)j + W (f j f)s Y (f) df, 1 where sin(( 1)πf) W (f). sin(πf) Because W ( ) oscillates between positive and negative values and because s,y ( 1)j can be negative, it is possible that some ofthe S j s are negative. Note, however, that, if {Y t } were a moving average process oforder q 1 so that s τ,y = 0 for τ, then we would have S j = S Y (f j ) so that S j 0. In order for the simulation method to work, we must impose the nonnegativity constraint that S j 0 for 0 j. Let W 0,...,W 1 be a set of independent and identically distributed Gaussian rv s with zero mean and unit variance. Define S0 W 0, j =0; 1 S j (W j 1 + iw j ), 1 j< V j ; S W 1, j = ; V j, <j 1 (the asterisk denotes complex conjugation). Note that cov{v j, V k } = E{V j V k } = { Sj, if j = k; 0, otherwise. We next define the process {V t } via V t 1 1 V j e iπfjt, t =0,..., 1. (3) By construction, the process {V t } is real-valued. Because V t is a linear combination ofgaussian rv s, the process is Gaussian. A straight-forward exercise shows that {V t } is a stationary process with zero mean and {s τ,v } given by s τ,v = 1 1 S j e iπfjτ. (4) In contrast to {Y t }, however, the stationary process {V t } is a harmonic process; i.e., it does not possess an sdf, but its spectral properties are given by an integrated spectrum that is a step function with steps at the ±f j s. This fact implies that realizations of {V t } are periodic with period, and hence so is its {s τ,v }. Note that, if is a power of, we can readily compute both {V t } and {s τ,v } using a conventional fft algorithm. By substituting the definition for S j in Equation () into Equation (4) and interchanging the order ofthe two summations, we obtain s τ,v = 1 From the result 1 we obtain ρ= ( 1) s ρ,y 1 e iπfj(τ ρ). { e iπfjη, for η =0, ±,±,...; = 0, otherwise, s τ,v = s τ,y for all τ. Hence the statistical properties of V 0,...,V N 1 are identical to those of Y 0,...,Y N 1 ifwe set N. As is true for the exact time domain method, we might need to obtain the required s τ,y s via numerical integration ifwe are given the sdfs Y ( ) instead ofthe. Also, because ofthe nonnegativity constraints on the S j s, this method cannot be used for all stationary processes. As we noted previously, it does work for moving average processes oforder q 1. Since every nonnegative lag window spectral estimate has an sdfcorresponding to that ofa high order moving average process, this exact frequency domain method is useful for simulating time series from such spectral estimates.
3 IV. An Approximate Frequency Domain ethod We consider here the construction ofa zero mean Gaussian process { t } whose {s τ, } agrees to a good approximation with {s τ,y } out to lag N 1. To begin with, we make the assumption that the sdf S Y ( ) is continuous over [ 1, 1 ] (we relax this restriction in the next section). Let be any even integer greater than or equal to the desired sample size N. Let W j, j =0,..., 1, be a set of independent and identically distributed Gaussian rv s with zero mean and unit variance. Let f j j as before. We define the process { t } via t 1 1 j e iπfjt, t =0,..., 1, (5) j j, where SY (0)W 0, j =0; 1 S Y (f j )(W j 1 + iw j ), 1 j< ; S Y ( 1 )W 1, j = ; <j 1. By construction, { t } is a real-valued Gaussian process. A straight-forward exercise shows that { t } is a stationary process with zero mean and {s τ, } given by s τ, = 1 1 S(f j )e iπfjτ (6) (note that this can readily be computed using an fft). In contrast to {Y t }, however, the stationary process { t } is a harmonic process (as was the case with the V t s in Equation (3)). Because the right-hand side ofequation (6) can be regarded as a Riemann sum approximation to the integral ofequation (1), we have s τ, s τ,y for possibly some values of τ, but certainly not all: whereas {s τ, } is periodic, {s τ,y } must damp down to 0. Note that, ifwe let be a power of, Equation (5) can be quickly computed using a conventional fft algorithm realizations of { t } oflength can thus be readily generated. By making large enough, we can make s τ, arbitrarily close to s τ,y for τ =0,...,N 1, and hence the simulations of 0,..., N 1 should have statistical properties that closely match those of Y 0,...,Y N 1. This scheme was evidently first proposed by Thompson (1973), who advocated just letting = N. This formulation is in common use in the physical sciences, perhaps due to the following result. Consider the periodogram of 0,..., 1 : Ŝ (p) (f) 1 1 t e iπft. (7) If is a power of, we can evaluate the periodogram quickly over the grid offourier frequencies f j using an fft algorithm. When = N, we have E{Ŝ(p) (f j)} = S Y (f j ) by construction hence realizations of { t } have a periodogram in good apparent agreement with the target sdf S Y ( ) at the Fourier frequencies. nfortunately, the periodogram for Y 0,...,Y 1 can be a badly biased estimator of S Y ( ), so this intuitively pleasing agreement is misleading. Figure 1 illustrates this important point. itchell and cpherson (1981) also discussed this approximate scheme. To avoid the = N problem discussed above, they advocated that should be made larger than N commensurate with the correlation length of {Y t }, but beyond this briefstatement they did not provide explicit guidelines for selecting relative to N. We can do so by noting the following useful measure ofhow well the s τ, s approximate the s τ,y s for τ N 1. Let s () τ, denote the value of s τ, generated using Equation (6) for a particular value of. Define S () ( ) as the function whose Fourier coefficients are equal to s () τ, for τ N 1 and equal to s τ,y for τ N. Parseval s theorem then tells us that SS() = N 1 τ= (N 1) 1 S () s () τ, s τ,y (f) S Y (f) df. SS() can be interpreted as the average squared difference between the sdf S Y ( ) and the function S () ( ). As gets large, SS() will decrease to zero. Ifwe know the s τ,y s, we can readily compute SS() and find a value of such that S () ( ) is sufficiently close to S Y ( ) in terms ofaverage squared difference. Ifthe s τ,y s cannot be readily computed, we can increase by, say, factors of until N 1 τ= (N 1) s () τ, s() τ, is small, indicating that the effect ofdoubling is small in that the average squared difference between S () ( ) and S () ( ) is small. Figure illustrates how increasing yields a better approximation to {s τ,y }.
4 60 40 (a) (a) db (b) (b) db (c) f Figure 1. The thick curves in plots (a) and (b) show the true sdf S Y ( ) (on a decibel scale) for a particular stationary process {Y t }. The thin bumpy curve in plot (a) shows the expected value ofthe periodogram for a sample of size N = 64 from this process. The thin bumpy curve in plot (b) shows the expected value of the periodogram Ŝ(p) ( ) ofequation (7) for = N = 64. Ifthe statistical properties of{ t } closely matched those of {Y t }, there would be good agreement between the two thin bumpy curves, but in fact they are substantially different. In particular, note that in plot (b) E{Ŝ(p) (f)} = S Y (f) forf = f j = j 64 and j =0,...,3, whereas E{Ŝ(p) Y ( )} and S Y ( ) in plot (a) differ at some ofthese frequencies by more than orders ofmagnitude (0 db). V. Stationary Power-Law Processes Suppose now that {Y t } is a stationary power-law process, which by definition has an sdfgiven by S Y (f) = f α S 0 (f), f 1, (8) where 1 <α<0, and S 0 ( ) is strictly positive, continuous and has bounded variation. A specific example of such a process is a fractional difference process with sdf 0 64 τ Figure. The thick curves in all three plot show the true {s τ,y } for lags 0 to 63 for a particular stationary process {Y t } (in fact, the same process as was used in Figure 1). The thin curves show {s τ, } ofequation (6) for, from top to bottom, =64, 18 and 56 the thin curve in plot (c) agrees so well with the thick curve that they are visibly indistinguishable. given by S Y (f) =( sin(πf) ) α, f 1. This process also has an {s τ,y } that can be computed recursively by a simple formula. sing these easily computed s τ,y s, Davies and Harte (1987) found that the exact frequency domain method can be used to simulate fractional difference processes. Since a similar simple formula for the is not readily available for other stationary power-law processes, it is ofsome interest to see if we can modify the approximate frequency domain method for use here. The main difficulty is that this method requires that S Y (0) be finite, whereas Equation (8) tells us that S Y (0) =. We thus need to find a suitable replacement for S Y (0).
5 To do so, let us return momentarily to the setup of the previous section, namely, a process with an sdfthat is continuous over [ 1, 1 ]. An easy exercise tells us that Ū 1 1 t = 0 SY (0)W 0 =, from which we obtain var{ū} = S Y (0) var{ȳ }, where Ȳ = 1 1 (Priestley, 1981, p. 30). We can thus regard S Y (0) as an approximation to var{ȳ }. Künsch (1991) shows that, for a stationary power-law process {Y t }, var{ȳ } 4S 0(0)Γ(1 + α) sin( πα/) (π) 1+α C. α(α 1) This result suggests that we let 0 = C W 0 for stationary power-law processes. Limited tests to date indicate that this substitution is effective. ore work needs to be done, however, to find the best 0 so that the statistical properties of 0,..., N 1 are as close as possible to those of Y 0,...,Y N 1. VI. Nonstationary Power-Law Processes Suppose now that {Y t } is a nonstationary powerlaw process with an sdf given by Equation (8) with α 1 (because S Y ( ) integrates to, it is not a proper sdf). Processes such as these are common models in the physical sciences some typical values for α are 1 ( flicker noise), 5 3 (certain types ofturbulence) and ( random walk noise). Yaglom (1958) developed a rigorous interpretation for the sdf S Y ( ) in terms offinite differences of {Y t }. For example, if 3 <α 1, the first difference process X t Y t Y t 1 is a stationary process with sdf Y t S X (f) = 4 sin (πf)s Y (f), f 1, an equation which is used to define S Y ( ) (for α 3, we define S Y ( ) using an appropriate higher order difference). Because we define {Y t } for 3 <α 1 in terms ofits first difference process {X t }, we can simulate {X t } and form cumulative sums to simulate {Y t }. Here we merely note that we can use S Y ( ) directly in the approximate frequency domain method because t t 1 = i 1 sin(πf j ) j ie iπfj(t 1 ), approximates Y t Y t 1 properly (again, care must be taken at f = 0). VII. Concluding Comments We have described three methods for simulating a stationary Gaussian process {Y t } with a specified sdf S Y ( ). Ifthe s τ,y s for the process are readily available, then the best choice is the exact frequency domain method if the S j s ofequation () are in fact nonnegative. Ifthe s τ,y s are not readily available or ifone or more ofthe S j s are negative, then with care the approximate frequency domain method can be useful. However, the = N formulation of this method that is sometimes used should be avoided. Acknowledgement The author thanks the Office ofnaval Research for support under contract number N K References Davies, R. B. and Harte, D. S. (1987) Tests for Hurst Effect. Biometrika, 74, Davis, B.., Hagan, R. and Borgman, L. E. (1981) A Program for the Finite Fourier Transform Simulation ofrealizations from a One-Dimensional Random Function with Known Covariance. Comp. and Geosci., 7, Demeure, C. J. and Scharf, L. L. (1987) Linear Statistical odels for Stationary Sequences and Related Algorithms for Cholesky Factorization of Toeplitz atrices. IEEE Trans. Acoust., Speech, Signal Processing, 35, 9 4. Franklin, J. N. (1965) Numerical Simulation ofstationary and Nonstationary Gaussian Random Processes. SIA Review, 7, Kay, S.. (1981) Efficient Generation ofcolored Noise. Proc. IEEE, 69, Künsch, H. R. (1991) Dependence Among Observations: Consequences and ethods to Deal with It. In Directions in Robust Statistics and Diagnostics, Part I, edited by W. Stahel and S. Weisberg, New York: Springer-Verlag, itchell, R. L. and cpherson, D. A. (1981) Generating Nonstationary Random Sequences. IEEE Trans. Aero., Elec. Sys., 17, Priestley,. B. (1981) Spectral Analysis and Time Series. London: Academic Press. Ripley, B. D. (1987) Stochastic Simulation. New York: John Wiley & Sons. Thompson, R. (1973) Generation ofstochastic Processes with Given Spectrum. til. ath., 3, Yaglom, A.. (1958) Correlation Theory ofprocesses with Random Stationary nth Increments. Translations Amer. ath. Soc., 8,
A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements
A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staff.washington.edu/dbp/talks.html
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationA Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements
A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staffwashingtonedu/dbp/talkshtml
More informationWavelet Methods for Time Series Analysis. Motivating Question
Wavelet Methods for Time Series Analysis Part VII: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work
More informationWavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping
Wavelet Methods for Time Series Analysis Part IX: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work with
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationMade available courtesy of Elsevier:
A note on processes with random stationary increments By: Haimeng Zhang, Chunfeng Huang Zhang, H. and Huang, C. (2014). A Note on Processes with Random Stationary Increments. Statistics and Probability
More informationCorrelator I. Basics. Chapter Introduction. 8.2 Digitization Sampling. D. Anish Roshi
Chapter 8 Correlator I. Basics D. Anish Roshi 8.1 Introduction A radio interferometer measures the mutual coherence function of the electric field due to a given source brightness distribution in the sky.
More informationDifferencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t
Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X
More informationOn Moving Average Parameter Estimation
On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationTime series and spectral analysis. Peter F. Craigmile
Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationA time series is a set of observations made sequentially in time.
Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis
More informationA=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error
UCSD SIOC 221A: (Gille) 1 Reading: Bendat and Piersol, Ch. 5.2.1 Lecture 10: Recap Last time we looked at the sinc function, windowing, and detrending with an eye to reducing edge effects in our spectra.
More informationAN INTRODUCTION TO SPECTRAL ANALYSIS AND WAVELETS
International Workshop on Advanced Mathematical Tools in Metrology Torino, October 1993 AN INTRODUCTION TO SPECTRAL ANALYSIS AND WAVELETS Donald B. Percival Applied Physics Laboratory, HN-10 University
More informationWavelet Methods for Time Series Analysis. Part IV: Wavelet-Based Decorrelation of Time Series
Wavelet Methods for Time Series Analysis Part IV: Wavelet-Based Decorrelation of Time Series DWT well-suited for decorrelating certain time series, including ones generated from a fractionally differenced
More informationExperimental Fourier Transforms
Chapter 5 Experimental Fourier Transforms 5.1 Sampling and Aliasing Given x(t), we observe only sampled data x s (t) = x(t)s(t; T s ) (Fig. 5.1), where s is called sampling or comb function and can be
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationStatistical and Adaptive Signal Processing
r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory
More informationDISCRETE-TIME SIGNAL PROCESSING
THIRD EDITION DISCRETE-TIME SIGNAL PROCESSING ALAN V. OPPENHEIM MASSACHUSETTS INSTITUTE OF TECHNOLOGY RONALD W. SCHÄFER HEWLETT-PACKARD LABORATORIES Upper Saddle River Boston Columbus San Francisco New
More informationSOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 2002 This paper is also taken for the relevant examination for the Associateship.
UNIVERSITY OF LONDON IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY AND MEDICINE SOLUTIONS BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 00 This paper is also taken for the relevant examination for the Associateship
More informationComments on New Approaches in Period Analysis of Astronomical Time Series by Pavlos Protopapas (Or: A Pavlosian Response ) Don Percival
Comments on New Approaches in Period Analysis of Astronomical Time Series by Pavlos Protopapas (Or: A Pavlosian Response ) Don Percival Applied Physics Laboratory Department of Statistics University of
More informationAdvanced Digital Signal Processing -Introduction
Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary
More informationTime series models in the Frequency domain. The power spectrum, Spectral analysis
ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ
More informationComputational Data Analysis!
12.714 Computational Data Analysis! Alan Chave (alan@whoi.edu)! Thomas Herring (tah@mit.edu),! http://geoweb.mit.edu/~tah/12.714! Introduction to Spectral Analysis! Topics Today! Aspects of Time series
More informationSTAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics
David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationADSP ADSP ADSP ADSP. Advanced Digital Signal Processing (18-792) Spring Fall Semester, Department of Electrical and Computer Engineering
Advanced Digital Signal rocessing (18-792) Spring Fall Semester, 201 2012 Department of Electrical and Computer Engineering ROBLEM SET 8 Issued: 10/26/18 Due: 11/2/18 Note: This problem set is due Friday,
More informationAN INVERTIBLE DISCRETE AUDITORY TRANSFORM
COMM. MATH. SCI. Vol. 3, No. 1, pp. 47 56 c 25 International Press AN INVERTIBLE DISCRETE AUDITORY TRANSFORM JACK XIN AND YINGYONG QI Abstract. A discrete auditory transform (DAT) from sound signal to
More informationA COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS. OFFER LIEBERMAN and PETER C. B. PHILLIPS
A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS BY OFFER LIEBERMAN and PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1247 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS
More informationFourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007
Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)
More informationParametric Method Based PSD Estimation using Gaussian Window
International Journal of Engineering Trends and Technology (IJETT) Volume 29 Number 1 - November 215 Parametric Method Based PSD Estimation using Gaussian Window Pragati Sheel 1, Dr. Rajesh Mehra 2, Preeti
More informationPublications (in chronological order)
Publications (in chronological order) 1. A note on the investigation of the optimal weight function in estimation of the spectral density (1963), J. Univ. Gau. 14, pages 141 149. 2. On the cross periodogram
More informationSystem Modeling and Identification CHBE 702 Korea University Prof. Dae Ryook Yang
System Modeling and Identification CHBE 702 Korea University Prof. Dae Ryook Yang 1-1 Course Description Emphases Delivering concepts and Practice Programming Identification Methods using Matlab Class
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationDetection & Estimation Lecture 1
Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationAn estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic
Chapter 6 ESTIMATION OF THE LONG-RUN COVARIANCE MATRIX An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic standard errors for the OLS and linear IV estimators presented
More informationPart III Spectrum Estimation
ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted
More informationCOMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS
COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS Y. ZHANG and A.I. MCLEOD Acadia University and The University of Western Ontario May 26, 2005 1 Abstract. A symbolic
More informationSYSTEM RECONSTRUCTION FROM SELECTED HOS REGIONS. Haralambos Pozidis and Athina P. Petropulu. Drexel University, Philadelphia, PA 19104
SYSTEM RECOSTRUCTIO FROM SELECTED HOS REGIOS Haralambos Pozidis and Athina P. Petropulu Electrical and Computer Engineering Department Drexel University, Philadelphia, PA 94 Tel. (25) 895-2358 Fax. (25)
More informationA KALMAN FILTER CLOCK ALGORITHM FOR USE IN THE PRESENCE OF FLICKER FREQUENCY MODULATION NOISE
A KALMAN FILTER CLOCK ALGORITHM FOR USE IN THE PRESENCE OF FLICKER FREQUENCY MODULATION NOISE J. A. Davis, C. A. Greenhall *, and P. W. Stacey National Physical Laboratory Queens Road, Teddington, Middlesex,
More informationLecture 1: Brief Review on Stochastic Processes
Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.
More informationThis is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or
Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects
More informationCentre for Mathematical Sciences HT 2017 Mathematical Statistics
Lund University Stationary stochastic processes Centre for Mathematical Sciences HT 2017 Mathematical Statistics Computer exercise 3 in Stationary stochastic processes, HT 17. The purpose of this exercise
More informationDetection & Estimation Lecture 1
Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven
More informationA Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University
EDICS category SP 1 A Subspace Approach to Estimation of Autoregressive Parameters From Noisy Measurements 1 Carlos E Davila Electrical Engineering Department, Southern Methodist University Dallas, Texas
More informationIterative Stepwise Selection and Threshold for Learning Causes in Time Series
Iterative Stepwise Selection and Threshold for Learning Causes in Time Series Jianxin Yin Shaopeng Wang Wanlu Deng Ya Hu Zhi Geng School of Mathematical Sciences Peking University Beijing 100871, China
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationSF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES
SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through
More informationUtility of Correlation Functions
Utility of Correlation Functions 1. As a means for estimating power spectra (e.g. a correlator + WK theorem). 2. For establishing characteristic time scales in time series (width of the ACF or ACV). 3.
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationTinySR. Peter Schmidt-Nielsen. August 27, 2014
TinySR Peter Schmidt-Nielsen August 27, 2014 Abstract TinySR is a light weight real-time small vocabulary speech recognizer written entirely in portable C. The library fits in a single file (plus header),
More informationExpressions for the covariance matrix of covariance data
Expressions for the covariance matrix of covariance data Torsten Söderström Division of Systems and Control, Department of Information Technology, Uppsala University, P O Box 337, SE-7505 Uppsala, Sweden
More information13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.
For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval
More informationNotes on Random Processes
otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More information12 The Heat equation in one spatial dimension: Simple explicit method and Stability analysis
ATH 337, by T. Lakoba, University of Vermont 113 12 The Heat equation in one spatial dimension: Simple explicit method and Stability analysis 12.1 Formulation of the IBVP and the minimax property of its
More informationWavelet-based estimation for Gaussian time series and spatio-temporal processes
Wavelet-based estimation for Gaussian time series and spatio-temporal processes Dissertation Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the Graduate School
More informationTHE PROBLEMS OF ROBUST LPC PARAMETRIZATION FOR. Petr Pollak & Pavel Sovka. Czech Technical University of Prague
THE PROBLEMS OF ROBUST LPC PARAMETRIZATION FOR SPEECH CODING Petr Polla & Pavel Sova Czech Technical University of Prague CVUT FEL K, 66 7 Praha 6, Czech Republic E-mail: polla@noel.feld.cvut.cz Abstract
More informationTesting for Homogeneity of Variance in Time Series: Long Memory, Wavelets and the Nile River
Testing for Homogeneity of Variance in Time Series: Long Memory, Wavelets and the Nile River Brandon Whitcher Geophysical Statistics Project National Center for Atmospheric Research Boulder, Colorado Peter
More informationParametric Inference on Strong Dependence
Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation
More informationTime Series Analysis. Nonstationary Data Analysis by Time Deformation
Communications in Statistics Theory and Methods, 34: 163 192, 2005 Copyright Taylor & Francis, Inc. ISSN: 0361-0926 print/1532-415x online DOI: 10.1081/STA-200045869 Time Series Analysis Nonstationary
More informationECE 541 Stochastic Signals and Systems Problem Set 11 Solution
ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ
More informationIntroduction to Spectral and Time-Spectral Analysis with some Applications
Introduction to Spectral and Time-Spectral Analysis with some Applications A.INTRODUCTION Consider the following process X t = U cos[(=3)t] + V sin[(=3)t] Where *E[U] = E[V ] = 0 *E[UV ] = 0 *V ar(u) =
More information2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES
2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral
More informationAsymptotic Decorrelation of Between-Scale Wavelet Coefficients
Asymptotic Decorrelation of Between-Scale Wavelet Coefficients Peter F. Craigmile 1 and Donald B. Percival 2 1 Department of Statistics, The Ohio State University, 44 Cockins Hall, 1958 Neil Avenue, Columbus.
More informationAn algorithm for robust fitting of autoregressive models Dimitris N. Politis
An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationTHE PROCESSING of random signals became a useful
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 58, NO. 11, NOVEMBER 009 3867 The Quality of Lagged Products and Autoregressive Yule Walker Models as Autocorrelation Estimates Piet M. T. Broersen
More informationA test for improved forecasting performance at higher lead times
A test for improved forecasting performance at higher lead times John Haywood and Granville Tunnicliffe Wilson September 3 Abstract Tiao and Xu (1993) proposed a test of whether a time series model, estimated
More informationLesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:
Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl
More informationCourse content (will be adapted to the background knowledge of the class):
Biomedical Signal Processing and Signal Modeling Lucas C Parra, parra@ccny.cuny.edu Departamento the Fisica, UBA Synopsis This course introduces two fundamental concepts of signal processing: linear systems
More informationEconometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series
Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What
More informationSignal Modeling Techniques in Speech Recognition. Hassan A. Kingravi
Signal Modeling Techniques in Speech Recognition Hassan A. Kingravi Outline Introduction Spectral Shaping Spectral Analysis Parameter Transforms Statistical Modeling Discussion Conclusions 1: Introduction
More informationFourier Analysis and Power Spectral Density
Chapter 4 Fourier Analysis and Power Spectral Density 4. Fourier Series and ransforms Recall Fourier series for periodic functions for x(t + ) = x(t), where x(t) = 2 a + a = 2 a n = 2 b n = 2 n= a n cos
More informationMinitab Project Report Assignment 3
3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series
More informationAn Introduction to Wavelets with Applications in Environmental Science
An Introduction to Wavelets with Applications in Environmental Science Don Percival Applied Physics Lab, University of Washington Data Analysis Products Division, MathSoft overheads for talk available
More informationStochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore
Stochastic Structural Dynamics Prof. Dr. C. S. Manohar Department of Civil Engineering Indian Institute of Science, Bangalore Lecture No. # 33 Probabilistic methods in earthquake engineering-2 So, we have
More informationSome basic properties of cross-correlation functions of n-dimensional vector time series
Journal of Statistical and Econometric Methods, vol.4, no.1, 2015, 63-71 ISSN: 2241-0384 (print), 2241-0376 (online) Scienpress Ltd, 2015 Some basic properties of cross-correlation functions of n-dimensional
More informationContents of this Document [ntc5]
Contents of this Document [ntc5] 5. Random Variables: Applications Reconstructing probability distributions [nex14] Probability distribution with no mean value [nex95] Variances and covariances [nex20]
More informationUTILIZING PRIOR KNOWLEDGE IN ROBUST OPTIMAL EXPERIMENT DESIGN. EE & CS, The University of Newcastle, Australia EE, Technion, Israel.
UTILIZING PRIOR KNOWLEDGE IN ROBUST OPTIMAL EXPERIMENT DESIGN Graham C. Goodwin James S. Welsh Arie Feuer Milan Depich EE & CS, The University of Newcastle, Australia 38. EE, Technion, Israel. Abstract:
More informationBiomedical Signal Processing and Signal Modeling
Biomedical Signal Processing and Signal Modeling Eugene N. Bruce University of Kentucky A Wiley-lnterscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim Brisbane Singapore Toronto
More informationAcceleration, Velocity and Displacement Spectra Omega Arithmetic
Acceleration, Velocity and Displacement Spectra Omega Arithmetic 1 Acceleration, Velocity and Displacement Spectra Omega Arithmetic By Dr Colin Mercer, Technical Director, Prosig A ccelerometers are robust,
More informationEstimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition
Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition Seema Sud 1 1 The Aerospace Corporation, 4851 Stonecroft Blvd. Chantilly, VA 20151 Abstract
More informationCommunication Theory II
Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:
More informationLECTURE NOTES IN AUDIO ANALYSIS: PITCH ESTIMATION FOR DUMMIES
LECTURE NOTES IN AUDIO ANALYSIS: PITCH ESTIMATION FOR DUMMIES Abstract March, 3 Mads Græsbøll Christensen Audio Analysis Lab, AD:MT Aalborg University This document contains a brief introduction to pitch
More informationCCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
More informationFull terms and conditions of use:
This article was downloaded by:[smu Cul Sci] [Smu Cul Sci] On: 28 March 2007 Access Details: [subscription number 768506175] Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationMagnitude F y. F x. Magnitude
Design of Optimum Multi-Dimensional Energy Compaction Filters N. Damera-Venkata J. Tuqan B. L. Evans Imaging Technology Dept. Dept. of ECE Dept. of ECE Hewlett-Packard Labs Univ. of California Univ. of
More informationNOISE ROBUST RELATIVE TRANSFER FUNCTION ESTIMATION. M. Schwab, P. Noll, and T. Sikora. Technical University Berlin, Germany Communication System Group
NOISE ROBUST RELATIVE TRANSFER FUNCTION ESTIMATION M. Schwab, P. Noll, and T. Sikora Technical University Berlin, Germany Communication System Group Einsteinufer 17, 1557 Berlin (Germany) {schwab noll
More informationResearch Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance
Advances in Decision Sciences Volume, Article ID 893497, 6 pages doi:.55//893497 Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Junichi Hirukawa and
More informationThe Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance
The Kalman Filter Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Sarah Dance School of Mathematical and Physical Sciences, University of Reading s.l.dance@reading.ac.uk July
More information