Part III Spectrum Estimation
|
|
- Felix Hunt
- 6 years ago
- Views:
Transcription
1 ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted from ENEE630 slides developed by Profs. K.J. Ray Liu and Min Wu at the University of Maryland, College Par. Contact:
2 Summary of Related Readings on Part-III Overview Hayins.6,.0 3. Non-parametric method Hayes 8.; 8. (8..3, 8..5); Parametric method Hayes 8.5, 4.7; Frequency estimation Hayes 8.6 Review On DSP and Linear algebra: Hayes.,.3 On probability and parameter estimation: Hayes NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]
3 Motivation Implicit assumption by classical methods Classical methods use Fourier transform on either windowed data or windowed autocorrelation function (ACF) Implicitly assume the unobserved data or ACF outside the window are zero => not true in reality Consequence of windowing: smeared spectral estimate (leading to low resolution) If prior nowledge about the process is available We can use prior nowledge and select a good model to approximate the process Usually need to estimate fewer model parameters (than nonparametric approaches) using the limited data points we have The model may allow us to better describe the process outside the window (instead of assuming zeros) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [5]
4 General Procedure of Parametric Methods Select a model (based on prior nowledge) Estimate the parameters of the assumed model Obtain the spectral estimate implied by the model (with the estimated parameters) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [6]
5 Spectral Estimation using AR, MA, ARMA Models Physical insight: the process is generated/approximated by filtering white noise with an LTI filter of rational transfer func H(z) Use observed data to estimate a few lags of r() Larger lags of r() can be implicitly extrapolated by the model Relation between r() and filter parameters {a } and {b } PARAMETER EQUATIONS from Section..(6) Solve the parameter equations to obtain filter parameters Use the p.s.d. implied by the model as our spectral estimate Deal with nonlinear parameter equations Try to convert/relate them to the AR models that have linear equations NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [7]
6 Review: Parameter Equations Yule-Waler equations (for AR process) ARMA model MA model NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [8]
7 3.. AR Spectral Estimation () Review of AR process The time series {x[n], x[n-],, x[n-m]} is a realization of an AR process of order M if it satisfies the difference equation x[n] + a x[n-] + + a M x[n-m] = v[n] where {v[n]} is a white noise process with variance. Generating an AR process with parameters {a i }: NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [9]
8 ) (/ ) ( ) ( ˆ AR z A z A z P f j j e e z AR ) ( ˆ M f j a e f P NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [] P.S.D. of An AR Process Recall: the p.s.d. of an AR process {x[n]} is given by
9 Procedure of AR Spectral Estimation Observe the available data points x[0],, x[n-], and Determine the AR process order p Estimate the autocorrelation functions (ACF) =0, p Biased (low variance) rˆ ( ) N N n0 x[ n ] x [ n] Unbiased (may not non neg.definite) rˆ( ) N N n0 x[ n ] x [ n] Solve { a i } from the Yule-Waler equations or the normal equation of forward linear prediction Recall for an AR process, the normal equation of FLP is equivalent to the Yule-Waler equation Pˆ AR( f ) Obtain power spectrum P AR (f): M jf a e NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]
10 3.. Maximum Entropy Spectral Estimation (MESE) View point: Extrapolations of ACF {r[0], r[],, r[p]} is nown; there are generally an infinite number of possible extrapolations for r() at larger lags As long as { r[p+], r[p+], } guarantee that the correlation matrix is non-negative definite, they all form valid ACFs for w.s.s. Maximum entropy principle Perform extrapolation s.t. the time series characterized by the extrapolated ACF has maximum entropy i.e. the time series will be the least constrained thus most random one among all series having the same first (p+) ACF values Maximizing entropy leads to estimated p.s.d. be the smoothest one Recall white noise process has flat p.s.d. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [5]
11 MESE for Gaussian Process: Formulation For a Gaussian random process, the entropy per sample is proportional to ln P( f ) df The max entropy spectral estimation is subject to max P( ln P( f ) f ) e jf df df r( ), for 0,,..., p NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [7]
12 MESE for Gaussian Process: Solution Using the Lagrangian multiplier technique, the solution can be found as Pˆ ME ( f ) p jf a e where {a } are found by solving the Yule-Waler equations given the ACF values r(0),, r(p) For Gaussian processes, the MESE is equivalent to AR spectral estimator and the P ME (f) is an all-pole spectrum Different assumptions on the process: Gaussian vs AR processes NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [9]
13 NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [] 3..3 MA Spectral Estimation Recall important results on MA process: () The problem of solving for b given {r()} is to solve a set of nonlinear equations; () An MA process can be approximated by an AR process of sufficiently high order. An MA(q) model can be used to define an MA spectral estimator q q z b z B n v b n x 0 0 ) ( ] [ ] [ MA ) ( ˆ q f j b e f P
14 Basic Idea to Avoid Solving Nonlinear Equations Consider two processes: Process#: we observed N samples, and need to perform spectral estimate We first model it as a high-order AR process, generated by /A(z) filter Process#: an MA-process generated by A(z) filter Since we now A(z), we can now process# s autocorrelation function We model process# as an AR(q) process => the filter would be /B(z) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation []
15 Use AR Model to Help Finding MA Parameters For simplicity, we consider the real coefficients for the MA model. Note P MA ( z) B( z) B( z ) To approximate it with an AR(L) model, i.e., ( P z) where A ( z ) a z MA A( z) A( z ) L q A( z) A( z ) B( z) B( z ) order L order q The RHS represents power spectrum of an AR(q) process The unverse ZT of LHS is the ACF of the AR(q) process L NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]
16 Use AR to Help Finding MA Parameters (cont d) A random process with power spectrum A(z)A(z - ) can be viewed as filtering a white process by a filter A(z), and has autocorrelation proportional to L n0 a n a n for lag Knowing such autocorrelation function, we can use Levinson-Durbin recursion to find the optimal linear prediction parameters for the process (or equivalently its AR approximation parameters) Thus we get {b } as A( z) A( z ) B( z) B( z ) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [8]
17 Durbin s Method. Use Levinson-Durbin recursion and solve for where That is, we first approximate the observed data sequence {x[0],, x[n]} with an AR model of high order (often pic L > 4q) We use biased ACF estimate here to ensure nonnegative definiteness and smaller variance than unbiased estimate (dividing by N-) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [30]
18 Durbin Method (cont d), aˆ, aˆ,..., aˆ { L. Fit the data sequence to an AR(q) model: } where The result {b i } is the estimated MA parameters for original{x[n]} Note we add /(L+) factor to allow the interpretation of r a () as an autocorrelation function estimator NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]
19 q p n v b n x a n x 0 ] [ ] [ ] [ ARMA ˆ ˆ ˆ ) ( ˆ p f j q f j e a e b f P Recall the ARMA(p,q) model We define an ARMA(p,q) spectral estimator NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [34] 3..4 ARMA Spectral Estimation
20 Modified Yule-Waler Equations Recall the Yule-Waler Eq. for ARMA(p,q) process We can use the equations for q+ to solve for {a l } Modified Yule-Waler Equations NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [36]
21 Estimating ARMA Parameters. By solving the modified Yule-Waler eq., we get. To estimate {b }, p A ˆ( z ) A ˆ( z ) a ˆ z We first filter {x[n]} with, and model the output with an MA(q) model using Durbin s method. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [38]
22 Extension: LSMYWE Estimator Performance by solving p modified Yule-Waler equations followed by Durbin s method May yield highly variable spectral estimates (esp. when the matrix involving ACF is nearly singular due to poor ACF estimates) Improvement: use more than p equations to solve {a,..., a p } in a least square sense Use Yule-Waler equations for = (q+), M: min t S a Least square solution: a = (S H S) S H t Then obtain {b i } by Durbin s method Least-square modified Yule-Waler equation (LSMYWE) Ref: review in Hayes boo Sec..3.6 on least square solution NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [40]
23 Comparison of Different Methods: Revisit Test case: a process consists of narrowband components (sinusoids) and a broadband component (AR) x[n] = cos( n) + cos( n) + cos( 3 n) + z[n] where z[n] = a z[n-] + v[n], a = 0.85, = 0. / = 0.05, / = 0.40, 3 / = 0.4 N=3 data points are available periodogram resolution f = /3 Examine typical characteristics of various non-parametric and parametric spectral estimators (Fig..7 from Lim/Oppenheim boo) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [4]
24 NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [4]
25 NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [43]
26 3..5 Model Order Selection The best way to determine the model order is to base it on the physics of the data generation process Example: speech processing Studies show the vocal tract can be modeled as an all-pole filter having 4 resonances in a 4Hz band, thus at least 4 pairs of complex conjugate poles are necessary Typically 0- poles are used in an AR modeling for speech When no such nowledge is available, we can use some statistical test to estimate the order Ref. for in-depth exploration: Model-order selection, by P. Stoica and Y. Selen, IEEE Signal Processing Magazine, July 004. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [44]
27 Considerations for Order Selection Modeling error Modeling error measures the (statistical) difference between the true data value and the approximation by the model e.g. estimating linear prediction MSE in AR modeling Usually for a given type of model (e.g. AR, ARMA), the modeling error decreases as we increase the model order Balance between the modeling error and the amount of model parameters to be estimated The number of parameters that need to be estimated and represented increases as we use higher model order Cost of overmodeling Can balance modeling error and the cost of going to higher model by imposing a penalty term that increases with the model order NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [45]
28 A Few Commonly Used Criteria Aaie Information Criterion (AIC) A general estimate of the Kullbac-Leibler divergence between assumed and true p.d.f., with an order penalty term increasing linearly Choose the model order that minimize AIC AIC( i) N ln i p size of available data model error model order: i=p for AR(p) i=p+q for ARMA(p,q) Minimum Description Length (MDL) Criterion Impose a bigger penalty term to overcome AIC s overestimation Estimated order converges to the true order as N goes to infinity MDL( i) N ln p (log N) i NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [46]
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides
More informationParametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion
Parametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationNonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied
Linear Signal Models Overview Introduction Linear nonparametric vs. parametric models Equivalent representations Spectral flatness measure PZ vs. ARMA models Wold decomposition Introduction Many researchers
More informationLinear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing
Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College d.mandic@imperial.ac.uk c Danilo
More informationSGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection
SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28
More informationAutomatic Autocorrelation and Spectral Analysis
Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution
More informationParametric Method Based PSD Estimation using Gaussian Window
International Journal of Engineering Trends and Technology (IJETT) Volume 29 Number 1 - November 215 Parametric Method Based PSD Estimation using Gaussian Window Pragati Sheel 1, Dr. Rajesh Mehra 2, Preeti
More information1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c =
ENEE630 ADSP Part II w/ solution. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix) R a = 4 4 4,R b = 0 0,R c = j 0 j 0 j 0 j 0 j,r d = 0 0 0
More informationStatistical and Adaptive Signal Processing
r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory
More informationOn Moving Average Parameter Estimation
On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationFigure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).
Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000
More informationAdvanced Digital Signal Processing -Introduction
Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary
More informationEE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI
EE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI ABSTRACT The goal of the paper is to present a parametric Fourier
More informationBasics on 2-D 2 D Random Signal
Basics on -D D Random Signal Spring 06 Instructor: K. J. Ray Liu ECE Department, Univ. of Maryland, College Park Overview Last Time: Fourier Analysis for -D signals Image enhancement via spatial filtering
More informationPractical Spectral Estimation
Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More informationECE4270 Fundamentals of DSP Lecture 20. Fixed-Point Arithmetic in FIR and IIR Filters (part I) Overview of Lecture. Overflow. FIR Digital Filter
ECE4270 Fundamentals of DSP Lecture 20 Fixed-Point Arithmetic in FIR and IIR Filters (part I) School of ECE Center for Signal and Information Processing Georgia Institute of Technology Overview of Lecture
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationTime Series: Theory and Methods
Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary
More informationLecture 4 - Spectral Estimation
Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationCCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Stochastic Processes Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
More informationMoving Average (MA) representations
Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],
More informationTHE ESTIMATED spectra or autocorrelation functions
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 58, NO. 5, MAY 2009 36 Modified Durbin Method for Accurate Estimation of Moving-Average Models Piet M. T. Broersen Abstract Spectra with narrow
More informationCONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1
DIGITAL SPECTRAL ANALYSIS WITH APPLICATIONS S.LAWRENCE MARPLE, JR. SUMMARY This new book provides a broad perspective of spectral estimation techniques and their implementation. It concerned with spectral
More informationLeast Square Es?ma?on, Filtering, and Predic?on: ECE 5/639 Sta?s?cal Signal Processing II: Linear Es?ma?on
Least Square Es?ma?on, Filtering, and Predic?on: Sta?s?cal Signal Processing II: Linear Es?ma?on Eric Wan, Ph.D. Fall 2015 1 Mo?va?ons If the second-order sta?s?cs are known, the op?mum es?mator is given
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
1 Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering University of Maryland, College Park
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationEvaluating some Yule-Walker Methods with the Maximum-Likelihood Estimator for the Spectral ARMA Model
TEMA Tend. Mat. Apl. Comput., 9, No. 2 (2008), 175-184. c Uma Publicação da Sociedade Brasileira de Matemática Aplicada e Computacional. Evaluating some Yule-Walker Methods with the Maximum-Likelihood
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationAutoregressive tracking of vortex shedding. 2. Autoregression versus dual phase-locked loop
Autoregressive tracking of vortex shedding Dileepan Joseph, 3 September 2003 Invensys UTC, Oxford 1. Introduction The purpose of this report is to summarize the work I have done in terms of an AR algorithm
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationEE 123: Digital Signal Processing Spring Lecture 23 April 10
EE 123: Digital Signal Processing Spring 2007 Lecture 23 April 10 Lecturer: Anant Sahai Scribe: Chuo J. Liu 23.1 Outline In this lecture, we talked about implementation issue for rational transfer function.
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationAutomatic spectral analysis with time series models
Selected Topics in Signals, Systems and Control Vol. 12, September 2001 Automatic spectral analysis with time series models P.M.T. Broersen Signals, Systems and Control Group, Department of Applied Physics
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationWe will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.
ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationPARAMETER ESTIMATION AND ORDER SELECTION FOR LINEAR REGRESSION PROBLEMS. Yngve Selén and Erik G. Larsson
PARAMETER ESTIMATION AND ORDER SELECTION FOR LINEAR REGRESSION PROBLEMS Yngve Selén and Eri G Larsson Dept of Information Technology Uppsala University, PO Box 337 SE-71 Uppsala, Sweden email: yngveselen@ituuse
More informationDetection & Estimation Lecture 1
Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven
More informationLesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:
Lesson 1 Optimal Signal Processing Optimal signalbehandling LTH September 2013 Statistical Digital Signal Processing and Modeling, Hayes, M: John Wiley & Sons, 1996. ISBN 0471594318 Nedelko Grbic Mtrl
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationADSP ADSP ADSP ADSP. Advanced Digital Signal Processing (18-792) Spring Fall Semester, Department of Electrical and Computer Engineering
Advanced Digital Signal rocessing (18-792) Spring Fall Semester, 201 2012 Department of Electrical and Computer Engineering ROBLEM SET 8 Issued: 10/26/18 Due: 11/2/18 Note: This problem set is due Friday,
More informationReview of Discrete-Time System
Review of Discrete-Time System Electrical & Computer Engineering University of Maryland, College Park Acknowledgment: ENEE630 slides were based on class notes developed by Profs. K.J. Ray Liu and Min Wu.
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationSIMON FRASER UNIVERSITY School of Engineering Science
SIMON FRASER UNIVERSITY School of Engineering Science Course Outline ENSC 810-3 Digital Signal Processing Calendar Description This course covers advanced digital signal processing techniques. The main
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationChapter 9. Linear Predictive Analysis of Speech Signals 语音信号的线性预测分析
Chapter 9 Linear Predictive Analysis of Speech Signals 语音信号的线性预测分析 1 LPC Methods LPC methods are the most widely used in speech coding, speech synthesis, speech recognition, speaker recognition and verification
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationII. Nonparametric Spectrum Estimation for Stationary Random Signals - Non-parametric Methods -
II. onparametric Spectrum Estimation for Stationary Random Signals - on-parametric Methods - - [p. 3] Periodogram - [p. 12] Periodogram properties - [p. 23] Modified periodogram - [p. 25] Bartlett s method
More informationTHE PROCESSING of random signals became a useful
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 58, NO. 11, NOVEMBER 009 3867 The Quality of Lagged Products and Autoregressive Yule Walker Models as Autocorrelation Estimates Piet M. T. Broersen
More informationSTAD57 Time Series Analysis. Lecture 23
STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=
More informationStatistical Signal Processing Detection, Estimation, and Time Series Analysis
Statistical Signal Processing Detection, Estimation, and Time Series Analysis Louis L. Scharf University of Colorado at Boulder with Cedric Demeure collaborating on Chapters 10 and 11 A TT ADDISON-WESLEY
More informationDetection & Estimation Lecture 1
Detection & Estimation Lecture 1 Intro, MVUE, CRLB Xiliang Luo General Course Information Textbooks & References Fundamentals of Statistical Signal Processing: Estimation Theory/Detection Theory, Steven
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationStationary Graph Processes: Nonparametric Spectral Estimation
Stationary Graph Processes: Nonparametric Spectral Estimation Santiago Segarra, Antonio G. Marques, Geert Leus, and Alejandro Ribeiro Dept. of Signal Theory and Communications King Juan Carlos University
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCh 8. MODEL DIAGNOSTICS. Time Series Analysis
Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals
More informationTHE PROBLEMS OF ROBUST LPC PARAMETRIZATION FOR. Petr Pollak & Pavel Sovka. Czech Technical University of Prague
THE PROBLEMS OF ROBUST LPC PARAMETRIZATION FOR SPEECH CODING Petr Polla & Pavel Sova Czech Technical University of Prague CVUT FEL K, 66 7 Praha 6, Czech Republic E-mail: polla@noel.feld.cvut.cz Abstract
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationThe Z transform (2) 1
The Z transform (2) 1 Today Properties of the region of convergence (3.2) Read examples 3.7, 3.8 Announcements: ELEC 310 FINAL EXAM: April 14 2010, 14:00 pm ECS 123 Assignment 2 due tomorrow by 4:00 pm
More informationFigure 18: Top row: example of a purely continuous spectrum (left) and one realization
1..5 S(). -.2 -.5 -.25..25.5 64 128 64 128 16 32 requency time time Lag 1..5 S(). -.5-1. -.5 -.1.1.5 64 128 64 128 16 32 requency time time Lag Figure 18: Top row: example o a purely continuous spectrum
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationAutoregressive (AR) Modelling
Autoregressive (AR) Modelling A. Uses of AR Modelling () Alications (a) Seech recognition and coding (storage) (b) System identification (c) Modelling and recognition of sonar, radar, geohysical signals
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationTime series models in the Frequency domain. The power spectrum, Spectral analysis
ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationHeteroskedasticity and Autocorrelation Consistent Standard Errors
NBER Summer Institute Minicourse What s New in Econometrics: ime Series Lecture 9 July 6, 008 Heteroskedasticity and Autocorrelation Consistent Standard Errors Lecture 9, July, 008 Outline. What are HAC
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationFeature extraction 2
Centre for Vision Speech & Signal Processing University of Surrey, Guildford GU2 7XH. Feature extraction 2 Dr Philip Jackson Linear prediction Perceptual linear prediction Comparison of feature methods
More informationLinear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.
Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero
More informationCHAPTER 8 FORECASTING PRACTICE I
CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationA Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University
EDICS category SP 1 A Subspace Approach to Estimation of Autoregressive Parameters From Noisy Measurements 1 Carlos E Davila Electrical Engineering Department, Southern Methodist University Dallas, Texas
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationClassical Decomposition Model Revisited: I
Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s
More informationEEG- Signal Processing
Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external
More informationrepresentation of speech
Digital Speech Processing Lectures 7-8 Time Domain Methods in Speech Processing 1 General Synthesis Model voiced sound amplitude Log Areas, Reflection Coefficients, Formants, Vocal Tract Polynomial, l
More informationComplement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises
Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises Random Signals Analysis (MVE136) Mats Viberg Department of Signals and Systems Chalmers University of Technology 412
More informationE : Lecture 1 Introduction
E85.2607: Lecture 1 Introduction 1 Administrivia 2 DSP review 3 Fun with Matlab E85.2607: Lecture 1 Introduction 2010-01-21 1 / 24 Course overview Advanced Digital Signal Theory Design, analysis, and implementation
More informationLecture 19 IIR Filters
Lecture 19 IIR Filters Fundamentals of Digital Signal Processing Spring, 2012 Wei-Ta Chu 2012/5/10 1 General IIR Difference Equation IIR system: infinite-impulse response system The most general class
More informationDISCRETE-TIME SIGNAL PROCESSING
THIRD EDITION DISCRETE-TIME SIGNAL PROCESSING ALAN V. OPPENHEIM MASSACHUSETTS INSTITUTE OF TECHNOLOGY RONALD W. SCHÄFER HEWLETT-PACKARD LABORATORIES Upper Saddle River Boston Columbus San Francisco New
More information6. Methods for Rational Spectra It is assumed that signals have rational spectra m k= m
6. Methods for Rational Spectra It is assumed that signals have rational spectra m k= m φ(ω) = γ ke jωk n k= n ρ, (23) jωk ke where γ k = γ k and ρ k = ρ k. Any continuous PSD can be approximated arbitrary
More informationAutoregressive Models Fourier Analysis Wavelets
Autoregressive Models Fourier Analysis Wavelets BFR Flood w/10yr smooth Spectrum Annual Max: Day of Water year Flood magnitude vs. timing Jain & Lall, 2000 Blacksmith Fork, Hyrum, UT Analyses of Flood
More informationSummary notes for EQ2300 Digital Signal Processing
Summary notes for EQ3 Digital Signal Processing allowed aid for final exams during 6 Joakim Jaldén, 6-- Prerequisites The DFT and the FFT. The discrete Fourier transform The discrete Fourier transform
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationL7: Linear prediction of speech
L7: Linear prediction of speech Introduction Linear prediction Finding the linear prediction coefficients Alternative representations This lecture is based on [Dutoit and Marques, 2009, ch1; Taylor, 2009,
More informationAutomatic Spectral Analysis With Time Series Models
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 51, NO. 2, APRIL 2002 211 Automatic Spectral Analysis With Time Series Models Piet M. T. Broersen Abstract The increased computational speed and
More information