Part III Spectrum Estimation

Similar documents
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 4. The Levinson-Durbin Recursion

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

Automatic Autocorrelation and Spectral Analysis

Parametric Method Based PSD Estimation using Gaussian Window

1. Determine if each of the following are valid autocorrelation matrices of WSS processes. (Correlation Matrix),R c =

Statistical and Adaptive Signal Processing

On Moving Average Parameter Estimation

2. An Introduction to Moving Average Models and ARMA Models

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Advanced Digital Signal Processing -Introduction

EE 602 TERM PAPER PRESENTATION Richa Tripathi Mounika Boppudi FOURIER SERIES BASED MODEL FOR STATISTICAL SIGNAL PROCESSING - CHONG YUNG CHI

Basics on 2-D 2 D Random Signal

Practical Spectral Estimation

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

ECE4270 Fundamentals of DSP Lecture 20. Fixed-Point Arithmetic in FIR and IIR Filters (part I) Overview of Lecture. Overflow. FIR Digital Filter

11. Further Issues in Using OLS with TS Data

FE570 Financial Markets and Trading. Stevens Institute of Technology

Time Series: Theory and Methods

Lecture 4 - Spectral Estimation

Ch 6. Model Specification. Time Series Analysis

Chapter 6: Model Specification for Time Series

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

Moving Average (MA) representations

THE ESTIMATED spectra or autocorrelation functions

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

Least Square Es?ma?on, Filtering, and Predic?on: ECE 5/639 Sta?s?cal Signal Processing II: Linear Es?ma?on

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Covariances of ARMA Processes

Evaluating some Yule-Walker Methods with the Maximum-Likelihood Estimator for the Spectral ARMA Model

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Autoregressive tracking of vortex shedding. 2. Autoregression versus dual phase-locked loop

3 Theory of stationary random processes

EE 123: Digital Signal Processing Spring Lecture 23 April 10

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Automatic spectral analysis with time series models

Lecture 2: Univariate Time Series

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Univariate Time Series Analysis; ARIMA Models

PARAMETER ESTIMATION AND ORDER SELECTION FOR LINEAR REGRESSION PROBLEMS. Yngve Selén and Erik G. Larsson

Detection & Estimation Lecture 1

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

Stochastic Processes. A stochastic process is a function of two variables:

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

STAT Financial Time Series

ADSP ADSP ADSP ADSP. Advanced Digital Signal Processing (18-792) Spring Fall Semester, Department of Electrical and Computer Engineering

Review of Discrete-Time System

Some Time-Series Models

SIMON FRASER UNIVERSITY School of Engineering Science

Classic Time Series Analysis

Chapter 9. Linear Predictive Analysis of Speech Signals 语音信号的线性预测分析

Introduction to Time Series Analysis. Lecture 11.

II. Nonparametric Spectrum Estimation for Stationary Random Signals - Non-parametric Methods -

THE PROCESSING of random signals became a useful

STAD57 Time Series Analysis. Lecture 23

Statistical Signal Processing Detection, Estimation, and Time Series Analysis

Detection & Estimation Lecture 1

Time Series I Time Domain Methods

A time series is called strictly stationary if the joint distribution of every collection (Y t

Stationary Graph Processes: Nonparametric Spectral Estimation

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

THE PROBLEMS OF ROBUST LPC PARAMETRIZATION FOR. Petr Pollak & Pavel Sovka. Czech Technical University of Prague

Advanced Econometrics

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

The Z transform (2) 1

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Levinson Durbin Recursions: I

Autoregressive (AR) Modelling

Ch 4. Models For Stationary Time Series. Time Series Analysis

Levinson Durbin Recursions: I

Time series models in the Frequency domain. The power spectrum, Spectral analysis

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Heteroskedasticity and Autocorrelation Consistent Standard Errors

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Time Series Examples Sheet

Feature extraction 2

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

CHAPTER 8 FORECASTING PRACTICE I

Modelling using ARMA processes

A Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Classical Decomposition Model Revisited: I

EEG- Signal Processing

representation of speech

Complement on Digital Spectral Analysis and Optimal Filtering: Theory and Exercises

E : Lecture 1 Introduction

Lecture 19 IIR Filters

DISCRETE-TIME SIGNAL PROCESSING

6. Methods for Rational Spectra It is assumed that signals have rational spectra m k= m

Autoregressive Models Fourier Analysis Wavelets

Summary notes for EQ2300 Digital Signal Processing

Circle a single answer for each multiple choice question. Your choice should be made clearly.

L7: Linear prediction of speech

Automatic Spectral Analysis With Time Series Models

Transcription:

ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted from ENEE630 slides developed by Profs. K.J. Ray Liu and Min Wu at the University of Maryland, College Par. Contact: chauwai.wong@ncsu.edu.

Summary of Related Readings on Part-III Overview Hayins.6,.0 3. Non-parametric method Hayes 8.; 8. (8..3, 8..5); 8.3 3. Parametric method Hayes 8.5, 4.7; 8.4 3.3 Frequency estimation Hayes 8.6 Review On DSP and Linear algebra: Hayes.,.3 On probability and parameter estimation: Hayes 3. 3. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]

Motivation Implicit assumption by classical methods Classical methods use Fourier transform on either windowed data or windowed autocorrelation function (ACF) Implicitly assume the unobserved data or ACF outside the window are zero => not true in reality Consequence of windowing: smeared spectral estimate (leading to low resolution) If prior nowledge about the process is available We can use prior nowledge and select a good model to approximate the process Usually need to estimate fewer model parameters (than nonparametric approaches) using the limited data points we have The model may allow us to better describe the process outside the window (instead of assuming zeros) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [5]

General Procedure of Parametric Methods Select a model (based on prior nowledge) Estimate the parameters of the assumed model Obtain the spectral estimate implied by the model (with the estimated parameters) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [6]

Spectral Estimation using AR, MA, ARMA Models Physical insight: the process is generated/approximated by filtering white noise with an LTI filter of rational transfer func H(z) Use observed data to estimate a few lags of r() Larger lags of r() can be implicitly extrapolated by the model Relation between r() and filter parameters {a } and {b } PARAMETER EQUATIONS from Section..(6) Solve the parameter equations to obtain filter parameters Use the p.s.d. implied by the model as our spectral estimate Deal with nonlinear parameter equations Try to convert/relate them to the AR models that have linear equations NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [7]

Review: Parameter Equations Yule-Waler equations (for AR process) ARMA model MA model NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [8]

3.. AR Spectral Estimation () Review of AR process The time series {x[n], x[n-],, x[n-m]} is a realization of an AR process of order M if it satisfies the difference equation x[n] + a x[n-] + + a M x[n-m] = v[n] where {v[n]} is a white noise process with variance. Generating an AR process with parameters {a i }: NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [9]

) (/ ) ( ) ( ˆ AR z A z A z P f j j e e z AR ) ( ˆ M f j a e f P NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [] P.S.D. of An AR Process Recall: the p.s.d. of an AR process {x[n]} is given by

Procedure of AR Spectral Estimation Observe the available data points x[0],, x[n-], and Determine the AR process order p Estimate the autocorrelation functions (ACF) =0, p Biased (low variance) rˆ ( ) N N n0 x[ n ] x [ n] Unbiased (may not non neg.definite) rˆ( ) N N n0 x[ n ] x [ n] Solve { a i } from the Yule-Waler equations or the normal equation of forward linear prediction Recall for an AR process, the normal equation of FLP is equivalent to the Yule-Waler equation Pˆ AR( f ) Obtain power spectrum P AR (f): M jf a e NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]

3.. Maximum Entropy Spectral Estimation (MESE) View point: Extrapolations of ACF {r[0], r[],, r[p]} is nown; there are generally an infinite number of possible extrapolations for r() at larger lags As long as { r[p+], r[p+], } guarantee that the correlation matrix is non-negative definite, they all form valid ACFs for w.s.s. Maximum entropy principle Perform extrapolation s.t. the time series characterized by the extrapolated ACF has maximum entropy i.e. the time series will be the least constrained thus most random one among all series having the same first (p+) ACF values Maximizing entropy leads to estimated p.s.d. be the smoothest one Recall white noise process has flat p.s.d. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [5]

MESE for Gaussian Process: Formulation For a Gaussian random process, the entropy per sample is proportional to ln P( f ) df The max entropy spectral estimation is subject to max P( ln P( f ) f ) e jf df df r( ), for 0,,..., p NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [7]

MESE for Gaussian Process: Solution Using the Lagrangian multiplier technique, the solution can be found as Pˆ ME ( f ) p jf a e where {a } are found by solving the Yule-Waler equations given the ACF values r(0),, r(p) For Gaussian processes, the MESE is equivalent to AR spectral estimator and the P ME (f) is an all-pole spectrum Different assumptions on the process: Gaussian vs AR processes NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [9]

NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [] 3..3 MA Spectral Estimation Recall important results on MA process: () The problem of solving for b given {r()} is to solve a set of nonlinear equations; () An MA process can be approximated by an AR process of sufficiently high order. An MA(q) model can be used to define an MA spectral estimator q q z b z B n v b n x 0 0 ) ( ] [ ] [ MA ) ( ˆ q f j b e f P

Basic Idea to Avoid Solving Nonlinear Equations Consider two processes: Process#: we observed N samples, and need to perform spectral estimate We first model it as a high-order AR process, generated by /A(z) filter Process#: an MA-process generated by A(z) filter Since we now A(z), we can now process# s autocorrelation function We model process# as an AR(q) process => the filter would be /B(z) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation []

Use AR Model to Help Finding MA Parameters For simplicity, we consider the real coefficients for the MA model. Note P MA ( z) B( z) B( z ) To approximate it with an AR(L) model, i.e., ( P z) where A ( z ) a z MA A( z) A( z ) L q A( z) A( z ) B( z) B( z ) order L order q The RHS represents power spectrum of an AR(q) process The unverse ZT of LHS is the ACF of the AR(q) process L NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]

Use AR to Help Finding MA Parameters (cont d) A random process with power spectrum A(z)A(z - ) can be viewed as filtering a white process by a filter A(z), and has autocorrelation proportional to L n0 a n a n for lag Knowing such autocorrelation function, we can use Levinson-Durbin recursion to find the optimal linear prediction parameters for the process (or equivalently its AR approximation parameters) Thus we get {b } as A( z) A( z ) B( z) B( z ) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [8]

Durbin s Method. Use Levinson-Durbin recursion and solve for where That is, we first approximate the observed data sequence {x[0],, x[n]} with an AR model of high order (often pic L > 4q) We use biased ACF estimate here to ensure nonnegative definiteness and smaller variance than unbiased estimate (dividing by N-) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [30]

Durbin Method (cont d), aˆ, aˆ,..., aˆ { L. Fit the data sequence to an AR(q) model: } where The result {b i } is the estimated MA parameters for original{x[n]} Note we add /(L+) factor to allow the interpretation of r a () as an autocorrelation function estimator NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [3]

q p n v b n x a n x 0 ] [ ] [ ] [ ARMA ˆ ˆ ˆ ) ( ˆ p f j q f j e a e b f P Recall the ARMA(p,q) model We define an ARMA(p,q) spectral estimator NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [34] 3..4 ARMA Spectral Estimation

Modified Yule-Waler Equations Recall the Yule-Waler Eq. for ARMA(p,q) process We can use the equations for q+ to solve for {a l } Modified Yule-Waler Equations NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [36]

Estimating ARMA Parameters. By solving the modified Yule-Waler eq., we get. To estimate {b }, p A ˆ( z ) A ˆ( z ) a ˆ z We first filter {x[n]} with, and model the output with an MA(q) model using Durbin s method. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [38]

Extension: LSMYWE Estimator Performance by solving p modified Yule-Waler equations followed by Durbin s method May yield highly variable spectral estimates (esp. when the matrix involving ACF is nearly singular due to poor ACF estimates) Improvement: use more than p equations to solve {a,..., a p } in a least square sense Use Yule-Waler equations for = (q+), M: min t S a Least square solution: a = (S H S) S H t Then obtain {b i } by Durbin s method Least-square modified Yule-Waler equation (LSMYWE) Ref: review in Hayes boo Sec..3.6 on least square solution NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [40]

Comparison of Different Methods: Revisit Test case: a process consists of narrowband components (sinusoids) and a broadband component (AR) x[n] = cos( n) + cos( n) + cos( 3 n) + z[n] where z[n] = a z[n-] + v[n], a = 0.85, = 0. / = 0.05, / = 0.40, 3 / = 0.4 N=3 data points are available periodogram resolution f = /3 Examine typical characteristics of various non-parametric and parametric spectral estimators (Fig..7 from Lim/Oppenheim boo) NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [4]

NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [4]

NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [43]

3..5 Model Order Selection The best way to determine the model order is to base it on the physics of the data generation process Example: speech processing Studies show the vocal tract can be modeled as an all-pole filter having 4 resonances in a 4Hz band, thus at least 4 pairs of complex conjugate poles are necessary Typically 0- poles are used in an AR modeling for speech When no such nowledge is available, we can use some statistical test to estimate the order Ref. for in-depth exploration: Model-order selection, by P. Stoica and Y. Selen, IEEE Signal Processing Magazine, July 004. NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [44]

Considerations for Order Selection Modeling error Modeling error measures the (statistical) difference between the true data value and the approximation by the model e.g. estimating linear prediction MSE in AR modeling Usually for a given type of model (e.g. AR, ARMA), the modeling error decreases as we increase the model order Balance between the modeling error and the amount of model parameters to be estimated The number of parameters that need to be estimated and represented increases as we use higher model order Cost of overmodeling Can balance modeling error and the cost of going to higher model by imposing a penalty term that increases with the model order NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [45]

A Few Commonly Used Criteria Aaie Information Criterion (AIC) A general estimate of the Kullbac-Leibler divergence between assumed and true p.d.f., with an order penalty term increasing linearly Choose the model order that minimize AIC AIC( i) N ln i p size of available data model error model order: i=p for AR(p) i=p+q for ARMA(p,q) Minimum Description Length (MDL) Criterion Impose a bigger penalty term to overcome AIC s overestimation Estimated order converges to the true order as N goes to infinity MDL( i) N ln p (log N) i NCSU ECE79-4 Statistical Methods for Signal Analytics Parametric spectral estimation [46]