ESE 524 Detection and Estimation Theory

Similar documents
Detection and Estimation Theory

Linear Prediction Theory

Numerical Linear Algebra

Detection and Estimation Theory

Detection and Estimation Theory

A Recursive Block Incomplete Factorization. Preconditioner for Adaptive Filtering Problem

Distributed Rule-Based Inference in the Presence of Redundant Information

General Linear Model Introduction, Classes of Linear models and Estimation

Detection and Estimation Theory

Use of Transformations and the Repeated Statement in PROC GLM in SAS Ed Stanek

Chapter 10. Supplemental Text Material

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

CHAPTER-II Control Charts for Fraction Nonconforming using m-of-m Runs Rules

Detection and Estimation Theory

Autoregressive (AR) Modelling

15-451/651: Design & Analysis of Algorithms October 23, 2018 Lecture #17: Prediction from Expert Advice last changed: October 25, 2018

Bayesian Model Averaging Kriging Jize Zhang and Alexandros Taflanidis

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

LPC methods are the most widely used in. recognition, speaker recognition and verification

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

The Recursive Fitting of Multivariate. Complex Subset ARX Models

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Observer/Kalman Filter Time Varying System Identification

Self-Driving Car ND - Sensor Fusion - Extended Kalman Filters

Spectral Analysis by Stationary Time Series Modeling

For final project discussion every afternoon Mark and I will be available

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

The Properties of Pure Diagonal Bilinear Models

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

MATH 2710: NOTES FOR ANALYSIS

Chater Matrix Norms and Singular Value Decomosition Introduction In this lecture, we introduce the notion of a norm for matrices The singular value de

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Estimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction

Moment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution

F denotes cumulative density. denotes probability density function; (.)

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

4. Score normalization technical details We now discuss the technical details of the score normalization method.

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

Multiple Random Variables

Introduction to Probability for Graphical Models

ECE 534 Information Theory - Midterm 2

B8.1 Martingales Through Measure Theory. Concept of independence

Notes on Instrumental Variables Methods

MULTI-CHANNEL PARAMETRIC ESTIMATOR FAST BLOCK MATRIX INVERSES

Homework 2: Solution

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 19: Bayesian Linear Estimators

Least Squares. Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Winter UCSD

José Alberto Mauricio. Instituto Complutense de Análisis Económico. Universidad Complutense de Madrid. Campus de Somosaguas Madrid - SPAIN

arxiv: v1 [physics.data-an] 26 Oct 2012

1 Data Arrays and Decompositions

Generation of Linear Models using Simulation Results

Adaptive Filter Theory

COMPARISON OF VARIOUS OPTIMIZATION TECHNIQUES FOR DESIGN FIR DIGITAL FILTERS

Outline. Markov Chains and Markov Models. Outline. Markov Chains. Markov Chains Definitions Huizhen Yu

Chapter 13 Variable Selection and Model Building

3.4 Design Methods for Fractional Delay Allpass Filters

E( x ) = [b(n) - a(n,m)x(m) ]

The Longest Run of Heads

MODEL-BASED MULTIPLE FAULT DETECTION AND ISOLATION FOR NONLINEAR SYSTEMS

Iterative Methods for Designing Orthogonal and Biorthogonal Two-channel FIR Filter Banks with Regularities

1. INTRODUCTION. Fn 2 = F j F j+1 (1.1)

Lecture 6. 2 Recurrence/transience, harmonic functions and martingales

Analysis of M/M/n/K Queue with Multiple Priorities

Multivariate Random Variable

On Fractional Predictive PID Controller Design Method Emmanuel Edet*. Reza Katebi.**

The Graph Accessibility Problem and the Universality of the Collision CRCW Conflict Resolution Rule

Optimization. Sherif Khalifa. Sherif Khalifa () Optimization 1 / 50

DETC2003/DAC AN EFFICIENT ALGORITHM FOR CONSTRUCTING OPTIMAL DESIGN OF COMPUTER EXPERIMENTS

Nonlinear Estimation. Professor David H. Staelin

A Simple Weight Decay Can Improve. Abstract. It has been observed in numerical simulations that a weight decay can improve

Lecture 13 Heat Engines

Training sequence optimization for frequency selective channels with MAP equalization

A New Asymmetric Interaction Ridge (AIR) Regression Method

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Asymptotic Properties of the Markov Chain Model method of finding Markov chains Generators of..

SAS for Bayesian Mediation Analysis

EE482: Digital Signal Processing Applications

Detection and Estimation Theory

Elementary Analysis in Q p

Appendix A : Introduction to Probability and stochastic processes

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

Solution sheet ξi ξ < ξ i+1 0 otherwise ξ ξ i N i,p 1 (ξ) + where 0 0

LINEAR SYSTEMS WITH POLYNOMIAL UNCERTAINTY STRUCTURE: STABILITY MARGINS AND CONTROL

Multivariable Generalized Predictive Scheme for Gas Turbine Control in Combined Cycle Power Plant

Applied Fitting Theory VI. Formulas for Kinematic Fitting

TSRT14: Sensor Fusion Lecture 9

Solutions to Homework Set #6 (Prepared by Lele Wang)

Learning Sequence Motif Models Using Gibbs Sampling

Hidden Predictors: A Factor Analysis Primer

OR MSc Maths Revision Course

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.

Chapter 5,6 Multiple RandomVariables

Pretest (Optional) Use as an additional pacing tool to guide instruction. August 21

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK

A SIMPLE PLASTICITY MODEL FOR PREDICTING TRANSVERSE COMPOSITE RESPONSE AND FAILURE

Review (Probability & Linear Algebra)

Transcription:

ESE 524 Detection and Estimation heory Joseh A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer Hall 34-935-473 (Lynda answers) jao@wustl.edu J. A. O'S. ESE 524, Lecture 4, 3/3/9

Linear Estimation x and y are jointly Gaussian. Problem : Find the exected value of x given y. Jointly Gaussian Posterior is Gaussian and MMSE estimate = MAP estimate x μx Kxx Kxy N, =, μ N y y Kyx Kyy E xy ( μ K) μ n+ m ln (, ) ln ((2 π) det( )) ( μ ) ( μ ) Kxx Kxy x x xy = K x 2 2 x y y μ Kyx Kyy y y Kxx Kxy x μx x ln ( xy, ) = [ I ] = μ Kyx Kyy y y J. A. O'S. ESE 524, Lecture 4, 3/3/9 2

Linear Estimation Solution for MMSE estimate uses the bloc matrix inversion exression Result is simle, easily interretable, fundamental Kxx Kxy x μx x ln ( xy, ) = [ I ] = μ Kyx Kyy y y ( ) ( ) ( ) ( ) Κ xx K Κ xx K xyκ yyk yx Κ xx K xyκ yykyx KxyΚ xy yy = K yx Κ yy Κ yyk yx Κ xx K xyκ yyk yx Κyy + ΚyyK yx Κ xx K xyκ yykyx KxyΚ yy μ ( Κ xx K xyκ yy ) ( ) x x K = yx Κ xx K xyκ yykyx KxyΚ yy μ y y ˆ MMSE ( μ ) = μx + xy yy y x K Κ y J. A. O'S. ESE 524, Lecture 4, 3/3/9 3

Linear Estimation Udated ( ) xˆ MMSE = μx + KxyΚ yy y μy Posterior mean Prior mean Attenuated by uncertainty New information Amlified by correlation J. A. O'S. ESE 524, Lecture 4, 3/3/9 4

Linear Estimation Problem 2: Assume that the mean vectors and joint covariance matrix for x and y are nown. Among all linear estimates of x as a function of y, find the one that minimizes MSE Assume x and y are zero mean random variables with nown joint covariance matrix. Find the linear estimator that minimizes the trace of the error covariance matrix ( xay)( xay) minr E A = minr Κ xx KxyA AKyx + AΚ yya A ( ) ( ) = minr Κ K Κ K + AK Κ Κ AK Κ A = r Κ xx K xyκ yykyx, with minimum at A= KxyΚ yy xx xy yy yx xy yy yy xy yy J. A. O'S. ESE 524, Lecture 4, 3/3/9 5

Linear Estimation Problem : x and y are jointly Gaussian. Find the exected value of x given y. Problem 2: x and Y have nown second order statistics. Among all linear estimates of x as a function of y, find the one that minimizes MSE Answer 2 = Answer Fundamental (Orthogonality) Proerty: he error in the estimate is orthogonal to the variables used in the estimate. Error covariance matrix ( x [ x y] ) x [ x y] E E E = Κ K Κ K xx xy yy yx ( ) ( [ ]) = [ ] ( ) ( [ ] [ ]) E E E E E x x y y x x y y y = E E E x y x y y = J. A. O'S. ESE 524, Lecture 4, 3/3/9 6

Recursive Linear Estimation Data Model and Problem Statements Suose a zero-mean, stationary Gaussian random rocess (GRP) with nown covariance function is given. Problem : Find the minimum mean square error estimate of the resent value of the GRP given the revious values. a: Derive the result as a transversal filter and derive the order-recursive udates (from to +). b: Derive the result as a lattice filter and derive the orderrecursive udates for the coefficients (reflection arameters). Problem 2: Assume that the GRP satisfies an autoregressive (AR) model of order. 2a: Find the maximum lielihood estimates of the AR arameters, including the time-recursive and order-recursive udates. 2b: Find the time- and order-recursive udates for the lattice filter coefficients (reflection arameters). J. A. O'S. ESE 524, Lecture 4, 3/3/9 7

Linear Prediction heory [ ] GRP Jointly Gaussian Er [ n] =, E rr n nl = cl Distributions for any Er [ n rn, rn2,... rn] = wr n+ wr 2 n2 +... + wr n subset of random variables Jointly Gaussian linear r( n ) = [ rn... rn2 rn ] estimation results aly; w = [ w w... w] estimate of current value is a linear combination of Er [ n rn, rn2,... rn] = wr( n) revious values r( n ) Linear combination defines E ( n ) rn r r = + = a transversal filter n c Imlementation through a ( ) = c i, j ij, = [ c c... c] taed delay line w =, w = Stationary covariance 2 matrix is oelitz, E ( rn wr( n ) ) = c coefficients in a transversal filter are indeendent of time J. A. O'S. ESE 524, Lecture 4, 3/3/9 8

Linear Prediction heory Covariance matrix is oelitz oelitz constant diagonals Order recursion derives from artition of covariance matrix by order. here are two standard artitions. he second uses an exchange matrix J that has ones along the antidiagonal. Somewhat loose on subscrits J = r ( n ) E ( n ) r = r r n n + ( ) = c, = [ c c... c ] w i, j ij = = 2 2 2 3 2 3 + = c ( J ) c + = J J. A. O'S. ESE 524, Lecture 4, 3/3/9 + = c+ c c c c c c c c c c c c c c c c 9

Linear Prediction heory Equations resulting from the orthogonality roerty are the normal equations erminology: forward rediction error of order ([ r ] n r w ) E ( n) r ( n ) = w =, w = a w = w a = = + c w + = J. A. O'S. ESE 524, Lecture 4, 3/3/9

Order Udate on Inverse: Ran One Udate ( J ) } } + = + c J row ( c ) ( c ) ( c ) ( c ) + + = = c = rows ( ( ) ) c J J ( c ( ) ) J J ( J) ( c ( ) ) ( ( ) ) + c ( ) J J J J J J J ( c ) ( c ) ( c ) ( c ) J = J + J J J J =, JJ = I, J = J } } rows row J. A. O'S. ESE 524, Lecture 4, 3/3/9

Order Udate on Inverse: Ran One Udate = + a c + = = J ( c ) ( c ) ( c ) ( c ) ( c ) + + = = c b a ( J ) ( c ) ( c ) ( c ) ( c ) = + b = = Ja Jw J J + J J ( c ) b J. A. O'S. ESE 524, Lecture 4, 3/3/9 2

Bacward Prediction Predict r n- from following values Bacward rediction error of order ; bacward rediction error filter Exchange matrix comes in again Same error variance as in forward rediction Basis for order udate: udate bacward and forward rediction error coefficients ([ r ] n r w ) E ( n) r ( n) = J w =, w = J = J = Jw J. A. O'S. ESE 524, Lecture 4, 3/3/9 + b = Jw c J + b = = J Jw + = Jw b c + J + 2 = Jw J + + c J c+ = J Jw = c+ c c+ Jw Δ = b 3

Order Udate For order udate, combine equations to cancel to and bottom terms he terms in arentheses must be the order-udates of the forward and bacward rediction error filters b + 2 = c + Jw + + + 2 = + c w a c J c + c+ Jw = J w = c+ c Δ = Ja = b + + + b+ =+ 2, 2 a+ =+ + b Δ + + 2, + = = a Δ b =, = + 2 + a J. A. O'S. ESE 524, Lecture 4, 3/3/9 + Δ Δ 2 2 4

Order Udate Order udate requires multilies to find Δ One division + multilies to get the udate wo multilies to get the next error variance otal comlexity: 2 +3 Sum from to is ( +)+3 his is the transversal filter version of linear rediction. Initialization: = c, a = b =, = c, = c = 2. Udate reflection coefficient and error variance 2 Δ Δ = + Ja, + = 2 = a c 3. Udate rediction error filters b Δ b + = a Δ b a + = a 4. Recursion ste c = + = +, +, return to ste 2 J. A. O'S. ESE 524, Lecture 4, 3/3/9 5

Lattice Filter Lattice filter structure is different from transversal. Each bloc has a delay on the bacward rediction error and cross-multilication he multilers are reflection coefficients here is an efficient udate for the reflection coefficients (just use the revious udate rule for transversal filters) Δ Define filter in terms of Define the forward and bacward rediction errors F( n) = ar+ ( n), G( n) = br+ ( n), Order udate equations give Δ b F+ ( n) = a+ r+ 2( n) = + 2( n) r a Δ b = r+ 2( n) 2( n) r + a Δ = F( n) G( n) b Δ G+ ( n) = b+ r+ 2( n) = 2( n) r + a Δ = G( n) F( n) Δ F+ ( n) F( n) = G ( n) Δ G( n) J. A. O'S. ESE + 524, Lecture 4, 3/3/9 6

Estimation Aroaches In this aroach to linear rediction, the second order statistics are assumed nown and the otimal estimators are derived, including order-recursive udates for both transversal and lattice filters. If the covariance function is not nown, then it must be estimated, or the filter coefficients must be estimated directly. A maximum lielihood aroach is reasonable. he data-udate version of the equations taes the form of an RLS recursive least squares solution. he RLS algorithm is usually imlemented with an arbitrary, but small initial covariance matrix.

Data Driven Estimation Aroaches Both data udates and order udates Data udates: ran one udate to a matrix is a ran one udate to its inverse r ( n) ( N) ( N) r ( N) c ( N) N r( n ) r n = + ( N) = n= + n w ( N) ( N) ( N) N n= + = ( w r ) 2 n ( ) ( ) = ( ) ( ) ( ) ( ) ( ) r N n c N N N N N + ( N + ) =+ ( N) + ( N) r+ ( N) ( ) r r+ N + ( N) r + ( N) ( N) ( N) + + J. A. O'S. ESE 524, Lecture 4, 3/3/9 8