ESE 524 Detection and Estimation heory Joseh A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer Hall 34-935-473 (Lynda answers) jao@wustl.edu J. A. O'S. ESE 524, Lecture 4, 3/3/9
Linear Estimation x and y are jointly Gaussian. Problem : Find the exected value of x given y. Jointly Gaussian Posterior is Gaussian and MMSE estimate = MAP estimate x μx Kxx Kxy N, =, μ N y y Kyx Kyy E xy ( μ K) μ n+ m ln (, ) ln ((2 π) det( )) ( μ ) ( μ ) Kxx Kxy x x xy = K x 2 2 x y y μ Kyx Kyy y y Kxx Kxy x μx x ln ( xy, ) = [ I ] = μ Kyx Kyy y y J. A. O'S. ESE 524, Lecture 4, 3/3/9 2
Linear Estimation Solution for MMSE estimate uses the bloc matrix inversion exression Result is simle, easily interretable, fundamental Kxx Kxy x μx x ln ( xy, ) = [ I ] = μ Kyx Kyy y y ( ) ( ) ( ) ( ) Κ xx K Κ xx K xyκ yyk yx Κ xx K xyκ yykyx KxyΚ xy yy = K yx Κ yy Κ yyk yx Κ xx K xyκ yyk yx Κyy + ΚyyK yx Κ xx K xyκ yykyx KxyΚ yy μ ( Κ xx K xyκ yy ) ( ) x x K = yx Κ xx K xyκ yykyx KxyΚ yy μ y y ˆ MMSE ( μ ) = μx + xy yy y x K Κ y J. A. O'S. ESE 524, Lecture 4, 3/3/9 3
Linear Estimation Udated ( ) xˆ MMSE = μx + KxyΚ yy y μy Posterior mean Prior mean Attenuated by uncertainty New information Amlified by correlation J. A. O'S. ESE 524, Lecture 4, 3/3/9 4
Linear Estimation Problem 2: Assume that the mean vectors and joint covariance matrix for x and y are nown. Among all linear estimates of x as a function of y, find the one that minimizes MSE Assume x and y are zero mean random variables with nown joint covariance matrix. Find the linear estimator that minimizes the trace of the error covariance matrix ( xay)( xay) minr E A = minr Κ xx KxyA AKyx + AΚ yya A ( ) ( ) = minr Κ K Κ K + AK Κ Κ AK Κ A = r Κ xx K xyκ yykyx, with minimum at A= KxyΚ yy xx xy yy yx xy yy yy xy yy J. A. O'S. ESE 524, Lecture 4, 3/3/9 5
Linear Estimation Problem : x and y are jointly Gaussian. Find the exected value of x given y. Problem 2: x and Y have nown second order statistics. Among all linear estimates of x as a function of y, find the one that minimizes MSE Answer 2 = Answer Fundamental (Orthogonality) Proerty: he error in the estimate is orthogonal to the variables used in the estimate. Error covariance matrix ( x [ x y] ) x [ x y] E E E = Κ K Κ K xx xy yy yx ( ) ( [ ]) = [ ] ( ) ( [ ] [ ]) E E E E E x x y y x x y y y = E E E x y x y y = J. A. O'S. ESE 524, Lecture 4, 3/3/9 6
Recursive Linear Estimation Data Model and Problem Statements Suose a zero-mean, stationary Gaussian random rocess (GRP) with nown covariance function is given. Problem : Find the minimum mean square error estimate of the resent value of the GRP given the revious values. a: Derive the result as a transversal filter and derive the order-recursive udates (from to +). b: Derive the result as a lattice filter and derive the orderrecursive udates for the coefficients (reflection arameters). Problem 2: Assume that the GRP satisfies an autoregressive (AR) model of order. 2a: Find the maximum lielihood estimates of the AR arameters, including the time-recursive and order-recursive udates. 2b: Find the time- and order-recursive udates for the lattice filter coefficients (reflection arameters). J. A. O'S. ESE 524, Lecture 4, 3/3/9 7
Linear Prediction heory [ ] GRP Jointly Gaussian Er [ n] =, E rr n nl = cl Distributions for any Er [ n rn, rn2,... rn] = wr n+ wr 2 n2 +... + wr n subset of random variables Jointly Gaussian linear r( n ) = [ rn... rn2 rn ] estimation results aly; w = [ w w... w] estimate of current value is a linear combination of Er [ n rn, rn2,... rn] = wr( n) revious values r( n ) Linear combination defines E ( n ) rn r r = + = a transversal filter n c Imlementation through a ( ) = c i, j ij, = [ c c... c] taed delay line w =, w = Stationary covariance 2 matrix is oelitz, E ( rn wr( n ) ) = c coefficients in a transversal filter are indeendent of time J. A. O'S. ESE 524, Lecture 4, 3/3/9 8
Linear Prediction heory Covariance matrix is oelitz oelitz constant diagonals Order recursion derives from artition of covariance matrix by order. here are two standard artitions. he second uses an exchange matrix J that has ones along the antidiagonal. Somewhat loose on subscrits J = r ( n ) E ( n ) r = r r n n + ( ) = c, = [ c c... c ] w i, j ij = = 2 2 2 3 2 3 + = c ( J ) c + = J J. A. O'S. ESE 524, Lecture 4, 3/3/9 + = c+ c c c c c c c c c c c c c c c c 9
Linear Prediction heory Equations resulting from the orthogonality roerty are the normal equations erminology: forward rediction error of order ([ r ] n r w ) E ( n) r ( n ) = w =, w = a w = w a = = + c w + = J. A. O'S. ESE 524, Lecture 4, 3/3/9
Order Udate on Inverse: Ran One Udate ( J ) } } + = + c J row ( c ) ( c ) ( c ) ( c ) + + = = c = rows ( ( ) ) c J J ( c ( ) ) J J ( J) ( c ( ) ) ( ( ) ) + c ( ) J J J J J J J ( c ) ( c ) ( c ) ( c ) J = J + J J J J =, JJ = I, J = J } } rows row J. A. O'S. ESE 524, Lecture 4, 3/3/9
Order Udate on Inverse: Ran One Udate = + a c + = = J ( c ) ( c ) ( c ) ( c ) ( c ) + + = = c b a ( J ) ( c ) ( c ) ( c ) ( c ) = + b = = Ja Jw J J + J J ( c ) b J. A. O'S. ESE 524, Lecture 4, 3/3/9 2
Bacward Prediction Predict r n- from following values Bacward rediction error of order ; bacward rediction error filter Exchange matrix comes in again Same error variance as in forward rediction Basis for order udate: udate bacward and forward rediction error coefficients ([ r ] n r w ) E ( n) r ( n) = J w =, w = J = J = Jw J. A. O'S. ESE 524, Lecture 4, 3/3/9 + b = Jw c J + b = = J Jw + = Jw b c + J + 2 = Jw J + + c J c+ = J Jw = c+ c c+ Jw Δ = b 3
Order Udate For order udate, combine equations to cancel to and bottom terms he terms in arentheses must be the order-udates of the forward and bacward rediction error filters b + 2 = c + Jw + + + 2 = + c w a c J c + c+ Jw = J w = c+ c Δ = Ja = b + + + b+ =+ 2, 2 a+ =+ + b Δ + + 2, + = = a Δ b =, = + 2 + a J. A. O'S. ESE 524, Lecture 4, 3/3/9 + Δ Δ 2 2 4
Order Udate Order udate requires multilies to find Δ One division + multilies to get the udate wo multilies to get the next error variance otal comlexity: 2 +3 Sum from to is ( +)+3 his is the transversal filter version of linear rediction. Initialization: = c, a = b =, = c, = c = 2. Udate reflection coefficient and error variance 2 Δ Δ = + Ja, + = 2 = a c 3. Udate rediction error filters b Δ b + = a Δ b a + = a 4. Recursion ste c = + = +, +, return to ste 2 J. A. O'S. ESE 524, Lecture 4, 3/3/9 5
Lattice Filter Lattice filter structure is different from transversal. Each bloc has a delay on the bacward rediction error and cross-multilication he multilers are reflection coefficients here is an efficient udate for the reflection coefficients (just use the revious udate rule for transversal filters) Δ Define filter in terms of Define the forward and bacward rediction errors F( n) = ar+ ( n), G( n) = br+ ( n), Order udate equations give Δ b F+ ( n) = a+ r+ 2( n) = + 2( n) r a Δ b = r+ 2( n) 2( n) r + a Δ = F( n) G( n) b Δ G+ ( n) = b+ r+ 2( n) = 2( n) r + a Δ = G( n) F( n) Δ F+ ( n) F( n) = G ( n) Δ G( n) J. A. O'S. ESE + 524, Lecture 4, 3/3/9 6
Estimation Aroaches In this aroach to linear rediction, the second order statistics are assumed nown and the otimal estimators are derived, including order-recursive udates for both transversal and lattice filters. If the covariance function is not nown, then it must be estimated, or the filter coefficients must be estimated directly. A maximum lielihood aroach is reasonable. he data-udate version of the equations taes the form of an RLS recursive least squares solution. he RLS algorithm is usually imlemented with an arbitrary, but small initial covariance matrix.
Data Driven Estimation Aroaches Both data udates and order udates Data udates: ran one udate to a matrix is a ran one udate to its inverse r ( n) ( N) ( N) r ( N) c ( N) N r( n ) r n = + ( N) = n= + n w ( N) ( N) ( N) N n= + = ( w r ) 2 n ( ) ( ) = ( ) ( ) ( ) ( ) ( ) r N n c N N N N N + ( N + ) =+ ( N) + ( N) r+ ( N) ( ) r r+ N + ( N) r + ( N) ( N) ( N) + + J. A. O'S. ESE 524, Lecture 4, 3/3/9 8