UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, 2014 Homework Set #6 Due: Thursday, May 22, 2011 1. Linear estimator. Consider a channel with the observation Y = XZ, where the signal X and the noise Z are uncorrelated Gaussian random variables. Let E[X] = 1, E[Z] = 2, σ 2 X = 5, and σ2 Z = 8. (a) Find the best MSE linear estimate of X given Y. (b) Suppose your friend from Caltech tells you that he was able to derive an estimator with a lower MSE. Your friend from UCLA disagrees, saying that this is not possible because the signal and the noise are Gaussian, and hence the best linear MSE estimator will also be the best MSE estimator. Could your UCLA friend be wrong? 2. Additive-noise channel with path gain. Consider the additive noise channel shown in the figure below, where X and Z are zero mean and uncorrelated, and a and b are constants. Z X a b Y = b(ax +Z) Find the MMSE linear estimate of X given Y and its MSE in terms only of σ X, σ Z, a, and b. 3. Image processing. A pixel signal X U[ k,k] is digitized to obtain X = i+ 1, if i < X i+1, i = k, k +1,..., k 2, k 1. 2 To improve the the visual appearance, the digitized value X is dithered by adding an independent noise Z with mean E(Z) = 0 and variance Var(Z) = N to obtain Y = X +Z. (a) Find the correlation of X and Y. (b) Find the best linear MSE estimate of X given Y. Your answer should be in terms only of k, N, and Y. 1
4. Covariance matrices. Which of the following matrices can be a covariance matrix? Justify your answer either by constructing a random vector X, as a function of the i.i.dzero meanunit variancerandomvariablesz 1,Z 2,andZ 3, withthegivencovariance matrix, or by establishing a contradiction. (a) [ 1 2 0 2 ] (b) [ 2 1 1 2 ] (c) 1 1 1 1 2 2 1 2 3 (d) 1 1 2 1 2 3 2 3 3 5. Gaussian random vector. Given a Gaussian random vector X N(µ,Σ), where µ = (152) T and 1 1 0 Σ = 1 4 0. 0 0 9 (a) Find the pdfs of i. X 1, ii. X 2 +X 3, iii. 2X 1 +X 2 +X 3, iv. X 3 given (X 1,X 2 ), and v. (X 2,X 3 ) given X 1. (b) What is P{2X 1 +X 2 X 3 < 0}? Express your answer using the Q function. (c) Find the joint pdf on Y = AX, where [ ] 2 1 1 A =. 1 1 1 6. Gaussian Markov chain. Let X, Y, and Z be jointly Gaussian random variables with zero mean and unit variance, i.e., E(X) = E(Y) = E(Z) = 0 and E(X 2 ) = E(Y 2 ) = E(Z 2 ) = 1. Let ρ X,Y denote the correlation coefficient between X and Y, and let ρ Y,Z denote the correlation coefficient between Y and Z. Suppose that X and Z are conditionally independent given Y. (a) Find ρ X,Z in terms of ρ X,Y and ρ Y,Z. (b) Find the MMSE estimate of Z given (X,Y) and the corresponding MSE. 7. Prediction of an autoregressive process. Let X be a random vector with zero mean and covariance matrix 1 α α 2 α n 1 α 1 α Σ X = α 2 α 1.... α n 1 1 2
for α < 1. X 1,X 2,...,X n 1 are observed, find the best linear MSE estimate (predictor) of X n. Compute its MSE. 8. Noise cancellation. A classical problem in statistical signal processing involves estimating a weak signal (e.g., the heart beat of a fetus) in the presence of a strong interference (the heart beat of its mother) by making two observations; one with the weak signal present and one without (by placing one microphone on the mother s belly and another close to her heart). The observations can then be combined to estimate the weak signal by cancelling out the interference. The following is a simple version of this application. Let the weak signal X be a random variable with mean µ and variance P, and the observations be Y 1 = X +Z 1 (Z 1 being the strong interference), and Y 2 = Z 1 +Z 2 (Z 2 is a measurement noise), where Z 1 and Z 2 are zero mean with variances N 1 and N 2, respectively. Assume that X, Z 1 and Z 2 are uncorrelated. Find the best linear MSE estimate of X given Y 1 and Y 2 and its MSE. Interprete the results. 3
Additional Exercises Do not turn in solutions to these problems. 1. Worst noise distribution. Consider an additive noise channel Y = X + Z, where the signal X N(0,P) and the noise Z has zero mean and variance N. Assume X and Z are independent. Find a distribution of Z that maximizes the minimum MSE of estimating X given Y, i.e., the distribution of the worst noise Z that has the given mean and variance. You need to justify your answer. 2. Jointly Gaussian random variables. Let X and Y be jointly Gaussian random variables with pdf 1 f X,Y (x,y) = π 1 3/4 e 2 (4x2 /3+16y 2 /3+8xy/3 8x 16y+16). (a) Find E(X), E(Y), Var(X), Var(Y), and Cov(X,Y). (b) Find the minimum MSE estimate of X given Y and its MSE. 3. Markov chain. Suppose X 1 and X 3 are independent given X 2. Show that f(x 1,x 2,x 3 ) = f(x 1 )f(x 2 x 1 )f(x 3 x 2 ) = f(x 3 )f(x 2 x 3 )f(x 1 x 2 ). Inotherwords, ifx 1 X 2 X 3 formsamarkovchain, thensodoesx 3 X 2 X 1. 4. Proof of Property 4. In Lecture Notes #6 it was stated that conditionals of a Gaussian random vector are Gaussian. In this problem you will prove that fact. [ Y If X] is a zero-mean GRV then X {Y = y} N ( Σ XY Σ 1 Y y, σ2 X Σ XYΣ 1 Y Σ YX). Justify each of the following steps of the proof. (a) Let ˆX be the best MSE linear estimate of X given Y. Then ˆX and X ˆX are individually zero-mean Gaussians. Find their variances. (b) ˆX and X ˆX are independent. (c) Now write X = ˆX +(X ˆX). If Y = y then X = Σ XY Σ 1 Y y + (X ˆX). (d) Now complete the proof. Remark: This proof can be extended to vector X. 4
5. Additive nonwhite Gaussian noise channel. Let Y i = X +Z i, i = 1,2,...,n, be n observations of a signal X N(0,P). The additive noise random variables Z 1,Z 2,...,Z n are zero mean jointly Gaussian random variables that are independent of X and have correlation E(Z i Z j ) = N 2 i j for 1 i,j n. (a) Find the best MSE estimate of X given Y 1,Y 2,...,Y n. (b) Find the MSE of the estimate in part (a). Hint: The coefficients for the best estimate are of the form h T = [a b b b b a]. 5