UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable as possible. Please justify any claim that you make.. Conditional expectations (40 pts). Let X and Y be i.i.d. Exp() random variables. Let Z = X +Y and W = X Y. (a) Find the joint pdf f X,Z (x,z) of X and Z. (b) Find the joint pdf f Z,W (z,w) of Z and W. (c) Find E[Z X]. (d) Find E[X Z]. 2. MMSE estimation (30 pts). Let X Exp() and Y = min{x,}. (a) Find E[Y]. (b) Findtheestimate ˆX = g(y)ofx giveny thatminimizes themeansquare error E[(X ˆX) 2 ] = E[(X g(y)) 2 ], and plot g(y) as a function of y. (c) Find the mean square error of the estimate found in part (b). 3. Is the grass always greener on the other side? (30 pts). Let X and Y be two i.i.d. continuous nonnegative random variables with invertible common cdf F, i.e., P{X x} = P{Y x} = F(x). (a) Find P{X > Y} and P{X < Y}. Suppose now that we observe the value of X and make a decision on whether X is larger or smaller than Y.
(b) Find the optimal decision rule d(x) that minimizes the error probability. Your answer should be in terms of the common cdf F. (c) Find the probability of error for the decision rule found in part (b). 4. Sampled Wiener process (60 pts). Let {W(t),t 0} be the standard Brownian motion. For n =,2,..., let ( ) X n = n W. n (a) Find the mean and autocorrelation functions of {X n }. (b) Is {X n } WSS? Justify your answer. (c) Is {X n } Markov? Justify your answer. (d) Is {X n } independent increment? Justify your answer. (e) Is {X n } Gaussian? Justify your answer. (f) For n =,2,..., let S n = X n /n. Find the limit in probability. lim n S n 5. Poisson process (40 pts). Let {N(t),t 0} be a Poisson process with arrival rate λ > 0. Let s t. (a) Find the conditional pmf of N(t) given N(s). (b) Find E[N(t) N(s)] and its pmf. (c) Find the conditional pmf of N(s) given N(t). (d) Find E[N(s) N(t)] and its pmf. 6. Hidden Markov process (60 pts). Let X 0 N(0,σ 2 ) and X n = 2 X n +Z n for n, where Z,Z 2,... are i.i.d. N(0,), independent of X 0. Let Y n = X n +V n, where V n are i.i.d. N(0,), independent of {X n }. 2
(a) Find the variance σ 2 such that {X n } and {Y n } are jointly WSS. Under the value of σ 2 found in part (a), answer the following. (b) Find R Y (n). (c) Find R XY (n). (d) Find the MMSE estimate of X n given Y n. (e) Find the MMSE estimate of X n given (Y n,y n ). (f) Find the MMSE estimate of X n given (Y n,y n+ ). 3
Practice Final Examination (Winter 206) There are 4 problems, each problem with multiple parts, each part worth 0 points. Your answer should be as clear and readable as possible. Justify any claim that you make.. Additive exponential noise channel (60 pts). A device has two equally likely states S = 0 and S =. When it is inactive (S = 0), it transmits X = 0. When it is active (S = ), it transmits X Exp(). Now suppose the signal is observed through the additive exponential noise channel with output Y = X +Z, where Z Exp(2) is independent of (X,S). One wishes to decide whether the device is active or not. (a) Find f Y S (y 0). (b) Find f Y S (y ). (c) Find f Y (y). (d) Find p S Y (0 y) and p S Y ( y). (e) Find the decision rule d(y) that minimizes the probability of error P(S d(y)). (f) Find the corresponding probability of error. (Hint: Recall that Z Exp(λ) means that its pdf is f Z (z) = λe λz, z 0.) 2. Brownian bridge (40 pts). Let {W(t)} t=0 be the standard Brownian motion (Wiener process). Recall that the process is independent-increment with W(0) = 0 and W(t) W(s) N(0,t s), 0 s < t. In the following, we investigate several properties of the process conditioned on {W() = 0}. (a) Find the conditional distribution of W(/2) given W() = 0. 4
(b) Find E[W(t) W() = 0] for t [0,]. (c) Find E[(W(t)) 2 W() = 0] for t [0,]. (d) Find E[W(t )W(t 2 ) W() = 0] for t,t 2 [0,]. 3. Convergence of random processes (30 pts). Let {N(t)} t=0 be a Poisson process with rate λ. Recall that the process is independent increment and N(t) N(s), 0 s < t, has the pmf Define p N(t) N(s) (n) = e λ(t s) (λ(t s)) n, n = 0,,... n! M(t) = N(t), t > 0. t (a) Find the mean and autocorrelation function of {M(t)} t>0. (b) Does {M(t)} t>0 converge in mean square as t, that is, lim t E[ (M(t) M) 2] = 0 for some random variable (or constant) M? If so, what is the limit? Now consider L(t) = t t 0 N(s) s ds, t > 0. (c) Does {L(t)} t>0 converge in mean square as t? If so, what is the limit? (Hint: /xdx = lnx+c, lnxdx = xlnx x+c, and lim x 0 xlnx = 0.) 4. Random binary waveform (40 pts). Let {N(t)} t=0 be a Poisson process with rate λ, and Z be independent of {N(t)} with P(Z = ) = P(Z = ) = /2. Define X(t) = Z ( ) N(t), t 0. (a) Find the mean and autocorrelation function of {X(t)} t=0. 5
(b) Is {X(t)} t=0 wide-sense stationary? (c) Find the first-order pmf p X(t) (x) = P(X(t) = x). (d) Find the second-order pmf p X(t ),X(t 2 )(x,x 2 ) = P(X(t ) = x, X(t 2 ) = x 2 ). (Hint: k even xk /k! = (e x +e x )/2 and k odd xk /k! = (e x e x )/2.) 6
Practice Final Examination (Winter 208) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable as possible. Please justify any claim that you make.. Nonlinear and linear MMSE estimation (30 pts). Let X and Y be two random variables with joint pdf { x+y, 0 x, 0 y, f X,Y (x,y) = 0, otherwise. (a) (0 points) Find the linear MMSE estimator of X given Y. (b) (0 points) Find the corresponding MSE. (c) (0 points) Find the MMSE estimator of X given Y. Is it the same as the linear MMSE estimator? 2. Convergence(30pts). Considerthesequenceofi.i.d.randomvariablesX,X 2,... with { 0 w.p. for all i. Define the sequence Let Y n = X i = 2 w.p., 2, 2 X n, for all n w.p. 3, X 2 n, for all n w.p., 3 0, for all n w.p.. 3 M n = n n Y i. (a) (0 points) Determine the probability mass function (pmf) of Y n. (b) (0 points) Determine the random variable (or constant) that M n converges to (in probability) as n approaches infinity. Justify your answer. i= 7
(c) (0 points) Use the central limit theorem to estimate the probability that the random variable M 84 exceeds 2 3. 3. Poisson Process (40 pts). Let {N(t),t 0} be a Poisson process with arrival rate λ > 0. (a) (0 points) Let T M be the time of the M-th arrival. Find E[T M ] and Var(T M ). (b) (0 points) Let s t. Assume k arrivals occur in t seconds, that is N(t) = k. Show that the conditional distribution of N(s) given N(t) = k satisfies N(s) {N(t) = k} Binom ( k, s t). (c) (0 points) Let s t. Determine the conditional expectation E[N(s) N(t)] and give its probability mass function (pmf). (d) (0 points) Assume N(t) = k. Determine the probability that all k arrivals occur in the first t 2 seconds. 4. Moving Average Process (40 pts). Let Z 0,Z,Z 2,... be i.i.d. N(0,). Let Y n = Z n +Z n for n. (a) (0 points) Find the mean function and autocorrelation function of {Y n }. (b) (5 points) Is {Y n } wide-sense stationary? Justify your answer. (c) (0 points) Is {Y n } Gaussian? Justify your answer. (d) (5 points) Is {Y n } strict-sense stationary? Justify your answer. (e) (0 points) Is {Y n } Markov? Justify your answer. [Hint: Compare E(Y 3 Y,Y 2 ) to E(Y 3 Y 2 ).] 5. WSS process through linear filter (40 pts). Let Y(t) be a short-term integration of a WSS process X(t): Y(t) = T t t T X(u)du. 8
The frequency response H(f) of this linear integration system is H(f) = e jπftsin(πft). πft Suppose the input X(t) has mean E[X(t)] and autocorrelation function R X (τ) = { τ T, τ T 0, otherwise. (a) (0 points) Determine the constant a such that E[Y(t)] = ae[x(t)]. (b) (0 points) Find S Y (f). (c) (0 points) Find R Y (τ). (You can leave your answer in the form of a convolution.) (d) (0 points) Determine explicitly the average power of the output E[Y 2 (t)]. ( ) 2 sin πft Hint: You may use the transform pair R X (τ) T πft and Fourier Transform relationships from the tables provided. 6. Optimal linear estimation (20 pts). Let X(t) be a zero-mean WSS process with autocorrelation function R X (τ) = e τ. (a) (0 points) Find the MMSE estimator for X(t) of the form ˆX(t) = ax(t t )+bx(t t 2 ), where t = t 0 and t 2 = 2t 0, where t 0 > 0. (b) (0 points) Find the MSE of this estimator. 9