UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, i.e., Z,Z,... are i.i.d. N(0,). Define the process X n, n as X 0 = 0, and X n = X n +Z n for n. (a) Is X n an independent increment process? Justify your answer. (b) Is X n a Markov process? Justify your answer. (c) Is X n a Gaussian process? Justify your answer. (d) Find the mean and autocorrelation functions of X n. (e) Specify the first and second order pdfs of X n. (f) Specify the joint pdf of X,X, and X 3. (g) Find E(X 0 X,X,...,X 0 ). Solution: (a) Yes. The increments X n, X n X n,...,x nk X nk are sums of different Z i random variables, and the Z i are IID. (b) Yes. Since the process has independent increments, it is Markov. (c) Yes. Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n). (d) We have [ ] E[X n ] = E Z i = E[Z i ] = R X (n,n ) = E[X n X n ] n = E Z i n j= Z j n n = E[Z i Z j ] = j= min(n,n ) = min(n,n ). 0 = 0, E(Z i) (IID)
(e) As shown above, X n is Gaussian with mean zero and variance Var(X n ) = E[Xn] E [X n ] = R X (n,n) 0 = n. Thus, X n N(0,n). Cov(X n,x n ) = E(X n X n ) E(X n )E(X n ) = min(n,n ). Therefore, X n and X n are ( jointly Gaussian random variables ) with mean µ = [0 0] T n and covariance matrix Σ = min(n,n ). min(n,n ) n (f) X n, n is a zero mean Gaussian random process. Thus X E[X ] R X (,) R X (,) R X (,3) X N E[X ], R X (,) R X (,) R X (,3) X 3 E[X 3 ] R X (3,) R X (3,) R X (3,3) Substituting, we get X 0 X N 0,. X 3 0 3 (g) Since X n is an independent increment process, E(X 0 X,X,...,X 0 ) = E(X 0 X 0 +X 0 X,X,...,X 0 ) = E(X 0 X 0 X,X,...,X 0 )+E(X 0 X,X,...,X 0 ) = E(X 0 X 0 )+X 0 = 0+X 0 = X 0.. Arrow of time. Let X 0 be a Gaussian random variable with zero mean and unit variance, and X n = αx n +Z n for n, where α is a fixed constant with α < and Z,Z,... are i.i.d. N(0, α ), independent of X 0. (a) Is the process {X n } Gaussian? (b) Is {X n } Markov? (c) Find R X (n,m). (d) Find the (nonlinear) MMSE estimate of X 00 given (X,X,...,X 99 ). (e) Find the MMSE estimate of X 00 given (X 0,X 0,...,X 99 ). (f) Find the MMSE estimate of X 00 given (X,...,X 99,X 0,...,X 99 ).
Solution: (a) Yes, the process is Gaussian, since it is the linear transform of white Gaussian process {Z n }. (b) Yes, the process is Markov since X n {X = x,...,x n = x n } N(αx n, α ), which depends only on x n. (c) First note that R X (n,n) = α R X (n,n )+( α ) = for all n. Since we can express X n = α k X n k +α k Z n k+ + +Z n, andx n k isindependent of(z n k+,...,z n ), wehaver X (n,n k) = E(X n X n k ) = α k. Thus, R X (n,m) = α n m. (d) Because the process is Gaussian, the MMSE estimator is linear. From Markovity, ˆX00 = E(X 00 X,...,X 99 ) = E(X 00 X 99 ) = R X(00,99) R X (99,99) X 99 = αx 99. (e) Again from the Markovity and the symmetry, (X,...,X n ) has the same distribution as (X n,...,x ) hence ˆX 00 = E(X 00 X 0,...,X 99 ) = E(X 00 X 0 ) = αx 0. (f) First note that X 00 is conditionally independent of (X,...,X 98,X 0,...,X 99 ) given (X 99,X 0 ). To see this, consider f(x 00 x,...,x 99,x 0,...,x 99 ) = Thus, f(x,...,x 99 ) f(x,...,x 99,x 0,...,x 99 ) = f(x )f(x x ) f ( x 99 x 98 )f(x 00 x 99 )f(x 0 x 00 )f(x 0 x 0 ) f(x 99 x 98 ) f(x )f(x x ) f(x 99 x 98 )f(x 0 x 99 )f(x 0 x 0 ) f(x 99 x 98 ) = f(x 00 x 99 )f(x 0 x 00 ) f(x 0 x 99 ) = f(x 00,x 0 x 99 ) f(x 0 x 99 ) = f(x 00 x 99,x 0 ). ˆX 00 = E(X 00 X 99,X 0 ) = [ R X (00,99) R X (00,0) ][ ] [ ] R X (99,99) R X (99,0) X99 R X (0,99) R X (0,0) X 0 = [ α α ][ α ] [ ] X99 α X 0 = α +α (X 99 +X 0 ). 3. AM modulation. Consider the AM modulated random process X(t) = A(t)cos(πt+Θ), 3
where the amplitude A(t) is a zero-mean WSS process with autocorrelation function R A (τ) = e τ, the phase Θ is a Unif[0,π) random variable, and A(t) and Θ are independent. Is X(t) a WSS process? Justify your answer. Solution: X(t)iswide-sensestationaryifEX(t)isindependentoftandifR X (t,t )depends only on t t. Consider and E[X(t)] = E[A(t)cos(ωt+Θ)] = E[A(t)]E[cos(ωt + Θ)] by independence = 0, R X (t,t ) = E[X(t )X(t )] = E[A(t )cos(ωt +Θ)A(t )cos(ωt +Θ)] = E[A(t )A(t )cos(ωt +Θ)cos(ωt +Θ)] = E[A(t )A(t ) Ecos(ωt +Θ)cos(ωt +Θ)] by independence = R A (t t )E[cos(ωt +Θ)cos(ωt +Θ)] [ ] = R A (t t )E (cos(ω(t +t )+Θ)+cos(ω(t t ))) = cos(ω(t +t ))cos(θ) R A(t t )E sin(ω(t +t ))sin(θ) +cos(ω(t t )) = Ecos(ω(t +t )) Ecos(Θ) R A(t t ) Esin(ω(t +t )) Esin(Θ) +Ecos(ω(t t )) = R A(t t )cos(ω(t t )), which is a function of t t only. Hence X(t) is wide-sense stationary. 4
4. LTI system with WSS process input. Let Y(t) = h(t) X(t) and Z(t) = X(t) Y(t) as shown in the Figure. (a) Find S Z (f). (b) Find E(Z (t)). Your answers should be in terms of S X (f) and the transfer function H(f) = F[h(t)]. X(t) h(t) Y(t) + - Z(t) Figure : LTI system. Solution: (a) To find S Z (f), we first find the autocorrelation function R Z (τ) = E(Z(t)Z(t+τ)) Now, taking the Fourier Transform, we get = E((X(t) Y(t))(X(t+τ) Y(t+τ))) = R X (τ)+r Y (τ) R YX ( τ) R XY ( τ) = R X (τ)+r Y (τ) R XY (τ) R XY ( τ). S Z (f) = S X (f)+s Y (f) S XY (f) S XY ( f) = S X (f)+ H(f) S X (f) H( f)s X (f) H(f)S X (f) = S X (f) ( + H(f) Re[H(f)] ) = S X (f) H(f). (b) To find the average power of Z(t), we find the area under S Z (f) E(Z (t)) = H(f) S X (f)df. 5. Echo filtering. A signal X(t) and its echo arrive at the receiver as Y(t) = X(t)+X(t )+ Z(t). Here the signal X(t) is a zero-mean WSS process with power spectral density S X (f) and the noise Z(t) is a zero-mean WSS with power spectral density S Z (f) = N 0 /, uncorrelated with X(t). (a) Find S Y (f) in terms of S X (f),, and N 0. (b) Find the best linear filter to estimate X(t) from {Y(s)} <s<. Solution: 5
(a) We can write Y(t) = g(t) X(t)+Z(t) where g(t) = δ(t)+δ(t ). Thus, S Y (f) = G(f) S X (f)+s Z (f) = +e jπ f S X (f)+ N 0. (b) Since S YX (f) = (+e jπ f )S X (f), ˆX(t) = h(t) Y(t), where the linear filter h(t) has the transfer function H(f) = S XY(f) S Y (f) = S YX( f) = S Y (f) (+e jπ f )S X (f) +e jπ f S X (f)+ N. 0 6. Discrete-time LTI system with white noise input. Let {X n : < n < } be a discrete-time white noise process, i.e., E(X n ) = 0, < n <, and { n = 0, R X (n) = 0 otherwise. The process is filtered using a linear time invariant system with impulse response α n = 0, h(n) = β n =, 0 otherwise. Find α and β such that the output process Y n has n = 0, R Y (n) = n =, 0 otherwise. Solution: We are given that R X (n) is a discrete-time unit impulse. Therefore R Y (n) = h(n) R X (n) h( n) = h(n) h( n). The impulse response h(n) is the sequence (α,β,0,0,...). The convolution with h( n) has only finitely many nonzero terms. R Y (0) = = h(0) h(0) = α +β R Y (+) = = h() h( ) = αβ R Y ( ) = = R Y () This pair of equations has two solutions: α = + and β = + or α = and β =. 7. Finding time of flight. Finding the distance to an object is often done by sending a signal and measuring the time of flight, the time it takes for the signal to return (assuming speed of signal, e.g., light, is known). Let X(t) be the signal sent and Y(t) = X(t δ)+z(t) be the 6
R Y X(t) 3 5 8 Figure : Crosscorrelation function. t signal received, where δ is the unknown time of flight. Assume that X(t) and Z(t) (the sensor noise) are uncorrelated zero mean WSS processes. The estimated crosscorrelation function of Y(t) and X(t), R YX (t) is shown in Figure. Find the time of flight δ. Solution: Consider R YX (τ) = E(Y(t+τ)X(t)) = E((X(t +τ)+z(t+τ))x(t)) = R X (τ ). Now since the maximum of R X (α) is achieved for α = 0, by inspection of the given R YX we get that 5 = 0. Thus = 5. 7
Additional Exercises Do not turn in solutions to these problems.. Moving average process. Let Z 0,Z,Z,... be i.i.d. N(0,). (a) Let X n = Z n +Z n for n. Find the mean and autocorrelation function of X n. (b) Is {X n } wide-sense stationary? Justify your answer. (c) Is {X n } Gaussian? Justify your answer. (d) Is {X n } strict-sense stationary? Justify your answer. (e) Find E(X 3 X,X ). (f) Find E(X 3 X ). (g) Is {X n } Markov? Justify your answer. (h) Is {X n } independent increment? Justify your answer. (i) Let Y n = Z n +Z n for n. Find the mean and autocorrelation function of Y n. (j) Is {Y n } wide-sense stationary? Justify your answer. (k) Is {Y n } Gaussian? Justify your answer. (l) Is {Y n } strict-sense stationary? Justify your answer. (m) Find E(Y 3 Y,Y ). (n) Find E(Y 3 Y ). (o) Is {Y n } Markov? Justify your answer. (p) Is {Y n } independent increment? Justify your answer. Solution: (a) E(X n ) = E(Z n )+E(Z n ) = 0. R X (m,n) = E(X m X n ) [( )( )] = E Z n +Z n Z m +Z m E[Z n ], n m = 4 = E[Z n ]+E[Z n], n = m E[Z n], m n = 0, otherwise = 5 4, n = m, n m = 0, otherwise. 8
(b) Since the mean and autocorrelation functions are time-invariant, the process is WSS. (c) Since (X,...,X n ) is a linear transform of a GRV (Z 0,Z,...,Z n ), the process is Gaussian. (d) Since the process is WSS and Gaussian, it is SSS. (e) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear. Hence, E(X 3 X,X ) = [ ] [ ] 5 [ ] 0 4 X = (0X 4X ). (f) Similarly, E(X 3 X ) = (/5)X. (g) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov. (h) Since the process is not Markov, it is not independent increment. (i) The mean function µ n = E(Y n ) = 0. The autocorrelation function 5 4 X n = m, R Y (n,m) = n = m or n = m+, 0 otherwise. (j) Since the mean and autocorrelation functions are time-invariant, the process is WSS. (k) Since (Y,...,Y n ) is a linear transform of a GRV (Z 0,Z,...,Z n ), the process is Gaussian. (l) Since the process is WSS and Gaussian, it is SSS. (m) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear. Hence, E(Y 3 Y,Y ) = [ 0 ][ ] [ ] Y = 3 (Y Y ). (n) Similarly, E(Y 3 Y ) = (/)Y. (o) Since E(Y 3 Y,Y ) E(Y 3 Y ), the process is not Markov. (p) Since the process is not Markov, it is not independent increment.. Gauss-Markov process. Let X 0 = 0 and X n = X n + Z n for n, where Z,Z,... are i.i.d. N(0,). Find the mean and autocorrelation function of X n. Solution: Using the method of lecture notes 8, we can easily verify that E(X n ) = 0 for every n and that R X (n,m) = E(X n X m ) = n m 4 3 [ ( 4 )min{n,m} ]. 3. Random binary waveform. In a digital communication channel the symbol is represented by the fixed duration rectangular pulse { for 0 t < g(t) = 0 otherwise, Y 9
and the symbol 0 is represented by g(t). The data transmitted over the channel is represented by the random process X(t) = A k g(t k), for t 0, k=0 where A 0,A,... are i.i.d random variables with { + w.p. A i = w.p.. (a) Find its first and second order pmfs. (b) Find the mean and the autocorrelation function of the process X(t). Solution: (a) The first order pmf is p X(t) (x) = P(X(t) = x) ( ) = P A k g(t k) = x k=0 = P ( A t = x ) = P(A 0 = x) IID { =, x = ± 0, otherwise. Now note that X(t ) and X(t ) are dependent only if t and t fall within the same time interval. Otherwise, they are independent. Thus, the second order pmf is (b) For t 0, p X(t )X(t )(x,y) = P(X(t ) = x,x(t ) = y) ( ) = P A k g(t k) = x, A k g(t k) = y k=0 k=0 = P ( A t = x,a t = y ) { P(A0 = x,a = 0 = y), t = t P(A 0 = x,a = y), otherwise, t = t & (x,y) = (,),(, ) = 4, t t & (x,y) = (,),(, ),(,),(, ) 0, otherwise. [ ] E[X(t)] = E A k g(t k) = k=0 g(t k)e[a k ] k=0 = 0. 0
For the autocorrelation R X (t,t ), we note once again that only if t and t fall within the same interval, will X(t ) be dependent on X(t ); if they do not fall in the same interval then they are independent from one another. Then, R X (t,t ) = E[X(t )X(t )] = g(t k)g(t k)e[a k ] = k=0 {, t = t 0, otherwise. 4. Absolute-value random walk. Let X n be a random walk defined by X 0 = 0, X n = Z i, n, where {Z i } is an i.i.d. process with P(Z = ) = P(Z = +) =. Define the absolute value random process Y n = X n. (a) Find P{Y n = k}. (b) Find P{max i<0 Y i = 0 Y 0 = 0}.
Solution: (a) This is a straightforward calculation and we can use results from lecture notes. If k 0 then P{Y n = k} = P{X n = +k or X n = k}. If k > 0 then P{Y n = k} = P{X n = k}, while P{Y n = 0} = P{X n = 0}. Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n = k} = n n n/)( ) k = 0, n is even, n 0 0 otherwise. (b) If Y 0 = X 0 = 0 then there are only two sample paths with max i<0 X i = 0 that is, Z = Z = = Z 0 = +,Z = = Z 0 = or Z = Z = = Z 0 =,Z = = Z 0 = +. Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i = 0 Y 0 = 0 } = ( i<0 0 ) = 0 84756 = 9378. 5. Mixture of two WSS processes. Let X(t) and Y(t) be two zero-mean WSS processes with autocorrelation functions R X (τ) and R Y (τ), respectively. Define the process { X(t), with probability Z(t) = Y(t), with probability. Find the mean and autocorrelation functions for Z(t). Is Z(t) a WSS process? Justify your answer. Solution: To show that Z(t) is WSS, we show that its mean and autocorrelation functions are time invariant. Consider and similarly µ Z (t) = E[Z(t)] = E(Z Z = X)P{Z = X}+E(Z Z = Y)P{Z = Y} = (µ X +µ Y ) = 0, R Z (t+τ,t) = E[Z(t+τ)Z(t)] = (R X(τ)+R Y (τ)). Since µ Z (t) is independent of time and R Z (t+τ,t) depends only on τ, Z(t) is WSS.
6. Stationary Gauss-Markov process. Consider the following variation on the Gauss-Markov process discussed in Lecture Notes 8: X 0 N(0,a) X n = X n +Z n, n, where Z,Z,Z 3,... are i.i.d. N(0,) independent of X 0. (a) Find a such that X n is stationary. Find the mean and autocorrelation functions of X n. (b) (Optional.) Consider the sample mean S n = n n X i, n. Show that S n converges totheprocessmeaninprobabilityeventhoughthesequencex n isnoti.i.d. (Astationary process for which the sample mean converges to the process mean is called mean ergodic.) Solution: (a) We are asked to find a such that E(X n ) is independent of n and R X (n,n ) depends only on n n. For X n to be stationary, E(X n) must be independent of n. Thus E(X n) = 4 E(X n )+E(Z n)+e(x n Z n ) = 4 E(X n)+. Therefore, a = E(X0 ) = E(X n) = 4 3. Using the method of lecture notes 8, we can easily verify that E(X n ) = 0 for every n and that R X (n,n ) = E(X n X n ) = 4 3 n n. (b) To prove convergence in probability, we first prove convergence in mean square and then use the fact that mean square convergence implies convergence in probability. ( E(S n ) = E n ) X i = n E(X i ) = n 0 = 0. To show convergence in mean square we show that Var(S n ) 0 as n. ( Var(S n ) = Var n = n 4 3n j= ) (( X i = E n R X (i,j) = 4 3n ( n ) + i ) ) X i ( n ) n+ (n i) i 4 ( + 3n ) i = 4 n. (since E(X i ) = 0) Thus S n converges to the process mean, even though the sequence is not i.i.d. 3
7. QAM random process. Consider the random process X(t) = Z cosωt+z sinωt, < t <, where Z and Z are i.i.d. discrete random variables such that p Zi (+) = p Zi ( ) =. (a) Is X(t) wide-sense stationary? Justify your answer. (b) Is X(t) strict-sense stationary? Justify your answer. Solution: (a) We first check the mean. E(X(t)) = E(Z )cosωt+e(z )sinωt = 0 cos(ωt) + 0 sin(ωt) = 0. The mean is independent of t. Next we consider the autocorrelation function. E(X(t+τ)X(t)) = E((Z cos(ω(t+τ))+z sin(ω(t+τ)))(z cos(ωt)+z sin(ωt))) = E(Z )cos(ω(t+τ))cos(ωt)+e(z )sin(ω(t+τ))sin(ωt) = cos(ω(t+τ))cos(ωt)+sin(ω(t+τ))sin(ωt) = cos(ω(t+τ) ωt)) = cosωτ. The autocorrelation function is also time invariant. Therefore X(t) is WSS. (b) Note that X(0) = Z cos0 + Z sin0 = Z, so X(0) has the same pmf as Z. On the other hand, X(π/4ω) = Z cos(π/4)+z (sinπ/4) = (Z +Z ) 4 x = ± = x = 0 0 otherwise This shows that X(π/4ω) does not have the same pdf or even same range as X(0). Therefore X(t) is not first-order stationary and as a result is not SSS. 8. Finding impulse response of LTI system. To find the impulse response h(t) of an LTI system (e.g., a concert hall), i.e., to identify the system, white noise X(t), < t <, is applied to its input and the output Y(t) is measured. Given the input and output sample functions, the crosscorrelation R YX (τ) is estimated. Show how R YX (τ) can be used to find h(t). Solution: Since white noise has a flat psd, the crosspower spectral density of the input X(t) and the output Y(t) is just the transfer function of the system scaled by the psd of the white noise. S YX (f) = H(f)S X (f) = H(f) N 0 R YX (τ) = F (S YX (f)) = N 0 h(τ) 4
Thus to estimate the impulse response of a linear time invariant system, we apply white noise to its input, estimate the crosscorrelation function of its input and output, and scale it by /N 0. 5