UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn from the urn Replace that ball and one like it Let X denote the name of the next ball drawn Replace it and one like it Continue, drawing and replacing (a) Argue that the probability of drawing k reds follwed by n k whites is 3 k k+ (n k) (k +) (n+) k!(n k)! (n+)! ( (n+) nk ) (b) Let P n be the proportion of red balls in the urn after the n th drawing Argue that Pr{P n k n+ } n+, for k,,,n + Thus all proportions are equally probable This shows that P n tends to a uniformly distributed random Variable in distribution, ie, lim Pr{P n t} t, 0 t n (c) Whatcan yousay aboutthebehaviorof theproportionp n ifyou startedinitially withoneredballintheurnandtwowhiteballs? Specifically, whatisthelimiting distribution of P n? Can you show Pr{P n k n+3 } k n+, for k,,,n+? Solution: (a) Let r be the number of red balls, w the number of white balls and t the total number of balls r +w in the urn The picture of the balls in the urn for each drawing is given below The ratio r/t or w/t gives the probability for the next drawing
r/t / st drawing r/t /3 d drawing r/t 3/4 (k ) th drawing r/t k/k + k th drawing w/t /k + (k+) th drawing w/t /k +3 (k+) th drawing w/t 3/k +4 (n ) th drawing n th drawing w/t n k/n+ The probability of drawing k reds followed by n k whites is 3 k (k +) (n k) (k +) (n+) k!(n k)! (n+)! ( n k) (n+) In fact, regardless of the ordering, the probability of drawing k reds in n draws is given by this expression as well Note that after the n th drawing, there are k + red balls and n k + white balls for a total of n+ balls in the urn (b) After the n th drawing, where k red balls have been drawn, the proportion P n of red balls is number of red balls in the urn P n total number of balls in the urn k+ n+ There are ( n k) orderings of outcomes with k red balls and n k white balls, and each ordering has the same probability In fact, in the expression of the probability of drawing k reds followed by n k whites, there will be just permutations of the numerators in the case of a different sequence of drawings Therefore { Pr P n k+ } n+ ( ) n k n+ ( ( n k ) ) (n+) for k 0,,n All the proportions are equally probable (the probability does not depend on k) and P n tends to a uniformly distributed random Variable Unif[0,] in distribution
(c) If there are one red ball and two white balls to start with: r/t /3 st drawing r/t /4 d drawing r/t 3/5 (k ) th drawing r/t k/k + k th drawing w/t /k +3 (k +) th drawing w/t 3/k +4 (k +) th drawing w/t 4/k +5 (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is w/t n k+/n+ 3 4 k (k +) (n k+) k!(n k +)! (k +3) (n+) (n+)! After the n th drawing, number of red balls in the urn P n total number of balls in the urn k+ n+3 Similarly as in part (b), { Pr P n k + } n+3 ( ) ( ) n k!(n k+)! k (n+)! ( ) n! k!(n k +)! k!(n k)! (n+)! (n k+) (n+)(n+) The distribution is linearly decreasing in p k+ Pr{P n p} ( (n+3) (n+)(n+) n+3 ) p+ ( ) n+ Therefore the limiting distribution of P n as n goes to infinity is a triangular distribution with density f Pn (p) ( p) 3
Note: the Polya urn for this problem generates a process that is identical to the following mixture of Bernoulli s Θ f Θ (θ) ( θ), 0 θ < Let X i Bernoulli(Θ), then P n converges to Θ as n tends to infinity Considering the general case of λ red balls and λ white balls to start with, the limiting distribution is f Θ (θ) C(λ,λ )θ λ ( θ) λ where C(λ,λ ) is a function of λ and λ Symmetric random walk Let X n be a random walk defined by X 0 0, n X n Z i, where Z,Z, are iid with P{Z } P{Z } (a) Find P{X 0 0} (b) Approximate P{ 0 X 00 0} using the central limit theorem (c) Find P{X n k} Solution: (a) Since the event {X 0 0} is equivalent to {Z Z 0 }, we have P{X 0 0} 0 (b) Since E(Z j ) 0 and E(Zj ), by the central limit theorem, { ( ) } 00 P{ 0 X 00 0} P Z i 00 (c) i i Q() Φ() 068 P{X n k} P{(n+k)/ heads in n independent coin tosses} ( ) n n+k n for n k n with n+k even 4
3 Absolute-value random walk ConsiderthesymmetricrandomwalkX n intheprevious problem Define the absolute value random process Y n X n (a) Find P{Y n k} (b) Find P{max i<0 Y i 0 Y 0 0} Solution: (a) If k 0 then P{Y n k} P{X n +k or X n k} If k > 0 then P{Y n k} P{X n k}, while P{Y n 0} P{X n 0} Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n k} n n n/)( ) k 0, n is even, n 0 0 otherwise (b) If Y 0 X 0 0 then there are only two sample paths with max i<0 X i 0 that is, Z Z Z 0 +,Z Z 0 or Z Z Z 0,Z Z 0 + Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i 0 Y 0 0 } ( i<0 0 ) 0 84756 9378 4 Discrete-time Wiener process Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, ie, Z,Z, are iid N(0,) Define the process X n, n as X 0 0, and X n X n +Z n for n (a) Is X n an independent increment process? Justify your answer (b) Is X n a Gaussian process? Justify your answer (c) Find the mean and autocorrelation functions of X n (d) Specify the first order pdf of X n (e) Specify the joint pdf of X 3,X 5, and X 8 (f) Find E(X 0 X,X,,X 0 ) (g) Given X 4, X, and 0 X 3 4, find the minimum MSE estimate of X 4 Solution: 5
(a) Yes The increments X n, X n X n,,x nk X nk are sums of different Z i random variables, and the Z i are IID (b) Yes Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n) (c) We have [ n ] E[X n ] E Z i i n E[Z i ] i R X (n,n ) E[X n X n ] n n E Z i i j Z j n n E[Z i Z j ] i j min(n,n ) i min(n,n ) n 0 0, i E(Z i) (IID) (d) As shown above, X n is Gaussian with mean zero and variance Thus, X n N(0,n) Var(X n ) E[X n ] E [X n ] R X (n,n) 0 n Cov(X n,x n ) E(X n X n ) E(X n )E(X n ) min(n,n ) Therefore, X n and X n are jointly ( Gaussian random variables ) with mean µ [0 0] T n and covariance matrix Σ min(n,n ) min(n,n ) n (e) X n, n is a zero mean Gaussian random process Thus X 3 E[X 3 ] R X (3,3) R X (3,5) R X (3,8) X 5 N E[X 5 ], R X (5,3) R X (5,5) R X (5,8) X 8 E[X 8 ] R X (8,3) R X (8,5) R X (8,8) 6
Substituting, we get X 3 0 3 3 3 X 5 N 0, 3 5 5 X 8 0 3 5 8 (f) Since X n is an independent increment process, E(X 0 X,X,,X 0 ) E(X 0 X 0 +X 0 X,X,,X 0 ) E(X 0 X 0 X,X,,X 0 )+E(X 0 X,X,,X 0 ) E(X 0 X 0 )+X 0 0+X 0 X 0 (g) The MMSE estimate of X 4 given X x, X x and a X 3 b equals E[X 4 X x,x x,a X 3 b] Thus, the MMSE estimate is given by [ ] {X E X 4 x,x x,a X 3 b} [ ] {X E X +Z 3 +Z 4 x,x x,a X +Z 3 b} [ ] {a x x +E Z 3 Z 3 b x } +E[Z 4 ] ( since Z 3 is independent of (X,X ), and Z 4 is independent of (X,X,Z 3 ) ) [ ] {a x x +E Z 3 Z 3 b x } Plugging in the values, the required MMSE estimate is ˆX [ {Z3 4 +E Z 3 ] [,]} We have f Z3 (z 3 ) f Z3 Z 3 [,](z 3 ) P(Z 3 [,]), z 3 [,], 0, otherwise [ ] {Z3 which is symmetric about z 3 0 Thus, E Z 3 [,]} 0 and we have ˆX 4 5 A random process Let X n Z n + Z n for n, where Z 0,Z,Z, are iid N(0,) (a) Find the mean and autocorrelation function of {X n } 7
(b) Is {X n } Gaussian? Justify your answer (c) Find E(X 3 X,X ) (d) Find E(X 3 X ) (e) Is {X n } Markov? Justify your answer (f) Is {X n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) E[(Z m +Z m )(Z n +Z n )] E[Zn ], n m E[Zn ]+E[Z n ], n m E[Zn], m n 0, otherwise, n m, n m 0, otherwise (b) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (c) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 ][ ] [ ] X 3 (X X ) (d) Similarly, E(X 3 X ) (/)X (e) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (f) Since the process is not Markov, it is not independent increment X 6 Autoregressive process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n 8
Solution: E[X n ] E[X n ]+E[Z n ] 4 E[X n ]+ E[Z n ] n E[X ] n E[Z ] 0 For n > m we can write X n n mx m + Therefore, n m Z m+ ++ Z n +Z n n m n mx m + R X (n,m) E(X n X m ) (n m) E[X m], i0 iz n i since X m and Z n i, i 0,,n m, are independent and E[X m ] E[Z n i ] 0 To find E[X m ] consider Thus in general, E[X ], E[X ] 4 E[X ]+E[Z ] 4 +, E[X n ] 4 n ++ 4 + 4 3 ( 4 n) R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 7 Moving average process Let Z 0,Z,Z, be iid N(0,) (a) Let X n Z n +Z n for n Find the mean and autocorrelation function of X n (b) Is {X n } wide-sense stationary? Justify your answer (c) Is {X n } Gaussian? Justify your answer (d) Is {X n } strict-sense stationary? Justify your answer (e) Find E(X 3 X,X ) (f) Find E(X 3 X ) (g) Is {X n } Markov? Justify your answer (h) Is {X n } independent increment? Justify your answer (i) Let Y n Z n +Z n for n Find the mean and autocorrelation function of Y n 9
(j) Is {Y n } wide-sense stationary? Justify your answer (k) Is {Y n } Gaussian? Justify your answer (l) Is {Y n } strict-sense stationary? Justify your answer (m) Find E(Y 3 Y,Y ) (n) Find E(Y 3 Y ) (o) Is {Y n } Markov? Justify your answer (p) Is {Y n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) [( )( )] E Z n +Z n Z m +Z m E[Z n ], n m 4 E[Z n ]+E[Z n ], n m E[Z n], m n 0, otherwise 5 4, n m, n m 0, otherwise (b) Since the mean and autocorrelation functions are time-invariant, the process is WSS (c) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (d) Since the process is WSS and Gaussian, it is SSS (e) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 (f) Similarly, E(X 3 X ) (/5)X ] [ 5 4 5 4 ] [ ] X X (0X 4X ) 0
(g) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (h) Since the process is not Markov, it is not independent increment (i) The mean function µ n E(Y n ) 0 The autocorrelation function n m, R Y (n,m) n m or n m+, 0 otherwise (j) Since the mean and autocorrelation functions are time-invariant, the process is WSS (k) Since (Y,,Y n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (l) Since the process is WSS and Gaussian, it is SSS (m) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, (n) Similarly, E(Y 3 Y ) (/)Y E(Y 3 Y,Y ) [ 0 ][ ] [ ] Y 3 (Y Y ) (o) Since E(Y 3 Y,Y ) E(Y 3 Y ), the process is not Markov (p) Since the process is not Markov, it is not independent increment Y 8 Gauss-Markov process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n Solution: Using the method of lecture notes 8, we can easily verify that E(X n ) 0 for every n and that R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 9 QAM random process Consider the random process X(t) Z cosωt+z sinωt, < t <, wherez andz areiiddiscreterandomvariablessuchthatp Zi (+) p Zi ( ) (a) Is X(t) wide-sense stationary? Justify your answer (b) Is X(t) strict-sense stationary? Justify your answer
Solution: (a) We first check the mean E(X(t)) E(Z )cosωt+e(z )sinωt 0 cos(ωt) + 0 sin(ωt) 0 The mean is independent of t Next we consider the autocorrelation function E(X(t+τ)X(t)) E((Z cos(ω(t+τ))+z sin(ω(t+τ)))(z cos(ωt)+z sin(ωt))) E(Z )cos(ω(t+τ))cos(ωt)+e(z )sin(ω(t+τ))sin(ωt) cos(ω(t+τ))cos(ωt)+sin(ω(t+τ))sin(ωt) cos(ω(t+τ) ωt)) cosωτ The autocorrelation function is also time invariant Therefore X(t) is WSS (b) Note that X(0) Z cos0 + Z sin0 Z, so X(0) has the same pmf as Z On the other hand, X( π 4ω ) Z cos(π/4)+z (sinπ/4) (Z +Z ) wp 4 0 wp wp 4 This shows that X(π/4ω) does not have the same pdf or even same range as X(0) Therefore X(t) is not first-order stationary and as a result is not SSS 0 Mixture of two WSS processes Let X(t) and Y(t) be two zero-mean WSS processes with autocorrelation functions R X (τ) and R Y (τ), respectively Define the process { X(t), with probability Z(t) Y(t), with probability Find the mean and autocorrelation functions for Z(t) Is Z(t) a WSS process? Justify your answer Solution: To show that Z(t) is WSS, we show that its mean and autocorrelation
functions are time invariant Consider µ Z (t) E[Z(t)] E(Z Z X)P{Z X}+E(Z Z Y)P{Z Y} (µ X +µ Y ) 0, and similarly R Z (t+τ,t) E[Z(t+τ)Z(t)] (R X(τ)+R Y (τ)) Since µ Z (t) is independent of time and R Z (t+τ,t) depends only on τ, Z(t) is WSS 3