UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e., f Xi (x) λe λx, for x 0. (a) Does Y n min{x,x,...,x n } converge in probability as n approaches infinity? (b) If it converges what is the limit? (c) What about Z n ny n? (a) For any set of values X i s, the sequence Y n is monotonically decreasing in n. Since the random variables are nonnegative, it is reasonable to guess that Y n converges to 0. Now Y n will converge in probability to 0 if and only if for any ǫ > 0, lim n P{ Y n > ǫ} 0. P{ Y n > ǫ} P{Y n > ǫ} P{X > ǫ,x > ǫ,..,x n > ǫ} P{X > ǫ}p{x > ǫ}...p{x n > ǫ} ( F X (ǫ))( F X (ǫ))...( F X (ǫ)) ( F X (ǫ)) n ( ( e λǫ )) n e λǫn. As n goes to infinity (for any finite ǫ > 0) this converges to zero. Therefore Y n converges to 0 in probability. (b) The limit to which Y n converges in probability is 0. (c) Does Z n ny n converges to 0 in probability? No, it does not. In fact, P{ Z n > ǫ} P{nY n > ǫ} P{Y n > ǫ n } e λ ǫ n n e λǫ which does not depend on n. So Z n does not converge to 0 in probability. Note that the distribution of Z n is exponential with parameter λ, the same as the distribution of X i. F Zn (z) P{Z n < z} e λz.
In conclusion, if X i s are i.i.d. exp(λ), then Y n min i n {X i} exp(nλ) and Thus but Z n ny n exp(λ). P{Y n > ǫ} e λǫn 0, so Y n 0 in probability, P{Z n > ǫ} e λǫ 0, so Z n 0 in probability.. Roundoff errors. The sum of a list of 00 real numbers is to be computed. Suppose that these numbers are rounded off to the nearest integer so that each number has an error that is uniformly distributed in the interval ( 0.5, 0.5). Use the central limit theorem to estimate the probability that the total error in the sum of the 00 numbers exceeds 6. Errorsareindependentanduniformlydistributedin( 0.5,0.5), i.e. e i Unif( 0.5,0.5). Usingthe Central Limit Theorem, thetotal error e 00 i e i can beapproximated as a Gaussian random Variable with mean and Variance E(e) σ e 0 00 E(e i ) i 00 Var(e i ) i 00 8.33. Therefore the probability of the given event is { e E(e) P{e > 6} P σ e ( ) 6 Q 8.33 > 6 0 } 8.33 0.088. If the error is interpreted as an absolute difference, then P{ e > 6} P{e > 6} 0.0376.
3. The signal received over a wireless communication channel can be represented by two sums X n n Z j cosθ j, and n j X n n Z j sinθ j, n j where Z,Z,... are i.i.d. with mean µ and variance [ σ and Θ,Θ,... are i.i.d. U[0,π Xn independent of Z,Z,... Find the distribution of as n approaches. Since Z,Z,... are i.i.d. and Θ,Θ,... are also i.i.d. and independent of Z,Z,..., [ [ Z cosθ Z cosθ, Z sinθ Z sinθ [ Zj cosθ Let Y j j Z j sinθ j,... are i.i.d. random vectors.. We have Similarly, E[Z j sinθ j 0. Thus, E[Y j 0. Moreover, we have X n E[Z j cosθ j E[Z j E[cosΘ j µ 0. π 0 π cosθdθ E[(Z j cosθ j ) E[Z je[cos Θ j Similarly, E[(Z j sinθ j ) µ +σ. Also, (µ +σ ) µ +σ. π 0 π cos θdθ E[(Z j cosθ j ) (Z j sinθ j ) E[Z j E[cosΘ j sinθ j (µ +σ ) 0. π 0 π cosθsinθdθ 3
Thus, Now, [ Xn X n Cov(Y j ) E[Y j Y T j E[Y j E[Y j T E[Y j Yj T [ E[(Z j cosθ j ) E[(Z j cosθ j ) (Z j sinθ j ) E[(Z j cosθ j ) (Z j sinθ j ) E[(Z j sinθ j ) µ +σ I. n Y j, and thus, by the central limit theorem, as n, n j [ Xn X n N (0, µ +σ ) I in distribution. 4. Polya urn. An urn initially has one red ball and one white ball. Let X denote the name of the first ball drawn from the urn. Replace that ball and one like it. Let X denote the name of the next ball drawn. Replace it and one like it. Continue, drawing and replacing. (a) Argue that the probability of drawing k reds follwed by n k whites is 3... k k+ (n k)... (k +) (n+) k!(n k)! (n+)! ( (n+) nk ). (b) Let P n be the proportion of red balls in the urn after the n th drawing. Argue that Pr{P n k n+ } n+, for k,,...,n+. Thus all proportions are equally probable. This shows that P n tends to a uniformly distributed random Variable in distribution, i.e., lim Pr{P n t} t, 0 t. n (c) What can you say about the behavior of the proportion P n if you started initially with one red ball in the urn and two white balls? Specifically, what is the limiting distribution of P n? Can you show Pr{P n k n+3 } k n+, for k,,...,n+? (a) Let r be the number of red balls, w the number of white balls and t the total number of balls r +w in the urn. The picture of the balls in the urn for each drawing is given below. The ratio r/t or w/t gives the probability for the next drawing. 4
r/t / st drawing r/t /3 d drawing r/t 3/4... (k ) th drawing r/t k/k + k th drawing w/t /k + (k+) th drawing w/t /k +3 (k+) th drawing w/t 3/k +4.. (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is. w/t n k/n+ 3... k (k +) (n k)... (k +) (n+) k!(n k)! (n+)! ( n. k) (n+) In fact, regardless of the ordering, the probability of drawing k reds in n draws is given by this expression as well. Note that after the n th drawing, there are k+ red balls and n k+ white balls for a total of n+ balls in the urn. (b) After the n th drawing, where k red balls have been drawn, the proportion P n of red balls is number of red balls in the urn P n total number of balls in the urn k+ n+. There are ( n k) orderings of outcomes with k red balls and n k white balls, and each ordering has the same probability. In fact, in the expression of the probability of drawing k reds followed by n k whites, there will be just permutations of the numerators in the case of a different sequence of drawings. Therefore { Pr P n k+ } n+ ( ) n k n+ ( ( n k ) ) (n+) for k 0,...,n. All the proportions are equally probable (the probability does not depend on k) and P n tends to a uniformly distributed random Variable Unif[0, in distribution. (c) If there are one red ball and two white balls to start with: 5
r/t /3 st drawing r/t /4 d drawing r/t 3/5... (k ) th drawing r/t k/k + k th drawing w/t /k +3 (k +) th drawing w/t 3/k +4 (k +) th drawing w/t 4/k +5.. (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is After the n th drawing,. w/t n k+/n+ 3 4... k (k +) (n k+) k!(n k +)!... (k +3) (n+) (n+)! Similarly as in part (b), { Pr P n k + } n+3 number of red balls in the urn P n total number of balls in the urn k+ n+3. The distribution is linearly decreasing in p k+ Pr{P n p} ( ) ( ) n k!(n k+)! k (n+)! ( ) n! k!(n k +)! k!(n k)! (n+)! (n k+) (n+)(n+). n+3. ( ) ( ) (n+3) p+. (n+)(n+) n+ Therefore the limiting distribution of P n as n goes to infinity is a triangular distribution with density f Pn (p) ( p). Note: the Polya urn for this problem generates a process that is identical to the following mixture of Bernoulli s Θ f Θ (θ) ( θ), 0 θ <. Let X i Bernoulli(Θ), then P n converges to Θ as n tends to infinity. Considering the general case of λ red balls and λ white balls to start with, the limiting distribution is where C(λ,λ ) is a function of λ and λ. f Θ (θ) C(λ,λ )θ λ ( θ) λ 6
5. Symmetric random walk. Let X n be a random walk defined by X 0 0, n X n Z i, where Z,Z,... are i.i.d. with P{Z } P{Z }. (a) Find P{X 0 0}. (b) Approximate P{ 0 X 00 0} using the central limit theorem. (c) Find P{X n k}. (a) Since the event {X 0 0} is equivalent to {Z Z 0 }, we have P{X 0 0} 0. (b) Since E(Z j ) 0 and E(Zj ), by the central limit theorem, { ( ) } 00 P{ 0 X 00 0} P Z i 00 (c) i i Q() Φ() 0.68. P{X n k} P{(n+k)/ heads in n independent coin tosses} ( ) n n+k n for n k n with n+k even. 6. Absolute-value random walk. Consider the symmetric random walk X n in the previous problem. Define the absolute value random process Y n X n. (a) Find P{Y n k}. (b) Find P{max i<0 Y i 0 Y 0 0}. (a) If k 0 then P{Y n k} P{X n +k or X n k}. If k > 0 then P{Y n k} P{X n k}, while P{Y n 0} P{X n 0}. Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n k} n n n/)( ) k 0, n is even, n 0 0 otherwise. 7
(b) If Y 0 X 0 0 then there are only two sample paths with max i<0 X i 0 that is, Z Z Z 0 +,Z Z 0 or Z Z Z 0,Z Z 0 +. Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i 0 Y 0 0 } ( i<0 0 ) 0 84756 9378. 7. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, i.e., Z,Z,... are i.i.d. N(0,). Define the process X n, n as X 0 0, and X n X n +Z n for n. (a) Is X n an independent increment process? Justify your answer. (b) Is X n a Gaussian process? Justify your answer. (c) Find the mean and autocorrelation functions of X n. (d) Specify the first order pdf of X n. (e) Specify the joint pdf of X 3,X 5, and X 8. (f) Find E(X 0 X,X,...,X 0 ). (g) Given X 4, X, and 0 X 3 4, find the minimum MSE estimate of X 4. (a) Yes. The increments X n, X n X n,...,x nk X nk are sums of different Z i random variables, and the Z i are IID. (b) Yes. Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n). (c) We have [ n E[X n E Z i i n E[Z i i R X (n,n ) E[X n X n n n E Z i i j Z j n n E[Z i Z j i j min(n,n ) i min(n,n ). n 0 0, i E(Z i) (IID) 8
(d) As shown above, X n is Gaussian with mean zero and variance Var(X n ) E[Xn E [X n R X (n,n) 0 n. Thus, X n N(0,n). Cov(X n,x n ) E(X n X n ) E(X n )E(X n ) min(n,n ). Therefore, X n and X n are ( jointly Gaussian random variables ) with mean µ [0 0 T n and covariance matrix Σ min(n,n ). min(n,n ) n (e) X n, n is a zero mean Gaussian random process. Thus X 3 E[X 3 R X (3,3) R X (3,5) R X (3,8) X 5 N E[X 5, R X (5,3) R X (5,5) R X (5,8) X 8 E[X 8 R X (8,3) R X (8,5) R X (8,8) Substituting, we get X 3 0 3 3 3 X 5 N 0, 3 5 5. X 8 0 3 5 8 (f) Since X n is an independent increment process, E(X 0 X,X,...,X 0 ) E(X 0 X 0 +X 0 X,X,...,X 0 ) E(X 0 X 0 X,X,...,X 0 )+E(X 0 X,X,...,X 0 ) E(X 0 X 0 )+X 0 0+X 0 X 0. (g) The MMSE estimate of X 4 given X x, X x and a X 3 b equals E[X 4 X x,x x,a X 3 b. Thus, the MMSE estimate is given by [ {X E X 4 x,x x,a X 3 b} [ {X E X +Z 3 +Z 4 x,x x,a X +Z 3 b} [ {a x x +E Z 3 Z 3 b x } +E[Z 4 ( since Z 3 is independent of (X,X ), and Z 4 is independent of (X,X,Z 3 ) ) [ {a x x +E Z 3 Z 3 b x }. 9
Plugging in the values, the required MMSE estimate is ˆX [ {Z3 4 +E Z 3 [,}. We have f Z3 (z 3 ) f Z3 Z 3 [,(z 3 ) P(Z 3 [,), z 3 [,, 0, otherwise [ {Z3 which is symmetric about z 3 0. Thus, E Z 3 [,} 0 and we have ˆX 4. 8. A random process. Let X n Z n +Z n for n, where Z 0,Z,Z,... are i.i.d. N(0,). (a) Find the mean and autocorrelation function of {X n }. (b) Is {X n } Gaussian? Justify your answer. (c) Find E(X 3 X,X ). (d) Find E(X 3 X ). (e) Is {X n } Markov? Justify your answer. (f) Is {X n } independent increment? Justify your answer. (a) E(X n ) E(Z n )+E(Z n ) 0. R X (m,n) E(X m X n ) E[(Z m +Z m )(Z n +Z n ) E[Zn, n m E[Zn +E[Z n, n m E[Zn, m n 0, otherwise, n m, n m 0, otherwise. (b) Since (X,...,X n ) is a linear transform of a GRV (Z 0,Z,...,Z n ), the process is Gaussian. (c) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear. Hence, E(X 3 X,X ) [ 0 [ [ X 3 (X X ). (d) Similarly, E(X 3 X ) (/)X. (e) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov. (f) Since the process is not Markov, it is not independent increment. 0 X
9. Moving average process. Let X n Z n + Z n for n, where Z 0,Z,Z,... are i.i.d. N(0,). Find the mean and autocorrelation function of X n. E(X n ) E(Z n )+E(Z n ) 0. R X (m,n) E(X m X n ) [( )( ) E Z m +Z m Z n +Z n E[Z n, n m 4 E[Z n +E[Z n, n m E[Z n, m n 0, otherwise 5 4, n m, n m 0, otherwise. 0. Autoregressive process. Let X 0 0 and X n X n +Z n for n, where Z,Z,... are i.i.d. N(0,). Find the mean and autocorrelation function of X n. E[X n E[X n +E[Z n 4 E[X n + E[Z n... n E[X n E[Z 0. For n > m we can write Therefore, X n n mx m + n m Z m+ +...+ Z n +Z n n m n mx m + R X (n,m) E(X n X m ) (n m) E[X m, i0 iz n i. since X m and Z n i, i 0,...,n m, are independent and E[X m E[Z n i 0. To find E[X m consider Thus in general, E[X, E[X 4 E[X +E[Z 4 +,. E[X n 4 n +...+ 4 + 4 3 ( 4 n). R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m}.
. Random binary waveform. In a digital communication channel the symbol is represented by the fixed duration rectangular pulse { for 0 t < g(t) 0 otherwise, and the symbol 0 is represented by g(t). The data transmitted over the channel is represented by the random process X(t) A k g(t k), for t 0, k0 where A 0,A,... are i.i.d random variables with { + w.p. A i w.p.. (a) Find its first and second order pmfs. (b) Find the mean and the autocorrelation function of the process X(t). (a) The first order pmf is p X(t) (x) P(X (t) x) ( ) P A k g(t k) x k0 P ( A t x ) P(A 0 x) IID {, x ± 0, otherwise. Now note that X(t ) and X(t ) are dependent only if t and t fall within the same time interval. Otherwise, they are independent. Thus, the second order pmf is p X(t )X(t )(x,y) P(X(t ) x,x (t ) y) ( ) P A k g(t k) x, A k g(t k) y k0 k0 P ( A t x,a t y ) { P(A0 x,a 0 y), t t P(A 0 x,a y), otherwise, t t & (x,y) (,),(, ) 4, t t & (x,y) (,),(, ),(,),(, ) 0, otherwise.
(b) For t 0, [ E[X(t) E A k g(t k) k0 g(t k)e[a k k0 0. For the autocorrelation R X (t,t ), we note once again that only if t and t fall within the same interval, will X(t ) be dependent on X(t ); if they do not fall in the same interval then they are independent from one another. Then, R X (t,t ) E[X(t )X(t ) g(t k)g(t k)e[a k k0 {, t t 0, otherwise. 3