EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on [0, 1]. Let Z = max(x 1, X 2,..., X n ). Find the pdf of Z. Consider the cdf of Z : F Z (z) = Pr(Z z) = Pr(X 1 z, X 2 z... X n z) = n n Pr(X i z) = F Xi (z) The pdf is a derivative of the cdf Z : z n if z [0, 1] = (F X1 (z)) n = 0 if z < 0 1 if z > 1 f Z (z) = d dz F Z(z) = nz n 1 if z [0, 1] 0 otherwise (b) (8 points) I have 10 hats in a box. Each time I go out with my friends, I pick up one of the hats at random and wear it. When I come back home, I put the hat back in the box. If I went out 20 times during this quarter, what is the expected number of hats that I never used? Introduce random variables X i, i = 1,... 10 : 1, if hat i was never used X i = A hat is never used if it is not picked up in any of the 20 independent times. The probability of picking a specific hat on a specific day is p = 1/10, thus The expected number of unused hats is Pr(X i = 1) = (1 p) 20 = (0.9) 20 0.1216 [ 10 ] E X i = 10 E [X i ] = 10 Pr(X i = 1) = 10 (0.9) 20 1.216 (c) Let X and Y denote the x and y coordinate of a point that is uniformly chosen over the circumference of a unit circle. (i) (2 points) Are X and Y independent? No. X 2 + Y 2 = 1, thus, by symmetry Pr(X = 1 Y = 0) = Pr(X = 1 Y = 0) = 0.5, while Pr(X = 1) = 0 as X is continuous on [ 1; 1]. Thus Pr(X = 1 Y = 0) Pr(X = 1), which implies dependence.
(ii) (2 points) Are X and Y identically distributed? Yes. By symmetry, any rotation doesn t change the distribution and X turns into Y under a rotation by π/2, thus they are identically distributed. (iii) (6 points) What is the variance of X? Hint: no need for lengthy calculations, try to build on your answer for part (ii). X is symmetric around the origin, thus E[X] = 0 According to part (ii), X and Y have the same distribution, thus, E[X 2 ] = E[Y 2 ]. As the points belong to the unit circumference: The variance: E[X 2 + Y 2 ] = 1 = E[X 2 ] + E[Y 2 ] = 1 = E[X 2 ] = 0.5 Var(X) = E[X 2 ] E[X 2 ] = 0.5 (d) You are trying to call your best friend. She answers the phone with probability p and does not answer with probability (1 p). If she answers the phone you engage in a conversation whose duration is an exponential random variable with rate λ. If she does not answer you wait for 5 min and call again. The probability that she answers is again p. If she does not answer you give up, otherwise you engage in a conversation whose duration is again an exponential random variable with rate λ. Let X (in minutes) be the total duration of the experiment, i.e., the total time you spent until you either give up calling your friend or hang up the phone. Recall that the pdf of an exponential random variable with rate λ is given by f Y (y) = λe λy, y 0. (i) (8 points) What are the probabilities of the events X < 5} and X 5}, X = 5} and X = 6}? Let the random variables for i = 1, 2 be 1 if friend answers call i X i =, X i Bern(p) and T Exp(λ) be the length of the conversation (if it was initiated). Note that F T (t) = t 0 f T (x)dx = 1 e λt, Pr(T = t) = 0 for any t as T is a continuous random variable. Pr(X < 5) = Pr(X 1 = 1, T < 5) = pf T (5) = p(1 e 5λ ) Pr(X 5) = Pr(X 1 = 1, T 5) + Pr(X 1 = 0, X 2 = 0) + Pr(X 1 = 0, X 2 = 1, T = 0) = pf T (5) + (1 p) 2 + 0 = p(1 e 5λ ) + (1 p) 2 Pr(X = 5) = Pr(X 5) Pr(X < 5) = (1 p) 2 Pr(X = 6) = Pr(X 1 = 1, T = 6) + Pr(X 1 = 0, X 2 = 1, T = 1) = 0 (ii) (10 points) Find and plot the cdf of X. for x < 5 : for x > 5 : F X (x) = Pr(X < x) = Pr(X 1 = 1, T < x) = pf T (x) = p(1 e xλ ) F X (x) = Pr(X < x) = Pr(X 1 = 1, T < x) + Pr(X 1 = 0, X 2 = 1, T < x 5) + Pr(X 1 = X 2 = 0) = pf T (x) + p(1 p)f T (x 5) + (1 p) 2 = p(1 e xλ ) + p(1 p)(1 e (x 5)λ ) + (1 p) 2 = 1 pe xλ p(1 p)e 5λ e xλ
for x = 5 : Thus, F X (5) = Pr(X 5) = p(1 e 5λ ) + (1 p) 2 ( F X (x) = p(1 e xλ ) + (1 p) 1 pe (x 5)λ) 1 x 5, where 1 A is the indicator function of A. f X (x) p(1 e 5λ ) +(1 p) 2 p(1 e 5λ ) 5 x (e) (8 points) You choose a point at random on a stick of length L and break the stick into two pieces at this point. What is the probability that the ratio of the shorter to the longer piece is less than 1 3? Let X be the length of the left piece, then L X is the length of the right piece and the ratio of the shorter piece to the longer piece is Y = min X L X, L X X on the stick, thus F X (x) = x/l for x [0, L]. Pr(Y < 1/3) = 1 Pr(Y 1/3), ( X Pr(Y 1/3) = Pr L X 1/3, L X ) X 1/3 = Pr(4X L, 3L 4X) = Pr (X [L/4, 3L/4]) }. Also note that X is uniformly distributed = Pr(X 3L/4) Pr(X < L/4) = F X (3L/4) F X (L/4) = 0.5, Pr(Y < 1/3) = 1 Pr(Y 1/3) = 0.5 (f) (10 points) Assume now that after you break the stick into two at a random point you throw away the piece in your left-hand. Then you take the right-hand piece and break it once again into two at random. What is the pdf of the length of the middle piece you end up with (that does not have any of the original end points)? Let Z be the length of the middle piece and X be the length of the piece in your right hand. Then X is uniformly distributed on [0, L] and Z is uniformly distributed on [0, x] given X = x, thus: for z [0, L] : Finally, f Z (z) = L 0 f Z X (z x)f X (x)dx = f X (x) = 1 x [0,L] L ] f Z X (z x) = 1 z [0,x] x f Z (z) = L 0 1 L z [0,x] xl dx = 1 ln(l/z) dx = z xl L ln(l/z) L, if z [0, L]
2. Communication over an Optical Channel (38 points) Consider the problem of communicating one bit of information across an optical fiber, which is equally likely to be 0 or 1. If the bit B = 1, we switch on an LED and its light is carried across the optical fiber to a photodetector at the receiver side. The photodetector outputs the number of photons Y N it detects. If the bit B = 0, we keep the LED off. What makes the problem interesting is that even if the LED is off, the detector is likely to detect some photons (e.g. due to ambient light). A good model is that Y is Poisson distributed with intensity that depends on whether the LED is on or off. Assume that when B = 1 (the LED is on), the number of photons counted by the photodetector is a Poisson random variable with parameter 200 and when B = 0 (the LED is off), the number of photons counted by the photodetector is a Poisson random variable with parameter 50. Remember that a Poisson random variable Y with parameter λ has mean and variance λ, and the following probability distribution: Pr(Y = k) = λk e λ, k = 0, 1, 2,... (a) (10 points) Apply the ML rule to decide on the transmitted bit when the photodetector detects Y = y photons. Reduce the decision rule to a simple threshold form and specify the value of the threshold. Let λ 1 = 200, λ 0 = 50. The decision rule: ˆB = 0 Pr(Y = y B = 1) < P r(y = y B = 0) λy 1 e λ 1 < λy 0 e λ 0 y! y! y ln(λ 1 ) λ 1 < y ln(λ 0 ) λ 0 λ 1 λ 0 y < ln(λ 1 ) ln(λ 0 ) = 150 ˆB = 0 if y 108 1 if y 109 (b) (4 points) Given that the transmitted bit B = 0. What is the probability that we erroneously declare the bit to be 1, i.e. ˆB = 1? Hint: you can express the result as a summation. Pr( ˆB = 1 B = 0) = Pr(Y 109 B = 0) = k=109 λ k 0 e λ 0 108 = 1 e 50 (c) (7 points) Use Chebyshev s inequality to provide a bound on the probability in part (b). Let Z be a Poisson random variable with parameter λ 0. Then E[Z] = λ 0 and Var(Z) = λ 0. Y B = 0 has the same distribution as Z, thus Pr( ˆB = 1 B = 0) = Pr(Z 109) = Pr(Z E[Z] 109 E[Z]) Pr( Z E[Z] 109 E[Z]) Var(Z) (109 E[Z]) 2 λ 0 (109 λ 0 ) 2 = 50 59 2 0.0144 (d) (7 points) Use the central limit theorem to provide an approximate estimate for the probability in part (b). Hint: recall that the sum of two independent Poisson random variables with parameters λ and µ is Poisson with parameter λ + µ. If X 1, X 2... X 50 are i. i.d. Poisson with parameter 1, then Z = 50 X i has Poisson distribution with parameter 50. Thus, Pr( ˆB = 1 B = 0) = Pr(Z 109) = Pr k=0 50 k ( Z 50 59 ) ( ) 59 = Q 3.60 10 17 50 50 50
(e) (10 points) Assume that now the transmission is repeated N times. That is, if B = 1 the receiver observes Y 1,..., Y N, where the Y i s are i.i.d. Poisson(200) random variables, if B = 0 the receiver observes Y 1,..., Y N, where the Y i s are i.i.d. Poisson(50). Use the ML rule to decide on the value of the transmitted bit given a certain observation Y 1 = y 1,..., Y N = y N at the receiver. ˆB = 0 Pr ( N Y i = y i } B = 1 ) < P r ( N Y i = y i } B = 0 ) N λ y i 1 e λ 1 y i! < N λ y i 0 e λ 0 y i! ln(λ 1 ) y i Nλ 1 < ln(λ 0 ) y i Nλ 0 1 N y i < λ 1 λ 0 ln(λ 1 ) ln(λ 0 ) = 150 The decision rule: ˆB = 0 if 1 N N y i < 150 1 otherwise.