UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N (n) = (λt)n e λt for n = 0,,,... n! Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solution: To find the pdf f Y (t) of the random variable Y, note that the event Y t} occurs iff the time of the nth packet is in [0,t], that is, iff the number N(t) of packets arriving in [0,t] is at least n. Alternatively, Y > t} occurs iff N(t) < n}. Hence, the cdf F Y (t) of Y is given by F Y (t) = PY t} = PN(t) n} = (λt) k e λt. k! k=n Differentiating F Y (t) with respect to t, we get the pdf f Y (t) as for t > 0. f Y (t) = k=n [ λe λt(λt)k k! = λe λt(λt)n (n )! = λe λt(λt)n (n )! k=n ] +λe λt(λt)k (k )! λe λt(λt)k k! + k=n+ λe λt(λt)k (k )! Or we can use another way. Since we know that the time interval T between packet arrivals is an exponential random variable with pdf λe f T (t) = λt, if t 0, Let T i denote the i.i.d. exponential interarrival times, then Y = T + T + + T n. By convolving f T (t) with itself n times, which can be also computed by its Fourier transform (characteristic function), we can show that the pdf of Y is given by f Y (t) = λe λt (λt) n (n )!, if t 0,
. Diamond distribution. Consider the random variables X and Y with the joint pdf c if x + y / f X,Y (x,y) = 0 otherwise, where c is a constant. (a) Find c. (b) Find f X (x) and f X Y (x y). (c) Are X and Y independent random variables? Justify your answer. Solution: (a) The integral of the pdf f X,Y (x,y) over < x <, < y < is c, and therefore by the definition of joint density c =. (b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 x, x ( ) f X (x) = cdy = x, and for x 0, f X (x) = +x +x x ( ) cdy = +x. So the marginal pdf may be written as x x f X (x) = 0 otherwise. Now since f XY (x,y) is symmetrical, f Y (y) = f X (y). Thus, (c) X and Y are not independent since f X Y (x y) = f X,Y(x,y) f Y (y) = y x + y, y 0 otherwise. f X,Y (x,y) f X (x)f Y (y). Alternatively, X and Y are not independent since f X Y (x y) depends on the value of y.
3. First available teller. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ) respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she is free. What is the probability that you are served by the first teller? Solution: From the memoryless property of the exponential distribution, the remaining services for the tellers are also independent exponentially distributed random variables with parameters λ and λ, respectively. The probability that you will be served by the first teller is the probability that the first teller finishes the service before the second teller does. Thus, PX < X } = f X,X (x,x )dx dx x >x = = x =0 x =0 = λ λ +λ. x =x λ e λ x λ e λ x dx dx λ e (λ +λ )x dx 4. Coin with random bias. You are given a coin but are not told what its bias (probability of heads) is. You are told instead that the bias is the outcome of a random variable P Unif[0, ]. To get more infromation about the coin bias, you flip it independently 0 times. Let X be the number of heads you get. Thus X B(0,P). Assuming that X = 9, find and sketch the a posteriori probability of P, i.e., f P X (p 9). Solution: In order to find the conditional pdf of P, apply Bayes rule for mixed random variables to get f P X (p x) = p X P(x p) f P (p) = p X (x) Now it is given that X = 9, thus for 0 p p 9 ( p) f P X (p 9) = 0 p9 ( p)dp = p9 ( p) 0 = 0p 9 ( p). p X P (x p) 0 p X P(x p)f P (p)dp f P(p). Figure compares the unconditional and the conditional pdfs for P. It may be seen that given the information that 0 independent tosses resulted in 9 heads, the pdf is shifted towards the value 9 0. 3
4.5 4 f P (p) f P X (p 9) 3.5 3.5.5 0.5 0 0 0. 0. 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Figure : Comparison of a priori and a posteriori pdfs of P 5. Optical communication channel. Let the signal input to an optical channel be: with probability X = 0 with probability. The conditional pmf of the output of the channel Y X = } Poisson(), i.e., Poisson with intensity λ = and Y X = 0} Poisson(0). Show that the MAP rule reduces to:, y < y D(y) = Find y and the corresponding probability of error. Solution: The MAP rule px Y ( y) > p D(y) = X Y (0 y) 0 otherwise minimizes the probability of decoding error. Since the a priori probabilities for the two X values are equal, the MAP rule is equivalent to the ML rule p Y X (y ) D(y) = p Y X (y 0) > 0 otherwise. Now, p Y X (y ) p Y X (y 0) = e /y! e 0 0 y /y! = e 9 yln(0). 4
This ratio is greater than if y < 9 ln(0). Therefore, y < 9 D(y) = ln(0) 0 otherwise and The probability of error is P e = PD(Y) X} y = 9 ln(0) = 3.9. = PY > y X = }PX = }+PY < y X = 0}PX = 0} e 3 e 0 0 y = 0.5+ 0.5 y! y! y=4 = 0.047. y=0 6. Iocane or Sennari. An absent-minded chemistry professor forgets to label two identically looking bottles. One contains a chemical named Iocane and the other contains a chemical named Sennari. It is well known that the radioactivity level of Iocane has the Unif[0, ] distribution, while the radioactivity level of Sennari has the Exp() distribution. (a) Let X be the radioactivity level measured from one of the bottles. What is the optimal decision rule (based on the measurement X) that maximizes the chance of correctly identifying the content of the bottle? (b) What is the associated probability of error? Solution: Let Θ = 0 denote the case in which the content of the bottle is Iocane and let Θ = denote the case in which the content of the bottle is Sennari. Implicit in the problem statement is that P(Θ = 0) = P(Θ = ) = /. (a) The optimal MAP rule is equivalent to the ML rule 0, f D(x) = X Θ (x 0) > f X Θ (x ),, otherwise. Since the Unif(0,) pdf f X Θ (x 0) is larger than the Exp() pdf f X Θ (x ) for 0 < x <, we have 0, 0 < x <, D(x) =, otherwise. (b) The probability of error is given by P(Θ D(X)) = P(Θ D(X) Θ = 0)+ P(Θ D(X) Θ = ) = P(X > Θ = 0)+ P(0 < X < Θ = ) = ( e ). 5
7. Two independent uniform random variables. Let X and Y be independently and uniformly drawn from the interval [0,]. (a) Find the pdf of U = max(x,y). (b) Find the pdf of V = min(x,y). (c) Find the pdf of W = U V. (d) Find the probability P X Y /}. Solution: (a) We have F U (u) = PU u} = Pmax(X,Y) u} = PX u,y u} = PX u}py u} = u for 0 u. Hence, f U (u) = u, 0 u, (b) Similarly, F V (v) = PV > v} = Pmin(X,Y) > v} = PX > v,y > v} = PX > v}py > v} = ( v), or equivalently, F V (v) = ( v), for 0 v. Hence, ( v), 0 v, f V (v) = (c) First note that W = U V = X Y. (Why?) Hence, PW w} = P X Y w} = P( w X Y w). Since X and Y are uniformly distributed over [0,], the above integral is equal to the area of the shaded region in the following figure: 6
y w 0 w x The area can be easily calculated as ( w) for 0 w. Hence F W (w) = ( w) and ( w), 0 w, f W (w) = (d) From the figure above, P X Y /} = PW /} = /4. 8. Waiting time at the bank. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ), respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y. Solution: First observe that Y = min(x,x ). Since PY > y} = PX > y,x > y} = PX > y}px > y} = e λ y e λ y = e (λ +λ )y for y 0, Y is an exponential random variable with pdf (λ +λ )e (λ +λ )y, y 0, f Y (y) = 7
Additional Exercises Do not turn in solutions to these problems.. Independence. Let X X and Y Y be two independent discrete random variables. (a) Show that any two events A X and B Y are independent. (b) Show that any two functions of X and Y separatelyareindependent; that is, ifu = g(x) and V = h(y) then U and V are independent. Solution: (a) Recall that the probability of any event A X is given by PX A} = x A X p X(x). Because of the independence of X and Y, we have PX A,Y B} = p X,Y (x,y) Therefore, A and B are independent. x A X y B Y = x A X y B Y = x A X p X (x) p X (x)p Y (y) y B Y = PX A}PY B}. p Y (y) (b) Let A x = x : g(x) < u} and B y = y : h(y) < v}. Then the joint distribution of U and V is F U,V (u,v) = PU u,v v} = Pg(X) u,h(y) v} = PX A x,y B y }. However, because of the independence of X and Y, so that Z and W are independent. F U,V (u,v) = PX A x,y B y = PX A x }PY B y } = Pg(X) < u}ph(y) < v} = PU < u}pv < v} = F(u)F(v),. Family planning. Alice and Bob choose a number X at random from the set,3,4} (so the outcomes are equally probable). If the outcome is X = x, they decide to have children until they have a girl or x children, whichever comes first. Assume that each child is a girl with probability / (independent of the number of children and gender of other children). Let Y be the number of children they will have. (a) Find the conditional pmf p Y X (y x) for all possible values of x and y. (b) Find the pmf of Y. 8
Solution: (a) Note that Y,,3,4}. The conditional pmf is as follows (b) The pmf of Y is: p Y () = p Y (3) = p Y X ( ) =, p Y X( ) =, p Y X ( 3) =, p Y X( 3) = 4, p Y X(3 3) = 4, p Y X ( 4) =, p Y X( 4) = 4, p Y X(3 4) = 8, p Y X(4 4) = 8. 4 p X (x)p Y X ( x) = /, p Y () = x= 4 p X (x)p Y X ( x) = /3, x= 4 p X (x)p Y X (3 x) = /8, p Y (4) = p X (4)p Y X (4 4) = /4. x=3 3. Joint cdf or not. Consider the function G(x,y) = if x+y 0 0 otherwise. Can G be a joint cdf for a pair of random variables? Justify your answer. Solution: No. Note that for every x, But for any genuine marginal cdf, lim G(x,y) =. y lim F X(x) = 0. x Therefore G(x,y) is not a cdf. Alternatively, assume that G(x,y) is a joint cdf for X and Y, then P < X, < Y } = G(,) G(,) G(, )+G(, ) = +0 =. But this violates the property that the probability of any event must be nonnegative. 9
0.5 f Y S (y 0) f Y S (y ) f Y S (y ) 0.4 0.3 0. 0. 0 3 0 3 Figure : f Y S (y,s) for λ = 4. Ternary signaling. Let the signal S be a random variable defined as follows: with probability 3 S = 0 with probability 3 + with probability 3. The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian random variable with pdf f Z (z) = λ e λ z, < z <. The signal S and the noise Z are assumed to be independent and the channel output is their sum Y = S +Z. (a) Find f Y S (y s) for s =, 0, +. Sketch the conditional pdfs on the same graph. (b) Find the optimal decoding rule D(Y) for deciding whether S is, 0 or +. Give your answer in terms of ranges of values of Y. (c) Find the probability of decoding error for D(y) in terms of λ. Solution: (a) We use a trick here that is used several times in the lecture notes. Since Y = S+Z and Z and S are independent, the conditional pdf is f Y S (y s) = f Z (y s) = λe λ y s. The plots are shown for λ = in Figure on page 0. 0
(b) The optimal decoding rule is MAP: D(y) = s where s maximizes p(s y) = f(y s)p(s) f(y) Sincep S (s)isthesamefors =,0,+, themaprulebecomesthemaximum-likelihood decoding rule: D(y) = argmaxf(y s). The conditional pdfs are plotted in Figure. By inspection, the ML rule reduces to s y < g(y) = 0 < y < + + y > +. (c) Inspection of Figure shows how to calculate the probability of error. P e = Perror i sent}pi sent} i Perror i sent} = 3 i = 3 ( P < S +Z < + S = 0}) + 3 P S +Z > S = } + 3 P S +Z < + S = +} = ( 3 P < Z < }) + 3 PZ < }+ 3 PZ > + } = 3 PZ < }+ 3 PZ > + }+ 3 PZ < }+ 3 PZ > + } = 3( PZ < }+PZ > + }) = 4 3 PZ > + } (by symmetry) = 4 3 λe λ z dz = 3 e λ. 5. Signal or no signal. Consider a communication system that is operated only from time to time. When the communication system is in the normal mode (denoted by M = ), it transmits a random signal S = X with +, with probability /, X =, with probability /. When the system is in the idle mode (denoted by M = 0), it does not transmit any signal (S = 0). Both normal and idle modes occur with equal probability. Thus X, with probability /, S = 0, with probability /. The receiver observes Y = S+Z, where the ambient noise Z U[,] is independent of S..
(a) Find and sketch the conditional pdf f Y M (y ) of the receiver observation Y given that the system is in the normal mode. (b) Find and sketch the conditional pdf f Y M (y 0) of the receiver observation Y given that the system is in the idle mode. (c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide the answer in terms of intervals of y. (d) Find the associated probability of error. Solution: (a) If M =, Hence, we have Y = + Z, with probability /, + Z, with probability /. f Y M (y ) = f Z(y )+ f Z(y +) = 4, y, (b) If M = 0, Y = Z, so f Y M (y 0) = f Z (y) =, y, (c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML rule, in which 0, if f d(y) = Y M (y 0) > f Y M (y ),, otherwise 0, if < y <, =, otherwise. (d) The probability of error is given by PM d(y)} = PM =, < Y < } = PM = }P < Y < M = } = = 4. 6. Function of uniform random variables. Let X and Y be two independent U[0, ] random variables. Find the probability density function (pdf) of Z = (X+Y) mod (i.e., Z = X+Y if X +Y and X +Y if X +Y > ). Solution: Let W = X + Y. Since X and Y are independent, the pdf of W is simply the convolution of the pdf of X and the pdf of Y. The convolution of this two uniform distributions is a triangular shaped pdf as follows,
w if 0 w f W (w) = w if w 0 otherwise. Since Z = W mod, it is easy to see that Z U[0,]. 7. Maximal correlation. (a) For any pair of random variables (X,Y), show that F X,Y (x,y) minf X (x),f Y (y)}. Now let F and G be continuous and invertible cdf s and let X F. (b) Find the distribution of (c) Show that Solution: (a) We have Y = G (F(X)). F X,Y (x,y) = minf(x),g(y)}. F X,Y (x,y) = PX x,y y} PX x} = F X (x), and similarly, F X,Y F Y (y). Thus, (b) We have (c) We have F X,Y minf X (x),f Y (y)}. F Y (y) = PY y} = PG (F(X)) y} = PF(X) G(y)} = PX F (G(y))} = F(F (G(y))) = G(y). F X,Y (x,y) = P(X x,y y) = P(X x,x F (G(y))) = P(X minx,f (G(y))) = minf(x),f(f (G(y)))} = minf(x),g(y)}. From part (a), this is the maximal joint cdf for any (X,Y) with the given marginal cdf s F(x) and G(y). 3