Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function of W.. Assume that you are thowing a dart to a round target of radious. Assume also that any point in the target is equal probable to score. Let be the center of the target and let (X, Y ) be point that the dart hits the target. Let W := X be the distance of the first coordinate from the center. Compute E[W ]. 3. (Honors) Use the Chernoff bound to obtain the following bound for the function Φ: Φ(x) e x. 4. Let X, Y be indepedent exponential random variables with expectation. Compute the pdf function of W := X Y. Compute the following probabilities: P( W t + s W s) and P( W t). What can you say about W? 5. Let X, Y be two random variable with join density function f X,Y (x, y) = x e x if y x, x <, and f X,Y = otherwise. Compute Cov(X, Y ). 6. Let X, Y, Z be independent and uniformly distributed on (, ). a. Compute the probability P(X Y +Z ). b. Compute the expectation of the random variable W = e X Y Z. 7. Let X be a random variable with ( ) M X (t) = + 5 e t + 5 e 5t + 3 e t. Compute E[X] and P(X ). Let X, X, X 3 be three independent copies of X. Compute the moment generating function of W := X + X + X 3. Then compute the probability P(W = ). 8. Let X, Y, Z be the number of units produced by three different factories. Assume that E[X] =, E[Y ] = 3, E[Z] = 5. Let W := X + Y + Z be the total production.
. Compute P(W 5).. Assume that X, Y are indepedent and var(x) = var(y ) = var(z) =, Cov(Z, X + Y ) =. Compute again the probability P(W 5). 3. Assume that X, Y, Z are independet Poisson with expectations, 3, 5. Compute P(W = 5). 9. Each day there are 4 different jobs that may take some of your time to be completed. In each day each job may appear with probability indepdently (of each other). Each job requires time X i to be completed, where X i are independent and have exponential distribution with expectation. Let Y be the total time that you may spent in a day to complete all (if any) four jobs. Compute E[Y ] and var(y ). (Honors) Compute also the moment generating function of Y. Use Chernoff bound to compute the probability P(Y 7).. A system consist of parts that work indepdently and have probability / to fail during a given period of time. The system will be functional as long at least 7 of them will be working. Use the central limit theorem to compute the probability that the system will not fail.. The number of people entering in a store follows Poisson with expectation. Let X i be the amount spent by each customer and assume that X i follows exponetial with expectation. Let W be the total amount spent in the store. Compute the expectation the variance and the moment generating function of W. Assume that X i and N are indepedent.. Let N be a d.r.v. uniformly distributed on the numbers,, 3. Let Z i, i =,, 3 be independent standard normal random variables (indepdent of N). Let Y := Z + + Z N. Is it true that Y is also normal? Explain your answer. 3. (Honors) Let X, X, be a sequence of random variables converging to with probability. Show that X i converges to in probability. Bring a Blue book with you. Additional office hours May st, :3-:3 and 4:- 5: The 3rd exam will be on Chapters: 3 (only 3.4, 3.5 and 3.6), 4, 5. The final exam schedule: 4- and 4-5 is on May 4th 8: till :, 4-53 is on May 4th :3 till :3 and 4-5 is on May 7th at 8: till : in class) Hints-Solutions
. Let B be the unit disk in R. When t, P(W t) = 4 vol( tb ) = π 4 t.. Let (X, Y ) be the random vector with density f X,Y (x, y) = if π x + y and otherwise. We have that t f X (t) = t π dy = t, t. π We have that F X (s) = P( X s) = P( s X s) = π Taking derivatives we get that s s t dt = 4 π f X (x) = 4 π x, x and s t dt. 3. E[ X ] = 4 π x x dx = π zdz. Φ(t) = P(Z t) e st M Z (s) = e (st s ), for all s > by Chernoff s bound. Since max s> {st s } = t, we conclude that Φ(t) e t. 4. We have that F Y (t) = P( Y t) = P(Y t) = So f y (t) = e t, t. We have that for s f W (s) = s t f X (t)f Y (s t)dt = e s s e t dt = e t, t e t dt = e s. Working in the same way for s we conclude that f W (s) = e s, s. Moreover F W (t) = P( W t) = P( t W t) = t t e s ds = e t = F Q (t), where Q is an exponential random variable with expectation. memoryless property. So W has the 3
5. Cov(X, Y ) = E[XY ] E[X]E[Y ]. E[XY ] = x xy x e x dydx = You work in the same way to compute E[X] and E[Y ]. 6. a. P(X Y +Z ) = y+z dxdydz. xe x dx = 4. b. E[e X Y Z] = E[e X ]E[Y ]E[Z] = ex dx y dy zdz =. 7. Set Y to be the descrete random variable with PMF p Y ( ) = 3, p Y ( 5) = 5, p Y ( ) = 5 and p Y () =. We have that M X (t) = M Y (t). So P(X ) = P(Y = ) + P(Y = ) = 3/. Also E[X] = dm X(t) dt t= = 57. Also ( M X +X +X 3 (t) = + 5 e t + 5 e 5t + 3 ) 3 e t = +terms with exponentials 3 Working as before we conclude that P(X + X + X 3 = ) = 3. 8. By Markov s inequality we have that P(W 5). We also have that 5 var(x + Y + Z) = var(x + Y ) + var(z) + Cov(X + Y, Z) = 7. By Chebychev s inequality we get that P(W 5) P( W 5) 7 5. c) Note that X + Y + Z is a Poisson with parameter (). 9. Let N be the number of jobs appearing during the day. Then N is binomial with parameters and 4. Let Y be to the total time needed to complete all jobs. Then X = X + + X N. Then we have that E[Y ] = E[N]E[X] =, var(y ) = E[N]var(X) + (E[X]) var(n) =.. Let X i be the independent Bernouli random variables that takes value if the i-th part works. Let X = X + + X be the total number of parts working. Using the De Moivre-Laplace theorem we get that (X 7) = Φ 4 7 4.
. Let W := X + + X N. We have that E[N] =, var(n) =, M N (t) = e (et ), So we get that E[X ] =, var(x ) =, M N (t) = E[W ] = E[N]E[X ] =, t. var(w ) := E[N]var(X) + (E[X]) var(n) =. M W (t) := e t t.. Assume that N is binomial (say with parameters (, ). Then M N(t) = 4 ( + et ). So if W := Z + + Z N, then M W (t) := 4 ( + e t ). But this is not the moment generating function of a normal. So W is not a normal r.v. 3. See book page 9. 5