1 Basic continuous random variable problems

Size: px
Start display at page:

Download "1 Basic continuous random variable problems"

Transcription

1 Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and work out problems in the book that weren t on the homework. Contents 1 Basic continuous random variable problems 1 2 Expected value 2 3 Computational problems 2 4 Independent random variables 4 5 Marginal and conditional densities 5 6 Covariance/Variance/Expected Value 5 7 Chebyshev/Markov 9 8 Normal random variables 9 9 Normal approximations/ central limit theorem 11 1 Basic continuous random variable problems 1. The lifetime of certain battery is an exponential random variable X with mean 1 (in hours. Given that the battery has already lasted 1 hours, what is the probability that it will last another 1 hours? In other words, P (X 2 X 1 =? Hint: the density function of X is { 1 f X (x = 1 e x/1 if x otherwise Solution. By the memoryless property P (X 2 X 1 = P (X 1 = 1 = e x/1 ] 1 = e 1. e x/1 1 dx 2. Let X be a random variable with probability density function Find c. Then find P ( 2 < X < 1. f(x = ce x, < x <.

2 2 Expected value 3. A certain road is 1 miles long from point A to point B. There are bus repair stations at points A, B and C which is located 3 miles from point A. If the next accident occurs at a point that is uniformly distributed along the length of the road, what is the expected distance from the accident to the closest repair station? Solution. The question asks for E[g(X] where X is uniformly distributed on [, 1] and x x < 15 g(x = 3 x 15 x < 65 1 x 65 x. So the answer is E[g(X] = 1 = 1 ( 15 x dx + 1 = 1 1 g(x/1 dx x dx x 3 dx x dx ( 15 2 /2 + (3(15 ( /2 3(35 + ( /2 + (1(35 ( /2 4. You and your friend agree to meet at a coffee shop around noon. Suppose your arrivial time (in minutes after noon is exponential with mean 1 and your friends is also exponential with mean 1. Assuming the two arrival times are independent, how much time will you wait on average for your friend? It is OK to set up the integral without evaluating. 3 Computational problems 5. Let X be a continuous random variable with density { 1/x 1 < x < e f(x = otherwise (a Find the distribution function of X. (b P (X > 2 =? (c E[X] =? (d E[X 2 ] =? (e Find the distribution function of Y = X 2 (f Find the density function of Y. Solution. The distribution function of X is F X (t = P (X t = t 1 1/x dx = log(t for 1 t e (F X (t = if t < 1 and F X (t = 1 if t > e.

3 Let Y = X 2. So F Y, the distribution function of Y is F Y (y = P (Y y = P (X 2 y = P (X y = log( y = (1/2 log(y if 1 y e 2. The density function f Y is the derivative f Y (y = F Y (y = 1/(2y for 1 y e2 and f Y (y = otherwise. 6. Let X, Y be random variables with joint density function { cxy x y 2 f(x, y = otherwise where c > is a constant. In the questions below, it is OK to leave your answer in integral form without evaluating it. (a Find c. (b Find P (X + Y 2. (c Find E[X]. (d Find the density of Y. (e Find f X (x Y = 1, the density of X given that Y = 1. (f Are X and Y independent? Solution. Note 1 = So c = 1/2. 2 y For the second part, cxy dxdy = c X and Y are not independent. 2 P (X + Y 2 = y[x 2 /2] y dy = c 2 y 2 y 2 y 3 /2 dy = cy 4 /8] 2 = 2c. xy/2 dxdy. E[X] = xf X,Y (x, y dxdy f Y (y = f X,Y (x, y dx f X (x Y = 1 = f(x, 1 f(t, 1 dt. 7. Let X, Y be random variables with joint density function { c(x + y x 1, y 1 f(x, y = otherwise where c > is a constant. Compute the following. You can leave your answers in integral form. (a c =? (b P (X + Y 1

4 4 Independent random variables 8. Let X, Y be exponential random variables with parameters λ 1, λ 2. Suppose X and Y are independent. (a Compute the density function of X + Y. (b P (X + Y 1 =? 9. Let X and Y be uniformly distributed over the interval [, 1]. Compute the density function of X Y assuming X and Y are independent. Hint: find the distribution function of X Y first. 1. Let X, Y be independent exponential random variables such that E[X] = 1 and E[Y ] = 2. (a Find the joint density function of (X, Y. (b Find P (X 5, Y Suppose X, Y, Z are uniformly distributed over (, 1. Suppose they are also independent. (a Find P (X 2 + Y 2 Z. (b Compute P (X Y +Z. (It is OK to just set up the integral without computing. 12. You have two friends, Alice and Bob and they both promised to call you sometime after 7pm on Saturday. So you are sitting at home at 7pm on Saturday and you decide to go out with either Alice or Bob, whomever calls you first. Let A be the amount of time before Alice calls you, and B the amount of time before Bob calls you. Assume A and B are independent exponential random variables; the expectation of A is 6 minutes and the expectation of B is 3 minutes. What is the probability that you will go out with Alice? Solution. The question asks for P (B A. Note that the joint density function of (A, B is f(x, y = (1/6(1/3 exp( x/6 y/3 if x, y > are otherwise. So the answer is = = x y = (1/6(1/3 exp( x/6 y/3 dydx (1/6(1/3 exp( x/6 y/3 dxdy ( 1/3 exp( x/6 y/3 (1/3[exp( y/3 exp( y/6 y/3] dy = = 1 2/3 = 1/3. ] y x= dxdy (1/3[exp( y/3 exp( y/2] dy

5 5 Marginal and conditional densities 13. Suppose X, Y are random variables with joint density function { 4xy x, y 1 f X,Y (x, y = otherwise (a Find the density of X. (b Are X and Y independent? Solution. f X (x = f X,Y (x, y dy = 1 4xy dy = 2xy 2 ] 1 = 2x if x 1. It equals zero otherwise. Similarly, f Y (y = 2y for y 1 and equals zero otherwise. X and Y are independent because f X,Y (x, y = f X (xf Y (y. 14. Let (X, Y be uniformly distributed over the circle of radius 1 with center at (,. (a Compute the density of X. (b Compute f X (x Y = 6, the density of X conditioned on Y = 6. 6 Covariance/Variance/Expected Value 15. Suppose that every day, the price of a certain stock either increases by 3% or decreases by 1% and that the probability that it increases is 1/2. If it starts out at 1, after n days what is the expected value of the stock? You can assume the different days are independent. Solution. The answer is E[1 X 1 X n ] where each X i is a random variable satisfying P (X i = 1.3 = P (X i =.9 = 1/2. Because of independence this is 1 E[X 1 ] E[X n ]. Since E[X i ] = 1.1, the expected value of the stock is 1 (1.1 n. 16. Suppose X, Y, Z are pairwise uncorrelated (this means Cov(X, Y = Cov(Y, Z = Cov(X, Z = E[X] = E[Y ] = E[Z] = and Var(X = Var(Y = Var(Z = 1. Find Cov(X + Y, Z Y. Answer. Because they are uncorrelated Cov(X, Y = (for example. Because covariance is bilinear, Cov(X + Y, Z Y = Cov(X, Z Cov(X, Y + Cov(Y, Z Cov(Y, Y = Cov(Y, Y = Var(Y = 1.

6 17. Ten hunters are shooting at 2 different ducks. Each hunter aims at a duck at random and hits it with probability 1/2 independently of the other hunters. (a Let X be the number of hunters that hit a duck. What is E[X]? What is Var(X? (b What is the probability that the first hunter does not hit the first duck? (This means that either he does not aim at the first duck or he aims and misses. (c Let Y be the number of ducks hit. What is E[Y ]? What is Var(Y? Partial Solution. (a X = X X 1 where X i = 1 if the i-th hunter hits a duck and X i = otherwise. Since E[X i ] = P (X i = 1 = 1/2, we have E[X] = 5. (b This probability is 1 (1/2(1/2 = 39/4. (c Y = Y Y 2 where Y i = 1 if the i-th duck is hit. Since E[Y i ] = 1 P (Y i = and P (Y i = = (39/4 1 we have E[Y ] = 2(1 (39/ Suppose n people go to a dinner party. Each person checks in their hat. The host accidentally mixes up the hats and, in a vain attempt to avoid embarassment, hands them back to guests at random. Let X be the number of people that receive his/her hat back. Find the variance of X. (Hint: again you can write X = X X n for a suitable choice of X i s. Solution. Now Also if i j then Var(X = Var( i X i = n i=1 Var(X i + 2 i<j Cov(X i, X j. Var(X i = E[X 2 i ] E[X i ] 2 = E[X i ] E[X i ] 2 = (1/n (1/n 2. Cov(X i, X j = E[X i X j ] E[X i ]E[X j ] = (1/n(1/(n 1 (1/n 2. So ( n Var(X = n[(1/n (1/n 2 ] + 2 [(1/n(1/(n 1 (1/n 2 ] 2 = 1 (1/n + 1 (n 1/n = 1 1/n + 1/n = Let (X, Y be uniformly distributed over the triangle with vertices ( 1,, (1,, (, 1. Compute Cov(X, Y. Are X and Y positively correlated, negatively correlated or uncorrelated? Are X and Y independent? Solution. Because the triangle is symmetric about the y-axis, E[X] =. The joint density function is f(x, y = 1 for (x, y in the triangle and otherwise (because the triangle has area 1. So the covariance is Cov(X, Y = E[XY] E[X]E[Y] = 1 1 y y 1 xy dxdy.

7 Because (X, Y and ( X, Y have the same distribution we must have Cov(X, Y = Cov( X, Y. But this implies Cov(X, Y =. So we didn t actually have to compute it. X and Y are uncorrelated (because Cov(X, Y = but not independent. 2. Let (X, Y be uniformly distributed over the triangle with vertices (,, (1,, (, 1. Compute Cov(X, Y. Are X and Y positively correlated, negatively correlated or uncorrelated? Are X and Y independent? Solution. The area of the triangle is 1/2. So the density function = 2 on the triangle and outside of it. So 1 1 y ( 1 1 y ( 1 1 y Cov(X, Y = E[XY] E[X]E[Y] = 2xy dxdy 2x dxdy 2y dxdy. X and Y are negatively correlated (because Cov(X, Y < and so are dependent. 21. Let (X, Y be uniformly distributed over the triangle with vertices (,, (1,, (, 2. (a Compute Cov(X, Y. (You can leave your answer in terms of integrals without evaluating (b Are X and Y positively correlated, negatively correlated or uncorrelated? Solution. The area of the triangle is 1. So the density function = 1 on the triangle and outside of it. The triangle is bounded by the x-axis, the y-axis and the line y = 2 2x. Solving for x we have x = 1 y/2. So 2 1 y/2 ( 2 1 y/2 ( 2 Cov(X, Y = E[XY] E[X]E[Y] = xy dxdy x dxdy X and Y are negatively correlated (because Cov(X, Y < and so are dependent. The density of X is The density of Y is So you could also write Cov(X, Y = Note y/2 2 f X (x = f Y (y = 2 2x 1 y/2 ( 1 xy dxdy dy = 2 2x dx = 1 y/2. ( 2 x(2 2x dx y(1 y/2 dy. ] 1 x(2 2x dx = x 2 2x 3 /3 = 1 2/3 = 1/3. ] 2 y(1 y/2 dy = y 2 /2 y 3 /6 = 2 8/6 = 2/3. 1 y/2 y dxdy.

8 22. Roll two dice. Let X be the number on the first die and Y be the number of the second die. (a Compute Cov(X + Y, X Y. (b Are X + Y and X Y independent? Explain! Solution. Because of bilinearity, Cov(X + Y, X Y = Cov(X, X Cov(X, Y + Cov(Y, X Cov(Y, Y. Because X and Y are independent, Cov(X, Y = Cov(Y, X =. Because X and Y are identically distributed, Cov(X, X = Cov(Y, Y. So Cov(X + Y, X Y =. Are they independent? Well suppose X + Y = 12. Then we know X = Y = 6 and therefore X Y =. This is nontrivial information. So they cannot be independent. To be more rigorous, P (X + Y = 12, X Y = = 1/36. However, P (X + Y = 12P (X Y = = (1/36(1/6 = 1/216. So they are not independent. 23. Suppose an urn contains 3 red balls, and 7 blue balls. You pick 2 balls at random without replacement. Let X be the number of red balls chosen. Compute the variance of X. 24. Suppose an urn contains 3 red balls, 4 blue balls and 5 yellow balls (for a total of 12. You pick 12 balls at random. Let X be the number of red balls chosen. For the questions below, compute the answer exactly without any long summations. (a Assume that the balls are chosen with replacement. Find E[X]. (b Assume that the balls are chosen without replacement. Find E[X]. Solution. X = X 1 + +X 12 where X i = 1 if the i-th ball is red and X i = otherwise. Since E[X i ] = 3/12 = 1/4, E[X] = (12/4 = 3. This answer does not depend on whether or not the balls or chosen with replacement. Alternative Solution. X = X 1 + +X 3 where X i = 1 if the i-th red ball is selected at some time and X = otherwise. Then ( E[X i ] = ( 12 = = So E[X] = 3/1 = 3. Again, this does not depend on whether or not the balls or chosen with replacement. 25. (Balls and Boxes Suppose there are n balls and n boxes. Each ball is placed in a box at random. Each box can contain an unlimited number of balls. Moreover the function assigning to balls to boxes is equally likely to be any of the n n functions. Find the expected value and the variance of the number of empty boxes.

9 7 Chebyshev/Markov 26. If a post office handles, on average, 1, letters a day, then provide an upper bound on the probability that they will handle more than 2, letters tomorrow. Use Markov s inequality. Answer. P (X 2 E[X]/2 = 1/ If the number of letters a post office handles in any given day is a random variable with mean 1, and variance 2, then use Chebyshev s inequality to provide an upper bound on the probability that they will handle more than 2, letters tomorrow Answer. Because X always, P (X 2 = P ( X 1 1 = 2 1, = 1 5, =.2. 2, (1, 2 Notice how much smaller.2 is than 1/2. So this estimate is much better than the previous one. The moral of the story is that it helps to know the variance is of modest size! (We would not have achieved a better bound if the variance was 5,,. 28. Suppose X is a non-negative random variable and P (X 1 = 1/2. What is the smallest possible value of E[X]? Solution. Since 1/2 = P (X 1 E[X]/1 we have 5 E[X]. The smallest possible value is 5. This occurs when X = 1 with probability 1/2 and X = with probability 1/2. 8 Normal random variables 29. Suppose that X 1, X 2,... are i.i.d. normal random variables with mean, variance 1 and S n = n i=1 X i. Determine the smallest value of n so that P ( Sn n Hint: if χ is the standard normal then P ( χ 2 =.95. Solution. The standard deviation of S n is n. So S n / n is the standard normal. So P ( S n /n 1 = P ( S n / n n P ( χ n. So n = You and your friend agree to meet at noon. Your arrival time in minutes after noon is normally distributed with mean and standard deviation 3. Your friend s arrival time in minutes after noon is normally distributed with mean 1 and standard deviation 4. Assume that the two arrival times are independent. What is the probability that you will have to wait at least 4 minutes? Solution. Let X be your arrival time and Y your friend s arrival time. The question asks P (Y X 4. Note Y X is normally distributed with mean 1 and variance = 5 2. So P (Y X 4 = P ( Y X 1 5 3/5 = 1 Φ(3/5 = Φ( 3/5.

10 31. The distributions of the grades of the students of probability and calculus at a certain university are normally distributed with parameters µ = 65, σ 2 = 418 and µ = 72, σ 2 = 448 respectively. Dr. Olwell teaches a probability section with 22 students and a calculus section with 28 students. What is the probability that the difference between the averages of the final grades of these two classes is at least 2? You may assume independence. Write your answer in terms of the function Φ(t = 1 t /2 2π e x2 dx. Solution. Let X and Y be the averages of the final grades in the probability and calculus classes respectively. The question asks for P ( X Y 2. I m assuming that X = (X X 22 /22 where each X i is a normal random variable with mean 65 and variance 418 (and similarly for Y. Therefore X is a normal random variable with mean 65 and variance 418/22 (to see this observe that V ar(x 1 + +X 22 = because these random variables are independent; next use the fact that V ar(tz = t 2 V ar(z for any constant t and any random variable Z. Similarly, Y is a normal random variable with 72 and variance 448/28. So X Y is a normal random variable with µ = = 7 and variance σ 2 = 418 Now P ( X Y 2 = P (X Y 2 + P (2 X Y = P X Y P = Φ Φ X Y A weighted coin lands on heads with probability.8. Suppose it is flipped 1 times and let X denote the number of times it lands on heads. (a What is E[X]? (b What is Var(X? (c Use normal random variables to approximate P (X 9. Leave your answer in terms of the Φ-function. Answer. X is a binomial random variable, E[X] = np = 8, SD(X = np(1 p = (1(.8(.2 = 4. So P (X 9 = P ( X = 1 P ( X Φ( numbers are selected independently and uniformly from the interval [, 1]. Use the central limit theorem to approximate the probability that the average of these numbers is within.1 of 1/2. Hint: the standard deviation of the random variable X that is 1 uniformly distributed in [, 1] is 12. Leave your answer in terms of the Φ-function. 34. Toss a fair coin twice. You win $1 if at least one of the two tosses comes out heads. You neither win nor lose anything otherwise.

11 (a Assume that you play this game 3 times. What is, approximately, the probability that you win at least $25? (Use the normal approximation. Solution. Let S = X X 3 where X i = 1 is you win a dollar on the i-th game and X i = otherwise. Then E[X i ] = 3/4 and E[S] = 3 3/4 = 225. Note SD(X i = 3/4 1/4 = 3/4. So SD(S = 3 3/4 = 3/4 = 7.5. P (S 25 = P (S = P ( S P (χ = 1 Φ( (b Approximately how many times do you need to play so that you win at least $25 with probability at least.99? Solution. Let n be the number of times you play. So S = X X n and E[S] = 3n/4, SD(S = 3n/4. Then P (S 25 = P (S = P ( S 3n/4 3n/ n/4 3n/4 P (χ n/ n/4 = 1 Φ(. 3n/4 3n/4 So to solve this problem, find a number x so that 1 Φ(x =.99. Then solve for n in the equation n/4 3n/4 = x. 9 Normal approximations/ central limit theorem numbers are selected independently and uniformly from the interval [ 1/2, 1/2]. Use the normal approximation to approximate the probability that the average of these numbers lies in the interval (.1,.1. You can leave your answer in terms of the Φ- function. Solution. If S denotes the sum of the numbers then Var(S = 12/12 = 1. So the standard deviation of S is 1 = 1. ( P.1 S 12.1 = P ( 12 S1 12 P ( χ 12 = 2Φ( For a survey, we contact N people at random and ask them whether they plan to vote for candidate A. How large should N be so that there is at least a 99% chance that the sample mean determined by the survey is within 1% of the true mean? Use the normal approximation. Solution. If ˆp is percentage of people in our survey who plan to vote for candidate A and ˆσ is the standard deviation of ˆp then there is a 99% chance that p [ˆp 3ˆσ, ˆp + 3ˆσ].

12 Now So we want to choose N so that In other words, ˆσ = p(1 p N 1 2 N. N 3 2 N.1 ( 2 3 = Suppose we contact 1, people at random and ask them whether they plan to vote for candidate A. Suppose that 6% of them say yes. Give a 95% confidence interval for the true percentage of people that plan to vote for A. Solution. If ˆp is the sample mean then the standard deviation of ˆp is p(1 p p(1 p =. 1 1 We don t know what the true percentage p is. However, we can either substitute.6 for p above or use the fact that p(1 p 1/4 for any p. In the first case we end up with [ [.6 2 σ, σ] = , ] With the second method we end up with [.59,.61] for our 95% confidence interval. 38. A fair die is rolled 1 times. Use the central limit theorem to approximate the probability that the sum of the outcomes is more than 4. Leave your answer in terms of the Φ-function. You do not have to simplify. Hint: if X is the value of a single die roll then E[X] = 3.5 = 7 35 and Var(X = Solution. Let X i be the result of the i-th roll and S = X X 1. So E[S] = 1(3.5 = 35 and Var(S = 1 35 = 35. So S 35 P (S 4 = P (S 35 5 = P ( (35/ Φ(. (35/12 5 (35/ An astronomer is trying to estimate the distance from her observatory to a certain star. She makes 1 different measurements of this distance and takes the average of these 1 measurements to be her estimate. Suppose that the error in each measurement is a continuous random variable with mean and variance σ 2 = 4 light years. Suppose also that the measurements are independent. Use the Central Limit Theorem to estimate the probability that her estimate of the distance is less than the true distance by at least 1 light-year. Leave your answer in terms of the Φ-function. Hint: if X i is the error in the i-th measurement, then difference between her estimate and the true error is the mean Y = (X X 1 /1. The question asks P (Y 1 (which is the same thing as P (Y 1.

13 Solution. Let X i be the error in the i-th measurement. This is the measured distance minus the true distance. So E[X i ] = and V ar(x i = 4. Let S = X X 1. So E[S] =, V ar(s = 4 and SD(S = 2. This random variable is continuous, so there is no need to make a histogram correction. Her estimate of the distance is S/1 plus the true distance. So we are interested in P (S/1 1 (this is the probability that her estimate of the distance is less than the true distance by at least 1 light-year: P (S/1 1 = P (S/2 5 P (χ 5 = Φ( 5 = 1 Φ(5. 4. Suppose that 4% of all voters in this county are Republicans. If 1 are selected at random what is the probability that over 5 Republicans are selected? Use normal random variables to obtain an approximation. Leave your answers in terms of the Φ-function. Solution. The question asks P (S 1 > 5 where S 1 is a Binomial random variable with n = 1, p =.4. Using the histogram correction, this is the same as ( S P (S = P. 1(.4(.6 1(.4(.6 By the Central Limit Theorem, this is approximately equal to ( P χ 1.5 ( 1.5 = 1 Φ To get a rough estimate, note that 24 5, so this is about 1 Φ( A fair die is rolled 2 times. Use the central limit theorem to approximate the probability that the sum of the outcomes is between 65 and 75. Leave your answer in terms of the Φ-function. Hint: if X denote the value of a single die roll then E[X] = 7 and 2 V ar(x = Solution. Let X i be the result of the i-th roll and S = X X 2. So E[S] = = 7 and V ar(s = = 5 35 = 175. So ( P (65 S 75 = P (64.5 S 75.5 = P ( S 7 = P SD(S n 5.5 SD(S n SD(S n S 7 SD(S n P ( χ 5.5 SD(S n = 2P (χ 5.5/ 175/3 1 = 2Φ(5.5/ 175/ SD(S n To get a rough estimate, 175/3 58 and and 5.5/7.5 = 11/15 = 66/9.63.

14 Some helpful formulae (this will be on the exam; no need to memorize it Type of Random Variable Density or Mass Function Expected Value Variance Binomial(n, p P (X = k = ( n k p k (1 p n k ( k n np np(1 p λ λk Poisson(λ P (X = k = e (k N λ λ k! Geometric(p P (X = k = (1 p k 1 1 p p, (k N 1/p p 2 Exponential(λ f(x = λe λx 1 1, (x λ λ 2 Normal(µ, σ f(x = 1 /2σ 2 2πσ 2 e (x µ2 µ σ 2 Uniform over an interval [a, b] Var(X = E[(X µ 2 ] = E[X 2 ] E[X] 2. f(x = b a a+b (x [a, b] 2 2 Cov(X, Y = E[(X E[X](Y E[Y ]] = E[XY ] E[X]E[Y ]. (Markov s inequality P (X P (X t E[X]/t. (Chebyshev s inequality P ( X µ t Var(X t 2. (b a 2 12

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Final Exam. Math Su10. by Prof. Michael Cap Khoury

Final Exam. Math Su10. by Prof. Michael Cap Khoury Final Exam Math 45-0 Su0 by Prof. Michael Cap Khoury Name: Directions: Please print your name legibly in the box above. You have 0 minutes to complete this exam. You may use any type of conventional calculator,

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in: STAT/MA 46 Midterm Exam 3 Monday, November 9, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

Expectation, inequalities and laws of large numbers

Expectation, inequalities and laws of large numbers Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator

More information

CSE 312 Foundations, II Final Exam

CSE 312 Foundations, II Final Exam CSE 312 Foundations, II Final Exam 1 Anna Karlin June 11, 2014 DIRECTIONS: Closed book, closed notes except for one 8.5 11 sheet. Time limit 110 minutes. Calculators allowed. Grading will emphasize problem

More information

MATH 407 FINAL EXAM May 6, 2011 Prof. Alexander

MATH 407 FINAL EXAM May 6, 2011 Prof. Alexander MATH 407 FINAL EXAM May 6, 2011 Prof. Alexander Problem Points Score 1 22 2 18 Last Name: First Name: USC ID: Signature: 3 20 4 21 5 27 6 18 7 25 8 28 Total 175 Points total 179 but 175 is maximum. This

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Math 493 Final Exam December 01

Math 493 Final Exam December 01 Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better. MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 12, 2001 Student s name Be sure to show all your work. Each problem is 4 points. Full credit will be given for 9 problems (36 points). You are welcome

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Question 1. Suppose you want to estimate the percentage of

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Midterm Exam 1 (Solutions)

Midterm Exam 1 (Solutions) EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name

More information

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final PRINT Your Name:, (last) SIGN Your Name: (first) PRINT Your Student ID: CIRCLE your exam room: 220 Hearst 230 Hearst 237

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

December 2010 Mathematics 302 Name Page 2 of 11 pages

December 2010 Mathematics 302 Name Page 2 of 11 pages December 2010 Mathematics 302 Name Page 2 of 11 pages [9] 1. An urn contains red balls, 10 green balls and 1 yellow balls. You randomly select balls, without replacement. (a What ( is( the probability

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm) Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2 IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014. EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

Probability Theory: Homework problems

Probability Theory: Homework problems June 22, 2018 Homework 1. Probability Theory: Homework problems 1. A traditional three-digit telephone area code is constructed as follows. The first digit is from the set {2, 3, 4, 5, 6, 7, 8, 9}, the

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam. Probability and Statistics FS 2017 Session Exam 22.08.2017 Time Limit: 180 Minutes Name: Student ID: This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided

More information

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

List the elementary outcomes in each of the following events: EF, E F, F G, EF c, EF G. For this problem, would you care whether the dice are fair?

List the elementary outcomes in each of the following events: EF, E F, F G, EF c, EF G. For this problem, would you care whether the dice are fair? August 23, 2013 Homework 1. Probability Theory: Homework problems 1. A traditional three-digit telephone area code is constructed as follows. The first digit is from the set 2, 3, 4, 5, 6, 7, 8, 9}, the

More information

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them Math 302.102 Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them solely for their final exam preparations. The final exam

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Final Review: Problem Solving Strategies for Stat 430

Final Review: Problem Solving Strategies for Stat 430 Final Review: Problem Solving Strategies for Stat 430 Hyunseung Kang December 14, 011 This document covers the material from the last 1/3 of the class. It s not comprehensive nor is it complete (because

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Discrete Structures Prelim 1 Selected problems from past exams

Discrete Structures Prelim 1 Selected problems from past exams Discrete Structures Prelim 1 CS2800 Selected problems from past exams 1. True or false (a) { } = (b) Every set is a subset of its power set (c) A set of n events are mutually independent if all pairs of

More information

CH5 CH6(Sections 1 through 5) Homework Problems

CH5 CH6(Sections 1 through 5) Homework Problems 550.40 CH5 CH6(Sections 1 through 5) Homework Problems 1. Part of HW #6: CH 5 P1. Let X be a random variable with probability density function f(x) = c(1 x ) 1 < x < 1 (a) What is the value of c? (b) What

More information

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999. Math 447. 1st Homework. First part of Chapter 2. Due Friday, September 17, 1999. 1. How many different seven place license plates are possible if the first 3 places are to be occupied by letters and the

More information

For a list of topics, look over the previous review sheets. Since the last quiz we have... Benford s Law. When does it appear? How do people use it?

For a list of topics, look over the previous review sheets. Since the last quiz we have... Benford s Law. When does it appear? How do people use it? Here are a whole lot of problems! I will keep browsing good sources of problems and posting them here until the last day of class. As always, Grinstead and Snell, Ross and problems from previous courses

More information

December 2010 Mathematics 302 Name Page 2 of 11 pages

December 2010 Mathematics 302 Name Page 2 of 11 pages December 2010 Mathematics 302 Name Page 2 of 11 pages [9] 1. An urn contains 5 red balls, 10 green balls and 15 yellow balls. You randomly select 5 balls, without replacement. What is the probability that

More information

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis 6.04/6.43: Probabilistic Systems Analysis Question : Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given.

More information

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes? Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

ISyE 3044 Fall 2017 Test #1a Solutions

ISyE 3044 Fall 2017 Test #1a Solutions 1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information