Math 151. Rumbos Spring Solutions to Review Problems for Exam 3
|
|
- Dulcie Margery Caldwell
- 5 years ago
- Views:
Transcription
1 Math 151. Rumbos Spring Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least m pages which contain more than k missprints? Solution: Let Y denote the number of misprints in one page. Then, we may assume that Y follows a Poisson(λ) distribution; so that Pr[Y = r] = λr r! e λ, for r = 0, 1, 2,... Thus, the probability that there will be more than k missprints in a given page is p = Pr(Y > k) so that = 1 Pr(Y k), p = 1 k r=0 λ r r! e λ. (1) Next, let X denote the number of the pages out of the n that contain more than k missprints. Then, X Binomial(n, p), where p is as given in (1). Then the probability that there will be at least m pages which contain more than k missprints is Pr[X m] = n l=m ( ) n p l (1 p) n l, l where p = 1 k r=0 λ r r! e λ. 2. Suppose that the total number of items produced by a certain machine has a Poisson distribution with mean λ, all items are produced independently of one another, and the probability that any given item produced by the machine will be defective is p. Let X denote the number of defective items produced by the machine.
2 Math 151. Rumbos Spring (a) Determine the marginal distribution of the number of defective items, X. Solution: Let N denote the number of items produced by the machine. Then, N Poisson(λ), (2) so that Pr[N = n] = λn n! e λ, for n = 0, 1, 2,... Now, since all items are produced independently of one another, and the probability that any given item produced by the machine will be defective is p, X has a conditional distribution (conditioned on N = n) that is Binomial(n, p); thus, ( ) n p k (1 p) n k, for k = 0, 1, 2,..., n; k Pr[X = k N = n] = 0 elsewhere. (3) Then, Pr[X = k] = Pr[X = k, N = n] n=0 = Pr[N = n] Pr[X = k N = n], n=0 where Pr[X = k N = n] = 0 for n < k, so that, using (2) and (3), Pr[X = k] = n=k = e λ k! λ n n! e λ p k n=k ( ) n p k (1 p) n k k λ n 1 (n k)! (1 p)n k. Next, make the change of variables l = n k in the last summation in (4) to get (4) Pr[X = k] = e λ k! p k l=0 λ l+k 1 l! (1 p)l,
3 Math 151. Rumbos Spring so that Pr[X = k] = (λp)k k! = (λp)k k! e λ l=0 e λ e λ(1 p) 1 [λ(1 p)]l l! which shows that = (λp)k k! e λp, X Poisson(λp). (5) (b) Let Y denote the number of non defective items produced by the machine. Show that X and Y are independent random variables. Solution: Similar calculations to those leading to (5) show that Y Poisson(λ(1 p)), (6) since the probability of an item coming out non defective is 1 p. Next, observe that Y = N X and compute the joint probability Pr[X = k, Y = l] = Pr[X = k, N = k + l] = Pr[N = k + l] Pr[X = k N = k + l] = by virtue of (2) and (3). Thus, λ k+l (k + l)! e λ ( k + l k ) p k (1 p) l Pr[X = k, Y = l] = λk+l k! l! e λ p k (1 p) l where = λk+l k! l! e λ p k (1 p) l, e λ = e [p+(1 p)]λ = e pλ e (1 p)λ.
4 Math 151. Rumbos Spring Thus, Pr[X = k, Y = l] = (pλ)k k! e pλ [(1 p)λ]l l! e (1 p)λ = p X (k) p Y (l), in view of (5) and (6). Hence, X and Y are independent. 3. Suppose that the proportion of color blind people in a certain population is Estimate the probability that there will be more than one color blind person in a random sample of 600 people from that population. Solution: Set p = and n = 600. Denote by Y the number of color blind people in the sample. The, we may assume that Y Binomial(n, p). Since p is small and n is large, we may use the Poisson approximation to the binomial distribution to get where λ = np = 3. Then, Pr[Y = k] λk k! e λ, Pr[Y > 1] = 1 Pr[Y 1] 1 e 3 3e Thus, the probability that there will be more than one color blind person in a random sample of 600 people from that population is about 80%. 4. An airline sells 200 tickets for a certain flight on an airplane that has 198 seats because, on average, 1% of purchasers of airline tickets do not appear for departure of their flight. Estimate the probability that everyone who appears for the departure of this flight will have a seat.
5 Math 151. Rumbos Spring Solution: Set p = 0.01, n = 200 and let Y denote the number of ticket purchasers that do not appear for departure. Then, we may assume that Y Binomial(n, p). We want to estimate the probability Pr[Y 2]. Using the Poisson(λ), with λ = np = 2, approximation to the distribution of Y, we get Pr[Y 2] = 1 Pr[Y 1] 1 e 2 2e Thus, the probability that everyone who appears for the departure in this flight will have a seat is about 59.4%. 5. Let X denote a positive random variable such that ln(x) has a Normal(0, 1) distribution. (a) Give the pdf of X and compute its expectation. Solution: Set Z = ln(x), so that Z Normal(0, 1); thus, f Z (y) = 1 2π e z2 /2, for z R. (7) Next, compute the cdf for X, F X (x) = Pr(X x), for x > 0, to get F X (x) = Pr[ln(X) ln(x)] = Pr[Z ln(x)] so that F X (x) = = F Z (ln(x)), { F Z (ln(x)), for x > 0; 0 for x 0. (8) Differentiating (8) with respect to x, for x > 0, we obtain f X (x) = F Z (ln(x)) 1 x,
6 Math 151. Rumbos Spring so that f X (x) = f Z (ln(x)) 1 x, (9) where we have used the Chain Rule. Combining (7) and (9) yields 1 e (ln x)2 /2, for x > 0; f X (x) = 2π x (10) 0 for x 0. In order to compute the expected value of X, use the pdf in (10) to get E(X) = = 0 xf X (x) dx 1 2π e (ln x)2 /2 dx. (11) Make the change of variables u = ln x in the last integral in (11) to get du = 1 x dx, so that dx = eu du and E(X) = 1 2π e u u2 /2 du. (12) Complete the square in the exponent of the integrand in (12) to obtain E(X) = e 1/2 1 e (u 1)2 /2 du. (13) 2π Next, make the change of variables w = u 1 for the integral in (13) to get (b) Estimate Pr(X 6.5). E(X) = e 1/2 1 2π e w2 /2 dw = e. Solution: Use the result in (8) to compute Pr(X 6.5) = F Z (ln(6.5)), where Z Normal(0, 1). Thus, or about 97%. Pr(X 6.5) = F Z (1.8718) = ,
7 Math 151. Rumbos Spring Forty seven digits are chosen at random and with replacement from {0, 1, 2,..., 9}. Estimate the probability that their average lies between 4 and 6. Solution: Let X 1, X 2,..., X n, where n = 47, denote the 47 digits. Since the sampling is done without replacement, the random variables X 1, X 2,..., X n are identically uniformly distributed over the digits {0, 1, 2,..., 9} with pmf given by 1, for k = 0, 1, 2,..., 9; 10 p X (k) = (14) 0, elsewhere. Consequently, the mean of the distribution is 9 µ = kp X (k) = 1 9 k = = (15) k=0 k=1 Before we compute the variance, we first compute the second moment of X: 9 9 E(X 2 ) = k 2 p X (k) = k 2 p X (k); k=0 thus, using the pmf of X in (14), E(X 2 ) = = 1 10 k=1 = 3 (19) 2 = Thus, the variance of X is k 2 k=1 9 (9 + 1)( ) 6 σ 2 = E(X 2 ) µ 2 = = 33 4 ;
8 Math 151. Rumbos Spring so that σ 2 = (16) We would like to estimate or where µ is given in (15), so that Pr(4 X n 6), Pr(4 µ X n µ 6 µ), Pr(4 X n 6) = Pr( 0.5 X n µ 1.5) (17) Next, divide the last inequality in (17) by σ/ n, where σ is as given in (16), to get ( Pr(4 X n 6) = Pr 1.19 X n µ σ/ n ) 3.58 (18) For n = 47 can be considered a large sample size, we can apply the Central Limit Theorem to obtain from (18) that Pr(4 X n 6) Pr ( 1.19 Z 3.58), where Z Normal(0, 1). (19) It follows from (19) and the definition of the cdf that Pr(4 X n 6) F Z (3.58) F Z ( 1.19), (20) where F Z is the cdf of Z Normal(0, 1). Using the symmetry of the pdf of Z Normal(0, 1), we can re write (20) as Pr(4 X n 6) F Z (1.19) + F Z (3.58) 1. (21) Finally, using a table of standard normal probabilities, we obtain from (21) that Pr(4 X n 6) = Thus, the probability that the average of the 47 digits is between 4 and 6 is about 88.3%.
9 Math 151. Rumbos Spring Let X 1, X 2,..., X 30 be independent random variables each having a discrete distribution with pmf: 1/4, if x = 0 or x = 2; p(x) = 1/2, if x = 1; 0, otherwise. Estimate the probability that X 1 + X X 30 is at most 33. Solution: First, compute the mean, µ = E(X), and variance, σ 2 = Var(X), of the distribution: µ = = 1. (22) σ 2 = E(X 2 ) [E(X)] 2, (23) where E(X 2 ) = = 1.5; (24) 4 so that, combining (22), (23) and (24), σ 2 = = 0.5. (25) Next, let Y = n X k, where n = 30. We would like to estimate k=1 Pr[Y 33], using the continuity correction, Pr[Y 33.5], (26) By the Central Limit Theorem ( ) Y nµ Pr Pr(Z z), n σ for z R, (27) where Z Normal(0, 1), µ = 1, σ 2 = 1.5 and n = 30. It follows from (27) that we can estimate the probability in (26) by Pr[Y 33.5] Pr(Z 0.52) = (28) Thus, according to (28), the probability that X 1 + X X 30 is at most 33 is about 70%.
10 Math 151. Rumbos Spring Roll a balanced die 36 times. Let Y denote the sum of the outcomes in each of the 36 rolls. Estimate the probability that 108 Y 144. Suggestion: Since the event of interest is (Y {108, 109,..., 144}), rewrite Pr(108 Y 144) as Pr(107.5 < Y 144.5). Solution: Let X 1, X 2,..., X n, where n = 36, denote the outcomes of the 36 rolls. Since we are assuming that the die is balanced, the random variables X 1, X 2,..., X n are identically uniformly distributed over the digits {1, 2,..., 6}; in other words, X 1, X 2,..., X n is a random sample from the discrete Uniform(6) distribution. Consequently, the mean of the distribution is µ = = 3.5, (29) 2 and the variance is We also have that σ 2 = (6 + 1)(6 1) 12 Y = n X k, k=1 where n = 36. By the Central Limit Theorem, ( nµ Pr(107.5 < Y 144.5) Pr < Z nσ = (30) ) nµ, nσ (31) where Z Normal(0, 1), n = 36, and µ and σ are given in (29) and (30), respectively. We then have from (31) that Pr(107.5 < Y 144.5) Pr ( 1.81 < Z 1.81) F Z (1.81) F Z ( 1.81) 2F Z (1.81) 1 2(0.9649) ; so that the probability that 108 Y 144 is about 93%.
11 Math 151. Rumbos Spring Let Y Binomial(100, 1/2). Use the Central Limit Theorem to estimate the value of Pr(Y = 50). Solution: We use the continuity correction and estimate Pr(49.5 < Y 50.5). By the Central Limit Theorem, ( 49.5 nµ Pr(49.5 < Y 50.5) Pr < Z nσ where Z Normal(0, 1), n = 100, and nµ = 50 and ( 1 σ = 1 1 ) = We then obtain from (32) that Pr(49.5 < Y 50.5) Pr ( 0.1 < Z 0.1) ) 50.5 nµ, (32) nσ F Z (0.1) F Z ( 0.1) Thus, or about 8%. 2F Z (0.1) 1 2(0.5398) Pr(Y = 50) 0.08, 10. Let Y Binomial(n, 0.55). Find the smallest value of n such that, approximately, Pr(Y/n > 1/2) (33) Solution: By the Central Limit Theorem, Y n 0.55 (0.55)(1 0.55)/ n D Z Normal(0, 1) as n. (34)
12 Math 151. Rumbos Spring Thus, according to (33) and (34), we need to find the smallest value of n such that ( ) Pr Z > (0.4975)/ 0.95, n or ( ) n Pr Z > (35) 10 The expression in (35) is equivalent to ( ) n 1 Pr Z 0.95, 10 which can be re-written as 1 F Z ( ) n 0.95, (36) 10 where F Z is the cdf of Z Normal(0, 1). By the symmetry of the pdf for Z Normal(0, 1), (36) is equivalent to ( ) n F Z (37) 10 The smallest value of n for which (37) holds true occurs when n 10 z, (38) where z is a positive real number with the property F Z (z ) = (39) The equality in (39) occurs approximately when z = (40) It follows from (38) and (40) that (33) holds approximately when n 1.645, 10 or n Thus, n = 271 is the smallest value of n such that, approximately, Pr(Y/n > 1/2) 0.95.
13 Math 151. Rumbos Spring Let X 1, X 2,..., X n be a random sample from a Poisson distribution with mean n λ. Thus, Y = has a Poisson distribution with mean nλ. Moreover, by i=1 X i the Central Limit Theorem, X = Y/n has, approximately, a Normal(λ, λ/n) distribution for large n. Show that u(y/n) = Y/n is a function of Y/n which is essentially free of λ. Solution: We will show that, for large values of n, the distribution of 2 ( Y n n ) λ (41) is independent of λ. In fact, we will show that, for large values of n, the distribution of the random variables in (41) can be approximated by a Normal(0, 1) distribution. First, note that by the Law of Large Numbers, Y n Pr λ, as n. Thus, for large values of n, we can approximate u(y/n) by its linear approximation around λ ( ) Y u(y/n) u(λ) + u (λ) n λ, (42) where u (λ) = 1 2 λ. (43) Combining (42) and (43) we see that, for large values of n, Y n λ 1 2 n Since, by the Central Limit Theorem, Y n λ. (44) λ n Y n λ λ D Z Normal(0, 1) as n, (45) n
14 Math 151. Rumbos Spring it follows from (44) and (45) that 2 ( Y n n ) D λ Z Normal(0, 1) as n, which was to be shown. 12. Suppose that factory produces a number X of items in a week, where X can be modeled by a random variable with mean 50. Suppose also that the variance for a week s production is known to be 25. What can be said about the probability that this week s production will be between 40 and 60? Solution: Write where µ = 50, so that Pr(40 < X < 60) = Pr( 10 < X µ < 10), Pr(40 < X < 60) = Pr( X µ < 2σ), where σ = 5. We can then write Pr(40 < X < 60) = 1 Pr( X µ 2σ); where, using Chebyshev s inequality, Consequently, Pr( X µ 2σ) σ2 4σ 2 = 1 4. Pr(40 < X < 60) = 3 4 ; so that the he probability that this week s production will be between 40 and 60 is at least 75%. 13. Let (X n ) denote a sequence of nonnegative random variables with means µ n = E(X n ), for each n = 1, 2, 3,.... Assume that lim µ n = 0. Show that X n n converges in probability to 0 as n. Solution: We need to show that, for any ε > 0, lim Pr( X n < ε) = 1. (46) n
15 Math 151. Rumbos Spring We will establish (46) by first applying the Markov inequality: Given a random variable X and any ε > 0, to X = X n, for each n Observe that Pr( X ε) E( X ), (47) ε E( X n ) = E(X n ) = µ n, for all n, since we are assuming that X n is nonnegative. It then follows from (47) that Pr( X n ε) µ n, ε for n = 1, 2, 3,... (48) It follows from (48) that so that Pr( X n < ε) 1 µ n, for n = 1, 2, 3,... ε 1 µ n ε Pr( X n < ε) 1, for n = 1, 2, 3,... (49) Next, use the assumption that lim n µ n = 0 and the Squeeze Lemma to obtain from (49) that lim Pr( X n < ε) = 1, n which is the statement in (46). Hence, X n converges in probability to 0 as n. 14. Let X 1, X 2,..., X n be a random sample from a Poisson distribution with mean n λ. Thus, Y n = has a Poisson distribution with mean nλ. Moreover, by i=1 X i the Central Limit Theorem, X n = Y n /n has, approximately, a Normal(λ, λ/n) distribution for large n. Show that, for large values of n, the distribution of 2 ( Yn n n ) λ is independent of λ. Solution: See solution to Problem 11.
Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 1
Math 5. Rumbos Spring 04 Solutions to Review Problems for Exam. There are 5 red chips and 3 blue chips in a bowl. The red chips are numbered,, 3, 4, 5 respectively, and the blue chips are numbered,, 3
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 2
Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,
More informationMath 151. Rumbos Fall Solutions to Review Problems for Final Exam
Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white
More informationTwelfth Problem Assignment
EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationFundamental Tools - Probability Theory IV
Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationMATH 250 / SPRING 2011 SAMPLE QUESTIONS / SET 3
MATH 250 / SPRING 2011 SAMPLE QUESTIONS / SET 3 1. A four engine plane can fly if at least two engines work. a) If the engines operate independently and each malfunctions with probability q, what is the
More informationMathematical statistics
October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:
More informationMath 151. Rumbos Fall Solutions to Review Problems for Final Exam
Math 5. Rumbos Fall 7 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card is red on one side and white
More informationFinal Solutions Fri, June 8
EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationDiscrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Fall 20 Rao Midterm 2 Solutions True/False. [24 pts] Circle one of the provided answers please! No negative points will be assigned for incorrect answers.
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationLecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014
Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete
More informationMath 151. Rumbos Spring Solutions to Assignment #15
Math 151. Rumbos Spring 28 1 Solutions to ssignment #15 1. Let F (X,Y ) be the joint cdf of two random variables X and Y. For real constants a < b, c < d, show that Pr(a < X b, c < Y d) F (X,Y ) (b, d)
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationIE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.
Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationSTAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:
STAT/MA 46 Midterm Exam 2 Thursday, October 8, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing
More information1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise
Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationSTAT 6385 Survey of Nonparametric Statistics. Order Statistics, EDF and Censoring
STAT 6385 Survey of Nonparametric Statistics Order Statistics, EDF and Censoring Quantile Function A quantile (or a percentile) of a distribution is that value of X such that a specific percentage of the
More informationExercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov
Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationApplied Statistics I
Applied Statistics I (IMT224β/AMT224β) Department of Mathematics University of Ruhuna A.W.L. Pubudu Thilan Department of Mathematics University of Ruhuna Applied Statistics I(IMT224β/AMT224β) 1/158 Chapter
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationMath 447. Introduction to Probability and Statistics I. Fall 1998.
Math 447. Introduction to Probability and Statistics I. Fall 1998. Schedule: M. W. F.: 08:00-09:30 am. SW 323 Textbook: Introduction to Mathematical Statistics by R. V. Hogg and A. T. Craig, 1995, Fifth
More informationn(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3
Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationChapter 3 Single Random Variables and Probability Distributions (Part 1)
Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function
More informationChapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be
Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationReview. December 4 th, Review
December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter
More informationby Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: November 29, 2002
INTRODUCTION TO PROBABILITY by Dimitri P. Bertsekas and John N. Tsitsiklis CHAPTER 7: ADDITIONAL PROBLEMS Last updated: November 9, 00 Problems marked with [D] are from Fundamentals of Applied Probability,
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationSection 9.1. Expected Values of Sums
Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationIntroduction to Probability, Fall 2013
Introduction to Probability, Fall 2013 Math 30530 Section 01 Homework 4 Solutions 1. Chapter 2, Problem 1 2. Chapter 2, Problem 2 3. Chapter 2, Problem 3 4. Chapter 2, Problem 5 5. Chapter 2, Problem 6
More informationBell-shaped curves, variance
November 7, 2017 Pop-in lunch on Wednesday Pop-in lunch tomorrow, November 8, at high noon. Please join our group at the Faculty Club for lunch. Means If X is a random variable with PDF equal to f (x),
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationMath 493 Final Exam December 01
Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationMCS Introduction to Probability and Statistics FIRST MIDTERM
Department of Mathematics and Computer Sciences 1-11 Spring Semester MCS 4 - Introduction to Probability and Statistics FIRST MIDTERM 1 The following data represents the grades of a group of students:
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationSTAT 516 Midterm Exam 2 Friday, March 7, 2008
STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More informationFINAL EXAM: 3:30-5:30pm
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationHW Solution 4 Not Due
IES 32: Engineering Statistics 211/2 HW Solution 4 Not Due Lecturer: Prapun Suksompong, Ph.D. Problem 1. The random variable V has pmf { cv p V (v) = 2, v = 1, 2, 3, 4,, otherwise. (a) Find the value of
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationStochastic Models in Computer Science A Tutorial
Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationMidterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example
Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationIntroduction to Probability
LECTURE NOTES Course 6.041-6.431 M.I.T. FALL 2000 Introduction to Probability Dimitri P. Bertsekas and John N. Tsitsiklis Professors of Electrical Engineering and Computer Science Massachusetts Institute
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationContinuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014
Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationHW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are
HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information(Practice Version) Midterm Exam 2
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 7, 2014 (Practice Version) Midterm Exam 2 Last name First name SID Rules. DO NOT open
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationSuppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.
Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationProbability Distributions
Probability Distributions Series of events Previously we have been discussing the probabilities associated with a single event: Observing a 1 on a single roll of a die Observing a K with a single card
More information