STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS

Similar documents
Homework 4 Solution, due July 23

Expected Value 7/7/2006

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Probability Distributions for Discrete RV

What is a random variable

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

First Digit Tally Marks Final Count

Probability. VCE Maths Methods - Unit 2 - Probability

Part (A): Review of Probability [Statistics I revision]

Applied Statistics I

CSC Discrete Math I, Spring Discrete Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

Quantitative Methods for Decision Making

Find the value of n in order for the player to get an expected return of 9 counters per roll.

Conditional Probability

Discrete Probability Distribution

Discrete Random Variable

MATH 250 / SPRING 2011 SAMPLE QUESTIONS / SET 3

Discussion 03 Solutions

Conditional Probability

DISCRETE VARIABLE PROBLEMS ONLY

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Notes for Math 324, Part 17

p. 4-1 Random Variables

Probabilistic models

Discrete random variables and probability distributions

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

More on Distribution Function

Introduction to Probability 2017/18 Supplementary Problems

University of California, Los Angeles Department of Statistics. Exam 1 21 October 2011

Bayes Rule for probability

Probability Theory: Homework problems

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

List the elementary outcomes in each of the following events: EF, E F, F G, EF c, EF G. For this problem, would you care whether the dice are fair?

Joint Distribution of Two or More Random Variables

2. Conditional Probability

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Exam 1. Name: = = P (convicted) P (guilty convicted)p (convicted) P (guilty convicted)p (convicted)+p (guilty convicted)p (convicted)

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

(b). What is an expression for the exact value of P(X = 4)? 2. (a). Suppose that the moment generating function for X is M (t) = 2et +1 3

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

PROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

The Bernoulli distribution has only two outcomes, Y, success or failure, with values one or zero. The probability of success is p.

Discrete Distributions

Deep Learning for Computer Vision

CH 3 P1. Two fair dice are rolled. What is the conditional probability that at least one lands on 6 given that the dice land on different numbers?

Discrete Random Variable Practice

Name: 180A MIDTERM 2. (x + n)/2

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Chapter 6 Continuous Probability Distributions

COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: ECONOMICS

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

ANSWERS TO TEST NUMBER 3

CS206 Review Sheet 3 October 24, 2018

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

SDS 321: Introduction to Probability and Statistics

random variables T T T T H T H H

Chapter (4) Discrete Probability Distributions Examples

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)


Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

ω X(ω) Y (ω) hhh 3 1 hht 2 1 hth 2 1 htt 1 1 thh 2 2 tht 1 2 tth 1 3 ttt 0 none

CS 361: Probability & Statistics

Steve Smith Tuition: Maths Notes

3.5. First step analysis

Notes on Continuous Random Variables

Instructor Solution Manual. Probability and Statistics for Engineers and Scientists (4th Edition) Anthony Hayter

Markov Chains. Chapter 16. Markov Chains - 1

IE 4521 Midterm #1. Prof. John Gunnar Carlsson. March 2, 2010

Some Special Discrete Distributions

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Chapter 1: Revie of Calculus and Probability

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Probability Pearson Education, Inc. Slide

Conditional distributions (discrete case)

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Edexcel past paper questions

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Chapter 3: Random Variables 1

Dept. of Linguistics, Indiana University Fall 2015

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Notes for Math 324, Part 19

Probability Theory and Random Variables

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Transcription:

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS 2. We ust examine the 36 possible products of two dice. We see that 1/36 for i = 1, 9, 16, 25, 36 2/36 for i = 2, 3, 5, 8, 10, 15, 18, 20, 24, 30 P(X = i = 3/36 for i = 4 4/36 for i = 6, 0 otherwise 4. The probabilities are as follows: P(X = 1 = 5 10 = 1 ( 2 ( 5 5 P(X = 2 = = 5 10 9 18 ( ( ( 5 4 5 P(X = 3 = = 5 10 9 8 36 ( ( ( ( 5 4 3 5 P(X = 4 = = 5 10 9 8 7 84 ( ( ( ( ( 5 4 3 2 5 P(X = 5 = = 5 10 9 8 7 6 252 ( ( ( ( ( ( 5 4 3 2 1 5 P(X = 6 = = 1 10 9 8 7 6 5 252 P(X = i = 0 for i = 7, 8, 9, 10 6. There are 8 equally-likely possibilities: HHH, which yields P(X = 3 = 1/8; HHT, HTH, THH, which yields P(X = 1 = 3/8; TTH, THT, HTT, which yields P(X = 1 = 3/8; TTT, which yields P(X = 3 = 1/8. 8a. Define X to be the maximum value of two rolls. Then, considering all 36 possibilities, P(X = 1 = 1/36 P(X = 2 = 3/36 P(X = 3 = 5/36 P(X = 4 = 7/36 P(X = 5 = 9/36 P(X = 6 = 11/36 1

2 8b. Define X to be the minimum value of two rolls. Then, considering all 36 possibilities, P(X = 1 = 11/36 P(X = 2 = 9/36 P(X = 3 = 7/36 P(X = 4 = 5/36 P(X = 5 = 3/36 P(X = 6 = 1/36 8c. Define X to be the sum of the two rolls. Then, considering all 36 possibilities, P(X = 2 = 1/36 P(X = 3 = 2/36 P(X = 4 = 3/36 P(X = 5 = 4/36 P(X = 6 = 5/36 P(X = 7 = 6/36 P(X = 8 = 5/36 P(X = 9 = 4/36 P(X = 10 = 3/36 P(X = 11 = 2/36 P(X = = 1/36 8d. Define X to be the value of the first roll minus the second roll. Then, considering all 36 possibilities, P(X = 5 = 1/36 P(X = 4 = 2/36 P(X = 3 = 3/36 P(X = 2 = 4/36 P(X = 1 = 5/36 P(X = 0 = 6/36 P(X = 1 = 5/36 P(X = 2 = 4/36 P(X = 3 = 3/36 P(X = 4 = 2/36 P(X = 5 = 1/36 14. Write v i for the number that Player i gets. Then P(X = 0 = P(v 2 > v 1 = 1/2 (i.e., Player 2 beats Player 1. Also P(X = 1 = P(v 3 > v 1 > v 2 = 1/3! = 1/6 (i.e., Player 3 beats Player 1, who beats Player 2. Also, P(X = 2 = P(v 4 > v 1 > v 2, v 3 = 2/4! = 1/ (i.e., Player 4 beats Player 1, who beats Players 2 and 3; there are 4! orderings of these 4 players, and either Player 2 or 3 can be last, i.e., 2 ways. Also, P(X = 3 = P(v 5 > v 1 > v 2, v 3, v 4 = 3!/5! = 1/20 (i.e., Player 5 beats Player 1, who beats Players 2, 3, and 4; there are 5! orderings of these 5 players, and the players 2, 3, 4 can be arranged in any order at the lower end, i.e., 3! ways. Finally, P(X = 4 = 1/5, since Player 1 must have the highest of the five values, in this case. 18. Define Y = X 2. Then the probability mass function of Y is p( 2 = P(Y = 2 = P(X = 0 = 1/16 p( 1 = P(Y = 1 = P(X = 1 = 4/16 p(0 = P(Y = 0 = P(X = 2 = 6/16 p(1 = P(Y = 1 = P(X = 3 = 4/16 p(2 = P(Y = 2 = P(X = 4 = 1/16

3 19. The probability mass function of X is p(0 = 1/2 p(1 = 1/10 p(2 = 1/5 p(3 = 1/10 p(3.5 = 1/10 21a. As discussed on page 143, the expected number of students on the bus of a randomly chosen student is larger than the average number of students on a bus. This is a general phenomenon and occurs because the more students there are on a bus, then the more likely a randomly chosen student would have been on that bus. So we expect E[X] E[Y ]. 21b. We compute E[X] = 40(40/148 + 33(33/148 + 25(25/148 + 50(50/148 39.28 E[Y ] = 40(1/4 + 33(1/4 + 25(1/4 + 50(1/4 = 37 22a. The series of games is a best 2-out-of-3. We use q = 1 p for convenience. Let X denote the number of games needed until A wins two games, and let Y denote the number of games needed until B wins two games. Then X is a negative binomial random variable with parameters p and also r = 2; also Y is a negative binomial random variable with parameters q = 1 p and also r = 2. The probability that the series ends in exactly two games is P(X = 2 + P(Y = 2 = ( p 2 (1 p 2 2 + ( q 2 (1 q 2 2 = p 2 + q 2 The probability that the series ends in exactly three games is ( ( P(X = 3 + P(Y = 3 = p 2 (1 p 3 2 + q 2 (1 q 3 2 = 2p 2 q + 2q 2 p So the expected number of games is 2(p 2 + q 2 + 3(2p 2 q + 2q 2 p = 2(1 + p p 2 To maximize this with respect to p, we differentiate with respect to p, yielding 2(1 2p. Then we solve 2(1 2p = 0, which yields p = 1/2. 22b. The series of games is a best 3-out-of-5. We use q = 1 p for convenience. Let X denote the number of games needed until A wins three games, and let Y denote the number of games needed until B wins three games. Then X is a negative binomial random variable with parameters p and also r = 3; also Y is a negative binomial random variable with parameters q = 1 p and also r = 3. The probability that the series ends in exactly three games is P(X = 3 + P(Y = 3 = ( p 3 (1 p 3 3 + ( q 3 (1 q 3 3 = p 3 + q 3 The probability that the series ends in exactly four games is ( ( P(X = 4 + P(Y = 4 = p 3 (1 p 4 3 + q 3 (1 q 4 3 = 3p 3 q + 3q 3 p

4 The probability that the series ends in exactly five games is P(X = 5 + P(Y = 5 = So the expected number of games is ( 5 1 p 3 (1 p 5 3 + ( 5 1 q 3 (1 q 5 3 = 6p 3 q 2 + 6q 3 p 2 3(p 3 + q 3 + 4(3p 3 q + 3q 3 p + 5(6p 3 q 2 + 6q 3 p 2 = 3(1 + p + p 2 4p 3 + 2p 4 To maximize this with respect to p, we differentiate with respect to p, yielding 3(1 + 2p p 2 + 8p 3. Then we solve 3(1 + 2p p 2 + 8p 3 = 0, which yields p = 1/2. 28. Let X denote the number of defective items in the sample. Then p(n = P(X = n = ( n( 4 3 n 16 for n = 0, 1, 2, 3. So the expected value of X is ( 20 3 E[X] = (0 ( 4 ( 16 0 3 ( 20 3 4 16 4 16 4 + (1( 1( 2 + (2( 2( 1 3 0 ( 20 + (3( ( 20 = 3/5 3 3 ( 20 3 30a. We note that p(2 n = P(X = 2 n = 1 2 n since n 1 heads are needed and then 1 tails, in order to win 2 n dollars. Also p(i = 0 when i is not a power of 2. So the expected value of X is E[X] = ip(i = i=1 n=1 ( 16 ( ( ( ( 1 1 1 1 2 n p(n = (2 1 + (2 2 + (2 3 + (2 4 + 2 1 2 2 2 3 2 4 = 1 + 1 + 1 + 1 + = + It is not advisable to pay $1 million to play the game only once, because unless at least 19 consecutive heads are thrown, then the winnings are less than 2 20 dollars. So the player would quite likely lose money on the first game. 30b. The player is expected to win infinitely many dollars during each time that the game is played, on average. So, by playing several times, the player would be guaranteed with success. So it is advisable to play as many times as we like, if we are allowed to settle up only when we stop playing. 33. By Proposition 6.1 on page 156, the most-likely number of customers is the largest integer less than or equal to (n + 1p = (11(1/3, namely, the integer 3. So the newsboy should buy 3 papers. One way to see this rigorously is the following: When X = i, then the profit is (15 10i 10(y i for i y; when X = i, then the profit is (15 10y for i < y. So the expected profit when buying y papers is y ((15 10i 10(y ip(x = i + i=0 10 i=y+1 (15 10yP(X = i

5 The probability that X = i is ( 10 i (1/3 i (2/3 10 i. So number of papers purchased by newsboy Y = 0 0 Y = 1 4.74 Y = 2 8.18 Y = 3 8.69 Y = 4 5.30 Y = 5 1.50 Y = 6 10.35 etc. So the newsboy should buy three papers. expected profit 36. As discussed in Problem 22a, we have p(2 = p 2 +q 2, and p(3 = 2p 2 q+2q 2 p. We write X for the number of games played. In Problem 22a, we already saw that E[X] = 2(1+p p 2. Also E[X 2 ] = 2 2 p(2 + 3 2 p(3 = 2(2 + 5p 5p 2 So the variance is Var(X = E[X 2 ] (E[X] 2 = 2(2 + 5p 5p 2 (2(1 + p p 2 2 = 2p 6p 2 + 8p 3 4p 4 Also d Var(X = 2 p + dx 24p6p 3. Solving d Var(X = 0 yields p = 1/2, as desired. dx 37. We already saw in Problem 21b that E[X] = 40(40/148 + 33(33/148 + 25(25/148 + 50(50/148 39.28 E[Y ] = 40(1/4 + 33(1/4 + 25(1/4 + 50(1/4 = 37 Now we compute E[X 2 ] = 40 2 (40/148 + 33 2 (33/148 + 25 2 (25/148 + 50 2 (50/148 1625.42 E[Y 2 ] = 40 2 (1/4 + 33 2 (1/4 + 25 2 (1/4 + 50 2 (1/4 = 1453.5 Thus Var(X 82.20 and Var(Y = 84.5. 38a. Since E[X] = 1 and Var(X = 5, then we compute 5 = Var(X = E[X 2 ] (E[X] 2 = E[X 2 ] 1 2, so E[X 2 ] = 6. Thus E[(2 + X 2 ] = E[4 + 4X + X 2 ] = 4 + 4E[X] + E[X 2 ] = 4 + (4(1 + 6 = 14 38b. We compute Var(4 + 3X = E[(4 + 3X 2 ] (E[4 + 3X] 2 = E[16 + 24X + 9X 2 ] (4 + 3E[X] 2 = 16 + 24E[X] + 9E[X 2 ] (16 + 24E[X] + 9(E[X] 2 = 16 + 24(1 + 9(6 (16 + 24(1 + 9(1 2 = 45

6 ( 44. Given rainy weather, the probability that exactly of the components will function is n p 1(1 p 1 n. Thus, given rainy weather, the probability that the system will function p 1(1 p 1 n. (i.e., at least k components will function is n ( n =k ( Given dry weather, the probability that exactly of the components will function is n p 2 (1 p 2 n. Thus, given dry weather, the probability that the system will function (i.e., at least k components will function is n ( n =k p 2(1 p 2 n. Conditioning on whether or not it rains yields P(system functions = P(system functions rainp(rain+p(system functions dryp(dry So the probability that the system functions is exactly ( n ( ( n n p 1(1 ( n p 1 n (α + p 2(1 p 2 n (1 α =k 46a. Given a guilty person, the probability that exactly of the urors will render a correct decision is ( (.8 (1.8. Thus, given a guilty person, the probability that the ury gives a correct decision (i.e., at least 9 of the vote guilty is =9 Given an innocent person, the probability that exactly of the urors will render a correct (.9 (1.9. Thus, given an innocent person, the probability that the =k ( (.8 (1.8. decision is ( ury gives a correct decision (i.e., fewer than 9 vote guilty, i.e., at least 4 of the vote innocent is ( =4 (.9 (1.9. Conditioning on whether or not the defendent is actually guilty or innocent, the probability that the ury renders a correct decision is P(correct decision = P(correct decision guilty personp(guilty person + P(correct decision innocent personp(innocent person So the probability that the ury gives a correct decision is exactly ( ( ( ( (.8 (1.8 (.65 + (.9 (1.9 (.35.8665 =9 =4 ( 46b. Given a guilty person, the probability that exactly of the urors will convict is (.8 (1.8. Thus, given a guilty person, the probability that the ury convicts (i.e., (.8 (1.8. ( at least 9 of the vote guilty is =9 ( Given an innocent person, the probability that exactly of the urors will convict is (.9 (1.9. Thus, given an innocent person, the probability that the ury convicts (i.e., at least 9 of the vote guilty is ( =9 (.1 (1.1. Conditioning on whether or not the defendent is actually guilty or innocent, the probability that the ury convicts is =9 P(convicts = P(convicts guilty personp(guilty person + P(convicts innocent personp(innocent person So the probability that the ury convicts is exactly ( ( ( ( (.8 (1.8 (.65 + (.1 (1.1 (.35.5165 =9

52a. Since the average number of airplane crashes in a month is 3.5, then the number of crashes next month is Poisson with parameter λ = 3.5. So the probability that at least 2 such accidents will occur next month is P(X 2 = 1 P(X = 0 P(X = 1 = 1 e 3.5 3.5 0 e 3.5 3.5 1.8641 0! 1! 52b. The probability that at most 1 such accident will occur next month is P(X 1 = P(X = 0 + P(X = 1 = e 3.5 3.5 0 + e 3.5 3.5 1.1359 0! 1! 54a. Since the average number of abandoned cars in a week is 2.2, then the number of abandoned cars next week is Poisson with parameter λ = 2.2. So the probability that no cars are abandoned next week is P(X = 0 = e 2.2 2.2 0.1108 0! 54b. The probability that at least 2 cars are abandoned next week is P(X 2 = 1 P(X = 0 P(X = 1 = 1 e 2.2 2.2 0 e 2.2 2.2 1.6454 0! 1! 58a. The binomial probability and its Poisson approximation are, respectively, ( 8 P(X = 2 = (.1 2 (.9 6 e (8(.1 ((8(.1 2.1488.1438 2 2! 58b. The binomial probability and its Poisson approximation are, respectively, ( 10 P(X = 9 = (.95 9 (.05 1 e (10(.95 ((10(.95 9.3151.1300 9 9! 58c. The binomial probability and its Poisson approximation are, respectively, ( 10 P(X = 0 = (.1 0 (.9 10 e (10(.1 ((10(.1 0.3487.3679 0 0! 58d. The binomial probability and its Poisson approximation are, respectively, ( 9 P(X = 4 = (.2 4 (.8 5 e (9(.2 ((9(.2 4.0661.0723 4 4! 59a. Since the average number of prizes won is 50/100 = 1/2, then the number of prizes won is Poisson with parameter λ = 1/2. So the probability that at least one prize is won is P(X 1 = 1 P(X = 0 = 1 e 1/2 (1/2 0 59b. The probability that exactly one prize is won is P(X = 1 = e 1/2 (1/2 1.3033 1! 59c. The probability that at least two prizes are won is 0!.3935 P(X 2 = 1 P(X = 0 P(X = 1 = 1 e 1/2 (1/2 0 e 1/2 (1/2 1.0902 0! 1! 64a. Throughout the problem below, we make the assumption that all suicides are independent. Since the average number of suicides per month is 400,000/100,000 = 4, then the 7

8 number of suicidess per month is Poisson with parameter λ = 4. So the probability that at least 8 suicides will occur in a month is 7 7 e 4 4 i P(X 8 = 1 P(X = i = 1.0511 i! i=0 64b. Write X as the number of months during the year that each have 8 or more suicides. Then P(X 2 = 1 P(X = 0 P(X = 1. Writing simply p.0511 to denote the probability from 64a above that at least 8 suicides take place in a given month, it follows that P(X 2 = 1 ( p 0 (1 p 0 i=0 ( p 1 (1 p 11.29 1 64c. Let Y denote the first month to have 8 or more suicides. Then for i 1 we have P(Y = i = (1 p i 1 p 72. We use A to denote the stronger team. The series of games is a best 4-out-of-7. Let X denote the number of games needed until A wins four games. Then X is a negative binomial random variable with parameters p and also r = 4. The probability that A wins the first four games is ( P(X = 4 = p 4 (1 p 4 4 =.96 The probability that A wins four out of the first five games is ( 5 1 P(X = 5 = p 4 (1 p 5 4 =.20736 The probability that A wins four out of the first six games is ( 6 1 P(X = 6 = p 4 (1 p 6 4 =.20736 The probability that A wins four out of the first seven games is ( 7 1 P(X = 7 = p 4 (1 p 7 4 =.165888 So the probability that A wins the series is.96+.20736+.20736+.165888 =.710208. In the case where the series of games is a best 2-out-of-3, let X denote the number of games needed until A wins two games. Then X is a negative binomial random variable with parameters p and also r = 2. The probability that A wins the first two games is ( P(X = 2 = p 2 (1 p 2 2 =.36 The probability that A wins two out of the first three games is ( P(X = 3 = p 2 (1 p 3 2 =.288 So the probability that A wins the series is.36 +.288 =.648.

73. We use the same notation as in Problem 72 above, but now p = 1/2 instead of p =.6. We have ( P(X = 4 = p 4 (1 p 4 4 =.0625 The probability that A wins four out of the first five games is ( 5 1 P(X = 5 = p 4 (1 p 5 4 =.5 The probability that A wins four out of the first six games is ( 6 1 P(X = 6 = p 4 (1 p 6 4 =.15625 The probability that A wins four out of the first seven games is ( 7 1 P(X = 7 = p 4 (1 p 7 4 =.15625 Since p = 1/2, then 1 p = 1/2 also, so the probability that team B wins in 4, 5, 6, or 7 games is the same in each case as listed above, i.e.,.0625,.5,.15625,.15625, respectively. So the expected number of games in the series is (4(.0625 +.0625 + (5(.5 +.5 + (6(.15625 +.15625 + (7(.15625 +.15625 = 5.85 75. We note that X is a negative binomial random variable with p = 1/2 and r = 10. So the probability mass function of X is ( n 1 p(n = P(X = n = p r (1 p n r r 1 for n = r, r + 1, r + 2,.... 9 78. Let X denote a hypergeometric random variable representing the N = 8 balls in the urn, of which m = 4 are white, the other N m = 4 are black, and for which we draw n = 4 balls each time. On each turn, the probability that exactly two of the four selected balls are black is P(X = 2 = 2( (4 4 2 = 18/35. So the probability that we shall make exactly n ( 8 4 selections is ( 1 18 n 1 ( 18 ( 35 35 = 17 n 1 ( 18 35 35 = 17 n 1 18. 35 n 79a. We note that X is a hypergeometric random variable with N = 100 items altogether, of which m = 6 are defective, the other N m = 94 are nondefective, and for which we draw n = 10 items. Then P(X = 0 = 0( (6 94 10 = 1,886,711/3,6,280.5223. 79b. We also note that ( 100 10 P(X > 2 = 1 P(X = 0 P(X = 1 P(X = 2 ( 6 94 ( 6 ( 94 ( 6 94 = 1 0( 10 1 9 ( 100 ( 100 2( 8 ( 100 10 10 10 = 22,669/1,806,140.055