Introduction to Probability 2017/18 Supplementary Problems

Similar documents
MAS108 Probability I

CSC Discrete Math I, Spring Discrete Probability

With Question/Answer Animations. Chapter 7

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

2. AXIOMATIC PROBABILITY

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

STAT:5100 (22S:193) Statistical Inference I

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1

What is Probability? Probability. Sample Spaces and Events. Simple Event

Introduction and basic definitions

2. Conditional Probability

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Conditional Probability

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Dept. of Linguistics, Indiana University Fall 2015

Nuevo examen - 02 de Febrero de 2017 [280 marks]

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Great Theoretical Ideas in Computer Science

MTH4107 / MTH4207: Introduction to Probability

Name: Exam 2 Solutions. March 13, 2017

Statistical Inference

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

UNIT 5 ~ Probability: What Are the Chances? 1

Section 13.3 Probability

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Independence 1 2 P(H) = 1 4. On the other hand = P(F ) =

Lecture 16. Lectures 1-15 Review

Conditional probability

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Discrete Random Variable Practice

CS 361: Probability & Statistics

Probability Theory and Applications

Mutually Exclusive Events

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Deep Learning for Computer Vision

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Discrete Structures for Computer Science

Discrete Probability

What is a random variable

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

STA Module 4 Probability Concepts. Rev.F08 1

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Lecture Lecture 5

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Math SL Day 66 Probability Practice [196 marks]

CME 106: Review Probability theory

MA : Introductory Probability

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

CMPSCI 240: Reasoning Under Uncertainty

THE QUEEN S UNIVERSITY OF BELFAST

STAT 285 Fall Assignment 1 Solutions

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

STEP Support Programme. Statistics STEP Questions: Solutions

CS206 Review Sheet 3 October 24, 2018

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

M378K In-Class Assignment #1

Part (A): Review of Probability [Statistics I revision]

STAT 414: Introduction to Probability Theory

Probability Year 9. Terminology

Discrete Probability

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Quantitative Methods for Decision Making

Lecture 3 Probability Basics

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators.

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Name: 180A MIDTERM 2. (x + n)/2

Edexcel past paper questions

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

Probability. VCE Maths Methods - Unit 2 - Probability

2011 Pearson Education, Inc

Probability Distributions. Conditional Probability.

Lower bound for sorting/probability Distributions

DISCRETE VARIABLE PROBLEMS ONLY

Conditional Probability. CS231 Dianna Xu

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Conditional Probability and Bayes

Notes 12 Autumn 2005

Probability Year 10. Terminology

STAT 430/510 Probability

Notes 10.1 Probability

Conditional Probability & Independence. Conditional Probabilities

Chapter 2 Class Notes

Probability and Independence Terri Bittner, Ph.D.

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Homework 4 Solution, due July 23

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

STAT 418: Probability and Stochastic Processes

PRACTICE PROBLEMS FOR EXAM 2

Probability 1 (MATH 11300) lecture slides

4.2 Probability Models

Transcription:

Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A B results in P(B) 0. Problem 2: a) Show that the conditional probability obeys 0 P(A B) 1. b) Show that for disjoint events A and B the conditional probability obeys P(A B C) P(A C) + P(B C) a) P(A B) P(A B) P(B) 0 since the numerator is non negative and the denominator is positive. Since A B B we have P(A B) P(B). Hence P(A B) P(A B) P(B) P(B) P(B) 1. b) (A B) C (A C) (B C) and the sets A C and B C are disjoint since A and B are disjoint. Hence P(A B C) P((A B) C) P((A C) (B C)) P(C) P(C) P(A C) + P(B C) P(A C) + P(B C) P(C) Problem 3: A bag contains n red balls and n blue balls. A selection of n balls is made from the bag at random without replacement. Let A i be the event that the selection contains exactly i red balls. a) Find P(A i ). b) What can you say about the events A 0, A i,..., A n? c) Deduce that i0 ( ) 2 n i ( ) 2n. n 1

a) We are making an unordered selection of n balls from 2n balls without replacement so ( ) 2n S. n The ( number of ways of choosing i red balls from the n red balls in the bag is n ) i. The number of ways of choosing the remaining n i balls from the n blue balls in the bag is ( n n i). Hence P(A i ) ( n n ) i)( n i ( 2n ). n But ( ) n n i so n! (n (n i))!(n i)! n! i!(n i)! P(A i ) ( n i) 2 ( 2n n ). ( ) n i b) The events A 0, A 1,..., A n are pairwise disjoint (selecting k red balls excludes selecting l red balls if l k) and A 0 A 1 A n S (any outcome in S contains a number of red balls and is thus contained in some A k ), they are a partition of S. c) The previous part and definition 2.1 b) and c) show that P(A i ) P(A 0 A 1 A n ) P(S) 1. i0 Substituting in our formula from part a) and rearranging we get that i0 ( ) 2 n i ( ) 2n. n Problem 4: A student observes that out of all students graduating with first class degrees in mathematics from Queen Mary, 96% of them passed the mid-term test in Introduction to Probability. He claims that since he has just passed the test he has a very good chance of getting a first. a) What is the obvious mistake with this argument? 2

b) Assume further that 80% of students pass the test and 10% of students get a first. What is the probability that a randomly chosen student who has passed the test gets a first. a) Let F be the event that a randomly chosen student obtains a first and T be the event that a randomly chosen student passes the test. Our student has observed that P(T F ) (that is the conditional probability of passing that test given that a first is obtained) is large (0.96) and assumed that consequently P(F T ) (that is the conditional probability of getting a first given that the test is passed) is also large. However, in general there is no such connection between P(F T ) and P(T F ). b) By Bayes theorem P(F T ) P(T F ) P(F ) P(T ). We are told that 10% of students get a first, i.e., P(F ) 0.1, and 96% of these passed the test so P(T F ) 0.96. We are also told that P(T ) 0.8. Putting these into the formula above we get that (rather smaller than P(T F )). P(F T ) 0.096 0.8 0.12 Problem 5: I toss a fair coin repeatedly stopping when I have either tossed it 4 times or I have seen 2 heads (whichever happens first). I record the sequence of heads/tails seen. a) Write down the sample space. b) Are all elements of the sample space equally likely? Justify your answer. c) Calculate the probability that I see 2 heads. d) Calculate the conditional probability that I see 2 heads given that the first toss is a tail. e) Calculate the conditional probability that the first toss is a tail given that I see 2 heads. a) The sample space is S {hh, hth, htth, httt, thh, thth, thtt, tthh, ttht, ttth, tttt}. 3

b) By independence of subsequent coin tosses and the fact that each coin toss is equally likely to be a head or a tail we have Similarly, P({hh}) P({hth}) ( ) 2 1 1 2 4. ( ) 3 1 1 2 8. So P({hh}) P({hth}) and so not all elements are equally likely. Note that to answer this part we don t need to calculate the probability of every element of the sample space. It suffices just to find two which have different probabilities. (Of course, other pairs would do as well). c) The event in question is {hh, hth, htth, thh, thth, tthh}. The probability of the simple events making this up can be calculated as in part b). We get P(2 heads) P({hh, hth, htth, thh, thth, tthh}) 1 4 +1 8 + 1 16 +1 8 + 1 16 + 1 16 11 16. d) Let A be the event I see 2 heads and B be the event the first toss is a tail. We want P(A B). Obviously P(B) 1/2, P(A B) P({thh, thth, tthh}) 1 8 + 1 16 + 1 16 1 4, and so P(A B) P(A B) P(B) 1/4 1/2 1 2. e) This time we want P(B A). Using part c) and d) P(B A) P(A B) P(A) 1/4 11/16 4 11. Problem 6: a) Suppose that A and B are independent events. Show that A c and B are independent events. b) Suppose that A and A c are independent events. Show that either P(A) 0 or P(A) 1. 4

a) We have A A c S and B (B A) (B A c ) with the two sets B A and B A c being disjoint. Hence by definition 2.1. c P(B) P(A c B) + P(A B) P(A c B) P(B) P(A B) Using independence of A and B, P(A B) P(A)P(B), we obtain P(A c B) P (B) P(A)P(B) P(B)(1 P(A)) P(A c )P(B) and so A c and B are independent. b) By definition A A c and so P(A A c ) 0. If A and A c are independent then 0 P(A A c ) P(A)P(A c ) P(A) (1 P(A)) and so one of P(A) and 1 P(A) is 0. It follows that P(A) 0 or P(A) 1. Problem 7: There are two roads from A to B and two roads from B to C. Suppose that each road is closed with probability p and that the state of each road is independent of the others. What condition on p ensures that the probability that I can travel from A to C is at least 1/2? I can travel from A to B unless both roads are closed. By independence the probability that both roads are closed is p 2. So if I write X for the event I can travel from A to B then P(X) 1 p 2. Similarly, if Y is the event I can travel from B to C then P(Y ) 1 p 2. I can travel from A to C if and only if both X and Y occur so we want P(X Y ) to be at least 1/2. Now, P(X Y ) P(X)P(Y ). (The fact that X and Y are independent follows from the mutual independence of the states of roads and the fact X and Y depend on disjoint sets of roads, see below for the actual computation.) So we require that (1 p 2 ) 2 1 2 This is satisfied if That is if So we need that p 0.541. 1 p 2 p 2 1 2 2. 2 2 5

To show that P(X Y ) P(X)P(Y ) denote by R 1 and R 2 the events that the roads from A to B are open, and by S 1 and S 2 the events that the roads from B to C are open. Obviously X R 1 R 2 and Y S 1 S 2. P(X Y ) P((R 1 Y ) (R 2 Y )) P(R 1 Y ) + P(R 2 Y ) P(R 1 R 2 Y ) Since R 1 and Y are independent (see e.g. problem sheets) and R 2 and Y are independent we have P(X Y ) (P(R 1 ) + P(R 2 ))P(Y ) P((R 1 R 2 S 1 ) (R 1 R 2 S 2 )). For the last term the inclusion exclusion principle and mutual independence yields Hence P((R 1 R 2 S 1 ) (R 1 R 2 S 2 )) P(R 1 R 2 S 1 ) + P(R 1 R 2 S 2 ) P(R 1 R 2 S 1 S 2 ) P(R 1 )P(R 2 )(P(S 1 ) + P(S 2 ) P(S 1 )P(S 2 )) P(R 1 )P(R 2 )P(S 1 S 2 ) P(R 1 )P(R 2 )P(Y ) P(X Y ) (P(R 1 )+P(R 2 ))P(Y ) P(R 1 )P(R 2 )P(Y ) P(R 1 R 2 )P(Y ) P(X)P(Y ). Problem 8: Two important members of a cricket team are injured, and each has probability 1/3 of recovering before the match. The recoveries of the two players are independent of each other. If both are able to play then the team has probability 3/4 of winning the match, if only one of them plays then the probability of winning is 1/2 and if neither play the probability of winning is 1/16. What is the probability that the match is won? Let E 0, E 1, E 2 be the events neither player recovers, exactly one player recovers and both players recover respectively. Let W be the event the match is won. By the theorem of total probability (Theorem 6.1) P(W ) P(W E 0 )P(E 0 ) + P(W E 1 )P(E 1 ) + P(W E 2 )P(E 2 ). Since the recoveries of the two players are independent we have ( ) 2 2 P(E 0 ) 4 3 9 ( ) ( ) 1 2 P(E 1 ) 2 4 3 3 9 ( ) 2 1 P(E 2 ) 1 3 9. 6

We are told the relevant conditional probabilities in the question and so P(W ) 1 16 4 9 + 1 2 4 9 + 3 4 1 9 1 3. Problem 9: In a game of tennis once the score has reached deuce play continues until (effectively) one player has a lead of two points. The score has reached deuce in a game of tennis between Andre and Boris. Suppose that each point is won by Andre with probability 1/4 (and otherwise by Boris). Suppose also (probably unrealistically) that the outcome of each point is independent of all other points. Let x be the probability that Andre wins the game, u be the conditional probability that Andre wins the game given that he wins the first point and v be the conditional probability that Andre wins the game given that he loses the first point. Use the theorem of total probability (and its analogue for conditional probability) to show that: 4x u + 3v 4u 3x + 1 4v x and hence determine the probability that Andre wins the game. Let X be the event Andre wins the game, W 1 be the event Andre wins the first point and L 1 be the event Andre loses the first point. The events W 1 and L 1 partition the sample space (since L c 1 W 1 ) and so by the theorem of total probability: P(X) P(X W 1 )P(W 1 ) + P(X L 1 )P(L 1 ) x u 1 4 + v 3 4 4x u + 3v. This is the first equation we wanted. To derive the second we need to consider u, that is P(X W 1 ). Let W 2 be the event Andre wins the second point and L 2 be the event Andre loses the second point. The events W 2 and L 2 partition the sample space and so by Theorem 7.4 (total probability applied to conditional probabilities) u P(X W 1 ) P(X W 1 W 2 )P(W 2 W 1 ) + P(X W 1 L 2 )P(L 2 W 1 ) P(X W 1 W 2 )P(W 2 ) + P(X W 1 L 2 )P(L 2 ) since the outcome of the second point is independent of the outcome of the first. If Andre wins the first two points then he wins the game and so P(X W 1 W 2 ) 1 If he wins the first point and loses the second then the score is back to deuce (each player needs to establish a lead of 2 to win) and so P(X W 1 L 2 ) x. So u 1 1 4 + x 3 4 4u 1 + 3x 7

Finally a similar argument applied to v gives v P(X L 1 ) P(X L 1 W 2 )P(W 2 L 1 ) + P(X L 1 L 2 )P(L 2 L 1 ) P(X L 1 W 2 )P(W 2 ) + P(X L 1 L 2 )P(L 2 ) x 1 4 + 0 3 4 4v x. This gives the three equations. Solving them (say by using the second and third equation to substitute for u and v respectively in the first equation) we obtain x 1/10. Problem 10: There are three boxes: a box containing two gold coins, a box containing two silver coins, and a box containing one gold coin and a silver coin. You pick a box at random. Then you pick a coin from the box at random. assume you have picked a gold coin. Compute the probability that you have picked the box with the two gold coins (i.e. the probability that the remaining coin in the box is a gold coin as well). Consider the events A GG : You pick the box with two gold coins. A GS : You pick the box with one gold coin and a silver coin. A SS : You pick the box with two silver coins. B: You pick a gold coin from the selected box. We need to compute P(A GG B). Since we pick boxes at random we have P(A GG ) 1 3, P(A GS) 1 3, P(A SS) 1 3 The conditional probabilities of picking a gold coin from a given box follows by sampling Bayes theorem tells us P(B A GG ) 1, P(B A GS ) 1 2, P(B A SS) 0 P(A GG B) P(B A GG ) P(A GG) P(B). Using the partition A GG, A GS, A SS the theorem of total probability gives P(B) P(B A GG )P(A GG ) + P(B A GS )P(A GS ) + P(B A SS )P(A SS ). 8

Altogether P(A GG B) P(B A GG )P(A GG ) P(B A GG )P(A GG ) + P(B A GS )P(A GS ) + P(B A SS )P(A SS ) 1 1 3 1 1 3 + 1 2 1 3 + 0 1 3 2 3 Problem 11: A bag contains three coins, a fair coin, a biased coin which has probability 1/3 to come up head, and a biased coin which has probability 2/3 to come up head. You pick a coin at random and toss it once. Assume the coin comes up tail. Compute the probability that the coin comes up head if you toss the coin again. Consider the events F You pick the fair coin. B s B l T 1 H 2 You pick the biased coin which has probability 1/3 coming up head You pick the biased coin which has probability 2/3 coming up head The first toss of the selected coin is tail The second toss of the selected coin is head We need to compute P(H 2 T 1 ). Since we pick the coin at random we have P(F ) 1 3, P(B s) 1 3, P(B l) 1 3. (Conditional) probabilities for getting tail when tossing the fair coin, the biased coin coming up head with probability 1/3, and the biased coin coming up head with probability 2/3 P(T 1 F ) 1 2, P(T 1 B s ) 2 3, P(T 1 B l ) 1 3 (Conditional) probabilities for getting head when tossing the fair coin, tossing the biased coin coming up head with probability 1/3, and tossing the biased coin coming up head with probability 2/3 P(H 2 F ) 1 2, P(H 2 B s ) 1 3, P(H 2 B l ) 2 3 Using the partition F, B s, B l theorem 6.2 tells us P(H 2 T 1 ) P(H 2 T 1 F )P(F T 1 ) + P(H 2 T 1 B s )P(B s T 1 ) + P(H 2 T 1 B l )P(B l T 1 ) 9

Let us first consider the first factor in each term, P(H 2 T 1 X), where X denotes F, B s, or B l. Using definition of conditional probability we have P(H 2 T 1 X) P(H 2 T 1 X) P(T 1 X) P(H 2 T 1 X) P(X) P(X) P(T 1 X) P(H 2 T 1 X) P(T 1 X) Using conditional independence of subsequent coin tosses for a given coin, P(H 2 T 1 X) P(H 2 X)P(T 1 X), we arrive at Hence P(H 2 T 1 X) P(H 2 X)P(T 1 X) P(T 1 X) P(H 2 T 1 F ) P(H 2 F ) 1 2 P(H 2 X). P(H 2 T 1 B s ) P(H 2 B 2 ) 1 3 P(H 2 T 1 B l ) P(H 2 B l ) 2 3 For the second factor P(X T 1 ) Bayes theorem gives P(X T 1 ) P(T 1 X) P(X) P(T 1 ) For the numerator we have P(X) 1/3 whereas the denominator can be written as, using the theorem of total probability with partition F, B s, B l Hence and P(T 1 ) P(T 1 F )P(F ) + P(T 1 B s )P(B s ) + P(T 1 B l )P(B l ) 1 2 1 3 + 2 3 1 3 + 1 3 1 3 1 2 P(X T 1 ) P(T 1 X) 2 3 P(F T 1 ) 2 3 P(T 1 F ) 1 3 P(B s T 1 ) 2 3 P(T 1 B 2 ) 4 9 Combining all results we finally arrive at P(B l T 1 ) 2 3 P(T 1 B l ) 2 9 P(H 2 T 1 ) 1 2 1 3 + 1 3 4 9 + 2 3 2 9 25 54 The probability is slightly smaller than 1/2, i.e., it is more likely to toss tail again. Problem 12: You roll two fair standard dice and record the numbers showing. Let T be the random variable the sum of the two numbers rolled and D be the random variable the absolute value of the difference between the two numbers rolled. 10

a) Find the probability mass function of T. b) Find the expectation of T. c) Find the probability mass function of D. d) Find the expectation D. e) Say in words what the random variable M 1 (T + D) measures. 2 a) The random variable T takes values 2, 3, 4,..., 12. The probability mass function of T is as follows: b) t 2 3 4 5 6 7 8 9 10 11 12 P(T t) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 To see this consider the sample space S {(k, l) : 1 k 6 and 1 l 6} Count, for each t, how many elements of the sample space there are for which the two numbers sum to t. For example if t 2 there is just the single outcome (1, 1) with this property and P(T 2) 1/36. Or P(T 5) 4/36 because of the four outcomes (1, 4), (2, 3), (3, 2), (4, 1). E(T ) 2 1/36 + 3 2/36 + 4 3/36 + 5 4/36 + 6 5/36 + 7 6/36+ 8 5/36 + 9 4/36 + 10 3/36 + 11 2/36 + 12 1/36 252/36 7. c) The random variable D takes values 0, 1, 2, 3, 4, 5. The probability mass function of D is as follows: d) d 0 1 2 3 4 5 P(D d) 6/36 10/36 8/36 6/36 4/36 2/36 To see this consider the sample space and count, for each d, how many elements of the sample space there are for which the difference between the two numbers is d. For example P(D 4) 4/36 because of the four outcomes (1, 5), (2, 6), (6, 2), (5, 1). E(D) 0 6/36 + 1 10/36 + 2 8/36 + 3 6/36 + 4 4/36 + 5 2/36 70/36 35/18. 11

e) T is the sum of the two rolls, that means it is the maximum of the two rolls, say x max, plus the minimum of the two rolls, say x min. Hence T x max + x min. D is the difference of both rolls in absolute value, that means it is the maximum of the two rolls minus the minimum of the two rolls. Hence D x max x min. Therefore T + D 2x max is twice the maximum of the two rolls and so M is the maximum of the two rolls. Problem 13: a) Adam has five keys in his pocket one of which opens his front door. On arriving home he takes out a key at random and attempts to open the door with it. If it doesn t fit the door then he replaces it in his pocket. This is repeated until he manage to open the door. Let A be the random variable the number of attempts he takes. Find the probability mass function of A. b) Eve has five keys in her pocket one of which opens her front door. On arriving home she takes out a key at random and attempts to open the door with it. If it doesn t fit the door then she holds on to it and picks a key from those remaining in her pocket. This is repeated until she manages to open the door. Let E be the random variable the number of attempts she takes. Find the probability mass function of E. a) The random variable A takes values in N. If n N then P(A n) P(first n 1 picks are incorrect, nth pick is correct). Each pick has probability 4/5 of being incorrect and 1/5 of being correct and picks are independent. We get that the pmf is: that means A Geom(1/5). P(A n) ( ) n 1 4 1 5 5 b) The random variable E takes values in {1, 2, 3, 4, 5} (since the incorrect keys are not replaced there is no way that 5 incorrect keys can be chosen). If 1 n 5 then the event E n occurs if the first n 1 picks are incorrect and the next is correct. For E 1 the first pick is correct and that happens with probability 1/5. For E 2 the first pick is incorrect (with probability 4/5) and the second is correct (with probability 1/4, key not replaced). For E 3 the first pick is incorrect (with probability 4/5), the second is incorrect (with probability 3/4), and the third is correct (with probability 1/3). 12

And so on. Hence we get that the pmf is: P(E 1) 1 5 P(E 2) 4 5 1 4 1 5 P(E 3) 4 5 3 4 1 3 1 5 P(E 4) 4 5 3 4 2 3 1 2 1 5 P(E 5) 4 5 3 4 2 3 1 2 1 1 1 5. That is P(E n) 1/5 for all n {1, 2, 3, 4, 5}. Problem 14: Each match played by a football team is won by that team with probability 1/2, is a draw with probability 1/6, and is lost with probability 1/3, with the result of each match being independent of all other results. Find the probability mass function of each of the following random variables related to this. (Some but not all of these take one of the special distributions studied in lectures; in these cases you should use the name of the distribution otherwise just write down the pmf). a) The number of drawn matches in a season lasting 38 matches. b) The number of matches up to and including their first win. c) The number of matches following their first defeat up to and including their next win. d) The number of wins in the 10 matches following their first defeat. e) The number of matches up to and including their second loss. f) The number of matches up to and including their mth loss where m is a fixed positive integer. a) If we regard a draw as success then we have 38 independent trials with the probability of success being 1/6 in each and our random being the number of successes. The distribution is therefore Bin(38, 1/6). b) If we regard a win as success then this is the number of Bernoulli(1/2) trials up to and including the first success. The distribution is therefore Geom(1/2). c) Starting from the first defeat we have a sequence of independent Bernoulli(1/2) trials. So the number of games up to and including the next win has the Geom(1/2) distribution. The fact that we start from the first defeat is irrelevant to the distribution of the number of games until the next win. 13

d) The 10 matches following the first defeat consist of a fixed number of Bernoulli trials each with probability 1/2 of success (thinking of a win as success). The number of wins then has a Bin(10, 1/2) distribution. e) This is not a distribution we have studied so we will have to work the pmf out directly. Let N be the number of matches up to and including the second loss. Then N takes values 2, 3, 4,... and if k 2 we have P(N k) P(1 loss and k 2 others in the first k 1 matches followed by a loss in the kth) ( ) ( ) k 2 ( ) 1 2 1 (k 1) 3 3 3 ( ) 2 ( ) k 2 1 2 (k 1). 3 3 f) Similarly to part e) you can work out the pmf of the number of matches up to and including the mth loss where m is a fixed positive integer. If we call this M then M takes values m, m + 1, m + 2,... and if k m we have P(M k) P(m 1 losses and k m others in the first k 1 matches then a loss in the kth) ( ) ( ) m 1 ( ) k m ( ) k 1 1 2 1 m 1 3 3 3 ( ) ( ) m ( ) k m k 1 1 2. m 1 3 3 Problem 15: Prove that if X is a discrete random variable with Var(X) 0 then X is constant. That is there exists some a with P(X a) 1. Suppose that X takes values x 1, x 2,..., x n and that E(X) µ. Then Var(X) (x i µ) 2 P(X x i ). i1 Each summand is 0 and so the only way the sum can be equal to 0 is if all summands are equal to 0. Now (x i µ) 2 is 0 only if x i µ and so we must have P(X x j ) 0 for all x j µ. It follows that P(X µ) 1 and so X is a constant random variable. Problem 16: Let X be the number of fish caught by a fisherman and Y be the number of fish caught by a second fisherman in one afternoon of fishing. Suppose that X is distributed Poisson(λ) and Y is distributed Poisson(µ). Suppose further that X and Y are independent random variables. a) Show that P(X + Y n) e k0 λ λk k! e µ µn k (n k)!. 14

b) Hence find the distribution of the total number of fish caught. a) To have X + Y n we need X k and Y n k for some k with 0 k n. Hence P(X + Y n) P(X k and Y n k). By independence k0 P(X + Y n) P(X k)p(y n k). k0 Now substituting in the pmf of a Poisson random variable P(X + Y n) e k0 λ λk k! e µ µn k (n k)!. b) λ λk µn k P(X + Y n) e e µ k! (n k)! k0 ( e (λ+µ) 1 n n! k k0 e (λ+µ) 1 ( n n! k k0 e (λ+µ) 1 n! (λ + µ)n. ) λ k µ n k ) λ k µ n k where the last line is an application of the binomial theorem. This is the pmf of a Poisson(λ + µ) distribution and so X + Y Poisson(λ + µ). 15