Introduction to Probability 2017/18 Supplementary Problems

Size: px
Start display at page:

Download "Introduction to Probability 2017/18 Supplementary Problems"

Transcription

1 Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A B results in P(B) 0. Problem 2: a) Show that the conditional probability obeys 0 P(A B) 1. b) Show that for disjoint events A and B the conditional probability obeys P(A B C) P(A C) + P(B C) a) P(A B) P(A B) P(B) 0 since the numerator is non negative and the denominator is positive. Since A B B we have P(A B) P(B). Hence P(A B) P(A B) P(B) P(B) P(B) 1. b) (A B) C (A C) (B C) and the sets A C and B C are disjoint since A and B are disjoint. Hence P(A B C) P((A B) C) P((A C) (B C)) P(C) P(C) P(A C) + P(B C) P(A C) + P(B C) P(C) Problem 3: A bag contains n red balls and n blue balls. A selection of n balls is made from the bag at random without replacement. Let A i be the event that the selection contains exactly i red balls. a) Find P(A i ). b) What can you say about the events A 0, A i,..., A n? c) Deduce that i0 ( ) 2 n i ( ) 2n. n 1

2 a) We are making an unordered selection of n balls from 2n balls without replacement so ( ) 2n S. n The ( number of ways of choosing i red balls from the n red balls in the bag is n ) i. The number of ways of choosing the remaining n i balls from the n blue balls in the bag is ( n n i). Hence P(A i ) ( n n ) i)( n i ( 2n ). n But ( ) n n i so n! (n (n i))!(n i)! n! i!(n i)! P(A i ) ( n i) 2 ( 2n n ). ( ) n i b) The events A 0, A 1,..., A n are pairwise disjoint (selecting k red balls excludes selecting l red balls if l k) and A 0 A 1 A n S (any outcome in S contains a number of red balls and is thus contained in some A k ), they are a partition of S. c) The previous part and definition 2.1 b) and c) show that P(A i ) P(A 0 A 1 A n ) P(S) 1. i0 Substituting in our formula from part a) and rearranging we get that i0 ( ) 2 n i ( ) 2n. n Problem 4: A student observes that out of all students graduating with first class degrees in mathematics from Queen Mary, 96% of them passed the mid-term test in Introduction to Probability. He claims that since he has just passed the test he has a very good chance of getting a first. a) What is the obvious mistake with this argument? 2

3 b) Assume further that 80% of students pass the test and 10% of students get a first. What is the probability that a randomly chosen student who has passed the test gets a first. a) Let F be the event that a randomly chosen student obtains a first and T be the event that a randomly chosen student passes the test. Our student has observed that P(T F ) (that is the conditional probability of passing that test given that a first is obtained) is large (0.96) and assumed that consequently P(F T ) (that is the conditional probability of getting a first given that the test is passed) is also large. However, in general there is no such connection between P(F T ) and P(T F ). b) By Bayes theorem P(F T ) P(T F ) P(F ) P(T ). We are told that 10% of students get a first, i.e., P(F ) 0.1, and 96% of these passed the test so P(T F ) We are also told that P(T ) 0.8. Putting these into the formula above we get that (rather smaller than P(T F )). P(F T ) Problem 5: I toss a fair coin repeatedly stopping when I have either tossed it 4 times or I have seen 2 heads (whichever happens first). I record the sequence of heads/tails seen. a) Write down the sample space. b) Are all elements of the sample space equally likely? Justify your answer. c) Calculate the probability that I see 2 heads. d) Calculate the conditional probability that I see 2 heads given that the first toss is a tail. e) Calculate the conditional probability that the first toss is a tail given that I see 2 heads. a) The sample space is S {hh, hth, htth, httt, thh, thth, thtt, tthh, ttht, ttth, tttt}. 3

4 b) By independence of subsequent coin tosses and the fact that each coin toss is equally likely to be a head or a tail we have Similarly, P({hh}) P({hth}) ( ) ( ) So P({hh}) P({hth}) and so not all elements are equally likely. Note that to answer this part we don t need to calculate the probability of every element of the sample space. It suffices just to find two which have different probabilities. (Of course, other pairs would do as well). c) The event in question is {hh, hth, htth, thh, thth, tthh}. The probability of the simple events making this up can be calculated as in part b). We get P(2 heads) P({hh, hth, htth, thh, thth, tthh}) d) Let A be the event I see 2 heads and B be the event the first toss is a tail. We want P(A B). Obviously P(B) 1/2, P(A B) P({thh, thth, tthh}) , and so P(A B) P(A B) P(B) 1/4 1/ e) This time we want P(B A). Using part c) and d) P(B A) P(A B) P(A) 1/4 11/ Problem 6: a) Suppose that A and B are independent events. Show that A c and B are independent events. b) Suppose that A and A c are independent events. Show that either P(A) 0 or P(A) 1. 4

5 a) We have A A c S and B (B A) (B A c ) with the two sets B A and B A c being disjoint. Hence by definition 2.1. c P(B) P(A c B) + P(A B) P(A c B) P(B) P(A B) Using independence of A and B, P(A B) P(A)P(B), we obtain P(A c B) P (B) P(A)P(B) P(B)(1 P(A)) P(A c )P(B) and so A c and B are independent. b) By definition A A c and so P(A A c ) 0. If A and A c are independent then 0 P(A A c ) P(A)P(A c ) P(A) (1 P(A)) and so one of P(A) and 1 P(A) is 0. It follows that P(A) 0 or P(A) 1. Problem 7: There are two roads from A to B and two roads from B to C. Suppose that each road is closed with probability p and that the state of each road is independent of the others. What condition on p ensures that the probability that I can travel from A to C is at least 1/2? I can travel from A to B unless both roads are closed. By independence the probability that both roads are closed is p 2. So if I write X for the event I can travel from A to B then P(X) 1 p 2. Similarly, if Y is the event I can travel from B to C then P(Y ) 1 p 2. I can travel from A to C if and only if both X and Y occur so we want P(X Y ) to be at least 1/2. Now, P(X Y ) P(X)P(Y ). (The fact that X and Y are independent follows from the mutual independence of the states of roads and the fact X and Y depend on disjoint sets of roads, see below for the actual computation.) So we require that (1 p 2 ) This is satisfied if That is if So we need that p p 2 p

6 To show that P(X Y ) P(X)P(Y ) denote by R 1 and R 2 the events that the roads from A to B are open, and by S 1 and S 2 the events that the roads from B to C are open. Obviously X R 1 R 2 and Y S 1 S 2. P(X Y ) P((R 1 Y ) (R 2 Y )) P(R 1 Y ) + P(R 2 Y ) P(R 1 R 2 Y ) Since R 1 and Y are independent (see e.g. problem sheets) and R 2 and Y are independent we have P(X Y ) (P(R 1 ) + P(R 2 ))P(Y ) P((R 1 R 2 S 1 ) (R 1 R 2 S 2 )). For the last term the inclusion exclusion principle and mutual independence yields Hence P((R 1 R 2 S 1 ) (R 1 R 2 S 2 )) P(R 1 R 2 S 1 ) + P(R 1 R 2 S 2 ) P(R 1 R 2 S 1 S 2 ) P(R 1 )P(R 2 )(P(S 1 ) + P(S 2 ) P(S 1 )P(S 2 )) P(R 1 )P(R 2 )P(S 1 S 2 ) P(R 1 )P(R 2 )P(Y ) P(X Y ) (P(R 1 )+P(R 2 ))P(Y ) P(R 1 )P(R 2 )P(Y ) P(R 1 R 2 )P(Y ) P(X)P(Y ). Problem 8: Two important members of a cricket team are injured, and each has probability 1/3 of recovering before the match. The recoveries of the two players are independent of each other. If both are able to play then the team has probability 3/4 of winning the match, if only one of them plays then the probability of winning is 1/2 and if neither play the probability of winning is 1/16. What is the probability that the match is won? Let E 0, E 1, E 2 be the events neither player recovers, exactly one player recovers and both players recover respectively. Let W be the event the match is won. By the theorem of total probability (Theorem 6.1) P(W ) P(W E 0 )P(E 0 ) + P(W E 1 )P(E 1 ) + P(W E 2 )P(E 2 ). Since the recoveries of the two players are independent we have ( ) 2 2 P(E 0 ) ( ) ( ) 1 2 P(E 1 ) ( ) 2 1 P(E 2 )

7 We are told the relevant conditional probabilities in the question and so P(W ) Problem 9: In a game of tennis once the score has reached deuce play continues until (effectively) one player has a lead of two points. The score has reached deuce in a game of tennis between Andre and Boris. Suppose that each point is won by Andre with probability 1/4 (and otherwise by Boris). Suppose also (probably unrealistically) that the outcome of each point is independent of all other points. Let x be the probability that Andre wins the game, u be the conditional probability that Andre wins the game given that he wins the first point and v be the conditional probability that Andre wins the game given that he loses the first point. Use the theorem of total probability (and its analogue for conditional probability) to show that: 4x u + 3v 4u 3x + 1 4v x and hence determine the probability that Andre wins the game. Let X be the event Andre wins the game, W 1 be the event Andre wins the first point and L 1 be the event Andre loses the first point. The events W 1 and L 1 partition the sample space (since L c 1 W 1 ) and so by the theorem of total probability: P(X) P(X W 1 )P(W 1 ) + P(X L 1 )P(L 1 ) x u v 3 4 4x u + 3v. This is the first equation we wanted. To derive the second we need to consider u, that is P(X W 1 ). Let W 2 be the event Andre wins the second point and L 2 be the event Andre loses the second point. The events W 2 and L 2 partition the sample space and so by Theorem 7.4 (total probability applied to conditional probabilities) u P(X W 1 ) P(X W 1 W 2 )P(W 2 W 1 ) + P(X W 1 L 2 )P(L 2 W 1 ) P(X W 1 W 2 )P(W 2 ) + P(X W 1 L 2 )P(L 2 ) since the outcome of the second point is independent of the outcome of the first. If Andre wins the first two points then he wins the game and so P(X W 1 W 2 ) 1 If he wins the first point and loses the second then the score is back to deuce (each player needs to establish a lead of 2 to win) and so P(X W 1 L 2 ) x. So u x 3 4 4u 1 + 3x 7

8 Finally a similar argument applied to v gives v P(X L 1 ) P(X L 1 W 2 )P(W 2 L 1 ) + P(X L 1 L 2 )P(L 2 L 1 ) P(X L 1 W 2 )P(W 2 ) + P(X L 1 L 2 )P(L 2 ) x v x. This gives the three equations. Solving them (say by using the second and third equation to substitute for u and v respectively in the first equation) we obtain x 1/10. Problem 10: There are three boxes: a box containing two gold coins, a box containing two silver coins, and a box containing one gold coin and a silver coin. You pick a box at random. Then you pick a coin from the box at random. assume you have picked a gold coin. Compute the probability that you have picked the box with the two gold coins (i.e. the probability that the remaining coin in the box is a gold coin as well). Consider the events A GG : You pick the box with two gold coins. A GS : You pick the box with one gold coin and a silver coin. A SS : You pick the box with two silver coins. B: You pick a gold coin from the selected box. We need to compute P(A GG B). Since we pick boxes at random we have P(A GG ) 1 3, P(A GS) 1 3, P(A SS) 1 3 The conditional probabilities of picking a gold coin from a given box follows by sampling Bayes theorem tells us P(B A GG ) 1, P(B A GS ) 1 2, P(B A SS) 0 P(A GG B) P(B A GG ) P(A GG) P(B). Using the partition A GG, A GS, A SS the theorem of total probability gives P(B) P(B A GG )P(A GG ) + P(B A GS )P(A GS ) + P(B A SS )P(A SS ). 8

9 Altogether P(A GG B) P(B A GG )P(A GG ) P(B A GG )P(A GG ) + P(B A GS )P(A GS ) + P(B A SS )P(A SS ) Problem 11: A bag contains three coins, a fair coin, a biased coin which has probability 1/3 to come up head, and a biased coin which has probability 2/3 to come up head. You pick a coin at random and toss it once. Assume the coin comes up tail. Compute the probability that the coin comes up head if you toss the coin again. Consider the events F You pick the fair coin. B s B l T 1 H 2 You pick the biased coin which has probability 1/3 coming up head You pick the biased coin which has probability 2/3 coming up head The first toss of the selected coin is tail The second toss of the selected coin is head We need to compute P(H 2 T 1 ). Since we pick the coin at random we have P(F ) 1 3, P(B s) 1 3, P(B l) 1 3. (Conditional) probabilities for getting tail when tossing the fair coin, the biased coin coming up head with probability 1/3, and the biased coin coming up head with probability 2/3 P(T 1 F ) 1 2, P(T 1 B s ) 2 3, P(T 1 B l ) 1 3 (Conditional) probabilities for getting head when tossing the fair coin, tossing the biased coin coming up head with probability 1/3, and tossing the biased coin coming up head with probability 2/3 P(H 2 F ) 1 2, P(H 2 B s ) 1 3, P(H 2 B l ) 2 3 Using the partition F, B s, B l theorem 6.2 tells us P(H 2 T 1 ) P(H 2 T 1 F )P(F T 1 ) + P(H 2 T 1 B s )P(B s T 1 ) + P(H 2 T 1 B l )P(B l T 1 ) 9

10 Let us first consider the first factor in each term, P(H 2 T 1 X), where X denotes F, B s, or B l. Using definition of conditional probability we have P(H 2 T 1 X) P(H 2 T 1 X) P(T 1 X) P(H 2 T 1 X) P(X) P(X) P(T 1 X) P(H 2 T 1 X) P(T 1 X) Using conditional independence of subsequent coin tosses for a given coin, P(H 2 T 1 X) P(H 2 X)P(T 1 X), we arrive at Hence P(H 2 T 1 X) P(H 2 X)P(T 1 X) P(T 1 X) P(H 2 T 1 F ) P(H 2 F ) 1 2 P(H 2 X). P(H 2 T 1 B s ) P(H 2 B 2 ) 1 3 P(H 2 T 1 B l ) P(H 2 B l ) 2 3 For the second factor P(X T 1 ) Bayes theorem gives P(X T 1 ) P(T 1 X) P(X) P(T 1 ) For the numerator we have P(X) 1/3 whereas the denominator can be written as, using the theorem of total probability with partition F, B s, B l Hence and P(T 1 ) P(T 1 F )P(F ) + P(T 1 B s )P(B s ) + P(T 1 B l )P(B l ) P(X T 1 ) P(T 1 X) 2 3 P(F T 1 ) 2 3 P(T 1 F ) 1 3 P(B s T 1 ) 2 3 P(T 1 B 2 ) 4 9 Combining all results we finally arrive at P(B l T 1 ) 2 3 P(T 1 B l ) 2 9 P(H 2 T 1 ) The probability is slightly smaller than 1/2, i.e., it is more likely to toss tail again. Problem 12: You roll two fair standard dice and record the numbers showing. Let T be the random variable the sum of the two numbers rolled and D be the random variable the absolute value of the difference between the two numbers rolled. 10

11 a) Find the probability mass function of T. b) Find the expectation of T. c) Find the probability mass function of D. d) Find the expectation D. e) Say in words what the random variable M 1 (T + D) measures. 2 a) The random variable T takes values 2, 3, 4,..., 12. The probability mass function of T is as follows: b) t P(T t) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 To see this consider the sample space S {(k, l) : 1 k 6 and 1 l 6} Count, for each t, how many elements of the sample space there are for which the two numbers sum to t. For example if t 2 there is just the single outcome (1, 1) with this property and P(T 2) 1/36. Or P(T 5) 4/36 because of the four outcomes (1, 4), (2, 3), (3, 2), (4, 1). E(T ) 2 1/ / / / / / / / / / /36 252/36 7. c) The random variable D takes values 0, 1, 2, 3, 4, 5. The probability mass function of D is as follows: d) d P(D d) 6/36 10/36 8/36 6/36 4/36 2/36 To see this consider the sample space and count, for each d, how many elements of the sample space there are for which the difference between the two numbers is d. For example P(D 4) 4/36 because of the four outcomes (1, 5), (2, 6), (6, 2), (5, 1). E(D) 0 6/ / / / / /36 70/36 35/18. 11

12 e) T is the sum of the two rolls, that means it is the maximum of the two rolls, say x max, plus the minimum of the two rolls, say x min. Hence T x max + x min. D is the difference of both rolls in absolute value, that means it is the maximum of the two rolls minus the minimum of the two rolls. Hence D x max x min. Therefore T + D 2x max is twice the maximum of the two rolls and so M is the maximum of the two rolls. Problem 13: a) Adam has five keys in his pocket one of which opens his front door. On arriving home he takes out a key at random and attempts to open the door with it. If it doesn t fit the door then he replaces it in his pocket. This is repeated until he manage to open the door. Let A be the random variable the number of attempts he takes. Find the probability mass function of A. b) Eve has five keys in her pocket one of which opens her front door. On arriving home she takes out a key at random and attempts to open the door with it. If it doesn t fit the door then she holds on to it and picks a key from those remaining in her pocket. This is repeated until she manages to open the door. Let E be the random variable the number of attempts she takes. Find the probability mass function of E. a) The random variable A takes values in N. If n N then P(A n) P(first n 1 picks are incorrect, nth pick is correct). Each pick has probability 4/5 of being incorrect and 1/5 of being correct and picks are independent. We get that the pmf is: that means A Geom(1/5). P(A n) ( ) n b) The random variable E takes values in {1, 2, 3, 4, 5} (since the incorrect keys are not replaced there is no way that 5 incorrect keys can be chosen). If 1 n 5 then the event E n occurs if the first n 1 picks are incorrect and the next is correct. For E 1 the first pick is correct and that happens with probability 1/5. For E 2 the first pick is incorrect (with probability 4/5) and the second is correct (with probability 1/4, key not replaced). For E 3 the first pick is incorrect (with probability 4/5), the second is incorrect (with probability 3/4), and the third is correct (with probability 1/3). 12

13 And so on. Hence we get that the pmf is: P(E 1) 1 5 P(E 2) P(E 3) P(E 4) P(E 5) That is P(E n) 1/5 for all n {1, 2, 3, 4, 5}. Problem 14: Each match played by a football team is won by that team with probability 1/2, is a draw with probability 1/6, and is lost with probability 1/3, with the result of each match being independent of all other results. Find the probability mass function of each of the following random variables related to this. (Some but not all of these take one of the special distributions studied in lectures; in these cases you should use the name of the distribution otherwise just write down the pmf). a) The number of drawn matches in a season lasting 38 matches. b) The number of matches up to and including their first win. c) The number of matches following their first defeat up to and including their next win. d) The number of wins in the 10 matches following their first defeat. e) The number of matches up to and including their second loss. f) The number of matches up to and including their mth loss where m is a fixed positive integer. a) If we regard a draw as success then we have 38 independent trials with the probability of success being 1/6 in each and our random being the number of successes. The distribution is therefore Bin(38, 1/6). b) If we regard a win as success then this is the number of Bernoulli(1/2) trials up to and including the first success. The distribution is therefore Geom(1/2). c) Starting from the first defeat we have a sequence of independent Bernoulli(1/2) trials. So the number of games up to and including the next win has the Geom(1/2) distribution. The fact that we start from the first defeat is irrelevant to the distribution of the number of games until the next win. 13

14 d) The 10 matches following the first defeat consist of a fixed number of Bernoulli trials each with probability 1/2 of success (thinking of a win as success). The number of wins then has a Bin(10, 1/2) distribution. e) This is not a distribution we have studied so we will have to work the pmf out directly. Let N be the number of matches up to and including the second loss. Then N takes values 2, 3, 4,... and if k 2 we have P(N k) P(1 loss and k 2 others in the first k 1 matches followed by a loss in the kth) ( ) ( ) k 2 ( ) (k 1) ( ) 2 ( ) k (k 1). 3 3 f) Similarly to part e) you can work out the pmf of the number of matches up to and including the mth loss where m is a fixed positive integer. If we call this M then M takes values m, m + 1, m + 2,... and if k m we have P(M k) P(m 1 losses and k m others in the first k 1 matches then a loss in the kth) ( ) ( ) m 1 ( ) k m ( ) k m ( ) ( ) m ( ) k m k m Problem 15: Prove that if X is a discrete random variable with Var(X) 0 then X is constant. That is there exists some a with P(X a) 1. Suppose that X takes values x 1, x 2,..., x n and that E(X) µ. Then Var(X) (x i µ) 2 P(X x i ). i1 Each summand is 0 and so the only way the sum can be equal to 0 is if all summands are equal to 0. Now (x i µ) 2 is 0 only if x i µ and so we must have P(X x j ) 0 for all x j µ. It follows that P(X µ) 1 and so X is a constant random variable. Problem 16: Let X be the number of fish caught by a fisherman and Y be the number of fish caught by a second fisherman in one afternoon of fishing. Suppose that X is distributed Poisson(λ) and Y is distributed Poisson(µ). Suppose further that X and Y are independent random variables. a) Show that P(X + Y n) e k0 λ λk k! e µ µn k (n k)!. 14

15 b) Hence find the distribution of the total number of fish caught. a) To have X + Y n we need X k and Y n k for some k with 0 k n. Hence P(X + Y n) P(X k and Y n k). By independence k0 P(X + Y n) P(X k)p(y n k). k0 Now substituting in the pmf of a Poisson random variable P(X + Y n) e k0 λ λk k! e µ µn k (n k)!. b) λ λk µn k P(X + Y n) e e µ k! (n k)! k0 ( e (λ+µ) 1 n n! k k0 e (λ+µ) 1 ( n n! k k0 e (λ+µ) 1 n! (λ + µ)n. ) λ k µ n k ) λ k µ n k where the last line is an application of the binomial theorem. This is the pmf of a Poisson(λ + µ) distribution and so X + Y Poisson(λ + µ). 15

MAS108 Probability I

MAS108 Probability I 1 BSc Examination 2008 By Course Units 2:30 pm, Thursday 14 August, 2008 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators.

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1 Topic 5: Probability Standard Level 5.4 Combined Events and Conditional Probability Paper 1 1. In a group of 16 students, 12 take art and 8 take music. One student takes neither art nor music. The Venn

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Introduction and basic definitions

Introduction and basic definitions Chapter 1 Introduction and basic definitions 1.1 Sample space, events, elementary probability Exercise 1.1 Prove that P( ) = 0. Solution of Exercise 1.1 : Events S (where S is the sample space) and are

More information

2. Conditional Probability

2. Conditional Probability ENGG 2430 / ESTR 2004: Probability and Statistics Spring 2019 2. Conditional Probability Andrej Bogdanov Coins game Toss 3 coins. You win if at least two come out heads. S = { HHH, HHT, HTH, HTT, THH,

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times. HW1 Solutions October 5, 2016 1. (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times. 1. (2 pts.) Dene the appropriate random variables. Answer:

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD .0 Introduction: The theory of probability has its origin in the games of chance related to gambling such as tossing of a coin, throwing of a die, drawing cards from a pack of cards etc. Jerame Cardon,

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Nuevo examen - 02 de Febrero de 2017 [280 marks]

Nuevo examen - 02 de Febrero de 2017 [280 marks] Nuevo examen - 0 de Febrero de 0 [0 marks] Jar A contains three red marbles and five green marbles. Two marbles are drawn from the jar, one after the other, without replacement. a. Find the probability

More information

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER 2 2017/2018 DR. ANTHONY BROWN 5.1. Introduction to Probability. 5. Probability You are probably familiar with the elementary

More information

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Chapter 2.5 Random Variables and Probability The Modern View (cont.) Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose

More information

Great Theoretical Ideas in Computer Science

Great Theoretical Ideas in Computer Science 15-251 Great Theoretical Ideas in Computer Science Probability Theory: Counting in Terms of Proportions Lecture 10 (September 27, 2007) Some Puzzles Teams A and B are equally good In any one game, each

More information

MTH4107 / MTH4207: Introduction to Probability

MTH4107 / MTH4207: Introduction to Probability Main Examination period 2018 MTH4107 / MTH4207: Introduction to Probability Duration: 2 hours Student number Desk number Make and model of calculator used Apart from this page, you are not permitted to

More information

Name: Exam 2 Solutions. March 13, 2017

Name: Exam 2 Solutions. March 13, 2017 Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth

More information

Statistical Inference

Statistical Inference Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory

More information

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553 Test 2 Review: Solutions 1) The following outcomes have at least one Head: HHH, HHT, HTH, HTT, THH, THT, TTH Thus, P(at least one head) = 7/8 2) The following outcomes have a sum of 9: (6,3), (5,4), (4,5),

More information

UNIT 5 ~ Probability: What Are the Chances? 1

UNIT 5 ~ Probability: What Are the Chances? 1 UNIT 5 ~ Probability: What Are the Chances? 1 6.1: Simulation Simulation: The of chance behavior, based on a that accurately reflects the phenomenon under consideration. (ex 1) Suppose we are interested

More information

Section 13.3 Probability

Section 13.3 Probability 288 Section 13.3 Probability Probability is a measure of how likely an event will occur. When the weather forecaster says that there will be a 50% chance of rain this afternoon, the probability that it

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as

More information

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Problems and results for the ninth week Mathematics A3 for Civil Engineering students Problems and results for the ninth week Mathematics A3 for Civil Engineering students. Production line I of a factor works 0% of time, while production line II works 70% of time, independentl of each other.

More information

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities Dr. Jing Yang jingyang@uark.edu OUTLINE 2 Applications

More information

Independence 1 2 P(H) = 1 4. On the other hand = P(F ) =

Independence 1 2 P(H) = 1 4. On the other hand = P(F ) = Independence Previously we considered the following experiment: A card is drawn at random from a standard deck of cards. Let H be the event that a heart is drawn, let R be the event that a red card is

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

Conditional probability

Conditional probability CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will

More information

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

Discrete Random Variable Practice

Discrete Random Variable Practice IB Math High Level Year Discrete Probability Distributions - MarkScheme Discrete Random Variable Practice. A biased die with four faces is used in a game. A player pays 0 counters to roll the die. The

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an

More information

Probability Theory and Applications

Probability Theory and Applications Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson

More information

Mutually Exclusive Events

Mutually Exclusive Events 172 CHAPTER 3 PROBABILITY TOPICS c. QS, 7D, 6D, KS Mutually Exclusive Events A and B are mutually exclusive events if they cannot occur at the same time. This means that A and B do not share any outcomes

More information

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1 Topic -2 Probability Larson & Farber, Elementary Statistics: Picturing the World, 3e 1 Probability Experiments Experiment : An experiment is an act that can be repeated under given condition. Rolling a

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

Discrete Probability

Discrete Probability MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

STA Module 4 Probability Concepts. Rev.F08 1

STA Module 4 Probability Concepts. Rev.F08 1 STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret

More information

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob Lecture 2 31 Jan 2017 Logistics: see piazza site for bootcamps, ps0, bashprob Discrete Probability and Counting A finite probability space is a set S and a real function p(s) ons such that: p(s) 0, s S,

More information

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail} Random Experiment In random experiments, the result is unpredictable, unknown prior to its conduct, and can be one of several choices. Examples: The Experiment of tossing a coin (head, tail) The Experiment

More information

Lecture Lecture 5

Lecture Lecture 5 Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Math SL Day 66 Probability Practice [196 marks]

Math SL Day 66 Probability Practice [196 marks] Math SL Day 66 Probability Practice [96 marks] Events A and B are independent with P(A B) = 0.2 and P(A B) = 0.6. a. Find P(B). valid interpretation (may be seen on a Venn diagram) P(A B) + P(A B), 0.2

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

MA : Introductory Probability

MA : Introductory Probability MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

STAT 285 Fall Assignment 1 Solutions

STAT 285 Fall Assignment 1 Solutions STAT 285 Fall 2014 Assignment 1 Solutions 1. An environmental agency sets a standard of 200 ppb for the concentration of cadmium in a lake. The concentration of cadmium in one lake is measured 17 times.

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1 IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability

More information

STEP Support Programme. Statistics STEP Questions: Solutions

STEP Support Programme. Statistics STEP Questions: Solutions STEP Support Programme Statistics STEP Questions: Solutions 200 S Q2 Preparation (i) (a) The sum of the probabilities is, so we have k + 2k + 3k + 4k k 0. (b) P(X 3) P(X 3) + P(X 4) 7 0. (c) E(X) 0 ( +

More information

CS206 Review Sheet 3 October 24, 2018

CS206 Review Sheet 3 October 24, 2018 CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

Discrete Probability

Discrete Probability Discrete Probability Counting Permutations Combinations r- Combinations r- Combinations with repetition Allowed Pascal s Formula Binomial Theorem Conditional Probability Baye s Formula Independent Events

More information

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

14 - PROBABILITY Page 1 ( Answers at the end of all questions ) - PROBABILITY Page ( ) Three houses are available in a locality. Three persons apply for the houses. Each applies for one house without consulting others. The probability that all the three apply for the

More information

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL: Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 15 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 6 (1) This week: Days

More information

Quantitative Methods for Decision Making

Quantitative Methods for Decision Making January 14, 2012 Lecture 3 Probability Theory Definition Mutually exclusive events: Two events A and B are mutually exclusive if A B = φ Definition Special Addition Rule: Let A and B be two mutually exclusive

More information

Lecture 3 Probability Basics

Lecture 3 Probability Basics Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability

More information

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators.

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators. SS657a Midterm Exam, October 7 th 008 pg. SS57a Midterm Exam Monday Oct 7 th 008, 6:30-9:30 PM Talbot College 34 and 343 You may use simple, non-programmable scientific calculators. This exam has 5 questions

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Name: 180A MIDTERM 2. (x + n)/2

Name: 180A MIDTERM 2. (x + n)/2 1. Recall the (somewhat strange) person from the first midterm who repeatedly flips a fair coin, taking a step forward when it lands head up and taking a step back when it lands tail up. Suppose this person

More information

Edexcel past paper questions

Edexcel past paper questions Edexcel past paper questions Statistics 1 Discrete Random Variables Past examination questions Discrete Random variables Page 1 Discrete random variables Discrete Random variables Page 2 Discrete Random

More information

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.) MAS 108 Probability I Notes 1 Autumn 2005 Sample space, events The general setting is: We perform an experiment which can have a number of different outcomes. The sample space is the set of all possible

More information

Probability. VCE Maths Methods - Unit 2 - Probability

Probability. VCE Maths Methods - Unit 2 - Probability Probability Probability Tree diagrams La ice diagrams Venn diagrams Karnough maps Probability tables Union & intersection rules Conditional probability Markov chains 1 Probability Probability is the mathematics

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive

More information

Probability Distributions. Conditional Probability.

Probability Distributions. Conditional Probability. Probability Distributions. Conditional Probability. CSE21 Winter 2017, Day 21 (B00), Day 14 (A00) March 6, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Probability Rosen p. 446, 453 Sample space, S:

More information

Lower bound for sorting/probability Distributions

Lower bound for sorting/probability Distributions Lower bound for sorting/probability Distributions CSE21 Winter 2017, Day 20 (B00), Day 14 (A00) March 3, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Another application of counting lower bounds Sorting

More information

DISCRETE VARIABLE PROBLEMS ONLY

DISCRETE VARIABLE PROBLEMS ONLY DISCRETE VARIABLE PROBLEMS ONLY. A biased die with four faces is used in a game. A player pays 0 counters to roll the die. The table below shows the possible scores on the die, the probability of each

More information

Conditional Probability. CS231 Dianna Xu

Conditional Probability. CS231 Dianna Xu Conditional Probability CS231 Dianna Xu 1 Boy or Girl? A couple has two children, one of them is a girl. What is the probability that the other one is also a girl? Assuming 50/50 chances of conceiving

More information

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Chapter 7: Section 7-1 Probability Theory and Counting Principles Chapter 7: Section 7-1 Probability Theory and Counting Principles D. S. Malik Creighton University, Omaha, NE D. S. Malik Creighton University, Omaha, NE Chapter () 7: Section 7-1 Probability Theory and

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

Probability Year 10. Terminology

Probability Year 10. Terminology Probability Year 10 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 3 May 28th, 2009 Review We have discussed counting techniques in Chapter 1. Introduce the concept of the probability of an event. Compute probabilities in certain

More information

Notes 10.1 Probability

Notes 10.1 Probability Notes 10.1 Probability I. Sample Spaces and Probability Functions A. Vocabulary: 1. The Sample Space is all possible events 2. An event is a subset of the sample space 3. Probability: If E is an event

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Probability and Independence Terri Bittner, Ph.D.

Probability and Independence Terri Bittner, Ph.D. Probability and Independence Terri Bittner, Ph.D. The concept of independence is often confusing for students. This brief paper will cover the basics, and will explain the difference between independent

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space I. Vocabulary: A. Outcomes: the things that can happen in a probability experiment B. Sample Space (S): all possible outcomes C. Event (E): one outcome D. Probability of an Event (P(E)): the likelihood

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Probability 1 (MATH 11300) lecture slides

Probability 1 (MATH 11300) lecture slides Probability 1 (MATH 11300) lecture slides Márton Balázs School of Mathematics University of Bristol Autumn, 2015 December 16, 2015 To know... http://www.maths.bris.ac.uk/ mb13434/prob1/ m.balazs@bristol.ac.uk

More information

4.2 Probability Models

4.2 Probability Models 4.2 Probability Models Ulrich Hoensch Tuesday, February 19, 2013 Sample Spaces Examples 1. When tossing a coin, the sample space is S = {H, T }, where H = heads, T = tails. 2. When randomly selecting a

More information