Senior Math Circles November 19, 2008 Probability II

Similar documents
Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Conditional Probability

1 The Basic Counting Principles

Introductory Probability

Random Variables Example:

Chapter 8 Sequences, Series, and Probability

Example. If 4 tickets are drawn with replacement from ,

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

10.1. Randomness and Probability. Investigation: Flip a Coin EXAMPLE A CONDENSED LESSON

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

MATH STUDENT BOOK. 12th Grade Unit 9

Module 8 Probability

Semester 2 Final Exam Review Guide for AMS I

Introduction to Probability, Fall 2013

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Grades 7 & 8, Math Circles 24/25/26 October, Probability

MAT Mathematics in Today's World

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

k P (X = k)

CS 361: Probability & Statistics

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Chapter 3: Probability 3.1: Basic Concepts of Probability

Intermediate Math Circles November 8, 2017 Probability II

What is the probability of getting a heads when flipping a coin

1 Probability Theory. 1.1 Introduction

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Probability Year 10. Terminology

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

12.1. Randomness and Probability

Part 3: Parametric Models

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

Great Theoretical Ideas in Computer Science

SERIES

Name: Firas Rassoul-Agha

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Unit 4 Probability. Dr Mahmoud Alhussami

Probability Year 9. Terminology

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

*Karle Laska s Sections: There is no class tomorrow and Friday! Have a good weekend! Scores will be posted in Compass early Friday morning

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Chapter 4: An Introduction to Probability and Statistics

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

What is a random variable

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

PROBABILITY.

Probability and Statistics Notes

6.3 Bernoulli Trials Example Consider the following random experiments

CMPSCI 240: Reasoning Under Uncertainty

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Sampling Distributions

1. Use the Fundamental Counting Principle. , that n events, can occur is a 1. a 2. a 3. a n

CSC Discrete Math I, Spring Discrete Probability

Lecture 1. ABC of Probability

Summer HSSP Week 1 Homework. Lane Gunderman, Victor Lopez, James Rowan

Math 493 Final Exam December 01

Homework every week. Keep up to date or you risk falling behind. Quizzes and Final exam are based on homework questions.

CISC-102 Fall 2017 Week 1 David Rappaport Goodwin G-532 Office Hours: Tuesday 1:30-3:30

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

Discrete random variables

Solution: There are 30 choices for the first person to leave, 29 for the second, etc. Thus this exodus can occur in. = P (30, 8) ways.

Discrete Random Variables

Discrete Probability. Chemistry & Physics. Medicine

Examples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

1 Basic continuous random variable problems

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

14.2 THREE IMPORTANT DISCRETE PROBABILITY MODELS

2. Linda paid $38 for a jacket that was on sale for 25% of the original price. What was the original price of the jacket?

Chapter 8: An Introduction to Probability and Statistics

2011 Cayley Contest. The CENTRE for EDUCATION in MATHEMATICS and COMPUTING. Solutions. Thursday, February 24, (Grade 10)

Grade 6 Math Circles. Gauss Contest Preparation - Solutions

Chapter 4 Probability

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Combinatorial Analysis

Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3

Discrete Probability

Principles of Mathematics 12

3.2 Probability Rules

CS 1538: Introduction to Simulation Homework 1

Probability Notes (A) , Fall 2010

Elementary Discrete Probability

STAT 201 Chapter 5. Probability

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Counting. Math 301. November 24, Dr. Nahid Sultana

With Question/Answer Animations. Chapter 7

Probability the chance that an uncertain event will occur (always between 0 and 1)

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

Statistics 100 Exam 2 March 8, 2017

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Conditional Probability

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

Transcription:

University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Senior Math Circles November 9, 2008 Probability II Probability Counting There are many situations where each of the outcomes in a sample space is equally likely. In these situations, the probability of an event A can be calculated by dividing the number of outcomes in A by the total number of outcomes in S. If we have 3 objects that are all different, we can arrange them in a row in 6 ways. For example, if the objects are denoted by the letters X, Y, Z, then there 6 ways to arrange them: XY Z, XZY, Y XZ, Y ZX, ZXY, ZY X. We have 3 choices for the first letter, for each of those we have 2 choices for the second letter, and then there is only one choice left for the third letter. So there are 3 2 = 6 different ways they can be arranged. If there are n distinct objects, we can arrange them in n (n ) (n 2)... 2 ways. We define n! as n! = n (n ) (n 2)... 2, to represent this number. In many applications, we need to know the number of ways to select a subset of outcomes from the entire set. For example, we might want to select 2 items from 4 where we do not really care about the order we select them (for example, XY and Y X are treated the same way). If we have letters W, X, Y, Z, there are 6 subsets of size 2: W X, W Y, W Z, XY, XZ, Y Z. To count these, we could say that there are 4 choices for the first letter, and three for the second. But that counts XY and Y X differently. There

2 are 2! ways to arrange the two letters we chose, so each selected pair is counted twice, so, overall, there are 4 3 2! = 6 ways to choose 2 letters from 4 if the order does not matter. We define a special notation for this, and say 4 choose 2 : ( ) 4 2 = 4! 2!2! is the number of ways to choose 2 items from 4 if order does not matter. In general, the number of ways to choose x items from n if order does not matter is: ( ) n x = n! x!(n x)!. Examples: In Lotto 649, you choose 6 numbers from 49. A random set of 6 numbers is chosen on the night of the draw. To calculate the probability that your numbers ( will be drawn, we first note that the sample space consists of 49 ) 6 possible ways to choose 6 numbers from 49. Because the numbers are chosen at random, each of the ( 49 6 ) = 3, 983, 86 possible combinations is equally likely. Hence the probability that your numbers will be drawn is: P(win) = ( 49 6 ) = 3,983,86 = 0.00000007524. Problem: The various other prizes from Lotto 649 are taken from the Lotto 649 website and given below. Can you verify them? (Some are quite tricky). Match Prize Probability 6 of 6 Share of 80.50% of Pools Fund 3,983,86 5 of 6 + Bonus Share of 5.75% of Pools Fund 2,330,636 5 of 6 (no bonus) Share of 4.75% of Pools Fund 55,49 4 of 6 Share of 9.00% of Pools Fund,032 3 of 6 $0 57 2 of 6 + Bonus $5 8 Independent Experiments If we conduct two experiments so that the outcome of the second experiment is not influenced by the outcome of the first experiment, we say that the experiments are independent. Example: If experiment consists of tossing a fair coin, and experiment 2 consists of rolling a fair dice independently of the results of experiment. Then the

3 sample space for the combined experiment (experiment followed by experiment 2) is: S = {(H, ), (H, 2), (H, 3), (H, 4), (H, 5), (H, 6), (T, ), (T, 2), (T, 3), (T, 4), (T, 5), (T, 6)}. Since the coin and die are fair, then the probability of a Head on the coin is 2, and the probability of a on the die is 6. All 2 events in this sample space should be equally likely since the result of the second experiment did not depend on the result of the first. Hence, the probability of the combined outcomes (H,) is 2 = 2 6. In general, if p i is the probability of outcome i in experiment, and q j is the probability of outcome j in experiment 2, and the experiments are independent, then p i q j is the probability observing of outcome i in experiment followed by outcome j in experiment 2. Independent Repetitions A special case of independent experiments occurs when experiment 2 is an independent repetition of experiment. Then, unless things have changed somehow, the probability distribution for the outcomes will be the same for each repetition. So the probability of outcome i on repetition and outcome j on repetition 2 is p i p j. Examples: A fair coin is tossed 4 times independently. Since there are 2 possibilities for each toss (H or T), there will be 2 4 possible outcomes. Since the coin is fair, each of these will be equally likely, so the probability distribution will assign probabilities 6 to each point in the combined sample space. A fair coin is tossed, independently, until a head appears. Then the sample space is S = {H, (T, H), (T, T, H), (T, T, T, H),...}. Since the tosses are independent, the probabilities assigned to the points of S are: { 2, 4, 8, 6,...}. This is our first example of a finite sample space with an infinite number of outcomes. But can a sum of an infinite number of real numbers equal? Our sum is T = 2 + 4 + 8 + 6 +.... This is an example of an infinite geometric series, where each term after the first in the sum is 2 of the term before it. There is a formula for finding the sums of such series. If a is the

4 first term in the sum, and r is the common ratio between all terms in the sum, then T = a r, providing < r <. This works here since a = 2, and r = 2, so T =. Bernoulli Trials - The Binomial Distribution A common application occurs when we have Bernoulli Trials - independent repetitions of an experiment in which there are only two possible outcomes. Label the outcomes as a Success or a Failure. If we have n independent repetitions of an experiment, and P (Success)= p on each repetition, then: P (x Successes in n repetitions) = ( n x) p x ( p) (n x). Example: A multiple choice exam consists of 25 questions with 5 responses to each question. If a student guesses randomly for each question, what is the probability he will answer 3 questions correctly? at least 3 questions correctly? Solution: This is a Bernoulli Trials situation with P (Success) = 5. Hence, P (3 correct) = ( ) 25 3 ( 5 )3 ( 4 5 )2 = 0.00029. The probability of at least 3 correct, is: P (3 correct) + P (4 correct) +... + P (25 correct) = 0.00037. Negative Binomial Distribution In a sequence of Bernoulli Trials, suppose we want to know the probability of having to wait x trials before seeing our kth success. If the kth success occurs on the xth trial, we must have k successes in the first x trials and then a success on the xth trial. So the probability is: P (kth Success on xth trial) = ( ) x k p k ( p) x k p = ( ) x k p k ( p) x k Note that there is no upper limit on x, so our sample space here is S = {k, k +, k + 2,...}.

5 Expected Values In the above examples, we are counting the number of successes in n trials or the number of trials until some event occurs. It is natural to want to know, on average how many successes will we see, or on average how many trials until some event occurs. Definition: On average is meant here to be a theoretical concept referring to repeating the process over and over again and averaging the numbers from an infinite number of trials. We refer to this theoretical average as the Expected Value for the number of successes or the number of trials. If p i represents the probability of i successes, or i trials until some event, then the Expected Number of successes or trials is 0 p 0 + p + 2 p 2 +... + n p n where we are counting the number of successes. If we are counting the number of trials, we get a sum that goes to infinity. Example: If you buy one ticket on Lotto 649 per week, how many weeks on average will you have to wait until you win the big prize? Solution: From the earlier example, the probability of winning the big prize on any draw is: p = ( 49 6 ) = 3,983,86. So the probability that you have to wait x weeks to win the big prize is: P (wait x weeks) = ( p) x p. Thus, the expected number of weeks to wait is E = (p) + 2( p)(p) + 3( p) 2 (p) +.... This is a combined arithmetic-geometric sequence. There are several ways to sum this. If you consider E ( p)e, you can verify that E = /p. So you need to wait 3,983,86 weeks on average to win the big prize. Example (Montmort Letter Problem): Suppose n letters are distributed at random into n envelopes so that one letter goes into each envelope. On average, how many letters will end up in their correct envelopes? Does this depend on n?

6 There are two ways to do this question - the hard way and the easy way. Figuring out the probability that x letters end up in the correct envelopes is not that easy. So lets try this another way. Define X i to be 0 if letter i does not end up in the correct envelope and define X i to be if letter i does end up in the correct envelope. Then T = X i is the number of letters in the correct envelopes. Then the expected value of T is the sum of the expected values of the X i s. Now P (X i = ) = n, since there are n choices for letter i. So the expected value of X i is also n, and so the expected value of T is n( n ) =. So, on average, regardless of n, we expect to find only one letter in the correct envelope.

7 Problem Set. The Paradox of Chevalier de Mere: This famous problem was presented by a French gambler (Chevalier de Mere) to Pascal who, in turn, discussed it with Fermat. Pascal and Fermat began the formal study of probability. The problem was Is it more likely to roll at least one in four rolls of a fair die, or to roll at least one pair of s in 24 rolls of two dice? de Mere argued that the probability of a when a fair die is rolled is. The probability of rolling two s when 6 two dice are rolled is. So he claimed that the pair of dice must be rolled 6 times more often 36 to have the same probability of rolling at least one pair of s, as rolling at least one in four rolls of a fair die. Was he right? 2. For Lotto 649, suppose you buy 5 different tickets for a draw. What is the probability that you win the grand prize? If you use this strategy every week for a year, what is the probability that you will win the grand prize at least once? How many tickets would you need to buy every week for one year so that the probability you win the grand prize at least once is 0%? 3. You want to find someone whose birthday matches yours, so you approach people on the street asking each one for their birthday. Ignoring leap year babies, if we assume that each person you approach has probability of sharing your birthday, on average how many people would 365 you have to approach to find someone with the same birthday as you? 4. Eight red blocks and four blue blocks are arranged at random in a row. What is the probability that no two blue blocks are side by side? 5. You will be presented with 0 offers to buy your car. You, quite naturally, want to choose the best offer. However, you see the offers one at a time, and if you refuse any offer, you do not get another chance at that offer. You know if you choose an offer at random, there is probability of choosing the best offer. So you try the following strategy: Examine the first four offers 0 but refuse each one, noting which of these is the best. Then for each of the other offers, refuse it if it is lower than the best one you saw in the first four, and accept it if it is better than the best one you saw in the first four. If you have not selected an offer by the time the last one comes in, you take the last one (by now you are desperate). What is the probability that by using this strategy you will select the best offer? 6. In a sequence of p zeros and q ones, the ith term, t i, is called a change point if t i t i, for i = 2, 3, 4..., p + q. For example, the sequence 0,,, 0, 0,, 0, has p = q = 4, and five change points t 2, t 4, t 6, t 7, t 8. For all possible sequences of p zeros and q ones, with p q, determine: (a) the minimum and maximum number of change points. (b) the average number of change points. 7. A group of n married couples arrives at a dinner party and is seated around a circular table. The distance between the members of a couple is defined to be the number of people sitting between them measured either clockwise or counter-clockwise, whichever gives the smaller result. (a) Considering all possible seating arrangements of the 2n people, what is the average distance between a particular couple A? (b) Considering all possible seating arrangements for the 2n people, what is the average number of couples, per arrangement, where both members of the couple are seated side-by-side?

8 (c) Repeat the question if the 2n people are attending the theatre and are seated in a row. (The distance between members of a couple is defined here as the number of people sitting between them). 8. A sample of size n is chosen at random with replacement from the integers, 2,..., N. (In sampling with replacement, an item is drawn, we note which item it was, and it is then replaced. Note that not all integers need to be different under sampling with replacement). Find the average number of different integers in the sample. What happens if n N? 9. A random graph with n vertices is generated by connecting pairs of vertices at random. Each of the possible ( n 2) possible edges is inserted with probability p, independently of other edges. Find the average number of triangles in the graph. 0. A group of n people each toss their hats in the air and the hats are picked up randomly. Each person who gets his or her own hat leaves and the remaining people toss the hats again. The process continues until every person has received his or her own hat again. Find the average number of rounds of tosses required.