P [(E and F )] P [F ]

Similar documents
Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Lecture Lecture 5

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Statistics 251: Statistical Methods

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

k P (X = k)

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

4. Probability of an event A for equally likely outcomes:

Math 1313 Experiments, Events and Sample Spaces

Probability: Sets, Sample Spaces, Events

First Digit Tally Marks Final Count

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Intermediate Math Circles November 8, 2017 Probability II


Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

A brief review of basics of probabilities

STAT 111 Recitation 1

P (A) = P (B) = P (C) = P (D) =

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

12 1 = = 1

Dynamic Programming Lecture #4

Chapter 2 PROBABILITY SAMPLE SPACE

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Probability- describes the pattern of chance outcomes

RVs and their probability distributions

STA 247 Solutions to Assignment #1

1 Random variables and distributions

Probabilistic models

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Probability (Devore Chapter Two)

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

STA Module 4 Probability Concepts. Rev.F08 1

Math 140 Introductory Statistics

Chapter 35 out of 37 from Discrete Mathematics for Neophytes: Number Theory, Probability, Algorithms, and Other Stuff by J. M. Cargal.

Basic Concepts of Probability

Chapter 2: Probability Part 1

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Elementary Discrete Probability

Probability. Lecture Notes. Adolfo J. Rumbos

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis

Term Definition Example Random Phenomena

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Lecture 6 - Random Variables and Parameterized Sample Spaces

Axioms of Probability

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Phys 160 Thermodynamics and Statistical Physics. Lecture 8 Randomness and Probability

STAT 430/510 Probability

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Fundamentals of Probability CE 311S

HW MATH425/525 Lecture Notes 1

Ch 14 Randomness and Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Lecture 1 : The Mathematical Theory of Probability

Properties of Probability

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Notes on Mathematics Groups

Statistical Theory 1

324 Stat Lecture Notes (1) Probability

Probability Pearson Education, Inc. Slide

3 PROBABILITY TOPICS

Chapter 2. Probability. Math 371. University of Hawai i at Mānoa. Summer 2011

Probability and Sample space

Section 13.3 Probability

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Chapter 6: Probability The Study of Randomness

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

Chapter 8: An Introduction to Probability and Statistics

When working with probabilities we often perform more than one event in a sequence - this is called a compound probability.

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

Probability and Independence Terri Bittner, Ph.D.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

the time it takes until a radioactive substance undergoes a decay

Conditional Probability

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Introduction to Probability

Conditional Probability

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013

Math 243 Section 3.1 Introduction to Probability Lab

Probability and Statistics Notes

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

STAT Chapter 3: Probability

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Sampling Distributions

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Basic Probability. Introduction

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Topic 5 Basics of Probability

Section 7.2 Definition of Probability

Transcription:

CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 This worksheet supplements our textbook material on the concepts of conditional probability and independence. The exercises at the end of each section of this worksheet provide good preparation for Exam 1 problems that cover these concepts. Solutions to the exercises on these worksheets are given in the Appendix at the end of the worksheet. 1. Conditional Probability Conditional probability is a key concept in probability theory. For two events E and F in a sample space the conditional probability of E given F assesses the chance of the event E occurring if we know that the event F will occur. The formula for this new probability of E called the conditional probability of E given F is P [E F ] = P [(E and F )] P [F ] Note that the symbol P [E F ] on the left side is just our notation for conditional probability; the right hand side of the equation is our definition. It says that we need to compute P [(E and F )] and P [F ] then divide P [(E and F )] by P [F ]. The following two examples should help illustrate the conditional probability concept. Three coin flip example Suppose we flip three fair coins: a dime a nickel and then a penny. There are eight possible outcomes of this experiment which we will symbolize with sequences of the capitol letters H or T which represent outcomes of heads or tails respectively. All eight possible outcomes in the order dime nickel then penny can be listed as follows: S = { HHH HHT HT H T HH T T T T T H T HT HT T } The eight outcomes that include at least two heads are highlighted below. S = { HHH HHT HT H T HH T T T T T H T HT HT T } Defining E as the event that at least two heads appear; }] P [E] = P HHH HHT HT H T HH = 4 8 = 0.5 In other words there is a 50% chance of getting at least two heads. We need to identify another event F to condition on if we want to compute a conditional probability. 1

2 CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 Define F as the event that the first coin (the dime) is tails. The event F is highlighted in yellow below: S = { HHH HHT HT H T HH T T T T T H T HT HT T } The intersection (E and F ) has a single outcome highlighted in green below: S = { HHH HHT HT H T HH T T T T T H T HT HT T } We can symbolize the conditional probability of getting at least two heads given that the first coin we flip (the dime) shows a tail as P [E F ]. Intuitively this probability should be lower than 50% because the first coin being tails should decrease the chance of getting at least two heads. Indeed }] P [E F ] = P [(E and F )] P [F ] P T HH = P HHH HHT HT H T HH }] = 1/8 4/8 = 0.125 0.5 = 0.25 We have found that the conditional probability of E given F is 0.25; that is there is a 25% chance of getting at least two heads given that the first coin shows tails Two dice roll example A familiar example of a probability experiment is the rolling of two fair six-sided dice. There are 36 possible outcomes of this experiment as seen here: S = { } If we actually roll the dice only one of these 36 outcomes will occur. If the dice are fair dice in the sense that every face is equally likely on each of the dice then we expect all 36 of the two-dice outcomes listed above to be equally likely. To demonstrate the concept of conditional probability in this context we define two events in this sample space: E = The sum of the pips on both dice is at least 7 F = Doubles (both dice have same number of pips) In the picture below we highlight outcomes only in E (and not in F ) using blue highlight the outcomes only in F (and not in E) using yellow and highlight the intersection (E and

CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 3 F ) in green: S = { } Now we compute the conditional probability of E given F ; verbally this is the chance that the sum of the dice will be at least 7 given that we know we will roll doubles. Using the definition P [E F ] = = P [(E and F )] P [F ] P P = 3/36 6/36 = 0.5 }] }] It is interesting to observe that the regular probability of E is P [E] = 21 = 0.583. Thus 36 the event that the dice will show doubles makes the conditional probability of E less than the regular probability of E since 0.5 < 0.583. Exercies: 1.1. Compute the conditional probability of F given E for both of the examples above and compare this to the probability of F. Is the conditional probability of F given E more than less than or equal to the probability of F? 1.2. If we flip five fair coins compute the following conditional probabilities: (1) The conditional probability that a least 3 heads show given that at least 1 tail shows. (2) The conditional probability that a least 3 heads show given that at least 1 head shows. (3) The conditional probability that exactly 3 heads show given that at least 1 head shows. (4) The conditional probability that all five coins show the same face given that at least 3 heads show. 1.3. For a certain airline 93% of all flights take off on time 18% of all flights take off from Denver and 16% of all flights take off from Denver on time. Find the conditional probability that one of their flights takes off on time given that it takes off from Denver.

4 CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 1.4. If we roll two fair six-sided dice compute the following conditional probabilities: (1) The conditional probability of rolling a sum of at least 9 pips given that the first die shows. (2) The conditional probability of rolling a sum of at least 9 pips given that the first die shows at least 5 pips. (3) The conditional probability of rolling a sum of at least 9 pips given that the first die shows less than 5 pips. (4) The conditional probability of rolling a sum of exactly 7 pips given that the first die shows. 2. Independence Both of the examples we looked at in the previous section featured conditional probabilities P [E F ] that differed from the regular probabilities P [E]. Events E and F that have P [E F ] P [E] are called dependent events; this is a technical term used in probability and does not mean the same thing as our everyday definition of the world dependent. For example two non-empty events E and F that have no outcomes in common (that is E and F are mutually exclusive) then the events must be dependent because if one event occurs the conditional probability of the other event must be zero. If the events are not dependent then P [E] must be the same as P [E F ]. This gives us the definition of independence: If P [E F ] = P [E] then the events E and F are said to be independent. We will look at two examples below to illustrate this concept. Two pairs of independent events in two different sample spaces are given in the two examples below. We will use the symbols A and B for the event in the examples below to distinguish them from the previous examples. Three coin flip example Again we consider the three coin flip sample space. S = { HHH HHT HT H T HH T T T T T H T HT HT T } Consider the events A and B defined as follows: A = All three coins show the same face B = The last coin (penny) is a heads Intuitively you might realize that neither of these events occurrence will affect the probability of the other event occurring; this in fact means that the events are independent. We do not have to rely on intuition though we can verify that the events are independent mathematically as follows. Below the outcomes contained in A but not in B are highlighted in yellow the outcomes contained in B but not in A are highlighted in blue and the intersection (A and B) is

CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 5 highlighted in green. S = { HHH HHT HT H T HH T T T T T H T HT HT T } To check if A and B are independent we compare P [A B] to P [A]. First we find that P [A B] = P [(A and B)] P [B] = }] P HHH P HHH T T H HT H T HH }] = 1/8 4/8 = 0.125 0.5 = 0.25 Second we find that P [A] = P HHH T T T }] = 2 8 = 0.25 Since P [A B] = P [A] we have verified that the events A and B are independent. Two fair dice roll example Again we consider the two dice roll sample space shown below: S = { } Consider the events A and B defined as follows: A = The first die rolled shows a B = Doubles (both dice have same number of pips) Before moving on contemplate these two events and ask yourself if the occurrence of one of these events changes the probability of the other occurring. If there is not change in this probability the events are independent. We check if the events are independent mathematically below. In the picture below we highlight outcomes only in A (and not in B) using blue highlight the outcomes only in B (and not in A) using yellow and highlight the intersection (A and

6 CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 B) in green: S = { } To check if A and B are independent we compare P [A B] to P [A]. First we find that }] P [(A and B)] P P [A B] = = }] = 1/36 P [B] P 6/36 =.167 Second we find that P [A] = P }] = 6 36 =.167 Since P [A B] = P [A] we have verified that the events A and B are independent. Exercies: 2.1. Compute the conditional probability of B given A for both of the examples above and compare this to the probability of B. Is the conditional probability of B given A more than less than or equal to the probability of B? 2.2. Which of the following pairs of events are independent? Show your comparison of P [E F ] to P [E] in each case. (1) The event that a least 3 heads show and the event that at least 1 tail shows. (2) The event that a least 3 heads show and the event that at least 1 head shows. (3) The event that exactly 3 heads show and the event that at least 1 head shows. (4) The event that all five coins show the same face and the event that at least 3 heads show. 2.3. For a certain airline 93% of all flights take off on time 18% of all flights take off from Denver and 16% of all flights take off from Denver on time. Determine if the event that the flight takes off on time is independent of the event that the flight takes off from Denver. 2.4. If we roll two fair six-sided dice determine if the following pairs of events are independent: (1) The event of rolling a sum of at least 9 pips and the event that the first die shows. (2) The event of rolling a sum of at least 9 pips and the event that the first die shows at least 5 pips. (3) The event of rolling a sum of at least 9 pips and the event that the first die shows at less than 5 pips. (4) The event of rolling a sum of exactly 7 pips and the event that the first die shows.

CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 7 Solutions to Section 1 Exercises: 1.1: Coins:.25 (less then P [F ]) Dice:.143(less then P [F ]) 1.2(1):.484 1.2(2):.516 1.2(3):.323 1.2(4):.0625 1.3:.889 1.4(1):.5 1.4(2):.583 1.4(3):.125 1.4(4):.028 Solutions to Section 2 Exercises: 2.1: Coins:.5 (same as P [B]) Dice:.167 (same as P [B]) 2.2(1):.484.5 not independent 2.2(2):.516.5 not independent 2.2(3):.313.323 not independent 2.2(4):.0625 =.0625 independent 2.3:.889.93 not independent 2.4(1):.5.278 not independent 2.4(2):.583.278 not independent 2.4(3):.125.278 not independent 2.4(4):.167 =.167 independent