Chapter 3 : Conditional Probability and Independence

Similar documents
Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

STAT 516: Basic Probability and its Applications

Axioms of Probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Conditional Probability & Independence. Conditional Probabilities

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Conditional Probability & Independence. Conditional Probabilities

STAT:5100 (22S:193) Statistical Inference I

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Conditional Probability

Discrete Random Variables (1) Solutions

Chapter 7: Section 7-1 Probability Theory and Counting Principles

MATH 556: PROBABILITY PRIMER

Probability 1 (MATH 11300) lecture slides

CMPSCI 240: Reasoning about Uncertainty

UNIT 5 ~ Probability: What Are the Chances? 1

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Mutually Exclusive Events

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Statistical Inference

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

Chapter 4 : Discrete Random Variables

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Fundamentals of Probability CE 311S

ECE 302: Chapter 02 Probability Model

Conditional Probability. CS231 Dianna Xu

Module 1. Probability

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Elementary Discrete Probability

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

1. Discrete Distributions

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

2011 Pearson Education, Inc

Statistical Theory 1

Lecture 3 Probability Basics

Probability Theory Review

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

CSC Discrete Math I, Spring Discrete Probability

Lecture Lecture 5

Probability and Statistics Notes

Chapter 2: Probability Part 1

2. AXIOMATIC PROBABILITY

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Properties of Probability

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Probability the chance that an uncertain event will occur (always between 0 and 1)

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

PLEASE MARK YOUR ANSWERS WITH AN X, not a circle! 1. (a) (b) (c) (d) (e) 2. (a) (b) (c) (d) (e) (a) (b) (c) (d) (e) 4. (a) (b) (c) (d) (e)...

HW MATH425/525 Lecture Notes 1

Origins of Probability Theory

Intermediate Math Circles November 8, 2017 Probability II

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Lecture 1 Introduction to Probability and Set Theory Text: A Course in Probability by Weiss

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Probabilistic models

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Discrete Probability

Conditional Probability and Independence

Math Exam 1 Review. NOTE: For reviews of the other sections on Exam 1, refer to the first page of WIR #1 and #2.

Lecture 4. Selected material from: Ch. 6 Probability

Deep Learning for Computer Vision

STOR Lecture 4. Axioms of Probability - II

Intro to Probability Day 4 (Compound events & their probabilities)

Dynamic Programming Lecture #4

Please do NOT write in this box. Multiple Choice Total

Dept. of Linguistics, Indiana University Fall 2015

Axioms of Probability. Set Theory. M. Bremer. Math Spring 2018

Econ 113. Lecture Module 2

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Axiomatic Foundations of Probability. Definition: Probability Function

Probabilistic models

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Introduction to probability

Introduction and basic definitions

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Problem #1 #2 #3 #4 Extra Total Points /3 /13 /7 /10 /4 /33

Probability, Random Processes and Inference

STAT 430/510 Probability

Chapter 2 PROBABILITY SAMPLE SPACE

18.600: Lecture 7 Bayes formula and independence

Lectures Conditional Probability and Independence

Introduction to Probability

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Statistical Methods for the Social Sciences, Autumn 2012

Intro to Probability Day 3 (Compound events & their probabilities)

Lecture 3 - Axioms of Probability

18.600: Lecture 7 Bayes formula and independence

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

Transcription:

STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when some supplementary information concerning the result of an experiment is available? The concept of conditional probability allows to answer this question. Example 1. A professional tennis player has the following yearly records : 45 victories (W) - 17 losses (L). Before playing a match, one would then bet that he would win with probability 45/(45 + 17) 0.726. Actually, we may suspect that the surface he is playing on might have some effect on his performance. To get better insight on this assumption, we have the following data on the player s records : W L Grass 7 3 Hard 27 9 Clay 11 5 Intuitively, if we know that the player is going to play a match on clay, we feel that the probability of winning should be updated to 11/(11+5) = 0.6875. This intuition turns out to be true and is confirmed by the definition of conditional probability. Definition 1.1 (Conditional probability). Let (Ω, A, P) be a probability space and F A an event such that > 0, we define the conditional probability that an event E A occurs given or knowing that F has occurred, and denote it by P(E F ), as follows : P(E F ) = P(E F ) (1) Proposition 1.1. Let (Ω, A, P) be a probability space and F A an event such that > 0. Function mapping P F : A R defined by P F (E) = P(E F ) is a probability measure on (Ω, A). Proof. We need to check that P F measure : satisfies the three axioms of a probability 1

1. For any event E A, is P F (E) 0? By definition, P F (E) = P(E F )/. Since P(E F ) 0 and > 0. So P F (E) 0. 2. Is P F (Ω) = 1? By definition, P F (Ω) = P(Ω F )/ = / = 1. 3. Let (E i ) be a countably infinite sequence of events that are mutually exclusive (i.e. E i E j = if i j), do we have P F ( E i) = P F (E i )? ( ) P F E i ( ) = P E i F by definition = P ( E i F ) = P ( (E i F )) distributivity of over = P(E i F ) since P is a probability measure and events E i are mutually exclusive events = P F (E i ) Remark : In case the sample space Ω is finite with uniform probability, that is all outcomes are equally likely to occur then, conditional on the event that the outcome lies in a subset F Ω, it may be convenient to compute conditional probabilities of the form P(E F ) by using F as the reduced sample space. Proposition 1.2. Let Ω be a finite sample space and consider the associated probability space (Ω, P(Ω), P) with uniform probability measure P. Let F A be an event such that > 0, the conditional probability of event E A given F can be computed as : P(E F ) = E F (2) F Example 2. You draw a card from a standard deck of 52 cards. If the card you draw is an ace, what is the probability that this is the ace of spades? Let A be the event that the card is an ace and S be the event that the card is a spade. Using the above proposition, the desired probability is thus : P(S A) = S A A = 1 4 2

Proposition 1.3 (Multiplication rule). Let E 1,..., E n be a sequence of n N, n 1 events. Then we have, ( n ) ( ) n 1 P E i = P(E 1 )P(E 2 E 1 )P(E 3 E 1 E 2 )... P E n E i (3) In particular, for n = 2, we have : P(E 1 E 2 ) = P(E 1 )P(E 2 E 1 ) Proof. To prove the multiplication rule, just apply the definition of conditional probability to its right-hand side. Example 3. Three cards are dealt successively at random and without replacement from a standard deck of 52 playing cards. What is the probability of receiving, in order, a king, a queen, and a jack? Let K 1, Q 2 and J 3 be the events of being dealt a king in the first draw, a queen in the second draw and a jack in the third draw, respectively. We want to find P(K 1 Q 2 J 3 ). According to the multiplication rule, we have : P(K 1 Q 2 J 3 ) = P(K 1 )P(Q 2 K 1 )P(J 3 K 1 Q 2 ) = 4 4 4 52 51 50 For any two events E and F such that P(E) > 0 and > 0, we have that P(E F ) = P(E F ) so P(E F ) = P(E F ). Similarly, we have that P(F E) = P(E F ) P(E). By replacing P(E F ) by P(E F ) in the last expression, we obtain the so-called first Bayes s formula that is stated as follows : Property 1.1 (First Bayes s formula). Let E and F be two events such that P(E) > 0 and > 0. We have that P(F E) = P(E F ) P(E) (4) Example 4. According to a study, 0.5% of the population exhibit symptoms of video game addiction. The study highlights that the proportion of addicts is 2% among teenagers. We also know that there are 15% of teenagers in the population. What is the probability that a randomly selected addict is a teenager? Let us denote A the event that a person is an addict and T the event that the person is a teenager. Then the desired probability is : P(T A) = P(A T )P(T ) P(A) = 0.02 0.15 0.005 = 0.6 In Chapter 2, one of the main results is called the law of total probability. This result can be rewritten using conditional probabilities instead of probabilities of intersection thanks to the multiplication rule. 3

Property 1.2 (Law of total probability). Let E A be an event and F 1,..., F n be n N, n 1 events that form a partition of the sample space Ω (i.e. n F i = Ω and F i F j = for i j), we have that n P(E) = P (E F i ) P(F i ) (5) The law of total probability entails the following result known as the second Bayes s formula : Property 1.3 (Second Bayes s formula). Let E A be an event and F 1,..., F n be nn, n 1 events that form a partition of the sample space Ω (i.e. n F i = Ω and F i F j = for i j). For a given i = 1... n, we have that P(F i E) = P(E F i )P(F i ) n j=1 P(E F j)p(f j ) (6) Example 5. A company called Fruit designs smartphones called e-phones. The items are produced on three factories. The three factories account for 20%, 30%, and 50% of the output, respectively. The fraction of defective e-phones produced is this: for the first factory, 5%; for the second factory, 3%; for the third factory, 1%. If an e-phone is chosen at random from the total output and is found to be defective, what is the probability that it was produced by the third factory? Let F i be the event that a given item is produced on the ith factory, for i = 1, 2, 3 and let us denote D the event that the item is defective. We are given that P(F 1 ) = 0.2, P(F 2 ) = 0.3 and P(F 3 ) = 0.5. In addition, P(D F 1 ) = 0.05, P(D F 2 ) = 0.03 and P(D F 3 ) = 0.01. We want to find P(F 3 D). The second Bayes s formula is applied as follows: P(F 3 D) = P(D F 3 )P(F 3 ) P(D F 1 )P(F 1 ) + P(D F 2 )P(F 2 ) + P(D F 3 )P(F 3 ) = 0.01 0.5 0.05 0.2 + 0.03 0.3 + 0.01 0.5 0.208 2 Independent Events Definition 2.1 (Independence). Let (Ω, A, P) be a probability space and E, F A two events such that P(E) > 0 and > 0, events E and F are said to be independent if one of the following equivalent identities holds : (a) P(E F ) = P(E) 4

(b) P(E F ) = P(E) (c) P(F E) = Example 6. A card is selected at random from an ordinary deck of 52 playing cards. If A is the event that the selected card is an ace and S is the event that it is a spade, then A and S are independent. Indeed, P(A) = 4 52 = 1 13, P(S) = 13 52 = 1 1 4 and P(A S) = 52 = P(A)P(S) Property 2.1. If events E and F are independent, then so are : E and F c E c and F E c and F c Proof. Let E and F be two independent events and let us prove that E and F c are independent, that is P(E F c ) = P(E)P(F c ). We have : P(E F c ) = P(E) P(E F ) (Law of total probability) = P(E) P(E) (since E and F are independent) = P(E)(1 ) = P(E)P(F c ) The proofs for the rest of the property are left as an exercise. Example 7. You roll one die followed by another die. Let A be the event that the sum of the die is 9. Let B be the event that the first die lands on an even number. Let C be the event that the second die lands on an odd number. Show that A and B are independent A and C are independent B and C are independent P(A B C) P(A)P(B)P(C) In a situation like this we say that A, B and C are pairwise independent but not mutually independent. Definition 2.2 (Independence of n events). Let E 1,..., E n be a sequence of n N, n 1 events, we say that events (E i )...n are pairwise independent if E i and E j are independent, that is P(E i E j ) = P(E i )P(E j ) for all i j. We say that events (E i )...n are mutually independent or independent if for all subsets I {1,..., n} ( ) P E i = P (E i ) (7) i I i I 5

For instance, for n = 3, if E 1, E 2 and E 3 are mutually independent, this means P(E 1 E 2 ) = P(E 1 )P(E 2 ) P(E 1 E 3 ) = P(E 1 )P(E 3 ) P(E 2 E 3 ) = P(E 2 )P(E 3 ) P(E 1 E 2 E 3 ) = P(E 1 )P(E 2 )P(E 3 ) Example 8. A person has two coins. The first coin comes up heads with probability 1/4 and tails with probability 3/4. The second coin comes up heads with probability 3/4 and tails with probability 1/4. One of the two coins is selected and then flipped twice. Each coin has the same initial probability of being selected. 1. Show that the event that the first flip is heads is not independent of the event that the second flip is heads. 2. Let F be the event that the first coin is selected. Let H 1 be the event that the first flip is heads and let H 2 be the event that the second flip is heads. Show that P(H 1 H 2 F ) = P(H 1 F )P(H 2 F ) Definition 2.3. Let E, F and G be three events such that P(G) > 0. We say that E and F are conditionally independent given G (or that conditioned on G the events E and F are independent) if P(E F G) = P(E G)P(F G) (8) It is easy to check that if E and F are conditionally independent given G then so are E and F c. It is also easy to extend this to three or more events being conditionally independent given G. 6