Conditional Probability & Independence. Conditional Probabilities

Similar documents
Conditional Probability & Independence. Conditional Probabilities

Conditional Probability P( )

Chapter 3 : Conditional Probability and Independence

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Mutually Exclusive Events

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Formalizing Probability. Choosing the Sample Space. Probability Measures

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

Probabilistic models

Conditional Probability

Probability 1 (MATH 11300) lecture slides

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Topic 5 Basics of Probability

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

ECE 302: Chapter 02 Probability Model

Probability and Statistics Notes

CMPSCI 240: Reasoning about Uncertainty

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

2. Conditional Probability

MA : Introductory Probability

Statistical Inference

Conditional probability

2. AXIOMATIC PROBABILITY

MATH 556: PROBABILITY PRIMER

Denker FALL Probability- Assignment 6

Independence. CS 109 Lecture 5 April 6th, 2016

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

Lecture 3 - Axioms of Probability

Probabilistic models

SDS 321: Introduction to Probability and Statistics

the time it takes until a radioactive substance undergoes a decay

UNIT 5 ~ Probability: What Are the Chances? 1

Module 1. Probability

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

With Question/Answer Animations. Chapter 7

CMPSCI 240: Reasoning about Uncertainty

Introduction to Probability 2017/18 Supplementary Problems

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

STA 247 Solutions to Assignment #1

Intermediate Math Circles November 15, 2017 Probability III

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Statistical Theory 1

Deep Learning for Computer Vision

Fundamentals of Probability CE 311S

Lecture 1. ABC of Probability

CSE 312, 2017 Winter, W.L.Ruzzo. 5. independence [ ]

Probability: Axioms, Properties, Interpretations

Intermediate Math Circles November 8, 2017 Probability II

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

A Event has occurred

Single Maths B: Introduction to Probability

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

STAT Chapter 3: Probability

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Lectures Conditional Probability and Independence

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

STAT:5100 (22S:193) Statistical Inference I

Properties of Probability

STAT 430/510 Probability

F71SM STATISTICAL METHODS

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013

Dept. of Linguistics, Indiana University Fall 2015

Discrete Random Variables (1) Solutions

Probability Year 9. Terminology

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Continuing Probability.

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Probability Review

Introduction to Probability, Fall 2009

3.2 Probability Rules

Lecture 3 Probability Basics

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Probability Theory and Applications

1 Probability Theory. 1.1 Introduction

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

1 INFO 2950, 2 4 Feb 10

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Probability Space: Formalism Simplest physical model of a uniform probability space:

Some Basic Concepts of Probability and Information Theory: Pt. 1

Topic 3: Introduction to Probability

Statistical Methods for the Social Sciences, Autumn 2012

Conditional Probability P( )

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Probability Year 10. Terminology

Lecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya

Transcription:

Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F )= P(E F ), for P(F ) > 0 P(F )

Conditional probabilities are useful because: Often want to calculate probabilities when some partial information about the result of the probabilistic experiment is available. Conditional probabilities are useful for computing regular probabilities.

Example 1. 2 random cards are selected from a deck of cards. (a) What is the probability that both cards are aces given that one of the cards is the ace of spaces? (b) What is the probability that both cards are aces given that at least one of the cards is an ace?

Cond prob satisfies the usual prob axioms. Suppose (S, P( )) is a probability space. Then (S, P( F )) is also a probability space (for F S with P(F ) > 0). Thus, we get all the usual identities, for example: P(E c F )=1 P(E F ) P(A B F )=P(A F )+P(B F ) P(A B F )

Cond prob satisfies the usual prob axioms. Suppose (S, P( )) is a probability space. Then (S, P( F )) is also a probability space (for F S with P(F ) > 0). 0 P(ω F ) 1 ω S P(ω F )=1 If E 1,E 2,...,E n are disjoint, then P( n i=1e i F )= n P(E n F ) i=1 Thus all our previous propositions for probabilities give analogous results for conditional probabilities.

The Multiplication Rule Re-arranging the conditional probability formula gives P(E F )=P(F ) P(E F ) This is often useful in computing the probability of the intersection of events. Example. Draw 2 balls at random without replacement from an urn with 8 red balls and 4 white balls. Find the chance that both are red.

The General Multiplication Rule P(E 1 E 2 E n )= P(E 1 ) P(E 2 E 1 ) P(E 3 E 1 E 2 ) P(E n E 1 E 2 E n 1 ) Example 1. Ihaven keys, one of which opens a lock. Trying keys at random without replacement, find the chance that the kth try opens the lock.

The Law of Total Probability We know that P(E) =P(E F )+P(E F c ). Using the definition of conditional probability, P(E) =P(E F ) P(F )+P(E F c ) P(F c ) This is extremely useful. Itmaybedifficult to compute P(E) directly, but easy to compute it once we know whether or not F has occurred.

The Law of Total Probability To generalize, say events F 1,...,F n form a partition if they are disjoint and n i=1 F i = S. Since E F 1,E F 2,...E F n are a disjoint partition of E P(E) = n i=1 P(E F i) Apply conditional probability to give the law of total probability, P(E) = n i=1 P(E F i) P(F i )

Bayes Formula Sometimes P(E F ) may be specified and we would like to find P(F E). A simple manipulation gives Bayes formula, P(F E) = P(E F ) P(F ) P(E) Combining this with the law of total probability, P(F E) = P(E F ) P(F ) P(E F ) P(F )+P(E F c ) P(F c )

Bayes Formula. P(F E) = P(E F ) P(F ) P(E F ) P(F )+P(E F c ) P(F c ) Example. Eric s girlfriend comes round on a given evening with probability 0.4. If she does not come round, the chance Eric watches The Wire is 0.8. If she does, this chance drops to 0.3. I call Eric and he says he is watching The Wire. What is the chance his girlfriend is around?

Sometimes conditional probability calculations can give quite unintuitive results. Example 3. I have three cards. One is red on both sides, another is red on one side and black on the other, the third is black on both sides. I shuffle the cards and put one on the table, so you can see that the upper side is red. What is the chance that the other side is black? is it 1/2, or > 1/2 or < 1/2? Solution

Discussion problem. Suppose 99% of people with HIV test positive, 95% of people without HIV test negative, and 0.1% of people have HIV. What is the chance that someone testing positive has HIV?

Example: Alice and Bob play a game where A tosses a coin, and wins $1 if it lands on H or loses $1 on T. B is surprised to find that he loses the first ten times they play. If B s prior belief is that the chance of A having a two headed coin is 0.01, what is his posterior belief? Note. Prior and posterior beliefs are assessments of probability before and after seeing an outcome. The outcome is called data or evidence. Solution.

Independence Intuitively, E is independent of F if the chance of E occurring is not affected by whether F occurs. Formally, P(E F )=P(E) (1) We say that E and F are independent if P(E F )=P(E) P(F ) (2) Note. (2) and (1) are equivalent.

Independence P(E F )=P(E) P(F ) (2) Note 1. It is clear from (2) that independence is a symmetric relationship. Also, (2) is properly defined when P(F )=0. Note 2. (1) gives a useful way to think about independence; (2) is usually better to do the math.

Note: We can study a probabilistic model and determine if certain events are independent or we can define our probabilistic model via independence. Example: Supposed a biased coin comes up heads with probability p, independent of other flips. P(n heads in n flips) = p n P(n tails in n flips) = (1 p) n P(HHTHTTT) = p 2 (1 p)p(1 p) 3 = p H (1 p) T P(exactly k heads in n flips) = ( ) n k p k (1 p) n k

Example 1: Independence can be obvious Draw a card from a shuffled deck of 52 cards. Let E = card is a spade and F = card is an ace. Are E and F independent? Solution

Example 2: Independence can be surprising Toss a coin 3 times. Define A={at most one T}={HHH,HHT,HTH,THH} B={both H and T occur}={hhh,ttt} c. Are A and B independent? Solution

Proposition. IfE and F are independent, then so are E and F c. Proof.

Independence as an Assumption It is often convenient to suppose independence. People sometimes assume it without noticing. Example. A sky diver has two chutes. Let E = {main chute opens}, P(E) = 0.98; F = {backup opens}, P(F )= 0.90. Find the chance that at least one opens, making any necessary assumption clear. Note. Assuming independence does not justify the assumption! Both chutes could fail because of the same rare event, such as freezing rain.

Independence of Several Events Three events E, F, G are independent if P(E F ) = P(E) P(F ) P(F G) = P(F ) P(G) P(E G) = P(E) P(G) P(E F G) = P(E) P(F ) P(G) If E, F, G are independent, then E will be independent of any event formed from F and G. Example. Show that E is independent of F G. Proof.

Pairwise Independence E, F and G are pairwise independent if E is independent of F, F is independent of G, and E is independent of G. Example. Toss a coin twice. Set E ={HH,HT}, F ={TH,HH} and G ={HH,TT}. (a) Show that E, F and G are pairwise independent. (b) By considering P(E F G), show that E, F and G are NOT independent.

Duel Suppose that Alice and Bob are involved in a duel. The rules of the duel are that they are to pick up their guns and shoot at each other simultaneously. If one or both are hit, then the duel is over. If both shots miss, then they repeat the process. Suppose that the results of the shots are independent and that each shot of Alice will hit Bob with probability p A and each shot of Bob will hit Alice with probability p B. What is the probability that the duel ends after the n th round of shots? the probability that both duelists are hit? the conditional probability that the duel ends after the n th round of shots given that both duelists are hit?

Baseball card collecting Suppose that you continually collect baseball cards and that there are m different types. Suppose also that each time a new card is obtained, it is a type i card with probability p i,1 i m. Suppose that you have just collected your n th card. What is the probability that this is a new type?