Conditional Probability & Independence. Conditional Probabilities

Similar documents
Conditional Probability & Independence. Conditional Probabilities

Conditional Probability P( )

Chapter 3 : Conditional Probability and Independence

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Probabilistic models

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Probability 1 (MATH 11300) lecture slides

Dept. of Linguistics, Indiana University Fall 2015

Probabilistic models

With Question/Answer Animations. Chapter 7

Conditional probability

Fundamentals of Probability CE 311S

UNIT 5 ~ Probability: What Are the Chances? 1

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Formalizing Probability. Choosing the Sample Space. Probability Measures

CSC Discrete Math I, Spring Discrete Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Conditional Probability P( )

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

STAT:5100 (22S:193) Statistical Inference I

Probability and Statistics Notes

2.4. Conditional Probability

STAT509: Probability

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

2. Conditional Probability

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

F71SM STATISTICAL METHODS

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).

Lectures Conditional Probability and Independence

STAT Chapter 3: Probability

Axioms of Probability. Set Theory. M. Bremer. Math Spring 2018

2. AXIOMATIC PROBABILITY

Independence. CS 109 Lecture 5 April 6th, 2016

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

Module 1. Probability

Conditional Probability

STA 247 Solutions to Assignment #1

A Event has occurred

Discrete Structures for Computer Science

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Introduction to Probability 2017/18 Supplementary Problems

Mutually Exclusive Events

Intermediate Math Circles November 15, 2017 Probability III

Lecture 3 - Axioms of Probability

STA 256: Statistics and Probability I

MATH 556: PROBABILITY PRIMER

Chapter 2 Class Notes

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Axioms of Probability

CMPSCI 240: Reasoning about Uncertainty

STAT 516: Basic Probability and its Applications

2011 Pearson Education, Inc

Probability Year 10. Terminology

Lecture 1 : The Mathematical Theory of Probability

Probability Theory and Applications

Lecture 3 Probability Basics

SDS 321: Introduction to Probability and Statistics

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

1 Probability Theory. 1.1 Introduction

Introduction and basic definitions

3.2 Probability Rules

Notes for Math 324, Part 17

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Introduction to probability

Single Maths B: Introduction to Probability

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Probability Year 9. Terminology

MATH/STATS 425 : Introduction to Probability. Boaz Slomka

Intermediate Math Circles November 8, 2017 Probability II

STAT 430/510 Probability

Probability, Random Processes and Inference

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

1 INFO 2950, 2 4 Feb 10

1. Discrete Distributions

Discrete Probability

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Chapter 2: Probability Part 1

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Motivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Topic 3: Introduction to Probability

Discrete Probability

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Introduction to Probability Theory

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Denker FALL Probability- Assignment 6

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Transcription:

Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F )= P(E \ F ), for P(F ) > 0 P(F ) Condition probabilities are useful because: Often want to calculate probabilities when some partial information about the result of the probabilistic experiment is available. Conditional probabilities are useful for computing regular probabilities.

Example 1. 2 random cards are selected from a deck of cards. What is the probability that both cards are aces given that one of the cards is the ace of spaces? What is the probability that both cards are aces given that at least one of the cards is an ace?

Example 2. Deal a 5 card poker hand, and let E = {at least 2 aces}, F = {at least 1 ace}, G = {hand contains ace of spades}. (a) Find P(E) (b) Find P(E F ) (c) Find P(E G)

Cond prob satisfies the usual prob axioms. Suppose (S, P( )) is a probability space. Then (S, P( F )) is also a probability space (for F S with P(F ) > 0). 0 apple P(! F ) apple 1 P!2S P(! F )=1 If E 1,E 2,... are disjoint, then P([ 1 i=1e i F )= 1X P(E i F ) i=1 Thus all our previous propositions for probabilities give analogous results for conditional probabilities. Examples P(E c F )=1 P(E F ) P(A [ B F )=P(A F )+P(B F ) P(A \ B F )

The Multiplication Rule Re-arranging the conditional probability formula gives P(E \ F )=P(F ) P(E F ) This is often useful in computing the probability of the intersection of events. Example. Draw 2 balls at random without replacement from an urn with 8 red balls and 4 white balls. Find the chance that both are red.

The General Multiplication Rule P(E 1 \ E 2 \ \E n )= P(E 1 ) P(E 2 E 1 ) P(E 3 E 1 \ E 2 ) P(E n E 1 \ E 2 \ \E n 1 ) Example 1. Alice and Bob play a game as follows. A die is thrown, and each time it is thrown it is equally likely to show any of the 6 numbers. If it shows 5, A wins. If it shows 1, 2 or 6, B wins. Otherwise, they play a second round, and so on. Find P(A n ), for A n = {Alice wins on nth round}.

Example 2. Ihaven keys, one of which opens a lock. Trying keys at random without replacement, find the chance that the kth try opens the lock.

The Law of Total Probability We know that P(E) =P(E \ F )+P(E \ F c ). Using the definition of conditional probability, P(E) =P(E F ) P(F )+P(E F c ) P(F c ) This is extremely useful. Itmaybedi cult to compute P(E) directly, but easy to compute it once we know whether or not F has occurred. To generalize, say events F 1,...,F n form a partition if they are disjoint and S n i=1 F i = S. Since E \ F 1,E\ F 2,...E\ F n are a disjoint partition of E. P(E) = P n i=1 P(E \ F i). Apply conditional probability to give the law of total probability, P(E) = P n i=1 P(E F i) P(F i )

Example 1. Eric s girlfriend comes round on a given evening with probability 0.4. If she does not come round, the chance Eric watches The Wire is 0.8. If she does, this chance drops to 0.3. Find the probability that Eric gets to watch The Wire.

Bayes Formula Sometimes P(E F ) may be specified and we would like to find P(F E). Example 2. I call Eric and he says he is watching The Wire. What is the chance his girlfriend is around? A simple manipulation gives Bayes formula, P(F E) = P(E F ) P(F ) P(E) Combining this with the law of total probability, P(F E) = P(E F ) P(F ) P(E F ) P(F )+P(E F c ) P(F c )

Sometimes conditional probability calculations can give quite unintuitive results. Example 3. I have three cards. One is red on both sides, another is red on one side and black on the other, the third is black on both sides. I shu e the cards and put one on the table, so you can see that the upper side is red. What is the chance that the other side is black? is it 1/2, or > 1/2 or < 1/2? Solution

Example: Spam Filtering 60% of email is spam. 10% of spam has the word Viagra. 1% of non-spam has the word Viagra. Let V be the event that a message contains the word Viagra. Let J be the event that the message is spam. What is the probability of J given V? Solution.

Discussion problem. Suppose 99% of people with HIV test positive, 95% of people without HIV test negative, and 0.1% of people have HIV. What is the chance that someone testing positive has HIV?

Example: Statistical inference via Bayes formula Alice and Bob play a game where R tosses a coin, and wins $1 if it lands on H or loses $1 on T. G is surprised to find that he loses the first ten times they play. If G s prior belief is that the chance of R having a two headed coin is 0.01, what is his posterior belief? Note. Prior and posterior beliefs are assessments of probability before and after seeing an outcome. The outcome is called data or evidence. Solution.

Example: A plane is missing, and it is equally likely to have gone down in any of three possible regions. Let i be the probability that the plane will be found in region i given that it is actually there. What is the conditional probability that the plane is in the second region, given that a search of the first region is unsuccessful?

Independence Intuitively, E is independent of F if the chance of E occurring is not a ected by whether F occurs. Formally, P(E F )=P(E) (1) We say that E and F are independent if P(E \ F )=P(E) P(F ) (2) Note. (2) and (1) are equivalent. Note 1. It is clear from (2) that independence is a symmetric relationship. Also, (2) is properly defined when P(F )=0. Note 2. (1) gives a useful way to think about independence; (2) is usually better to do the math.

Proposition. IfE and F are independent, then so are E and F c. Proof.

Example 1: Independence can be obvious Draw a card from a shu ed deck of 52 cards. Let E = card is a spade and F = card is an ace. Are E and F independent? Solution Example 2: Independence can be surprising Toss a coin 3 times. Define A={at most one T}={HHH,HHT,HTH,THH} B={both H and T occur}={hhh,ttt} c. Are A and B independent? Solution

Independence as an Assumption It is often convenient to suppose independence. People sometimes assume it without noticing. Example. A sky diver has two chutes. Let E = {main chute opens}, P(E) = 0.98; F = {backup opens}, P(F )= 0.90. Find the chance that at least one opens, making any necessary assumption clear. Note. Assuming independence does not justify the assumption! Both chutes could fail because of the same rare event, such as freezing rain.

Independence of Several Events Three events E, F, G are independent if P(E \ F ) = P(E) P(F ) P(F \ G) = P(F ) P(G) P(E \ G) = P(E) P(G) P(E \ F \ G) = P(E) P(F ) P(G) If E, F, G are independent, then E will be independent of any event formed from F and G. Example. Show that E is independent of F [ G. Proof.

Pairwise Independence E, F and G are pairwise independent if E is independent of F, F is independent of G, and E is independent of G. Example. Toss a coin twice. Set E ={HH,HT}, F ={TH,HH} and G ={HH,TT}. (a) Show that E, F and G are pairwise independent. (b) By considering P(E \ F \ G), show that E, F and G are NOT independent. Note. Another way to see the dependence is that P(E F \ G) =16= P(E).

Example: Insurance policies Insurance companies categorize people into two groups: accident prone (30%) or not. An accident prone person will have an accident within one year with probability 0.4; otherwise, 0.2. What is the conditional probability that a new policyholder will have an accident in his second year, given that the policyholder has had an accident in the first year?

Note: We can study a probabilistic model and determine if certain events are independent or we can define our probabilistic model via independence. Example: Supposed a biased coin comes up heads with probability p, independent of other flips P(n heads in n flips) = p n P(n tails in n flips) = (1 p) n P(exactly k heads n flips) = n k pk (1 p) n k P(HHTHTTT) = p 2 (1 p)p(1 p) 3 = p ]H (1 p) ]T