Conditional Probability

Similar documents
I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Properties of Probability

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

STAT Chapter 3: Probability

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Chapter 3 : Conditional Probability and Independence

P (E) = P (A 1 )P (A 2 )... P (A n ).

Chapter 6: Probability The Study of Randomness

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Conditional Probability and Independence

Probabilistic models

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Deep Learning for Computer Vision

Dynamic Programming Lecture #4

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Fundamentals of Probability CE 311S

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Lecture 3 Probability Basics

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

ORF 245 Fundamentals of Statistics Chapter 5 Probability

STA Module 4 Probability Concepts. Rev.F08 1

STAT 430/510 Probability

2.4 Conditional Probability

Probability Notes. Definitions: The probability of an event is the likelihood of choosing an outcome from that event.

STAT:5100 (22S:193) Statistical Inference I

k P (X = k)

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Lecture 3. January 7, () Lecture 3 January 7, / 35

Probability- describes the pattern of chance outcomes

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Math 1313 Experiments, Events and Sample Spaces

Recitation 2: Probability

Chapter 7 Wednesday, May 26th

Conditional Probability & Independence. Conditional Probabilities

4. Probability of an event A for equally likely outcomes:

Elementary Discrete Probability

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Statistics 1 - Lecture Notes Chapter 1

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Probability and Independence Terri Bittner, Ph.D.

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

STAT 516: Basic Probability and its Applications

MA : Introductory Probability

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

CS626 Data Analysis and Simulation

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

2011 Pearson Education, Inc

Probability and Statistics Notes

Chapter 2 Class Notes

Probability (Devore Chapter Two)

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

Probability the chance that an uncertain event will occur (always between 0 and 1)

Intermediate Math Circles November 8, 2017 Probability II

Probabilistic models

Conditional Probability

Statistical Inference

Formalizing Probability. Choosing the Sample Space. Probability Measures

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Lectures Conditional Probability and Independence

Probability: Part 1 Naima Hammoud

Sample Spaces, Random Variables

MITOCW watch?v=vjzv6wjttnc

Dept. of Linguistics, Indiana University Fall 2015

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

UNIT 5 ~ Probability: What Are the Chances? 1

Statistics 100A Homework 1 Solutions

Module 2 : Conditional Probability

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Elements of probability theory

Introduction to Probability

Statistical Methods for the Social Sciences, Autumn 2012

Introduction to Probability Theory, Algebra, and Set Theory

MA : Introductory Probability

Statistical Theory 1

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Midterm #1 - Solutions

PROBABILITY THEORY 1. Basics

ECE353: Probability and Random Processes. Lecture 3 - Independence and Sequential Experiments

2.6 Tools for Counting sample points

CHAPTER 3 PROBABILITY TOPICS

Lecture 6 - Random Variables and Parameterized Sample Spaces

Lecture 3 - Axioms of Probability

Probability Theory and Applications

A Event has occurred

Announcements. Topics: To Do:

Transcription:

Conditional Probability When we obtain additional information about a probability experiment, we want to use the additional information to reassess the probabilities of events given the new information. Example: A box has 5 computer chips. Two are defective. A random sample of size 2 is selected from the box. (All subsets of size 2 are equally likely). 1. Compute the probability that the second chip is defective. Intuition/symmetry P ( second chip defective) = 2 5. More formally... # outcomes with second chip defective P ( second chip defective) = # ways to draw two chips = (4)(2) (5)(4). 2. If we know that the first chip is good, what is the probability that the second chip is defective.

Defn. The conditional probability of an event A given an event B is P (A B) P (A B) :=, P (B) provided P (B) 0. The definition makes some sense...the conditional probability of A given B is the fraction of outcomes in B that are also in A. An important implication of the definition is as follows: ( ) P (A B) = P (A B)P (B) = P (B A)P (A). (**) holds even if P (A) = 0 or P (B) = 0.

Example: Re-compute the probability that the second chip is defective given that the first chip is good using the definition. Example: More computer chips...a box has 500 computer chips with a speed of 400 Mhz and 500 computer chips with a speed of 500 Mhz. The numbers of good (G) and defective (D) chips at the two different speeds are as shown in the table below. 400 Mhz 500 Mhz G 480 490 970 D 20 10 30 500 500 Total=1000 We select a chip at random and observe its speed. What is the probability that the chip is defective given that its speed is 400 Mhz?

Example: Consider three cards. One card has two green sides, one card has two red sides, and the third card has one green side and one red side. ({G, G}, {R, R}, {R, G}) - I pick a card at random and show you a randomly selected side. - What is the proability that the flip side is green given that the side I show you is green?

Independence Sometimes, knowledge that B occurred does not change our assessment of the P (A). Let s say I toss a fair coin. I tell you that I got a tail. I then give you the coin to toss. Does the knowledge that I got a tail affect what you think the chance is that you will get a head? Intuitively, two events A and B are independent if the event B does not have any influcence on the probability that A happens (and vice versa). Mathematically, independence of two events is defined as follows: Defn. Two events A and B are called independent if P (A B) = P (A)P (B). Result: If P (B) 0, then A and B are independent P (A B) = P (A). Proof of Result: (HW...Use the definitions of conditional probability and independence.) The result gives us another way to think of independence: the fraction of A out of B is the same as the fraction of A out of Ω.

Example: An alternative model for logging on to the AOL network using dial-up. Suppose I log on to AOL using dial-up. I connect successfully if and only if the phone number works and the AOL network works. The probability that the phone works is.9, and the probability that the network works is.6. Suppose that the status of the phone line and the status of the AOL network are independent. What is the probability that I connect successfully? Result: Events A and B are independent A and B are independent A and B are independent A and B are independent. Proof of Result: (HW...Use the definition of independence and consequence 1. of Kolmogorov s Axioms).

I defined independence of two events. We can also talk about independence of a collection of events. Defn. Events A 1,..., A n are mutually independent if for any {i 1,..., i k } {1,..., n}, k P ( A i j ) = j=1 k P (A i j ), j=1 where A i j may be A ij or A ij. Events A 1,... A n are pairwise independent if for any i, j {1,..., n}, A i and A j are independent. Note: Mutual independence implies pairwise independence, but pairwise independence does not imply mutual independence. (See supplementary exercises for HW 2/3).

A Little Bit on Systems in Series, Systems and Parallel, and Reliability (Reference: Hofmann, pp. 17-18.) A parallel system consists of k components c 1,..., c k arranged in such a way that the system works if and only if at least one of the k components functions properly. A series system consists of k components c 1,..., c k arranged in such a way that the system works if an only if all of the components function properly. The system consisting of the AOL network and the phone line is an example of a parallel system. The reliability of a system is the probability that the system works. For example, the reliability of the system consisting of the AOL network and the phone line is.54. We can also construct larger systems with sub-systems that are connected in series and in parallel.

Example: Parallel system with k mutually independent components. Let c 1,..., c k denote the k components in a parallel system. Assume the k components operate independently, and P (c j works ) = p j. What is the reliability of the system? P ( system works) = P ( at least one component works) = 1 P ( all components fail ) = 1 P (c 1 fails and c 2 fails... and c k fails ) k = 1 (1 p j ). j=1 Example: System in series with k mutually independent components. Let c 1,..., c k denote the k components in a system. Assume the k components are connected in series, operate independently, and P (c j works ) = p j. What is the reliability of the system? P ( system works) = P ( all least components work) k = p j. j=1 Example:Let s compute the reliability of a system consisting of subsystems connected in series and in parallel.

Disjointness and Independence are Different Ideas Disjoint/Mutually Exclusive vs. Independent P (A B) = P ( ) = 0 If I know B happened, then I know A did not happen. P (A B) = 0 P (A B) = P (A)P (B) Knowing that B happened tells me nothing about P (A). P (A B) = P (A)

Law of Total Probability and Bayes Rule. This stuff is not new. The Law of Total Probability and Bayes Rule are just restatements of what we already know. Example: A rediculous game... Box 1 (B1) has two gold coins and one penny. Box 2 (B2) has one gold coin and two pennies. Box 3 (B3) has four gold coins and one penny. - Player 1 rolls a fair 6-sided die. Call the outcome D. Player 1 picks a box according to the outcome of the die roll as follows: 1, 2 pick B1 D = 3, 4, 5 pick B2 6 pick B3. Then, player 1 selects a coin at random from the chosen box and tells player 2 whether the coin is a gold coin or a penny. - Player 2 then guesses which box the coin came from. - If player 2 guesses correctly, then player 2 keeps the selected coin. Otherwise, player 1 keeps the chosen coin. a.) What is the probability that player 1 selects a gold coin? b.) What box will player 2 pick if player 1 selects a gold coin? c.) What is the probability that player 2 guesses the correct box? d.) Would you prefer to be player 1 or player 2?

a.) A tree diagram shows all possible outcomes of the two-step procedure. - There are 3 distinct ways to get a gold coin: E 1 = (B1, G), E 2 = (B2, G), and E 3 = (B3, G). - E 1, E 2, and E 3 are mutually disjoint. - E 1 E 2 E 2 = G - Axiom (iii) P (G) = P (E 1 E 2 E 2 ) = P (E 1 ) + P (E 2 ) + P (E 3 ) - By definition of conditional probability, - Likewise, P (E 1 ) = P (B1 and G) = P (G B1)P (B1) = ( 2 3 )(1 3 ) = 2 9 P (E 2 ) = P (G B2)P (B2) = ( 1 3 )(1 2 ) P (E 3 ) = P (G B3)P (B3) = ( 4 5 )(1 6 ) - Then, P (G) = 2 9 + 1 6 + 4 30.522. *** We just used the Law of Total Probability to compute the probability of a gold coin.

Defn. A collection of events B 1,... B k is called a cover or partition of Ω if (i) the events are disjoint (B i B j = for i j), and (ii) the union of the events is Ω ( k i=1 B i = Ω). If we represent a multi-step procedure with a tree diagram, then the branches of the tree are a cover. We can also represent a cover with a different kind of diagram: Thrm. Law of Total Probability: If the collection of events B 1,..., B k is a cover of Ω, and A is an event, then P (A) = k P (A B i )P (B i ). i=1 Proof of the Law of Total Probability: By definition of conditional probability P (A B i )P (B i ) = P (A B i ) Because B 1,..., B k partition Ω, the events A B 1,... A B k are disjoint, and k i=1 A i = A. By Axiom (iii), P (A) = k i=1 P (A B i) = k i=1 P (A B i)p (B i ).

Pictures for the law of total probability... A tree diagram... A Venn diagram...

b.) I tell you that I got a gold coin. Which box do you think it came from? Specifically, We want to compute P (B j G), j = 1, 2, 3 and pick the highest one. By definition of conditional probability, P (B j G) = P (B j G) P (G) = P (G B j)p (B j ) P (G) P (G B j )P (B j ) = P (G B 1 )P (B 1 ) + P (G B 2 )P (B 2 ) + P (G B 3 )P (B 3 ) P (B 1 G) = P (B 2 G) = P (B 3 G) = To figure out these probabilities, we used Bayes rule.

Thrm. Bayes Rule: If B 1,..., B k is a cover or partition of Ω, and A is an event, then Proof of Bayes Rule: P (B j A) = P (A B j )P (B j ) k j=1 P (A B j)p (B j ). P (B j A) = P (B j A) = P (A B j)p (B j ) P (A) P (A B j )P (B j ) = k j=1 P (A B j)p (B j ). We can represent Bayes rule with tree diagrams and Venn diagrams as well.