Lecture 4 Bayes Theorem

Similar documents
Lecture 4 Bayes Theorem

Lecture 3 Probability Basics

Econ 325: Introduction to Empirical Economics

Conditional Probability. CS231 Dianna Xu

Conditional probability

STAT:5100 (22S:193) Statistical Inference I

Bayesian RL Seminar. Chris Mansley September 9, 2008

Lecture 3: Probability

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University

Lecture 2. Conditional Probability

With Question/Answer Animations. Chapter 7

STAT509: Probability

Introduction into Bayesian statistics

Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014

Single Maths B: Introduction to Probability

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

CHAPTER 3 PROBABILITY: EVENTS AND PROBABILITIES

1 INFO 2950, 2 4 Feb 10

Announcements. Topics: To Do:

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

1.6/1.7 - Conditional Probability and Bayes Theorem

2011 Pearson Education, Inc

Joint, Conditional, & Marginal Probabilities

STAT Chapter 3: Probability

Denker FALL Probability- Assignment 6

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Lecture 4: Independent Events and Bayes Theorem

CMPSCI 240: Reasoning about Uncertainty

1 Probability Theory. 1.1 Introduction

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 1: Review of Probability

Math 42, Discrete Mathematics

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Chapter. Probability

Elementary Discrete Probability

STAT 516: Basic Probability and its Applications

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

EE126: Probability and Random Processes

MA : Introductory Probability

2.4. Conditional Probability

Introduction to Probability 2017/18 Supplementary Problems

Lecture 3 - Axioms of Probability

LING 473: Day 5. START THE RECORDING Bayes Theorem. University of Washington Linguistics 473: Computational Linguistics Fundamentals

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

Intermediate Math Circles November 15, 2017 Probability III

Dept. of Linguistics, Indiana University Fall 2015

Statistics for Engineers Lecture 2

Statistics for Business and Economics

Lecture 4. Selected material from: Ch. 6 Probability

Stats Probability Theory

4. Conditional Probability

Math 243 Section 3.1 Introduction to Probability Lab

Applied Bayesian Statistics

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

UNIT 5 ~ Probability: What Are the Chances? 1

Conditional probability and independence

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Machine Learning

2.4 Conditional Probability

Chapter 1 (Basic Probability)

Basic Statistics for SGPE Students Part II: Probability theory 1

Discrete Probability

Info 2950, Lecture 4

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

M378K In-Class Assignment #1

Introduction to probability

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Lecture Lecture 5

Bayes Rule for probability

CS626 Data Analysis and Simulation

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Discrete Structures Prelim 1 Selected problems from past exams

Probability Theory and Applications

Chapter 01: Probability Theory (Cont d)

Review of Probabilities and Basic Statistics

Probabilistic models

Review of Basic Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

STA Module 4 Probability Concepts. Rev.F08 1

More on conditioning and Mr. Bayes

Basic Statistics and Probability Chapter 3: Probability

Statistical Theory 1

Probability: Part 1 Naima Hammoud

Law of Total Probability and Bayes Rule

CSC Discrete Math I, Spring Discrete Probability

Venn Diagrams; Probability Laws. Notes. Set Operations and Relations. Venn Diagram 2.1. Venn Diagrams; Probability Laws. Notes

Basic Probabilistic Reasoning SEG

Conditional Probability P( )

Lecture 7. Bayes formula and independence

Applied Bayesian Statistics

18.600: Lecture 7 Bayes formula and independence

A Event has occurred

STAT 430/510 Probability

MA-250 Probability and Statistics. Nazar Khan PUCIT Lecture 21

Transcription:

1 / 24 Lecture 4 Bayes Theorem September 09, 2010

2 / 24 Lesson Plan 1. Bayes Theorem 2. Simpson s Paradox 3. More worked problems

3 / 24 Why Study Probability? A probability model describes the random phenomenon that determines how data is produced. Say we want to know if a coin is fair. Unknown parameter: probability of heads Population: infinite number of tosses Sample: 10 tosses and 4 are heads Estimate: sample estimate 4/10 = 0.40 We will later learn how to quantify the uncertainty in our estimate and if it is considerably different from 0.5.

4 / 24 Bayes Theorem Let s consider the events A and B. By the conditional probability formula we have P(A and B) = P(A B)P(B) P(A and B) = P(B A)P(A). By combining the two formulas we have P(A B)P(B) = P(B A)P(A) P(A B) = P(B A)P(A) P(B) This is a simplified version of the Bayes theorem first proposed by Thomas Bayes in 1763.

Bayes Theorem P(A B) = P(B A)P(A). P(B) We establish the relation between P(A B) and P(B A). The textbook calls this tree reversal. This is very important in Bayesian analysis which we will see later in the course. Suppose we are interested in a parameter θ and we observed some data. P(θ data) = P(data θ)p(θ) P(data) where P(data θ) describes how the data is generated and P(θ) describes our prior information on θ. 5 / 24

6 / 24 Paternity Suits Legal cases of disputed paternity in many countries are resolved using blood tests. Laboratories make genetic determinations concerning the mother, child, and alleged father. Suppose you are on a jury considering a paternity suit, The mother has blood type O. The alleged father has blood type AB. The child has blood type B.

7 / 24 Paternity Suits Here s some information we need to solve the problem. According to genetics, There is a 50% chance that this child will have blood type B if this alleged father is the real father. Based on percentage of B genes in the population, there is a 9% chance that this child would have blood type B if this alleged father is not the real father. Based on other evidence (e.g., testimonials, physical evidence, records) presented before the DNA test, you believe there is a 75% chance that the alleged father is the real father. This assessment is your prior belief.

8 / 24 Paternity Suits Given: P(son = B father = true) = 0.5 P(son = B father = false) = 0.09 P(F = true) = 0.75 Our goal is where P(father = true son = B) = P(father = true and son = B) P(son = B) P(son = B) = P(son = B and father = true) (Draw a tree to see this) +P(son = B and father = false)

Paternity Suits Given: P(son = B father = true) = 0.5 P(son = B father = false) = 0.09 P(F = true) = 0.75 Therefore: P(F = false) = 0.25 P(son = B and father = false) = 0.5 0.75 P(son = B and father = false) = 0.9 0.25 P(son = B) = 0.5 0.75 + 0.9 0.25 Finally P(father = true son = B) = 0.50 0.75 0.5 0.75+0.09 0.25 = 0.9434 9 / 24

10 / 24 Paternity Suits - Impact of Prior P (father = true son = B) 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 Prior Belief We see that P(father = true son = B) 0.50 ρ = 0.5 ρ + 0.09 (1 ρ) where ρ is the prior belief that the child belong to the father.

11 / 24 Roommate Example The simple version of Bayes Rule is just an application of the definition of conditional probability. Suppose that 90% of statistics majors are looking for a roommate, compared with 10% of econ majors and 50% of bio majors. And suppose that Duke has only those three majors, with 25% of the students majoring in stats, 45% majoring in econ, and the rest in bio (with no double majors). Suppose your friend finds a roommate. What is the probability that it is a statistics major?

12 / 24 Roommate Example From the definition of conditional probability, For the numerator, P( stat roommate ) = P( stat and roommate ). P( roommate ) P( stat and roommate ) = P( roommate stat ) P( stat ) = 0.9 0.25 For the denominator, (let d denote looking for roommate), P(roommate) = P( (s and d) or (e and d) or (b and d) ) = P( s and d ) + P( e and d ) + P( b and d ) = P(d s) P(s) + P(d e) P(e) + P(d b) P(b) = 0.9 0.25 + 0.1 0.45 + 0.5 0.3

13 / 24 Roommate Example Putting everything together we have, P( stat roommate ) = P( stat and roommate ) P( roommate ) = 0.9 0.25 0.9 0.25 + 0.1 0.45 + 0.5 0.3 = 0.5357 We can similarly compute: P( econ roommate ) = P( econ and roommate ) P( roommate ) = 0.1 0.45 0.9 0.25 + 0.1 0.45 + 0.5 0.3 = 0.107

14 / 24 Roommate Example P( bio roommate ) = P( econ and roommate ) P( roommate ) = 0.5 0.35 0.9 0.25 + 0.1 0.45 + 0.5 0.3 = 0.357 Note that P(stat roommate) + P(econ roommate) + P(bio roommate) = 1 Reverse conditioning - do you see a pattern yet?

15 / 24 Law of Total Probability In general, let A 1,..., A n be mutually exclusive and suppose that P(A 1 or A 2 or... or A n ) = 1. Then the Bayes formula is: P(A 1 B) = P(B A 1) P(A 1 ) P(B) = P(B A 1 ) P(A 1 ) n i=1 P(B A i) P(A i )). Note the reversal from P(B A k ) to P(A k B) requires the calculation of P(B) using P(B A k )

16 / 24 HIV Test ELISA is a test for HIV. If a person has HIV, ELISA has probability.997 of resulting in a positive test (true positive). If a person does not have HIV, then ELISA is negative with probability.985. (true negative) About 0.32% of the U.S. population has HIV. Suppose you get an HIV test (e.g., as part of a marriage license). Your test comes back positive. What is the chance that you have HIV? We can use Bayes Rule. Let A 1 = {have HIV} and let A 2 = {do not have HIV}.

17 / 24 HIV Test P[A 1 pos ] = P[ pos A 1 ] P[A 1 ] P[ pos A 1 ] P[A 1 ] + P[ pos A 2 ] P[A 2 ] =.0032.997 (.997) (.0032) + (1.985) (1.0032) =.1758 So even though the test is positive, you are still unlikely to have HIV. This is because the background rate of HIV is quite low.

18 / 24 Buying a car: Bayes with Tree Diagram Suppose you need to buy a car. Based on previous experience you know that 30% of the cars have FAULTY transmission. To be sure you bring your car to a mechanic. Based on his past experience you know that he correctly detects cars with good transmission 80% of the cases. he correctly detects car with bad transmission 90% of the cases. What is the probability that you buy a faulty car if the mechanic says that is faulty? What is the probability that you buy a faulty car if the mechanic says that is good?

Buying a car: Bayes with Tree Diagram 19 / 24

Buying a Car: Bayes with Tree Diagram It turns out that true positive: P(F F ) = 27% 27%+14% = 66% false positive: P(F G ) = 3% 3%+56% = 5% true negative: P(G G ) = 56% 3%+56% = 95% false negative: P(G F ) = 14% 27%+14% = 34% The tree model is easy to be understood, but not practical when you have many cases!! Here we only have 2. 20 / 24

Buying a Car: Bayes without Tree Diagram It turns out that P(F) = 30% P(G) = 70% P( F F) = 90% P( G F) = 10% P( G G) = 80% P( F G) = 20% P(F F ) = P( F F)P(F) P( F ) P(F G ) = P( G F)P(F) P( G ) = 90% 30% 90% 30% + 20% 70% = 66% 10% 30% = 10% 30% + 80% 70% = 5% 21 / 24

22 / 24 The Monty Hall Problem with Bayes Define O xy to be the event such that the TV show presenter opens door x if you pick y. W x to be the event such that the door you pick, x, is the winning door. Therefore you are interested in knowing P(W A O BA ). Conditional probabilities tell us that P(W A O BA ) = P(W A & O BA ) P(O BA ) = P(O BA W A )P(W A ) P(O BA )

23 / 24 The Monty Hall Problem with Bayes P(W A ) = 1 3 represents the probability that the door you picked is the winning one! P(O BA W A ) = 1 2 is the probability that the TV show presenter opens door B given that door you picked and A is the winning one. Finally the marginal distribution that the TV show presenter opens door B is P(O BA ) = P(W A & O BA ) + P(W B & O BA ) + P(W C & O BA ) = 1 6 + 0 + 1 3 = 3 6

24 / 24 The Monty Hall Problem with Bayes Note that P(W A ) represents our prior distribution (personal probability). Opening the door B represents the data that are going to update our personal probability. The marginal probability P(O BA ) re-weight the numerator and again makes the experimental information irrelevant: P(W A O BA ) = 1/2 1/3 3/6 = 1 3