Bayes Formula. MATH 107: Finite Mathematics University of Louisville. March 26, 2014

Similar documents
MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability

With Question/Answer Animations. Chapter 7

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Slide 1 Math 1520, Lecture 21

When working with probabilities we often perform more than one event in a sequence - this is called a compound probability.

Lecture 2. Conditional Probability

STAT 201 Chapter 5. Probability

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Bayesian Updating: Discrete Priors: Spring

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Event A: at least one tail observed A:

CSC Discrete Math I, Spring Discrete Probability

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Math 140 Introductory Statistics

Conditional Probability 2 Solutions COR1-GB.1305 Statistics and Data Analysis

Math 140 Introductory Statistics

Introduction to probability

Statistical testing. Samantha Kleinberg. October 20, 2009

Random processes. Lecture 17: Probability, Part 1. Probability. Law of large numbers

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

Probability: Part 1 Naima Hammoud

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Conditional Probability. CS231 Dianna Xu

Bayesian Updating: Discrete Priors: Spring

Introduction to Machine Learning

Introduction to Machine Learning

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

MA : Introductory Probability

CHAPTER 3 PROBABILITY: EVENTS AND PROBABILITIES

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

Quantitative Understanding in Biology 1.7 Bayesian Methods

STAT:5100 (22S:193) Statistical Inference I

STAT Chapter 3: Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Probability. Lecture Notes. Adolfo J. Rumbos

Probability Intro Part II: Bayes Rule

Discrete Probability and State Estimation

CSC321 Lecture 18: Learning Probabilistic Models

Math 243 Section 3.1 Introduction to Probability Lab

Lecture 4 Bayes Theorem

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Lecture 4 Bayes Theorem

Conditional Probability

Topic 4 Probability. Terminology. Sample Space and Event

Machine Learning CMPT 726 Simon Fraser University. Binomial Parameter Estimation

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan

Discrete Probability. Chemistry & Physics. Medicine

Bayesian Inference. Introduction

Math 140 Introductory Statistics

CS626 Data Analysis and Simulation

STA Module 4 Probability Concepts. Rev.F08 1

P (E) = P (A 1 )P (A 2 )... P (A n ).

A Event has occurred

Lecture 1: Probability Fundamentals

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Probability and Independence Terri Bittner, Ph.D.

Probability Theory for Machine Learning. Chris Cremer September 2015

2.4. Conditional Probability

ENGR 200 ENGR 200. What did we do last week?

1 INFO 2950, 2 4 Feb 10

Conditional Probability

MATH 10 INTRODUCTORY STATISTICS

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

A survey of Probability concepts. Chapter 5

STAT 111 Recitation 1

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Hypothesis tests

Dept. of Linguistics, Indiana University Fall 2015

2011 Pearson Education, Inc

Math 1313 Experiments, Events and Sample Spaces

Name: Exam 2 Solutions. March 13, 2017

Chapter 2: Probability Part 1

Probability Theory and Simulation Methods

Conditional Probability & Independence. Conditional Probabilities

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

M378K In-Class Assignment #1

Basic Probability. Introduction

A brief review of basics of probabilities

Formal Modeling in Cognitive Science

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.

Lecture 3 Probability Basics

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Standard & Conditional Probability

Single Maths B: Introduction to Probability

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Conditional Probability P( )

Frequentist Statistics and Hypothesis Testing Spring

Why Bayesian? Rigorous approach to address statistical estimation problems. The Bayesian philosophy is mature and powerful.

Discrete Structures Prelim 1 Selected problems from past exams

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace

1.6/1.7 - Conditional Probability and Bayes Theorem

Transcription:

Bayes Formula MATH 07: Finite Mathematics University of Louisville March 26, 204 Test Accuracy Conditional reversal 2 / 5 A motivating question A rare disease occurs in out of every 0,000 people. A test for this disease is 99.9% accurate (that is, it correctly determines whether the disease is present or not 99.9% of the time). You tested positive for the disease. Is the probability you actually have the disease about... %? 0%? 50%? 90%? 99%? Believe it or not, the majority of positive test results are false, despite this test s high accuracy!

Conditional reversal 3 / 5 Justification of that result There are four possible classifications for any person: they could have the disease or not, and could test positive or negative. Let s look at how we would expect 0,000,000 people to be classified: Test positive Test negative Total Have disease 999,000 No disease 9,999 9,989,00 9,999,000 Total 0,998 9,989,002 0,000,000 of every 0,000 people has the disease. In each category, of every,000 people gets the wrong test result. We know we re one of the 0,998 people who test positive; but only 999 of those have the disease, so our chances of having the disease are: 999 0998 9% Conditional reversal 4 / 5 What we re doing here: as trees We could have phrased the original setup in the problem as a tree: 0.999 Positive test 0.000 Disease 0.00 Negative test Start 0.00 Positive test 0.9999 No disease 0.999 Negative test And then the problem raised was reversing the tree:? Disease? Positive test? No disease Start? Disease? Negative test? No disease

Conditional reversal 5 / 5 What we re doing here: as events We could identify the disease as event D, and a positive test as T. The scenario told us that P(D) 0.000, and that P(T D) P(T D ) 0.999. The question at the end of the scenario was to evaluate P(D T). Thus, in a nutshell, what we want to do is reverse conditional probabilities which we know. Deriving a formula Bayes Formula 6 / 5 Suppose we know P(A), P(B A), and P(B A ). How do we find P(A B)? P(A B) P(A B) P(B) P(B) P(A B) + P(A B) + P(B A )P(A ) + P(B A )( P(A)) This is one of several variations on what is known as Bayes formula.

Using Bayes formula Bayes Formula 7 / 5 P(A B) + P(B A )( P(A)) In our disease example, the disease (event A) had probability 0.000, while the test (event B) occurred with conditional probabilities P(B A ) 0.00 and P(B A) 0.999, so: P(A B) as we had previously worked out. 0.999 0.000 0.999 0.000 + 0.00 0.9999 0.0908 Detecting a fake Bayes Formula 8 / 5 An unfair coin problem We have two coins, one of which is fair and the other of which lands heads-up 75% of the time, but we don t know which is which. We pick a random coin and flip it eight times and get the results HHHHHTHT. This looks like it s the unfair one, but how certain can we be? We have two events: A is the event that the chosen coin is in fact the unfair one, and B would be the event that 8 coin flips come out the way ours did. What we want to find is P(A B). We know: P(A) 2 ;P(B A) (3 4 ) 6 ( 4 ) 2 ;P(B A ) ( 2 ) 8

Detecting a fake, continued Bayes Formula 9 / 5 P(A) 729 ;P(B A) 2 2 2 ;P(B A ) 2 6 We apply Bayes formula: P(A B) + P(B A )( P(A)) 729 2 2 2 729 2 2 2 + 2 6 2 729 793 92% which is a good but not ironclad certainty that we have the right choice. Bayes Formula 0 / 5 Another variation on Bayes theorem Sometimes instead of having a single prior result we want to find, we might have one of several mutually exclusive possibilities. We might have mutually exclusive events A,A 2,...,A n with total probability and an event B, and want to find P(A B) from all the P(A i ) and P(B A i ): P(A B) P(A B) P(B) P(B A )P(A ) P(B) P(B A )P(A ) P(A B) + P(A 2 B) + + P(A n B) P(B A )P(A ) P(B A )P(A ) + P(B A 2 )P(A 2 ) + + P(B A n )P(A n )

Bayes Formula / 5 A multiple-possibility scenario Two fake coins! We now happen to have a collection of three coins; one fair, one of which lands heads-up 3 4 of the time, and one of which lands tails-up 3 4 of the time. We pick one at random, flip it 7 times, and get the result TTHHHTH. What is the chance we chose the fair coin? Let A be the event of picking the fair coin, A 2 the event of picking the heads-biased coin, A 3 the event of picking the tail-biased coin, and B the event of flipping our chosen coin to get TTHHHTH. We want to know P(A B).Clearly, each P(A i ) 3, and: P(B A ) ( 2 ) 7 ;P(B A 2 ) ( 3 4 ) 4 ( 4 ) 3 ;P(B A 3 ) ( 3 4 ) 3 ( 4 ) 4. Bayes Formula 2 / 5 A multiple-possibility scenario, cont d P(A ) P(A 2 ) P(A 3 ) 3 ;P(B A ) 2 7 ;P(B A 2) 8 4 7 ;P(B A 3) 27 4 7 We apply Bayes formula: P(A B) P(B A )P(A ) P(B A )P(A ) + P(B A 2 )P(A 2 ) + P(B A 3 )P(A 3 ) 2 7 3 2 7 3 + 8 4 7 3 + 27 4 7 3 28 236 54% which is better than a blind guess (right 33% of the time), but not by much!

Bayes Formula 3 / 5 Another unequal-probability scenario Two unfair coins... and lots of fair ones! Our heads-up (75%) and tails-up (75%) coins are mixed in with eight normal coins. We pick one at random, flip it 6 times, and get all heads. What are the probabilities we got the heads-biased coin? The tails-biased coin? A fair coin? Let A be the event of picking a fair coin, A 2 the event of picking the heads-biased coin, A 3 the event of picking the tail-biased coin, and B the event of flipping six heads. We re interested in the three conditional probabilities P(A i B). P(A ) 0.8;P(A 2 ) P(A 3 ) 0. P(B A ) 2 6 ;P(B A 2) 36 4 6 ;P(B A 3) 4 6 Bayes Formula 4 / 5 Another unequal-probability scenario, cont d Applying Bayes formula: P(A B) P(A 2 B) P(B A )P(A ) P(B A )P(A ) + P(B A 2 )P(A 2 ) + P(B A 3 )P(A 3 ) 0.8 2 6 0.8 + 729 0. + 0. 256 62 4% 2 6 4 6 4 6 P(B A 2 )P(A 2 ) P(B A )P(A ) + P(B A 2 )P(A 2 ) + P(B A 3 )P(A 3 ) 729 0. 4 6 0.8 + 729 0. + 0. 729 242 59% 2 6 4 6 4 6 P(B A P(A 3 B) 3 )P(A 3 ) P(B A )P(A ) + P(B A 2 )P(A 2 ) + P(B A 3 )P(A 3 ) 0. 4 6 0.8 + 729 0. + 0. 242 0.08% 2 6 4 6 4 6

Where we use this Bayes Formula 5 / 5 In many problems, the real-world state is not yet known, and the test is all we have. Even the probabilities associated with the real-world state are often unknown! In practice, we often assume some simple probability distribution (called the prior probability) for the real-world state, and use Bayes formula to figure out a distribution which better fits the evidence (called the posterior probability). By using the posterior of one experiment to determine the prior of another, we can get a very accurate idea of the true underlying probabilities! This technique is known as Bayesian statistics.