Probabilistic and Bayesian Analytics

Similar documents
Probability Basics. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Sources of Uncertainty

Probabilistic and Bayesian Analytics Based on a Tutorial by Andrew W. Moore, Carnegie Mellon University

Bayes Nets for representing and reasoning about uncertainty

CISC 4631 Data Mining

CS 331: Artificial Intelligence Probability I. Dealing with Uncertainty

Machine Learning

A [somewhat] Quick Overview of Probability. Shannon Quinn CSCI 6900

Machine Learning

CS 4649/7649 Robot Intelligence: Planning

Bayes Nets for representing

Bayes and Naïve Bayes Classifiers CS434

Machine Learning

Dealing with Uncertainty. CS 331: Artificial Intelligence Probability I. Outline. Random Variables. Random Variables.

Bayes Nets for representing and reasoning about uncertainty

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Information Gain. Andrew W. Moore Professor School of Computer Science Carnegie Mellon University.

Basic Probability and Statistics

Probability review. Adopted from notes of Andrew W. Moore and Eric Xing from CMU. Copyright Andrew W. Moore Slide 1

VC-dimension for characterizing classifiers

VC-dimension for characterizing classifiers

Algorithms for Playing and Solving games*

Machine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty

Refresher on Discrete Probability

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Probabilistic and Bayesian Analytics

Uncertain Knowledge and Bayes Rule. George Konidaris

Bayesian Networks: Independencies and Inference

Introduction to Machine Learning

Introduction to Machine Learning

Machine Learning, Fall 2009: Midterm

Artificial Intelligence Programming Probability

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

Probability Densities in Data Mining

Probability Theory Review

Modeling and reasoning with uncertainty

CS 188: Artificial Intelligence Spring Today

Uncertainty. Chapter 13

Uncertainty. Variables. assigns to each sentence numerical degree of belief between 0 and 1. uncertainty

Aarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell

CS 188: Artificial Intelligence. Our Status in CS188

1 Probabilities. 1.1 Basics 1 PROBABILITIES

Basic Probabilistic Reasoning SEG

Discrete Probability and State Estimation

Symbolic Logic Outline

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Naive Bayes classification

DATA MINING: NAÏVE BAYES

A Problem Involving Games. Paccioli s Solution. Problems for Paccioli: Small Samples. n / (n + m) m / (n + m)

Basic notions of probability theory

Why Probability? It's the right way to look at the world.

Cross-validation for detecting and preventing overfitting

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

Bayesian Learning. Artificial Intelligence Programming. 15-0: Learning vs. Deduction

Reasoning with Uncertainty

Bayesian Networks Structure Learning (cont.)

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II

CSC2515 Assignment #2

CS 188: Artificial Intelligence Spring Announcements

Proof Techniques (Review of Math 271)

Probability and Decision Theory

CHAPTER 2 Estimating Probabilities

CS 5100: Founda.ons of Ar.ficial Intelligence

Artificial Intelligence Uncertainty

In today s lecture. Conditional probability and independence. COSC343: Artificial Intelligence. Curse of dimensionality.

CS 5522: Artificial Intelligence II

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace

Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.

Lecture 7B: Chapter 6, Section 2 Finding Probabilities: More General Rules

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics

Uncertainty. Chapter 13

Bayesian belief networks

Chapter Six. Approaches to Assigning Probabilities

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012

BAYESIAN inference is rooted from the well-known

CS 561: Artificial Intelligence

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16

From Bayes Theorem to Pattern Recognition via Bayes Rule

Imprecise Probability

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012

CS 343: Artificial Intelligence

Notes and Solutions #6 Meeting of 21 October 2008

The Basics COPYRIGHTED MATERIAL. chapter. Algebra is a very logical way to solve

A new Approach to Drawing Conclusions from Data A Rough Set Perspective

10/15/2015 A FAST REVIEW OF DISCRETE PROBABILITY (PART 2) Probability, Conditional Probability & Bayes Rule. Discrete random variables

CSE 473: Artificial Intelligence Autumn 2011

Representing and Querying Correlated Tuples in Probabilistic Databases

Fundamentals of Probability CE 311S

Artificial Intelligence

Drawing Conclusions from Data The Rough Set Way

MA554 Assessment 1 Cosets and Lagrange s theorem

Introduction to Artificial Intelligence. Unit # 11

Lecture 4 : Conditional Probability and Bayes Theorem 0/ 26

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

CS 5522: Artificial Intelligence II

Transcription:

Probabilistic and Bayesian Analytics Note to other teachers and users of these slides. Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit your own needs. PowerPoint originals are available. If you make use of a significant portion of these slides in your own lecture, please include this message, or the following link to the source repository of Andrew s tutorials: http://www.cs.cmu.edu/~awm/tutorials. Comments and corrections gratefully received. Andrew W. Moore Professor School of Computer Science Carnegie Mellon University www.cs.cmu.edu/~awm awm@cs.cmu.edu 42-268-7599 Copyright Andrew W. Moore Slide Probability The world is a very uncertain place 3 years of Artificial Intelligence and Database research danced around this fact And then a few AI researchers decided to use some ideas from the eighteenth century Copyright Andrew W. Moore Slide 2

What we re going to do We will review the fundamentals of probability. It s really going to be worth it In this lecture, you ll see an example of probabilistic analytics in action: Bayes Classifiers Copyright Andrew W. Moore Slide 3 Discrete Random Variables A is a Boolean-valued random variable if A denotes an event, and there is some degree of uncertainty as to whether A occurs. Examples A = The US president in 223 will be male A = You wake up tomorrow with a headache A = You have Ebola Copyright Andrew W. Moore Slide 4 2

Probabilities We write A) as the fraction of possible worlds in which A is true We could at this point spend 2 hours on the philosophy of this. But we won t. Copyright Andrew W. Moore Slide 5 Visualizing A Event space of all possible worlds Worlds in which A is true A) = Area of reddish oval Its area is Worlds in which A is False You can think of a probability as the measurement of a set. In this case, the measurement is area. Copyright Andrew W. Moore Slide 6 3

The Axioms Of Probab ility Copyright Andrew W. Moore Slide 7 The Axioms of Probability <= A) <= True) = False) = A or B) = A) + B) - A and B) These plus set theory are all you need Where do these axioms come from? Were they discovered? Answers coming up later. Copyright Andrew W. Moore Slide 8 4

Interpreting the axioms <= A) <= True) = False) = A or B) = A) + B) - A and B) The area of A can t get any smaller than And a zero area would mean no world could ever have A true Copyright Andrew W. Moore Slide 9 Interpreting the axioms <= A) <= True) = False) = A or B) = A) + B) - A and B) The area of A can t get any bigger than And an area of would mean all worlds will have A true Copyright Andrew W. Moore Slide 5

Interpreting the axioms <= A) <= True) = False) = A or B) = A) + B) - A and B) A A and B B Copyright Andrew W. Moore Slide Interpreting the axioms <= A) <= True) = False) = A or B) = A) + B) - A and B) A and B) would be counted twice A B A or B) B Simple addition and subtraction Copyright Andrew W. Moore Slide 2 6

These Axioms are Not to be Trifled With There have been attempts to do different methodologies for uncertainty Fuzzy Logic Three-valued logic Dempster-Shafer Non-monotonic reasoning But the axioms of probability are the only system with this property: If you gamble using them you can t be unfairly exploited by an opponent using some other system [di Finetti 93] Copyright Andrew W. Moore Slide 3 Theorems from the Axioms <= A) <=, True) =, False) = A or B) = A) + B) - A and B) From these we can prove: not A) = ~A) = - A) How? Substitute B = ~A above Notice A and ~A) = Notice A or ~A) = A ~A Copyright Andrew W. Moore Slide 4 7

Side Note I am inflicting these proofs on you for two reasons:. These kind of manipulations will need to be second nature to you if you use probabilistic analytics in depth. (or even not in depth -- Dan) 2. Suffering is good for you Copyright Andrew W. Moore Slide 5 Another important theorem <= A) <=, True) =, False) = A or B) = A) + B) - A and B) From these we can prove: A) = A ^ B) + A ^ ~B) How? One way: Let C = A ^ B, D = A ^ ~B Substitute above Notice C ^ D) = Notice C D) = A) A ~B B ~A Copyright Andrew W. Moore Slide 6 8

Multivalued Random Variables Suppose A can take on more than 2 values A is a random variable with arity k if it can take on exactly one value out of {v,v 2,.. v k } Thus A = v i " A = v j ) = if i # j A = v " A = v 2 "K" A = v k ) = Copyright Andrew W. Moore Slide 7 An easy fact about Multivalued Random Variables: Using the axioms of probability <= A) <=, True) =, False) = A or B) = A) + B) - A and B) And assuming that A obeys A = v i " A = v j ) = if i # j A = v " A = v 2 "K" A = v k ) = It s easy to prove that A = v " A = v 2 "K" A = v i ) = # A = v j ) i j= Copyright Andrew W. Moore Slide 8 9

An easy fact about Multivalued Random Variables: Using the axioms of probability <= A) <=, True) =, False) = A or B) = A) + B) - A and B) And assuming that A obeys A = v i " A = v j ) = if i # j A = v " A = v 2 "K" A = v k ) = It s easy to prove that A = v " A = v 2 "K" A = v i ) = # A = v j ) And thus we can prove k! j= A = v j ) = i j= Copyright Andrew W. Moore Slide 9 Another fact about Multivalued Random Variables: Using the axioms of probability <= A) <=, True) =, False) = A or B) = A) + B) - A and B) And assuming that A obeys A = v i " A = v j ) = if i # j A = v " A = v 2 "K" A = v k ) = It s easy to prove that B "[A = v # A = v 2 #K# A = v i ]) = $ B " A = v j ) i j= Copyright Andrew W. Moore Slide 2

Another fact about Multivalued Random Variables: Using the axioms of probability <= A) <=, True) =, False) = A or B) = A) + B) - A and B) And assuming that A obeys A = v i " A = v j ) = if i # j A = v " A = v 2 "K" A = v k ) = It s easy to prove that B "[A = v # A = v 2 #K# A = v i ]) = $ B " A = v j ) And thus we can prove k B) = # B " A = v j ) j= i j= Copyright Andrew W. Moore Slide 2 Elementary Probability in Pictures k! j= A = v j ) = Copyright Andrew W. Moore Slide 22

Elementary Probability in Pictures B) = k! j= B " A = v j ) Copyright Andrew W. Moore Slide 23 Conditional Probability A B) = Fraction of worlds in which B is true that also have A true H = Have a headache F = Coming down with Flu F H) = / F) = /4 H F) = /2 H Headaches are rare and flu is rarer, but if you re coming down with flu there s a 5-5 chance you ll have a headache. Copyright Andrew W. Moore Slide 24 2

Conditional Probability F H F) = Fraction of flu-inflicted worlds in which you have a headache H = Have a headache F = Coming down with Flu H) = / F) = /4 H F) = /2 H = #worlds with flu and headache ------------------------------------ #worlds with flu = Area of H and F region ------------------------------ Area of F region = H ^ F) ----------- F) Copyright Andrew W. Moore Slide 25 Definition of Conditional Probability A ^ B) A B) = ----------- B) Corollary: The Chain Rule A ^ B) = A B) B) Copyright Andrew W. Moore Slide 26 3

Probabilistic Inference F H = Have a headache F = Coming down with Flu H H) = / F) = /4 H F) = /2 One day you wake up with a headache. You think: Drat! 5% of flus are associated with headaches so I must have a 5-5 chance of coming down with flu Is this reasoning good? Copyright Andrew W. Moore Slide 27 Probabilistic Inference F F ^ H) = H H = Have a headache F = Coming down with Flu H) = / F) = /4 H F) = /2 F H) = Copyright Andrew W. Moore Slide 28 4

Another way to understand the intuition Thanks to Jahanzeb Sherwani for contributing this explanation: Copyright Andrew W. Moore Slide 29 What we just did A ^ B) A B) B) B A) = ----------- = --------------- A) A) This is Bayes Rule Bayes, Thomas (763) An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society of London, 53:37-48 Copyright Andrew W. Moore Slide 3 5

Aside: Fixing my mistakes Independent Mutually Exclusive A and B) = A) B) A or B) = A) + B) A A B B Special case of A and B) = A) B A) Special case of A or B) = A) + B) - A and B) These are not the same thing! Copyright Andrew W. Moore Slide 3 Copyright Andrew W. Moore Slide 32 6

Using Bayes Rule to Gamble kr. The Win envelope has krónur and four beads in it The Lose envelope has three beads and no money Trivial question: someone draws an envelope at random (i.e. 5% chance of winning) and offers to sell it to you. How much should you pay? Copyright Andrew W. Moore Slide 33 Using Bayes Rule to Gamble kr. The Win envelope has krónur and four beads in it The Lose envelope has three beads and no money Interesting question: before deciding, you are allowed to see one bead drawn from the envelope. Suppose it s black: How much should you pay? Suppose it s red: How much should you pay? Copyright Andrew W. Moore Slide 34 7

More General Forms of Bayes Rule A B) = B A)A) B A)A) + B ~ A)~ A) A B " X) = B A " X)A " X) B " X) Here, the denominator just computes B). In the gambling example, we could have done: black) = black win) win) + black ~win) ~win) Copyright Andrew W. Moore Slide 35 More General Forms of Bayes Rule A = v i B) = B A = v i )A = v i ) n A " k= B A = v k )A = v k ) Copyright Andrew W. Moore Slide 36 8

Useful Easy-to-prove facts A B) + ~ A B) = n A " A = v k B) = k= Implied by the definition of conditional probability A ^ B) A B) = ----------- B) and the marginalization formula A) = A ^ B) + A ^ ~B) Copyright Andrew W. Moore Slide 37 The Joint Distribution Recipe for making a joint distribution of M variables: Example: Boolean variables A, B, C Copyright Andrew W. Moore Slide 38 9

The Joint Distribution Recipe for making a joint distribution of M variables:. Make a truth table listing all combinations of values of your variables (if there are M Boolean variables then the table will have 2 M rows). A Example: Boolean variables A, B, C B C Copyright Andrew W. Moore Slide 39 The Joint Distribution Recipe for making a joint distribution of M variables:. Make a truth table listing all combinations of values of your variables (if there are M Boolean variables then the table will have 2 M rows). 2. For each combination of values, say how probable it is. A Example: Boolean variables A, B, C B C Prob.3.5..5.5..25. Copyright Andrew W. Moore Slide 4 2

The Joint Distribution Recipe for making a joint distribution of M variables:. Make a truth table listing all combinations of values of your variables (if there are M Boolean variables then the table will have 2 M rows). 2. For each combination of values, say how probable it is. 3. If you subscribe to the axioms of probability, those numbers must sum to. A A Example: Boolean variables A, B, C B.5 C..25.5 Prob.3.5..5.5..25...5 C.3 B. Copyright Andrew W. Moore Slide 4 Using the Joint One you have the Joint Distribution, you can ask for the probability of any logical expression involving your attributes E) =! rows matching E row) Copyright Andrew W. Moore Slide 42 2

Using the Joint Poor ^ Male) =.4654 E) =! rows matching E row) Copyright Andrew W. Moore Slide 43 Using the Joint Poor) =.764 E) =! rows matching E row) Copyright Andrew W. Moore Slide 44 22

Inference with the Joint E E 2 ) = E " E2) E ) 2 =! rows matching E and E! rows matching E row) 2 2 row) Copyright Andrew W. Moore Slide 45 Inference with the Joint E E 2 ) = E " E2) E ) 2 =! rows matching E and E! rows matching E row) 2 2 row) Male Poor) =.4654 /.764 =.62 Copyright Andrew W. Moore Slide 46 23

Inference with the Joint E E 2 ) = E " E2) E ) 2 =! rows matching E and E! rows matching E row) 2 2 row) Poor Male) =.4654 /.6685 =.696 Copyright Andrew W. Moore Slide 47 Inference is a big deal I ve got this evidence. What s the chance that this conclusion is true? I ve got a sore neck: how likely am I to have meningitis? I see my lights are out and it s 9pm. What s the chance my spouse is already asleep? Copyright Andrew W. Moore Slide 48 24

Inference is a big deal I ve got this evidence. What s the chance that this conclusion is true? I ve got a sore neck: how likely am I to have meningitis? I see my lights are out and it s 9pm. What s the chance my spouse is already asleep? There s a thriving set of industries growing based around Bayesian Inference. Highlights are: Medicine, Pharma, Help Desk Support, Engine Fault Diagnosis Copyright Andrew W. Moore Slide 49 Where do Joint Distributions come from? Idea One: Expert Humans Idea Two: Simpler probabilistic facts and some algebra (axioms, Bayes rule, ) Example: Suppose you had three binary variables A, B, C, and you knew A) =.7 B A) =.2 B ~A) =. C A^B) =. C A^~B) =.8 C ~A^B) =.3 C ~A^~B) =. A=x ^ B=y ^ C=z) = C=z A=x^ B=y) B=y A=x) A=x) Then you can automatically compute the JD using the chain rule Once you have recovered JD, you can ask whatever you want! Copyright Andrew W. Moore Slide 5 25

Where do Joint Distributions come from? Idea Three: Learn them from data! Prepare to see one of the most impressive learning algorithms you ll come across in the entire course. Copyright Andrew W. Moore Slide 5 Learning a joint distribution Build a JD table for your attributes in which the probabilities are unspecified A B C Prob???????? Fraction of all records in which A and B are True but C is False The fill in each row with records matching row Pˆ (row) = total number of records A B C Prob.3.5..5.5..25. Copyright Andrew W. Moore Slide 52 26

Example of Learning a Joint This Joint was obtained by learning from three attributes in the UCI Adult Census Database [Kohavi 995] Copyright Andrew W. Moore Slide 53 Where are we? We have recalled the fundamentals of probability We have become content with what Joint Distributions are and how to use them And we even know how to learn Joint Distributions from data. Copyright Andrew W. Moore Slide 54 27