Probabilistic Models

Similar documents
Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.

Probabilistic Reasoning

Uncertainty. Outline

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

Probabilistic Robotics

Uncertainty. Chapter 13

Artificial Intelligence

Uncertainty. Chapter 13

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?

Bayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.

UNCERTAINTY. In which we see what an agent should do when not all is crystal-clear.

Quantifying uncertainty & Bayesian networks

Probability, Entropy, and Inference / More About Inference

Artificial Intelligence Uncertainty

Quantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari

ARTIFICIAL INTELLIGENCE. Uncertainty: probabilistic reasoning

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Basic Probabilistic Reasoning SEG

Uncertainty. Russell & Norvig Chapter 13.

Introduction to Mobile Robotics Probabilistic Robotics

Where are we in CS 440?

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012

Probabilistic Reasoning

Uncertainty (Chapter 13, Russell & Norvig) Introduction to Artificial Intelligence CS 150 Lecture 14

Machine Learning

CS 188: Artificial Intelligence Spring Today

Probability Theory Review

Pengju XJTU 2016

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Cartesian-product sample spaces and independence

Reasoning under Uncertainty: Intro to Probability

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Probabilistic Graphical Models

Machine Learning

Why Probability? It's the right way to look at the world.

Where are we in CS 440?

Modeling and reasoning with uncertainty

Resolution or modus ponens are exact there is no possibility of mistake if the rules are followed exactly.

Probability Basics. Robot Image Credit: Viktoriya Sukhanova 123RF.com

CS 561: Artificial Intelligence

Artificial Intelligence Programming Probability

Web-Mining Agents Data Mining

Introduction to Bayesian Learning. Machine Learning Fall 2018

Introduction to Machine Learning

Uncertainty. Chapter 13, Sections 1 6

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

Our Status. We re done with Part I Search and Planning!

Discrete Probability and State Estimation

Introduction to Machine Learning

Answering. Question Answering: Result. Application of the Theorem Prover: Question. Overview. Question Answering: Example.

Computer Science CPSC 322. Lecture 18 Marginalization, Conditioning

Uncertainty. Chapter 13. Chapter 13 1

Probability is related to uncertainty and not (only) to the results of repeated experiments

CS 188: Artificial Intelligence. Our Status in CS188

Uncertainty and Bayesian Networks

13.4 INDEPENDENCE. 494 Chapter 13. Quantifying Uncertainty

Probabilistic Models

Y. Xiang, Inference with Uncertain Knowledge 1

CS 5100: Founda.ons of Ar.ficial Intelligence

Bayesian Inference: What, and Why?

Pengju

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

Probability theory basics

Conditional probabilities and graphical models

Reasoning with Uncertainty

Uncertain Knowledge and Reasoning

Artificial Intelligence CS 6364

Discrete Probability and State Estimation

Probabilistic and Bayesian Analytics

Lecture Overview. Introduction to Artificial Intelligence COMP 3501 / COMP Lecture 11: Uncertainty. Uncertainty.

Uncertainty. Outline. Probability Syntax and Semantics Inference Independence and Bayes Rule. AIMA2e Chapter 13

Our Status in CSE 5522

Naïve Bayes classification

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

CPSC 340: Machine Learning and Data Mining

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

CSE 473: Artificial Intelligence

Lecture : Probabilistic Machine Learning

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012

CS 188: Artificial Intelligence Fall 2009

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16

Probabilistic and Bayesian Machine Learning

CS188 Outline. We re done with Part I: Search and Planning! Part II: Probabilistic Reasoning. Part III: Machine Learning

Intelligent Systems I

Probability and Information Theory. Sargur N. Srihari

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace

Bayesian belief networks

Introduction to AI Learning Bayesian networks. Vibhav Gogate

Probabilistic representation and reasoning

Bayesian Inference. Chris Mathys Wellcome Trust Centre for Neuroimaging UCL. London SPM Course

From inductive inference to machine learning

Machine Learning

Probabilistic Models. Models describe how (a portion of) the world works

CPSC 340: Machine Learning and Data Mining. MLE and MAP Fall 2017

Transcription:

582636 Probabilistic Models Spring 2009 Petri Myllymäki Department of Computer Science University of Helsinki, Finland http://www.cs.helsinki.fi/group/cosco/teaching/probability/2009 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 1

Three concepts Compression, coding, modeling Uncertain reasoning Information Probability Utility (Stochastic) search Decision making Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 2

Our motivation: Uncertain reasoning Information Theory Mathematical Statistics Computer Science Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 3

Yet another probability course? Computer science point of view Artificial Intelligence Point of View Agent point of view Knowledge representation (KR) Reasoning Rationality Decision making Grounding Machine learning point of view Computational methods for data analysis Large data bases Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 4

Reasoning under uncertainty The world is a very uncertain place Thirty years of Artificial Intelligence and Database research danced around this fact And then a few AI researchers decided to use some ideas from the eighteenth century Uncertainty in Artificial Intelligence conference series 1985- Probabilistic reasoning now mainstream AI Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 5

But first there was logic Historically too it was first - syllogisms a model of rationality Certainty, correctness, modularity, monotonicity BUT limited applicability since Agents almost never have access to the whole truth about their environment! Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 6

Acting by certain knowledge only? Is it enough to leave home 90 minutes before the flight departure? Anything can happen. How about X minutes before departure? Are you bound to stay home? Qualification problem: What are the things that have to be taken into account? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 7

Knowledge representation in FOPL Let us try to use FOPL for dental diagnosis p Symptom p,toothache Disease p,cavity p Symptom p,toothache Disease p,cavity Disease p, GumDisease Disease p, Abscess... Wrong! Incomplete! p Disease p,cavity Symptom p,toothache Wrong again, need to add qualifications! Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 8

FOPL representation fails because Laziness Its is too much work to list all the factors to ensure exceptionless rules Theoretical ignorance We do not know all the factors that play role in the phenomenon Practical Ignorance Even if we know all the factors in general, we do not know them for each particular case Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 9

Probability to rescue Probability provides a way to summarize the uncertainty that comes from our laziness and ignorance Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 10

On modeling In building intelligent systems, in statistics and in the rest of the world... Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 11

Modeling framework Decision making Prediction Modeling Problem Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 12

What does this mean? Problem: there is a need to model some part of the universe and make decisions based on the model Modeling: build the best model possible from a priori knowledge and data available Prediction: use the model to predict properties of interest Decision making: decide actions based on the predictions Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 13

For example Problem: online troubleshooting of software/hardware Modeling: build a latent variable (Bayes) model of the problems user encounters based on knowledge about the software and symptom data Prediction: use the model to predict the underlying problem given symptoms Decision making: propose actions to remove the problem (or to find more symptoms) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 14

Microsoft Technical support Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 15

Bayesian email spam filters SpamBayes, OPFile, Outclass, bayespam, bogofilter, ifile, PASP, spamoracle, Spam Assassin Annoyance Filter, BSpam, Spam Bully, Death2Spam, InBoxer, Software: http://spambayes.sourceforge.net/related.html Background: http://spambayes.sourceforge.net/background.html Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 16

Real questions are... Infinite number of models - what models do we consider? Model is always chosen from a set of possible models! How do we compare models (i.e., measure that one model is better than another one) given some data? How do we find good models? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 17

and more How do we use the models to predict unobserved quantities of interest? What actions do we choose given the predictions? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 18

General rational agent framework Model family Problem Problem domain Learning Domain model Inference Actions Sampling Data Predictive distribution Decision making Utilities Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 19

Choice of models Simple models vs. complex models Linear models vs. non-linear models Parametric models vs. non-parametric models Flat models vs. structural models What is complex is a totally nontrivial question One intuition: a complex model has more effective parameters Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 20

The Occam s razor principle The problem: You are given the following sequence: -1, 3, 7, 11 Question: What are the two next numbers? Solution 1: Answer: 15 and 19 Explanation: add 4 to the previous number Solution 2: Answer: -19.9 and 1043.8 Explanation: if the previous number is x, the next one is x 3 /11 + 9/11x 2 + 23/11 Of two competing hypotheses both conforming to our observations, choose the simpler one. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 21

Occam s Razor in Modeling there is a trade-off between the model complexity and fit to the data # of car accidents Occam too complex too simple Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 22 age

Simpler models are better than complex models interpretation: they are easier to understand computation: predictions are typically easier to compute (not necessarily!) universality: they can be applied in more domains (more accurate predictions) models should be only as complex as the data justifies BUT: simpler models are NOT more probable a priori! Bayesian model selection: automatic Occam s razor for model complexity regularization Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 23

Two types of modeling Descriptive models ( Statistical modeling ) describe objects (e.g., data) as they are typically exploratory structures Predictive models ( Predictive inference ) models that are able to predict unknown objects (e.g., future data) models of the underlying process Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 24

Some viewpoints prediction is our business why the best fit to data is not the best predictor data can be erroneous - perfect fit is too specialized and models the errors also! a sample can only identify up to a certain level of complexity intuitive goal: minimize model complexity + prediction error - it keeps you honest! Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 25

Alternatives Probabilistic inference Statistical inference Bayesian inference Fuzzy inference Dempster-Shafer inference Non-monotonic logic Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 26

Bayesian inference: basic concepts Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 27

Some early history Bernoulli (1654-1705) Bayes (1701-1761) Laplace (1749-1827) Prediction problem ( forward probability ): If the probability of an outcome in a single trial is p, what is the relative frequency of occurrence of this outcome in a series of trials? Learning problem ( inverse probability ): Given a number of observations in a series of trials, what are the probabilities of the different possible outcomes? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 28

The Bayes rule Axioms of probability theory: The sum rule: P(A C) + P(Ā C) = 1 The product rule: P(AB C) = P(A BC) P (B C) The Bayes rule: P(A BC) = P(A C) P(B AC) / P(B C) A rule for updating our beliefs after obtaining new information H = hypothesis (model), I = background information, D = data (observations): P(H D I) = P(H I) P(D H I) / P(D I) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 29

Do I have a good test? A new home HIV test is assumed to have 95% sensitivity and 98% specificity a population has HIV prevalence of 1/1000. If you use the test, what is the chance that someone testing positive actually has HIV? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 30

Test continued... P(HIV + test HIV +) =? We know that P(test HIV + HIV +) =.95 P(test HIV + HIV -) =.02 from Bayes we have 45 learned that we can calculate the probability of having HIV given a positive test result by P test HIV + HIV + P HIV + P test HIV + HIV + P HIV + P test HIV + HIV - P HIV - 0.95 x0.001 = 0.95 x0.001 0.02 x 0.99 =0.045 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 31

Thus finally thus over 95% of those testing positive will, in fact, not have HIV the right question is: How should the test result change our belief that we are HIV positive? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 32

Bayesian? Probabilities can be interpreted in various ways: Frequentist interpretation (Fisher,Neyman, Cramer) Degree of belief interpretation (Bernoulli, Bayes, Laplace, Jeffreys, Lindley, Jaynes) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 33

Frequentist says... The long-run frequency of an event is the proportion of the time it occurs in a long sequence of trials - probability is this frequency probability can only be attached to random variables - not to individual events Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 34

Bayesian says... an event x = state of some part of the universe probability of x is the degree of belief that event x will occur probability will always depend on the state of knowledge p(x y,c) means probability of event x given that event y is true and background knowledge C Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 35

Frequentist language for solving problems P(data model) sampling distributions Data Model? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 36

Bayesian language for solving problems Bayesian: P(data model) & P(model data) Prior knowledge Data? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 37

Isn t this what I already do? No. Hypothesis testing X Sampling distribution of the estimator 0,25 0,2 0,15 0,1 0,05 M 0 0.2.4.6.8 1.0 Estimator (function of data)... Data... Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 38

The Bayesian way 0,25 0,2 0,15 0,1 0,05 M Posterior distribution of the models 0 0.1.2.3.4.5.6.7.8.9 1 Likelihood 0,25 0,2 0,15 0,1 0,05 M X 0,25 0,2 0,15 0,1 0,05 M Prior distribution of the models 0 0.1.2.3.4.5.6.7.8.9 1 0 0.1.2.3.4.5.6.7.8.9 1 Data Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 39

Reasons for using probability theory Cox/Jaynes argument: probability is an appealing choice as the language for plausible inference Berger argument: Decision theory offers a theoretical framework for optimal decision making, and decision theory needs probabilities Pragmatic argument: it is a very general framework and it works Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 40

On plausible reasoning The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, non of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man s mind (James Clerk Maxwell) Probabilistic reasoning is intuitively easy to understand, but on the other hand intuition may be a poor guide when facing probabilistic evidence Inside every non-bayesian there is a Bayesian struggling to get out (Dennis V. Lindley) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 41

Real questions Q1: Given plausibilities Plaus(A) and Plaus(B), what is Plaus(AB)? Q2: How is Plaus(~A) related to Plaus(A)? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 42

Qualitative properties of p.r. D1. Degrees of plausibility are represented by real numbers D2. Direction of inference has a qualitative correspondence with common sense For example: if Plaus(A C ) > Plaus(A C) and Plaus(B C ) = Plaus(B C), then Plaus(AB C ) > Plaus(AB C) Ensures consistency in the limit (with perfect certainty) with deductive logic D3. If a conclusion can be inferred in more than one way, every possible way should lead to the same result D4. All relevant information is always taken into account D5. Equivalent states of knowledge must be represented by equivalent plausibility assignments Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 43

Cox/Jaynes/Cheeseman argument Every allowed extension of Aristotelian logic to plausibility theory is isomorphic to Bayesian probability theory Product rule (answers question Q1) P(AB C) = P(A BC) P (B C) Sum rule (answers question Q2) P(A C) + P(Ā C) = 1 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 44

Bayesian inference: How to update beliefs? Select the model space Use Bayes theorem to obtain the posterior probability of models (given data) P Data Model P Model P Model Data = P Data Posterior distribution is the result of the inference; what one needs from the posterior depends on what decisions are to be made Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 45

The Bayesian modeling viewpoint Explicitly include prediction (and intervention) in modeling Models are a means (a language) to describe interesting properties of the phenomenon to be studied, but they are not intrinsic to the phenomenon itself. All models are false, but some are useful. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 46

(Being predictive ) True prediction performance is a function of future data, not a model fit to current data Good predictive models describe useful regularities of the data generating mechanism, while models that give a high probability to the observed data have possibly only learnt to memorize it. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 47

Bayesian decision making for kids assign a benefit for every possible outcome (for every possible decision) assign a probability to every possible outcome given every possible decision what is the best decision? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 48

Decision theory argument Decision theory offers a theoretical framework for optimal decision making? P($100)=0.1, P($50)=0.9 Expected utility: 0.1*$100+0.9*$50=$55 P($200)=0.2, P($5)=0.8 Expected utility: 0.2*$200+0.8*$5=$44 P($80)=0.5, P($50)=0.5 Expected utility: 0.5*$80+0.5*$50=$65 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 49

Optimal actions Optimal policy: choose the action with maximal expected utility The Dutch book argument: betting agencies must be Bayesians Where to get the utilities? (decision theory) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 50

Pragmatic reasons for using probability theory The predictor and predicted variables (the inference task) do not have to be determined in advance probabilistic models can be used for solving both classification (discriminative tasks), and configuration problems and prediction (regression problems) predictions can also be used as a criteria for Data mining (explorative structures) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 51

More pragmatic reasons for using probability theory consistent calculus creating a consistent calculus for uncertain inference is not easy (the Cox theorem) cf. fuzzy logic Probabilistic models can handle both discrete and continuous variables at the same time Various approaches for handling missing data (both in model building and in reasoning) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 52

Nice theory, but... isn t probabilistic reasoning counter-intuitive, something totally different from human reasoning? Cause for confusion: the old frequentist interpretation. But probabilities do NOT have to be thought of as frequencies, but as measures of belief The so called paradoxes are often misleading A: P( 1.000.000)=1.0 B: P( 1.000.000)=0.25, P( 4.000.000)=0.25, P( 0)=0.5 Even if that were true, maybe that would be a good thing! Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 53

Nice theory, but... Where do all the numbers come from? Bayesian networks: small number of parameters the numbers do not have to be accurate probability theory offers a framework for constructing models from sample data, from domain knowledge, or from their combination Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 54

We can learn from Bayesians :-) I rest my case Bayesian approaches never overfit (in principle) Bayesian approaches infer only from observed data (not possible data) Bayesian inference is always relative to a model family Does all this semi-philosophical debate really matter in practice? YES!! (see e.g. The great health hoax by Robert Matthews. The Sunday Telegraph, September 13, 1998. ) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 55

Fundamental questions What is the model space? How do we compare models? How do we search? Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 56

Bayesian answers Model family (space) is made explicit Comparison criteria is a probability No restrictions on the search algorithm Classical statistics answers ( ) Model family is implicit (normal distributions) Comparison criteria is fit to data, deviation from random behavior, model index Simple deterministic greedy algorithms Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 57

Bayesian inference: basic operations Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 58

Probability of propositions Notation P(x) : read probability of x-pression Expressions are statements about the contents of random variables Random variables are very much like variables in computer programming languages. Boolean; statements, propositions Enumerated, discrete; small set of possible values Integers or natural numbers; idealized to infinity Floating point (continuous); real numbers to ease calculations Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 59

P(X=x) Elementary probositions probability that random variable X has value x we like to use words starting with capital letters to denote random variables For example: P(It_will_snow_tomorrow = true) P(The_weekday_I'll_graduate = sunday) P(Number_of_planets_around_Gliese_581 = 7) P(The_average_height_of_adult Finns = 1702mm) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 60

Semantics of P(X=x)=p So what does it mean? P(The_weekday_I'll_graduate = sunday)=0.20 P(Number_of_planets_around_Gliese_581 = 7)=0.3 Bayesian interpretation: The proposition is either true or false, nothing in between, but we may be unsure about the truth. Probabilities measure that uncertainty. The greater the p, the more we believe that X=x: P(X=x) = 1 : Agent totally believes that X = x. P(X=x) = 0 : Agent does not believe that X=x at all. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 61

Compound probositions Elementary propositions can be combined using logical operators, and. like P(X=x Y=y) etc. Possible shorthand: P(X ϵ S) P(X x) for continuous variables Operator is the most common one, and often replaced by just comma like : P(A=a, B=b). Naturally other logical operators can also be defined as derivatives. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 62

Axioms of probability Kolmogorov's axioms: 1.0 P(x) 1 2.P(true) = 1, P(false)=0 3.P(x y) = P(x) + P(y) P(x y) Some extra technical axioms needed to make theory rigorous Axioms can also be derived from common sense requirements (Cox/Jaynes argument) Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 63

Axiom 3 again P(x or y) = P(x) + P(y) P(x and y) It is there to avoid double counting: P( day_is_sunday or day_is_in_july ) = 1/7 + 31/365-4/31. A A and Β B Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 64

Discrete probability distribution Instead of stating that P(D=d 1 )=p 1, P(D=d 2 )=p 2,... and P(D=d n )=p n 1 0,9 0,8 0,7 0,6 0,5 0,4 0,3 0,2 P(D) we often compactly say P(D)=(p 1,p 2,..., p n ). 0,1 0 Mon Tue Wed Thu Fri P(D) is called a probability distribution of D. NB! p 1 + p 2 +... + p n = 1. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 65

Continuous probability distribution In continuous case, the area under P(X=x) must equal one. For example P(X=x) = exp(-x): Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 66

Main toolbox of the Bayesians Definition of conditional probability Chain rule The Bayes rule Marginalization NB. These are all direct derivates of the axioms of probability theory Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 67

Conditional probability Let us define a notation for the probability of x given that we know (for sure) that y: y, and we know nothing else: P x y = P x y P y Bayesians say that all probabilities are conditional since they are relative to the agent's knowledge K. P x y, K = P x y K P y K But Bayesians are lazy too, so they often drop K. Notice that P(x,y) = P(x y)p(y) is also very useful! Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 68

Chain rule From the definition of conditional probability, we get: P X 1, X 2 =P X 2 X 1 P X 1 And more generally: P X 1,..., X n = i P X 1 P X 2 X 1... P X n X 1, X 2,..., X n 1 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 69

Marginalization Let us assume we have a joint probability distribution for a set S of random variables. Let us further assume S1 and S2 partitions the set S. Now P S 1 =s 1 = = s dom S 2 s dom S 2 P S 1 =s 1, S 2 =s P S 1 =s 1 S 2 =s P S 2 =s, where s 1 and s are vectors of possible value combination of S1 and S2 respectively. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 70

Joint probability distribution P(Toothache=x,Catch=y,Cavity=z) for all combinations of truth values (x,y,z): Toothache Catch Cavity probability true true true 0,108 true true false 0,016 true false true 0,012 true false false 0,064 false true true 0,072 false true false 0,144 false false true 0,008 false false false 0,576 1,000 You may also think this as a P(Too_Cat_Cav=x), where x is a 3-dimensional vector of truth values. Generalizes naturally to any set of discrete variables, not only Booleans. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 71

Joys of joint probability distribution Summing the condition matching numbers from the joint probability table you can calculate probability of any subset of events. P(Cavity=true or Toothache=true): Toothache Catch Cavity probability true true true 0,108 true true false 0,016 true false true 0,012 true false false 0,064 false true true 0,072 false true false 0,144 false false true 0,008 false false false 0,576 0,280 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 72

Marginal probabilities are probabilities too P(Cavity=x, Toothache=y) Toothache Catch Cavity probability true true true 0,108 true true false 0,016 true false true 0,012 true false false 0,064 false true true 0,072 false true false 0,144 false false true 0,008 false false false 0,576 1,000 Probabilities of the lines with equal values for marginal variables are simply summed. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 73

Conditioning Marginalization can be used to calculate conditional probability: P Cavity=t Toothache=t = P Cavity=t Toothache=t P Toothache=t Toothache Catch Cavity probability true true true 0,108 true true false 0,016 true false true 0,012 true false false 0,064 false true true 0,072 false true false 0,144 false false true 0,008 false false false 0,576 1,000 0.108 0.012 0.108 0.016 0.012 0.064 =0.6 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 74

Conditioning via marginalization P X Y = Z (definition) = Z (chain rule) = Z = Z P X,Z Y P X,Z,Y P Y P X Z,Y P Z Y P Y P Y P X Z,Y P Z Y. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 75

Combining Bayes' rule P x y, K = P x y K P y K P x y K =P y x K =P y x, K P x K yields the famous Bayes formula P x y, K = P x K P y x, K P y K or P h e = P h P e h P e Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 76

Bayes formula as an update rule Prior belief P(h) is updated to posterior belief P(h e 1 ). This, in turn, gets updated to P(h e 1,e 2 ) using the very same formula with P(h e 1 ) as a prior. Finally, denoting P( e 1 ) with P 1 we get P h e 1, e 2 = P h,e 1, e 2 P e 1, e 2 = P h,e 1 P e 2 h,e 1 P e 1 P e 2 e 1 = P h e 1 P e 2 h,e 1 P e 2 e 1 = P 1 h P 1 e 2 h P 1 e 2 Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 77

Bayes formula for diagnostics Bayes formula can be used to calculate the probabilities of possible causes for observed symptoms. P cause symptoms = P cause P symptoms cause P symptoms Causal probabilities P(symptoms cause) are usually easier for experts to estimate than diagnostic probabilities P(cause symptoms). Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 78

Bayes formula for model selection Bayes formula can be used to calculate the probabilities of hypotheses, given observations P H D = P H 1 P D H 1 1 P D P H D = P H 2 P D H 2 2 P D... Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 79

General recipe for Bayesian inference X: something you don't know and need to know Y: the things you know Z: the things you don't know and don't need to know Compute: P X Y = Z P X Z,Y P Z Y That's it - we're done. Probabilistic Models, Spring 2009 Petri Myllymäki, University of Helsinki 80