PROBABILITY AND INFERENCE

Similar documents
PROBABILITY. Inference: constraint propagation

Our Status in CSE 5522

CS188 Outline. We re done with Part I: Search and Planning! Part II: Probabilistic Reasoning. Part III: Machine Learning

CS188 Outline. CS 188: Artificial Intelligence. Today. Inference in Ghostbusters. Probability. We re done with Part I: Search and Planning!

Our Status. We re done with Part I Search and Planning!

Random Variables. A random variable is some aspect of the world about which we (may) have uncertainty

CS 343: Artificial Intelligence

CS 188: Artificial Intelligence Fall 2011

CSE 473: Artificial Intelligence

CS 188: Artificial Intelligence. Our Status in CS188

Reinforcement Learning Wrap-up

CSE 473: Artificial Intelligence Probability Review à Markov Models. Outline

CS 188: Artificial Intelligence Fall 2009

CS 188: Artificial Intelligence Spring Announcements

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012

Hidden Markov Models. Vibhav Gogate The University of Texas at Dallas

Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule

Machine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Artificial Intelligence

Artificial Intelligence Uncertainty

Where are we in CS 440?

Web-Mining Agents Data Mining

Pengju

CSEP 573: Artificial Intelligence

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

CS 188: Artificial Intelligence. Bayes Nets

Uncertainty. Chapter 13

Uncertainty. Outline

Probabilistic Reasoning

Uncertainty. Chapter 13

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Uncertainty and Bayesian Networks

Reasoning Under Uncertainty: Conditional Probability

Bayes Nets III: Inference

Basic Probability and Statistics

Uncertain Knowledge and Reasoning

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Where are we in CS 440?

UNCERTAINTY. In which we see what an agent should do when not all is crystal-clear.

Quantifying uncertainty & Bayesian networks

Weather. science centers. created by: The Curriculum Corner.

Course Introduction. Probabilistic Modelling and Reasoning. Relationships between courses. Dealing with Uncertainty. Chris Williams.

An AI-ish view of Probability, Conditional Probability & Bayes Theorem

10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.

Basic Probability. Robert Platt Northeastern University. Some images and slides are used from: 1. AIMA 2. Chris Amato

Uncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial

Probability and Decision Theory

Basic Probability and Decisions

Probabilistic Reasoning

In today s lecture. Conditional probability and independence. COSC343: Artificial Intelligence. Curse of dimensionality.

Introduction to Machine Learning

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Markov Chains and MCMC

Introduction to Machine Learning

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Pengju XJTU 2016

Outline. Uncertainty. Methods for handling uncertainty. Uncertainty. Making decisions under uncertainty. Probability. Uncertainty

CS 188: Artificial Intelligence Spring Announcements

Probabilistic Robotics

Chapter 13 Quantifying Uncertainty

Computer Science CPSC 322. Lecture 18 Marginalization, Conditioning

Probability, Markov models and HMMs. Vibhav Gogate The University of Texas at Dallas

Uncertainty. Chapter 13. Chapter 13 1

You will be writing code in the Python programming language, which you may have learnt in the Python module.

HMM part 1. Dr Philip Jackson

Earth Science: Second Quarter Grading Rubric Kindergarten

B. Temperature. Write the correct adjectives on the correct lines below

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

Modeling and reasoning with uncertainty

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

Probabilistic Graphical Models

Artificial Intelligence

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

CS 561: Artificial Intelligence

Uncertainty. Outline. Probability Syntax and Semantics Inference Independence and Bayes Rule. AIMA2e Chapter 13

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences

Discrete Random Variables

Mobile Robotics II: Simultaneous localization and mapping

T Machine Learning: Basic Principles

CS 343: Artificial Intelligence

COMP 328: Machine Learning

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

Artificial Intelligence Programming Probability

The Naïve Bayes Classifier. Machine Learning Fall 2017

Bayesian networks (1) Lirong Xia

K-ESS2-1. By: Natalie Rapson and Cassidy Smith

Probabilistic Reasoning; Network-based reasoning

Weather and Seasons. Look out the window and think about the weather. What is it like? Circle all the words that describe the weather today.

CS325 Artificial Intelligence Ch. 15,20 Hidden Markov Models and Particle Filtering

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015

Bayesian Networks BY: MOHAMAD ALSABBAGH

COMP9414/ 9814/ 3411: Artificial Intelligence. 14. Uncertainty. Russell & Norvig, Chapter 13. UNSW c AIMA, 2004, Alan Blair, 2012

Uncertainty. Chapter 13, Sections 1 6

Approximate Inference

CS221 / Autumn 2017 / Liang & Ermon. Lecture 13: Bayesian networks I

Transcription:

PROBABILITY AND INFERENCE Progress Report We ve finished Part I: Problem Solving! Part II: Reasoning with uncertainty Part III: Machine Learning 1

Today Random variables and probabilities Joint, marginal, conditional distributions Inference (Leading to Independence) We re going to be using these concepts a lot so it s worth learning it well the first time! Handling Uncertainty The world is an uncertain place Partially observable, non-deterministic On the way to the bank, you get in a car crash! Medical diagnosis Driving to Seattle Sensors Probability theory gives us a language to reason about an uncertain world. Probability theory is beautiful! 2

Random variables Discrete Probability distribution 3

Distributions of Interest to Us X X p(x) = X y Marginal Conditional p(x y) = p(y) Joint Joint probability distribution P(W = w, T = t) w t P sunny hot 0.05 rainy hot 0.01 cloudy hot 0.01 snowy hot 0.01 sunny cold 0.20 rainy cold 0.29 cloudy cold 0.29 snowy cold 0.14 4

Events P(W = w, T = t) w t P sunny hot 0.07 rainy hot 0.01 cloudy hot 0.01 snowy hot 0.01 sunny cold 0.15 rainy cold 0.3 cloudy cold 0.3 snowy cold 0.15 Marginal Distributions P(W = w, T = t) w t P sunny hot 0.07 rainy hot 0.01 cloudy hot 0.01 snowy hot 0.01 sunny cold 0.15 rainy cold 0.3 cloudy cold 0.3 snowy cold 0.15 5

Conditional (posterior) distribution Often, we observe some information (evidence) and we want to know the probability of an event conditioned on this evidence p(w T = cold) evidence In all the worlds where T=cold, what is the probability that W = sunny? That W = rainy? That W = cloudy? That W = snowy? This is called the conditional distribution, e.g. the distribution of W conditioned on the evidence T = cold Conditional (posterior) distribution 6

Conditional and Joint are just a constant apart! Normalization Trick 7

Distributions of Interest to Us X X p(x) = X y Marginal Conditional p(x y) = p(y) Joint Probabilistic Inference Probabilistic inference refers to the task of computing some desired probability given other known probabilities (evidence) Typically compute the conditional (posterior) probability of an event p(on time to airport no accidents ) = 0.80 Probabilities change with new evidence p(on time to airport no accidents, 5 a.m.) = 0.95 p(on time to airport no accidents, 5 a.m., raining) = 0.8 8

Step One: select the entries in the table consistent with the evidence (this becomes our world) Step Two: sum over the H variables to get the joint distribution of the query and evidence variables Step Three: Normalize p(w S = winter)? S W T P summer sunny hot 0.30 summer rain hot 0.05 summer sunny cold 0.10 summer rain cold 0.05 winter sunny hot 0.10 winter rain hot 0.05 winter sunny cold 0.15 winter rain cold 0.20 Step One S W T P summer sunny hot 0.30 summer rain hot 0.05 summer sunny cold 0.10 summer rain cold 0.05 winter sunny hot 0.10 winter rain hot 0.05 winter sunny cold 0.15 winter rain cold 0.20 9

Step One: select the entries in the table consistent with the evidence (this becomes our world) Step Two: sum over the H variables to get the joint distribution of the query and evidence variables Step Three: Normalize p(w S = winter)? S W T P winter sunny hot 0.10 winter rain hot 0.05 winter sunny cold 0.15 winter rain cold 0.20 Step Two p(q, e 1,...,e k )= X p(q, e 1,...,e k,h 1,...,h r ) (h 1,...,h r) Step One: select the entries in the table consistent with the evidence (this becomes our world) Step Two: sum over the H variables to get the joint X distribution of the query and evidence variables Step Three: Normalize p(w S = winter)? S W T P winter sunny hot 0.10 winter rain hot 0.05 winter sunny cold 0.15 winter rain cold 0.20 Step Three Z = X p(q = q, e 1,...,e k ) q p(q e 1,...,e k )= 1 Z p(q, e 1,...,e k ) 10

Step One: select the entries in the table consistent with the evidence (this becomes our world) Step Two: sum over the H variables to get the joint distribution of the query and evidence variables Step Three: Normalize S W T P summer sunny hot 0.30 summer rain hot 0.05 summer sunny cold 0.10 summer rain cold 0.05 winter sunny hot 0.10 winter rain hot 0.05 winter sunny cold 0.15 winter rain cold 0.20 Queries: P(S T = hot)? p(w S=winter, T=hot)? p(s, W)? p(s, W T=hot)? n random variables d domain size Worst-case time is O(d n ) Space is O(d n ) to save entire table in memory Is there something better? 11