Reasoning Under Uncertainty: Belief Network Inference

Similar documents
Reasoning Under Uncertainty: Bayesian networks intro

Propositional Logic: Semantics and an Example

Cognitive Systems 300: Probability and Causality (cont.)

Reasoning Under Uncertainty: Conditional Probability

CS Lecture 3. More Bayesian Networks

Bayesian Networks BY: MOHAMAD ALSABBAGH

Reasoning Under Uncertainty: Variable Elimination

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence

Intelligent Systems (AI-2)

Reasoning Under Uncertainty: More on BNets structure and construction

Introduction to Artificial Intelligence. Unit # 11

Reasoning Under Uncertainty: More on BNets structure and construction

Rapid Introduction to Machine Learning/ Deep Learning

CS 188: Artificial Intelligence Spring Announcements

COMP5211 Lecture Note on Reasoning under Uncertainty

Probabilistic Models

Directed Graphical Models

Probabilistic Graphical Networks: Definitions and Basic Results

CS 188: Artificial Intelligence Fall 2008

Bayesian belief networks. Inference.

CS 5522: Artificial Intelligence II

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Bayesian Networks. Motivation

CS 188: Artificial Intelligence Fall 2009

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Bayes Nets: Independence

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Artificial Intelligence Bayes Nets: Independence

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Learning in Bayesian Networks

Bayesian Networks. Vibhav Gogate The University of Texas at Dallas

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

CS 5522: Artificial Intelligence II

STA 4273H: Statistical Machine Learning

Our Status. We re done with Part I Search and Planning!

Introduction to Bayesian Learning

Lecture 5: Bayesian Network

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Review: Bayesian learning and inference

CSE 473: Artificial Intelligence Probability Review à Markov Models. Outline

CS 343: Artificial Intelligence

Learning Bayesian Networks (part 1) Goals for the lecture

Probabilistic Models. Models describe how (a portion of) the world works

COMP538: Introduction to Bayesian Networks

Informatics 2D Reasoning and Agents Semester 2,

Another look at Bayesian. inference

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.

CS 188: Artificial Intelligence. Our Status in CS188

Probabilistic Classification

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014

Modeling and Inference with Relational Dynamic Bayesian Networks Cristina Manfredotti

CSE 473: Artificial Intelligence Autumn 2011

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Probabilistic Graphical Models (I)

CS6220: DATA MINING TECHNIQUES

Bayesian Inference. Definitions from Probability: Naive Bayes Classifiers: Advantages and Disadvantages of Naive Bayes Classifiers:

Probabilistic representation and reasoning

Directed and Undirected Graphical Models

Introduction to Artificial Intelligence Belief networks

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Probabilistic representation and reasoning

CS6220: DATA MINING TECHNIQUES

Quantifying uncertainty & Bayesian networks

CPSC 540: Machine Learning

Announcements. Assignment 5 due today Assignment 6 (smaller than others) available sometime today Midterm: 27 February.

Propositions. c D. Poole and A. Mackworth 2010 Artificial Intelligence, Lecture 5.1, Page 1

CS 188: Artificial Intelligence Spring Announcements

Motivation. Bayesian Networks in Epistemology and Philosophy of Science Lecture. Overview. Organizational Issues

Machine Learning Lecture 14

CSE 473: Artificial Intelligence

Probability and Time: Hidden Markov Models (HMMs)

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Bayesian Approaches Data Mining Selected Technique

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863

Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule

Computer Science CPSC 322. Lecture 18 Marginalization, Conditioning

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Product rule. Chain rule

6.867 Machine learning, lecture 23 (Jaakkola)

Artificial Intelligence Bayesian Networks

Uncertain Reasoning. Environment Description. Configurations. Models. Bayesian Networks

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Outline. CSE 473: Artificial Intelligence Spring Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón

Decision Theory Intro: Preferences and Utility

Introduction to Artificial Intelligence (AI)

Based on slides by Richard Zemel

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Introduction to Artificial Intelligence (AI)

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

Uncertainty and Bayesian Networks

Introduction to Bayesian Networks

2 : Directed GMs: Bayesian Networks

PROBABILISTIC REASONING SYSTEMS

Transcription:

Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5 Textbook 10.4 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 1

Lecture Overview 1 Recap 2 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 2

Components of a belief network Definition (belief network) A belief network consists of: a directed acyclic graph with nodes labeled with random variables a domain for each random variable a set of conditional probability tables for each variable given its parents (including prior probabilities for nodes with no parents). Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 3

Belief network summary A belief network is a directed acyclic graph (DAG) where nodes are random variables. A belief network is automatically acyclic by construction. The parents of a node n are those variables on which n directly depends. A belief network is a graphical representation of dependence and independence: A variable is conditionally independent of its non-descendants given its parents. Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 4

Relating BNs to the joint Belief networks are compact representations of the joint. To encode the joint as a BN: 1 Totally order the variables of interest: X 1,..., X n 2 Write down the chain rule decomposition of the joint, using this ordering: P (X 1,..., X n ) = n i=1 P (X i X i 1,..., X 1 ) 3 For every variable X i, find the smallest set px i {X 1,..., X i 1 } such that P (X i X i 1,..., X 1 ) = P (X i px i ). If px i {X 1,..., X i 1 }, X i is conditionally independent of some of its ancestors given px i. 4 Now we can write P (X 1,..., X n ) = n i=1 P (X i px i ) 5 Construct the BN: Nodes are variables Incoming edges to each variable X i from each variable in px i Conditional probability table for variable X i : P (X i px i ) Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 5

Lecture Overview 1 Recap 2 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 6

Example: Fire Diagnosis Suppose you want to diagnose whether there is a fire in a building you receive a noisy report about whether everyone is leaving the building. if everyone is leaving, this may have been caused by a fire alarm. if there is a fire alarm, it may have been caused by a fire or by tampering if there is a fire, there may be smoke Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 7

Example: Fire Diagnosis First you choose the variables. In this case, all are boolean: Tampering is true when the alarm has been tampered with Fire is true when there is a fire Alarm is true when there is an alarm Smoke is true when there is smoke Leaving is true if there are lots of people leaving the building Report is true if the sensor reports that people are leaving the building Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 8

Example: Fire Diagnosis Next, you order the variables: F ire; T ampering; Alarm; Smoke; Leaving; Report. Now evaluate which variables are conditionally independent given their parents: F ire is independent of T ampering (learning that one is true would not change your beliefs about the probability of the other) Alarm depends on both F ire and T ampering: it could be caused by either or both. Smoke is caused by F ire, and so is independent of T ampering and Alarm given whether there is a F ire Leaving is caused by Alarm, and thus is independent of the other variables given Alarm. Report is caused by Leaving, and thus is independent of the other variables given Leaving. Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 9

Example: Fire Diagnosis This corresponds to the following belief network: Tampering Fire Alarm Smoke Leaving Report Of course, we re not done until we also come up with conditional probability tables for each node in the graph. Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 10

Example: Circuit Diagnosis w 1 s 2 w 2 w 3 s 3 w 0 l 1 S 1 _pos S 2 _pos w 4 S 2 _st l 2 s 1 Cb 1 _st S 1 _st W 1 cb 1 W 0 L 1 _lit w 6 p 1 W 3 W 2 w 5 cb 2 L 1 _st p 2 Outside_power W 4 P 1 L 2 _lit outside power S 3 _pos W 6 S 3 _st L 2 _st off circuit breaker switch on two-way switch light power outlet Cb 2 _st P 2 The belief network also specifies: The domain of the variables: W 0,..., W 6 {live, dead} S 1 pos, S 2 pos, and S 3 pos have domain {up, down} S 1 st has {ok, upside down, short, intermittent, broken}. Conditional probabilities, including: P (W 1 = live s 1 pos = up S 1 st = ok W 3 = live) P (W 1 = live s 1 pos = up S 1 st = ok W 3 = dead) P (S 1 pos = up) P (S 1 st = upside down) Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 11

Example: Circuit Diagnosis The power network can be used in a number of ways: Conditioning on the status of the switches and circuit breakers, whether there is outside power and the position of the switches, you can simulate the lighting. Given values for the switches, the outside power, and whether the lights are lit, you can determine the posterior probability that each switch or circuit breaker is ok or not. Given some switch positions and some outputs and some intermediate values, you can determine the probability of any other variable in the network. Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 12

Example: Liver Diagnosis Source: Onisko et al., 1999 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 13