EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Similar documents
Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Artificial Intelligence

Bayesian networks. Chapter Chapter Outline. Syntax Semantics Parameterized distributions. Chapter

Introduction to Artificial Intelligence Belief networks

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter AIMA2e Slides, Stuart Russell and Peter Norvig, Completed by Kazim Fouladi, Fall 2008 Chapter 14.

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Bayesian networks. Chapter 14, Sections 1 4

Probabilistic Reasoning Systems

Bayesian networks: Modeling

CS Belief networks. Chapter

Bayesian Networks. Philipp Koehn. 6 April 2017

Bayesian networks. Chapter Chapter

Bayesian Networks. Philipp Koehn. 29 October 2015

CS 380: ARTIFICIAL INTELLIGENCE UNCERTAINTY. Santiago Ontañón

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Outline. Bayesian networks. Example. Bayesian networks. Example contd. Example. Syntax. Semantics Parameterized distributions. Chapter 14.

Example. Bayesian networks. Outline. Example. Bayesian networks. Example contd. Topology of network encodes conditional independence assertions:

Outline } Conditional independence } Bayesian networks: syntax and semantics } Exact inference } Approximate inference AIMA Slides cstuart Russell and

Review: Bayesian learning and inference

Uncertainty and Bayesian Networks

Bayesian Networks BY: MOHAMAD ALSABBAGH

PROBABILISTIC REASONING SYSTEMS

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

Lecture 10: Bayesian Networks and Inference

Course Overview. Summary. Outline

CS 5522: Artificial Intelligence II

Another look at Bayesian. inference

Artificial Intelligence Bayes Nets: Independence

14 PROBABILISTIC REASONING

Quantifying uncertainty & Bayesian networks

Artificial Intelligence Methods. Inference in Bayesian networks

Probabilistic Reasoning. Kee-Eung Kim KAIST Computer Science

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Inference in Bayesian Networks

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

Probabilistic representation and reasoning

Informatics 2D Reasoning and Agents Semester 2,

Inference in Bayesian networks

Inference in Bayesian networks

Probabilistic representation and reasoning

Bayes Nets: Independence

Outline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

CS 188: Artificial Intelligence Spring Announcements

Uncertainty. Introduction to Artificial Intelligence CS 151 Lecture 2 April 1, CS151, Spring 2004

Artificial Intelligence Bayesian Networks

Directed Graphical Models or Bayesian Networks

CS 343: Artificial Intelligence

PROBABILISTIC REASONING Outline

This lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004

Product rule. Chain rule

Uncertainty and Belief Networks. Introduction to Artificial Intelligence CS 151 Lecture 1 continued Ok, Lecture 2!

CS 188: Artificial Intelligence. Bayes Nets

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks. Motivation

Belief Networks for Probabilistic Inference

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

School of EECS Washington State University. Artificial Intelligence

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

Bayes Nets III: Inference

Machine Learning for Data Science (CS4786) Lecture 24

Graphical Models - Part I

Bayesian belief networks. Inference.

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling

Sampling Algorithms for Probabilistic Graphical models

Probabilistic Models

Bayesian networks (1) Lirong Xia

Bayesian Belief Network

STA 4273H: Statistical Machine Learning

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

CS 188: Artificial Intelligence Spring Announcements

CS Lecture 3. More Bayesian Networks

4 : Exact Inference: Variable Elimination

Introduction to Bayesian Learning

CS 188: Artificial Intelligence Fall 2009

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Directed and Undirected Graphical Models

Belief networks Chapter 15.1í2. AIMA Slides cæstuart Russell and Peter Norvig, 1998 Chapter 15.1í2 1

Intelligent Systems (AI-2)

Bayesian Methods in Artificial Intelligence

Uncertainty. 22c:145 Artificial Intelligence. Problem of Logic Agents. Foundations of Probability. Axioms of Probability

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Artificial Intelligence

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences

Y. Xiang, Inference with Uncertain Knowledge 1

Probabilistic Graphical Networks: Definitions and Basic Results

University of Washington Department of Electrical Engineering EE512 Spring, 2006 Graphical Models

Reasoning Under Uncertainty: More on BNets structure and construction

Outline } Exact inference by enumeration } Exact inference by variable elimination } Approximate inference by stochastic simulation } Approximate infe

Reasoning Under Uncertainty

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

CSCE 478/878 Lecture 6: Bayesian Learning

Soft Computing. Lecture Notes on Machine Learning. Matteo Matteucci.

Graphical Models - Part II

CS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas

Transcription:

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes

Uncertainty & Bayesian Networks Chapter 13/14

Outline Inference Independence and Bayes' Rule Chapter 14 Syntax Semantics Parameterized Distributions Inference in Bayesian Networks

On the final Same format as midterm closed book/closed notes Might test on all material of the quarter, including today (i.e., chapters 1-9, 13,14) but will not test on fuzzy logic. Will be weighted towards latter half of the course though.

Homework Last HW of the quarter Due next Wed, June 1 st, in class: Chapter 13: 13.3, 13.7, 13.16 Chapter 14: 14.2, 14.3, 14.10

Bayesian Networks Chapter 14

Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions Syntax: a set of nodes, one per variable a directed, acyclic graph (link "directly influences") a conditional distribution for each node given its parents: P (X i Parents (X i )) In the simplest case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over X i for each combination of parent values

Example contd.

Semantics The full joint distribution is defined as the product of the local conditional distributions: P (X 1,,X n ) = π i = 1 P (X i Parents(X i )) e.g., P(j m a b e) n = P (j a) P (m a) P (a b, e) P ( b) P ( e)

Local Semantics Local semantics: each node is conditionally independent of its nondescendants given its parents Thm: Local semantics global semantics

Example: car diagnosis Initial evidence: car won t start Testable variables (green), broken, so fix it variables (orange) Hidden variables (gray) ensure sparse structure, reduce parameters.

Example: car insurance

compact conditional dists. CPT grows exponentially with number of parents CPT becomes infinite with continuous-valued parent or child Solution: canonical distributions that are defined compactly Deterministic nodes are the simplest case: X = f(parents(x)), for some deterministic function f (could be logical form) E.g., boolean functions NorthAmerican Canadian US Mexican E.g,. numerical relationships among continuous variables

compact conditional dists. Noisy-Or distributions model multiple interacting causes: 1) Parents U 1,, U k include all possible causes 2) Independent failure probability q i for each cause alone X U 1 U 2 U k P(X U 1,, U j, U j+1,, U k ) = 1 - i=1j q i Number of parameters is linear in number of parents.

Hybrid (discrete+cont) networks Discrete (Subsidy? and Buys?); continuous (Harvest and Cost) Option 1: discretization large errors and large CPTs Option 2: finitely parameterized canonical families Gaussians, Logistic Distributions (as used in Neural Networks) Continuous variables, discrete+continuous parents (e.g., Cost) Discrete variables, continuous parents (e.g., Buys?)

Inference by enumeration by variable elimination by stochastic simulation by Markov chain Monte Carlo

Inference Tasks Simple queries: compute posterior marginal, P(X i E=e) e.g., P(NoGas Gague=empty,Lights=on,Starts=false) Conjunctive queries: P(X i,x j E=e) = P(X i E=e)P(X j X i,e=e) Optimal Decisions: decision networks include utility information; probabilistic inference required fro P(outcome action,evidence) Value of information: which evidence to seek next? Sensitivity analysis: which probability values are most critical? Explanation: why do I need a new starter motor?

Inference By Enumeration

Enumeration Algorithm

Evaluation Tree

Inference by Variable Elimination

Variable Elimination: Basic operations

Variable Elimination: Algorithm

Irrelevant variables

Irrelevant varaibles continued:

Complexity of exact inference

Inference by stochastic simulation

Sampling from empty network

Example

Rejection Sampling

Analysis of rejection sampling

Likelihood weighting

MCMC

Summary Bayesian networks provide a natural representation for (causally induced) conditional independence Topology + CPTs = compact representation of joint distribution Generally easy for domain experts to construct Exact inference by variable elimination polytime on polytrees, NP-hard on general graphs space can be exponential as well sampling approaches can help, as they only do approximate inference. Take my Graphical Models class if more interested (much more theoretical depth)