Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

Similar documents
COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

CS 188: Artificial Intelligence. Bayes Nets

Bayesian networks. Instructor: Vincent Conitzer

CS 343: Artificial Intelligence

Bayes Nets: Independence

Artificial Intelligence Bayes Nets: Independence

Intelligent Systems (AI-2)

Bayesian Networks BY: MOHAMAD ALSABBAGH

CS 5522: Artificial Intelligence II

Inference in Bayesian Networks

Bayesian networks. Chapter 14, Sections 1 4

Introduction to Artificial Intelligence. Unit # 11

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Bayesian networks. Independence. Bayesian networks. Markov conditions Inference. by enumeration rejection sampling Gibbs sampler

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Introduction to Bayesian Networks

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Bayesian networks. Chapter Chapter

Artificial Intelligence Bayesian Networks

Outline. CSE 573: Artificial Intelligence Autumn Agent. Partial Observability. Markov Decision Process (MDP) 10/31/2012

CSEP 573: Artificial Intelligence

Uncertainty and Bayesian Networks

last two digits of your SID

Directed Graphical Models or Bayesian Networks

Quantifying uncertainty & Bayesian networks

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

PROBABILISTIC REASONING SYSTEMS

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference

Directed Graphical Models

1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution

Bayes Nets III: Inference

Bayesian Approaches Data Mining Selected Technique

School of EECS Washington State University. Artificial Intelligence

Informatics 2D Reasoning and Agents Semester 2,

Bayesian Networks aka belief networks, probabilistic networks. Bayesian Networks aka belief networks, probabilistic networks. An Example Bayes Net

Bayesian Networks. Machine Learning, Fall Slides based on material from the Russell and Norvig AI Book, Ch. 14

Probabilistic Graphical Networks: Definitions and Basic Results

Bayesian Machine Learning

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

CMPSCI 240: Reasoning about Uncertainty

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Rapid Introduction to Machine Learning/ Deep Learning

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Lecture 8: Bayesian Networks

An Introduction to Bayesian Machine Learning

Directed and Undirected Graphical Models

Artificial Intelligence Methods. Inference in Bayesian networks

Lecture 6: Graphical Models

Outline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Bayes Nets: Sampling

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Bayesian networks. Soleymani. CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018

CSE 473: Artificial Intelligence Probability Review à Markov Models. Outline

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Probabilistic Graphical Models (I)

Review: Bayesian learning and inference

CSE 473: Artificial Intelligence Autumn 2011

Probabilistic Classification

Probabilistic Reasoning Systems

CS188 Outline. We re done with Part I: Search and Planning! Part II: Probabilistic Reasoning. Part III: Machine Learning

Inference in Bayesian networks

{ p if x = 1 1 p if x = 0

Our Status in CSE 5522

qfundamental Assumption qbeyond the Bayes net Chain rule conditional independence assumptions,

Bayes Networks 6.872/HST.950

CS 188: Artificial Intelligence Spring Announcements

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling

Inference in Bayesian networks

Bayesian Networks. Philipp Koehn. 6 April 2017

Chris Bishop s PRML Ch. 8: Graphical Models

Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas

CS Lecture 3. More Bayesian Networks

Machine Learning Lecture 14

Directed Graphical Models

Soft Computing. Lecture Notes on Machine Learning. Matteo Matteucci.

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

Ch.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science

Bayesian Networks. Motivation

Statistical Approaches to Learning and Discovery

Conditional Independence

Recap: Bayes Nets. CS 473: Artificial Intelligence Bayes Nets: Independence. Conditional Independence. Bayes Nets. Independence in a BN

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Machine Learning. Torsten Möller.

CS 188: Artificial Intelligence Spring Announcements

Informatics 2D Reasoning and Agents Semester 2,

TDT70: Uncertainty in Artificial Intelligence. Chapter 1 and 2

Product rule. Chain rule

Bayesian Networks. Philipp Koehn. 29 October 2015

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

Announcements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation

Conditional Independence and Factorization

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

Extensions of Bayesian Networks. Outline. Bayesian Network. Reasoning under Uncertainty. Features of Bayesian Networks.

Y. Xiang, Inference with Uncertain Knowledge 1

Bayesian Networks (Part II)

CS 343: Artificial Intelligence

Graphical Models and Kernel Methods

Transcription:

Probabilistic Graphical Models and Bayesian Networks Artificial Intelligence Bert Huang Virginia Tech

Concept Map for Segment Probabilistic Graphical Models Probabilistic Time Series Models Particle Filters (Neural Networks)

Outline Probabilistic graphical models Bayesian networks Inference in Bayes nets

Probabilistic Graphical Models PGMs represent probability distributions They encode conditional independence structure with graphs They enable graph algorithms for inference and learning

Probability Identities Random variables in caps (A) values in lowercase: A = a or just a for shorthand P(a b) = P(a, b) / P(b) conditional probability P(a, b) = P(a b) P(b) joint probability P(b a) = P(a b) P(b) / P(a)

Probability via Counting

Probability via Counting P(circle, red) =2/8 = 0.25 2 8

Probability via Counting 3 2 5 3 8 2 1 P(circle red) = P(circle, red) / P(red) 2/3 2/8 3/8

Probability via Counting 3 2 5 3 8 2 1 P(circle red) P(red) = P(circle, red) 2/3 3/8 2/8

Probability Identities Random variables in caps (A) values in lowercase: A = a or just a for shorthand P(a b) = P(a, b) / P(b) P(a, b) = P(a b) P(b) P(b a) = P(a b) P(b) / P(a)

Bayesian Networks P(L, R, W) conditional Win Lottery independence structure = P(L) P(R) P(W R) Rain Wet Ground Slip P(L, R, W, S) = P(L) P(R) P(W R) P(S W) P(S W, R)

Bayesian Networks P(R, W, S, C) = P(R) P(C) P(W C, R) P(S W) P(X Parents(X)) Rain Wet Ground Slip Car Wash

Independence in Bayes Nets A B Each variable is conditionally independent of its non-descendents given its parents Each variable is conditionally independent of any other variable given its Markov blanket C D Parents, children, and children s parents E

Independence in Bayes Nets A B Each variable is conditionally independent of its non-descendents given its parents Each variable is conditionally independent of any other variable given its Markov blanket C D Parents, children, and children s parents E

Independence in Bayes Nets A B Each variable is conditionally independent of its non-descendents given its parents Each variable is conditionally independent of any other variable given its Markov blanket C D Parents, children, and children s parents E

Independence in Bayes Nets A B Each variable is conditionally independent of its non-descendents given its parents Each variable is conditionally independent of any other variable given its Markov blanket C D Parents, children, and children s parents E

Independence in Bayes Nets A B Each variable is conditionally independent of its non-descendents given its parents Each variable is conditionally independent of any other variable given its Markov blanket C D Parents, children, and children s parents E

Inference Given a Bayesian Network describing P(X, Y, Z), what is P(Y) First approach: enumeration

P(R, W, S, C) = P(R) P(C) P(W C, R) P(S W) X P(r s) = X w P(r, w, s, c)/p(s) c X P(r s) / X w P(r)P(c)P(w c, r)p(s w) c P(r s) / P(r) X w P(s w) X c P(c)P(w c, r) O(2 n )

Second Approach: Variable Elimination X P(r s) / X w P(r)P(c)P(w c, r)p(s w) c f C (w) = X c P(c)P(w c, r) P(r s) / X w P(r)P(s w)f c (w)

P(W, X, Y, Z) =P(W )P(X W )P(Y X )P(Z Y ) P(Y )? X X P(Y )= X w P(w)P(x w)p(y x)p(z Y ) x z f w (x) = X w P(w)P(x w) X P(Y )= X x f w (x)p(y x)p(z Y ) z f x (Y )= X x f w (x)p(y x) P(Y )= X z f x (Y )P(z Y )

W X Y Z X X P(Y )= X w P(w)P(x w)p(y x)p(z Y ) x z f w (x) = X w P(w)P(x w) X P(Y )= X x f w (x)p(y x)p(z Y ) z f x (Y )= X x f w (x)p(y x) P(Y )= X z f x (Y )P(z Y )

Variable Elimination Every variable that is not an ancestor of a query variable or evidence variable is irrelevant to the query Iterate: choose variable to eliminate sum terms relevant to variable, generate new factor until no more variables to eliminate Exact inference is #P-Hard in tree-structured BNs, linear time (in number of table entries)

Learning in Bayes Nets Super easy! Estimate each conditional probability by counting