Bayesian Networks. Exact Inference by Variable Elimination. Emma Rollon and Javier Larrosa Q

Size: px
Start display at page:

Download "Bayesian Networks. Exact Inference by Variable Elimination. Emma Rollon and Javier Larrosa Q"

Transcription

1 Bayesian Networks Exact Inference by Variable Elimination Emma Rollon and Javier Larrosa Q Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

2 Recall the most usual queries Let X = E Q Z, where E are the evidence variables, e are the observed values, Q are the variables of interest, Z are the rest of the variables. Probability of Evidence (Q = ): Prior Marginals (E = ): Posterior Marginals (E ): Pr(e) = Z Pr(e, Z) Pr(Q) = Z Pr(Q, Z) Pr(Q e) = Z Pr(Q, Z e) Most Probable Explanation (MPE) (E, Z = ): q = arg max Q Pr(Q e) Maximum a Posteriori Probability (MAP): q = arg max Q Pr(Q e) = arg max Q Z Pr(Q, Z e) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

3 In this session We are going to see a generic algorithm for inference (inference = query answer) It is called Variable Elimination because it eliminates one by one those variables which are irrelevant for the query. It relies on some basic operations on a class of functions known as factors. It uses an algorithmic technique called dynamic programming We will illustrate it for the computation of prior and posteriors marginals, P(Y) = Z P(Y, Z) P(Y e) = Z P(Y, Z e) It can be easily extended to other queries. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

4 Factors A factor f is a function over a set of variables {X 1, X 2,..., X k }, called scope, mapping each instantiation of these variables to a real number R. X Y Z f (X, Y, Z) Observation: Condition Probability Tables (CPTs) are factors. We define three operations on factors: Product: f (X, Y, Z) g(q, Y, R) Conditioning: f (X, y, Z) Marginalization: y Y f (X, y, Z) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

5 Factors: Product X Y f (X, Y ) Y Z g(y, Z) X Y Z f (X, Y ) g(y, Z) Factor multiplication is commutative and associative. It is time and space exponential in the number of variables in the resulting factor. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

6 Factors: Conditioning X Y f (X, Y ) Y f (0, Y ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

7 Factors: Marginalization X Y f (X, Y ) Y x X f (x, Y ) Factor marginalization is commutative. It is time exponential in the number of variables in the original factor, and space exponential in the number of variables in the resulting factor. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

8 Factors: Example Given the following BN: compute: Pr(S I ) Pr(I ), Pr(G i 1, D), G Pr(L G), L Pr(L G) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

9 The basic property: distributivity When dealing with integers, we know that: n n K i = K i i=1 i=1 The same happens when dealing with factors (e.g., D(X ) = {x 0, x 1 }): f (Z) g(x, Y ) = f (Z) g(x 0, Y ) + f (Z) g(x 1, Y ) = X = f (Z) (g(x 0, Y ) + g(x 1, Y )) = f (Z) X g(x, Y ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

10 VE: prior marginal on a chain We will illustrate VE in the simplest case (a chain with no evidence): Boolean variables X i The BN is a chain: P(X 1, X 2, X 3, X 4 ) = Pr(X 1 )Pr(X 2 X 1 )Pr(X 3 X 2 )Pr(X 4 X 3 ) We have no evidence (E = ) The only variable of interest is X 4 Thus, we want to compute: Pr(X 4 ) = Pr(X 1 )Pr(X 2 X 1 )Pr(X 3 X 2 )Pr(X 4 X 3 ) X 1 X 2 X 3 Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

11 VE: prior marginal on a chain Elimination of X 1 : Pr(X 4 ) = Pr(X 1 )Pr(X 2 X 1 )Pr(X 3 X 2 )Pr(X 4 X 3 ) = X 1 X 2 X 3 = Pr(X 1 )Pr(X 2 X 1 )Pr(X 3 X 2 )Pr(X 4 X 3 ) = X 2 X 3 X 1 = Pr(X 3 X 2 )Pr(X 4 X 3 )( Pr(X 1 )Pr(X 2 X 1 )) = X 2 X 3 X 1 = Pr(X 3 X 2 )Pr(X 4 X 3 )λ 1 (X 2 ) X 2 X 3 where λ 1 (X 2 ) = X 1 Pr(X 1 )Pr(X 2 X 1 ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

12 VE: prior marginal on a chain Elimination of X 2 : P(X 4 ) = Pr(X 3 X 2 )Pr(X 4 X 3 )λ 1 (X 2 ) X 2 X 3 = Pr(X 3 X 2 )Pr(X 4 X 3 )λ 1 (X 2 ) = X 3 X 2 = Pr(X 4 X 3 )( Pr(X 3 X 2 )λ 1 (X 2 )) = X 3 X 2 = X 3 Pr(X 4 X 3 )λ 2 (X 3 ) = where λ 2 (X 3 ) = X 2 Pr(X 3 X 2 )λ 1 (X 2 ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

13 VE: prior marginal on a chain Elimination of X 3 : P(X 4 ) = X 3 Pr(X 4 X 3 )λ 2 (X 3 ) = λ 3 (X 4 ) Question: What would be the time and space complexity of this execution? Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

14 VE: prior marginal on general networks Joint probability distribution: P(C, D, I, G, L, S, H, J) = P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )P(D C)P(C) Goal: P(G, L, S, H, J) Elimination order: C, D, I Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

15 VE: prior marginal on general networks Elimination of C: P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )P(D C)P(C) = I,D,C = P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )( P(D C)P(C)) I,D C = I,D P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )λ C (D) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

16 VE: prior marginal on general networks Elimination of D: P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )λ C (D) I,D = P(H G, J)P(J L, S)P(L G)P(S I )P(I )( P(G D, I )λ C (D)) = I D = I P(H G, J)P(J L, S)P(L G)P(S I )P(I )λ D (G, I ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

17 VE: prior marginal on general networks Elimination of I : P(H G, J)P(J L, S)P(L G)P(S I )P(I )λ D (G, I ) I P(H G, J)P(J L, S)P(L G)( P(S I )P(I )λ D (G, I )) = I P(H G, J)P(J L, S)P(L G)λ I (S, G) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

18 VE: posterior marginal on general networks Given the joint probability: P(C, D, I, G, L, S, H, J) = P(H G, J)P(J L, S)P(L G)P(S I )P(I )P(G D, I )P(D C)P(C) and evidence {I = i, H = h}, we want to compute: P(J, S, L i, h) = P(J, S, L, i, h) P(i, h) = C,D,G J,S,L,C,D,G P(C, D, i, G, L, S, h, J) P(C, D, i, G, L, S, h, J) using elimination order C, D, G. Trick Compute P(J, S, L, i, h) and normalize. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

19 VE: posterior marginal on general networks Elimination of C: P(h G, J)P(J L, S)P(L G)P(S i)p(i)p(g D, i)p(d C)P(C) = C,D,G P(h G, J)P(J L, S)P(L G)P(S i)p(i)p(g D, i)( D,G C P(h G, J)P(J L, S)P(L G)P(S i)p(i)p(g D, i)λ C (D) D,G P(D C)P(C)) = Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

20 VE: posterior marginal on general networks Elimination of D: P(h G, J)P(J L, S)P(L G)P(S i)p(i)p(g D, i)λ C (D) = D,G P(h G, J)P(J L, S)P(L G)P(S i)p(i)( P(G D, i)λ C (D)) = G D P(h G, J)P(J L, S)P(L G)P(S i)p(i)λ D (G) G Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

21 VE: posterior marginal on general networks Elimination of G: P(h G, J)P(J L, S)P(L G)P(S i)p(i)λ D (G) = G P(J L, S)P(S i)p(i)( P(h G, J)P(L G)λ D (G)) = G P(J L, S)P(S i)p(i)λ G (J, L) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

22 VE: Summary Condition all factors on evidence. Eliminate/marginalize each non-query variable X : Move factors mentioning X towards the right. Push-in the summation over X towards the right. Replace expression by λ function. Normalize to get distribution. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

23 VE: Exercise Given the following BN: Eliminate Letter and Difficulty from the network. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

24 VE: Some tricks Given a BN and a set of query nodes Q, and a set of evidence variables E: Pruning edges: the effect of conditioning is to remove any edge from a node in E to its parents. Pruning nodes: we can remove any leaf node (with its CPT) from the network as long as it does not belong to variables in Q E. Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

25 Complexity and Elimination Order P(C, X 1,..., X n ) = P(C)P(X 1 C)P(X 2 C)... P(X n C) Eliminate X n first: P(C)P(X 1 C)P(X 2 C)... P(X n 1 C)( P(X n C)) = C,X 1,X 2,...X n 1 X n P(C)P(X 1 C)P(X 2 C)... P(X n 1 C)λ Xn (C) C,X 1,X 2,...,X n 1 Eliminate C first: X 1,X 2,...,X n ( P(C)P(X 1 C)P(X 2 C)... P(X n C)) C X 1,X 2,...,X n λ C (X 1, X 2,..., X n ) Emma Rollon and Javier Larrosa Bayesian Networks Q / 25

Bayesian Networks. Probability Theory and Probabilistic Reasoning. Emma Rollon and Javier Larrosa Q

Bayesian Networks. Probability Theory and Probabilistic Reasoning. Emma Rollon and Javier Larrosa Q Bayesian Networks Probability Theory and Probabilistic Reasoning Emma Rollon and Javier Larrosa Q1-2015-2016 Emma Rollon and Javier Larrosa Bayesian Networks Q1-2015-2016 1 / 26 Degrees of belief Given

More information

COMP538: Introduction to Bayesian Networks

COMP538: Introduction to Bayesian Networks COMP538: Introduction to ayesian Networks Lecture 4: Inference in ayesian Networks: The VE lgorithm Nevin L. Zhang lzhang@cse.ust.hk Department of Computer Science and Engineering Hong Kong University

More information

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006

Lecture 8. Probabilistic Reasoning CS 486/686 May 25, 2006 Lecture 8 Probabilistic Reasoning CS 486/686 May 25, 2006 Outline Review probabilistic inference, independence and conditional independence Bayesian networks What are they What do they mean How do we create

More information

Bayesian Networks: Representation, Variable Elimination

Bayesian Networks: Representation, Variable Elimination Bayesian Networks: Representation, Variable Elimination CS 6375: Machine Learning Class Notes Instructor: Vibhav Gogate The University of Texas at Dallas We can view a Bayesian network as a compact representation

More information

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013

Introduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013 Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What

More information

Quantifying uncertainty & Bayesian networks

Quantifying uncertainty & Bayesian networks Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

COMPSCI 276 Fall 2007

COMPSCI 276 Fall 2007 Exact Inference lgorithms for Probabilistic Reasoning; OMPSI 276 Fall 2007 1 elief Updating Smoking lung ancer ronchitis X-ray Dyspnoea P lung cancer=yes smoking=no, dyspnoea=yes =? 2 Probabilistic Inference

More information

Bayesian Network Representation

Bayesian Network Representation Bayesian Network Representation Sargur Srihari srihari@cedar.buffalo.edu 1 Topics Joint and Conditional Distributions I-Maps I-Map to Factorization Factorization to I-Map Perfect Map Knowledge Engineering

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

Directed Graphical Models or Bayesian Networks

Directed Graphical Models or Bayesian Networks Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Lecture 9: PGM Learning

Lecture 9: PGM Learning 13 Oct 2014 Intro. to Stats. Machine Learning COMP SCI 4401/7401 Table of Contents I Learning parameters in MRFs 1 Learning parameters in MRFs Inference and Learning Given parameters (of potentials) and

More information

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22. Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides

More information

CS 188: Artificial Intelligence Spring Announcements

CS 188: Artificial Intelligence Spring Announcements CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements

More information

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return

More information

On the Relationship between Sum-Product Networks and Bayesian Networks

On the Relationship between Sum-Product Networks and Bayesian Networks On the Relationship between Sum-Product Networks and Bayesian Networks International Conference on Machine Learning, 2015 Han Zhao Mazen Melibari Pascal Poupart University of Waterloo, Waterloo, ON, Canada

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

15-780: Graduate Artificial Intelligence. Bayesian networks: Construction and inference

15-780: Graduate Artificial Intelligence. Bayesian networks: Construction and inference 15-780: Graduate Artificial Intelligence ayesian networks: Construction and inference ayesian networks: Notations ayesian networks are directed acyclic graphs. Conditional probability tables (CPTs) P(Lo)

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Exact Inference Algorithms Bucket-elimination

Exact Inference Algorithms Bucket-elimination Exact Inference Algorithms Bucket-elimination COMPSCI 276, Spring 2011 Class 5: Rina Dechter (Reading: class notes chapter 4, Darwiche chapter 6) 1 Belief Updating Smoking lung Cancer Bronchitis X-ray

More information

Introduction to Bayesian Learning

Introduction to Bayesian Learning Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline

More information

{ p if x = 1 1 p if x = 0

{ p if x = 1 1 p if x = 0 Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas

Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas Renato Martins Assunção DCC, UFMG - 2015 Renato Assunção, DCC, UFMG PGM 1 / 29 What s the use of? = BN Y = (Y 1, Y 2,...,

More information

Sampling Algorithms for Probabilistic Graphical models

Sampling Algorithms for Probabilistic Graphical models Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir

More information

Bayes Nets: Independence

Bayes Nets: Independence Bayes Nets: Independence [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Bayes Nets A Bayes

More information

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Probability. CS 3793/5233 Artificial Intelligence Probability 1 CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions

More information

4 : Exact Inference: Variable Elimination

4 : Exact Inference: Variable Elimination 10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference

More information

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Bayesian networks: Representation

STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas. Bayesian networks: Representation STATISTICAL METHODS IN AI/ML Vibhav Gogate The University of Texas at Dallas Bayesian networks: Representation Motivation Explicit representation of the joint distribution is unmanageable Computationally:

More information

Informatics 2D Reasoning and Agents Semester 2,

Informatics 2D Reasoning and Agents Semester 2, Informatics 2D Reasoning and Agents Semester 2, 2018 2019 Alex Lascarides alex@inf.ed.ac.uk Lecture 25 Approximate Inference in Bayesian Networks 19th March 2019 Informatics UoE Informatics 2D 1 Where

More information

Exercises, II part Exercises, II part

Exercises, II part Exercises, II part Inference: 12 Jul 2012 Consider the following Joint Probability Table for the three binary random variables A, B, C. Compute the following queries: 1 P(C A=T,B=T) 2 P(C A=T) P(A, B, C) A B C 0.108 T T

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Andrea Passerini passerini@disi.unitn.it Machine Learning Inference in graphical models Description Assume we have evidence e on the state of a subset of variables E in the model (i.e. Bayesian Network)

More information

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech

Probabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech Probabilistic Graphical Models and Bayesian Networks Artificial Intelligence Bert Huang Virginia Tech Concept Map for Segment Probabilistic Graphical Models Probabilistic Time Series Models Particle Filters

More information

Bayes Nets III: Inference

Bayes Nets III: Inference 1 Hal Daumé III (me@hal3.name) Bayes Nets III: Inference Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421: Introduction to Artificial Intelligence 10 Apr 2012 Many slides courtesy

More information

last two digits of your SID

last two digits of your SID Announcements Midterm: Wednesday 7pm-9pm See midterm prep page (posted on Piazza, inst.eecs page) Four rooms; your room determined by last two digits of your SID: 00-32: Dwinelle 155 33-45: Genetics and

More information

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions

More information

Rapid Introduction to Machine Learning/ Deep Learning

Rapid Introduction to Machine Learning/ Deep Learning Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/32 Lecture 5a Bayesian network April 14, 2016 2/32 Table of contents 1 1. Objectives of Lecture 5a 2 2.Bayesian

More information

CS 5522: Artificial Intelligence II

CS 5522: Artificial Intelligence II CS 5522: Artificial Intelligence II Bayes Nets: Independence Instructor: Alan Ritter Ohio State University [These slides were adapted from CS188 Intro to AI at UC Berkeley. All materials available at http://ai.berkeley.edu.]

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Chapter 8&9: Classification: Part 3 Instructor: Yizhou Sun yzsun@ccs.neu.edu March 12, 2013 Midterm Report Grade Distribution 90-100 10 80-89 16 70-79 8 60-69 4

More information

Introduction to Bayesian Networks

Introduction to Bayesian Networks Introduction to Bayesian Networks Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/23 Outline Basic Concepts Bayesian

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September

More information

Need for Sampling in Machine Learning. Sargur Srihari

Need for Sampling in Machine Learning. Sargur Srihari Need for Sampling in Machine Learning Sargur srihari@cedar.buffalo.edu 1 Rationale for Sampling 1. ML methods model data with probability distributions E.g., p(x,y; θ) 2. Models are used to answer queries,

More information

Exact Inference Algorithms Bucket-elimination

Exact Inference Algorithms Bucket-elimination Exact Inference Algorithms Bucket-elimination COMPSCI 276, Spring 2013 Class 5: Rina Dechter (Reading: class notes chapter 4, Darwiche chapter 6) 1 Belief Updating Smoking lung Cancer Bronchitis X-ray

More information

Bayesian networks. Instructor: Vincent Conitzer

Bayesian networks. Instructor: Vincent Conitzer Bayesian networks Instructor: Vincent Conitzer Rain and sprinklers example raining (X) sprinklers (Y) P(X=1) =.3 P(Y=1) =.4 grass wet (Z) Each node has a conditional probability table (CPT) P(Z=1 X=0,

More information

Bayesian networks. Chapter 14, Sections 1 4

Bayesian networks. Chapter 14, Sections 1 4 Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks

More information

Computational Complexity of Inference

Computational Complexity of Inference Computational Complexity of Inference Sargur srihari@cedar.buffalo.edu 1 Topics 1. What is Inference? 2. Complexity Classes 3. Exact Inference 1. Variable Elimination Sum-Product Algorithm 2. Factor Graphs

More information

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III

Bayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample

More information

CSEP 573: Artificial Intelligence

CSEP 573: Artificial Intelligence CSEP 573: Artificial Intelligence Bayesian Networks: Inference Ali Farhadi Many slides over the course adapted from either Luke Zettlemoyer, Pieter Abbeel, Dan Klein, Stuart Russell or Andrew Moore 1 Outline

More information

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.

Axioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks. Bayesian Networks Today we ll introduce Bayesian Networks. This material is covered in chapters 13 and 14. Chapter 13 gives basic background on probability and Chapter 14 talks about Bayesian Networks.

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Artificial Intelligence

Artificial Intelligence ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic

More information

Bayes Nets: Sampling

Bayes Nets: Sampling Bayes Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Approximate Inference:

More information

Graphical Models - Part I

Graphical Models - Part I Graphical Models - Part I Oliver Schulte - CMPT 726 Bishop PRML Ch. 8, some slides from Russell and Norvig AIMA2e Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Outline Probabilistic

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

The Bucket Elimination Algorithm for Belief Net Inference R Greiner October 22, 2006

The Bucket Elimination Algorithm for Belief Net Inference R Greiner October 22, 2006 The Bucket Elimination Algorithm for Belief Net Inference R Greiner October 22, 2006 In general, the input to the BuckElim algorithm is a belief net (structure plus CPtables) here: the Belief Net shown

More information

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling

Bayesian Networks. instructor: Matteo Pozzi. x 1. x 2. x 3 x 4. x 5. x 6. x 7. x 8. x 9. Lec : Urban Systems Modeling 12735: Urban Systems Modeling Lec. 09 Bayesian Networks instructor: Matteo Pozzi x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 1 outline example of applications how to shape a problem as a BN complexity of the inference

More information

Bayesian Learning. Instructor: Jesse Davis

Bayesian Learning. Instructor: Jesse Davis Bayesian Learning Instructor: Jesse Davis 1 Announcements Homework 1 is due today Homework 2 is out Slides for this lecture are online We ll review some of homework 1 next class Techniques for efficient

More information

Machine Learning for Signal Processing Bayes Classification and Regression

Machine Learning for Signal Processing Bayes Classification and Regression Machine Learning for Signal Processing Bayes Classification and Regression Instructor: Bhiksha Raj 11755/18797 1 Recap: KNN A very effective and simple way of performing classification Simple model: For

More information

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference David Poole University of British Columbia 1 Overview Belief Networks Variable Elimination Algorithm Parent Contexts

More information

Computer Science CPSC 322. Lecture 23 Planning Under Uncertainty and Decision Networks

Computer Science CPSC 322. Lecture 23 Planning Under Uncertainty and Decision Networks Computer Science CPSC 322 Lecture 23 Planning Under Uncertainty and Decision Networks 1 Announcements Final exam Mon, Dec. 18, 12noon Same general format as midterm Part short questions, part longer problems

More information

Bayesian Networks. Characteristics of Learning BN Models. Bayesian Learning. An Example

Bayesian Networks. Characteristics of Learning BN Models. Bayesian Learning. An Example Bayesian Networks Characteristics of Learning BN Models (All hail Judea Pearl) (some hail Greg Cooper) Benefits Handle incomplete data Can model causal chains of relationships Combine domain knowledge

More information

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods AI: Stochastic inference in BNs AI: Stochastic inference in BNs 1 Outline ypes of inference in (causal) BNs Hardness of exact

More information

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum Graphical Models Lecture 10: Variable Elimina:on, con:nued Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Last Time Probabilis:c inference is

More information

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment

Introduction to Probabilistic Reasoning. Image credit: NASA. Assignment Introduction to Probabilistic Reasoning Brian C. Williams 16.410/16.413 November 17 th, 2010 11/17/10 copyright Brian Williams, 2005-10 1 Brian C. Williams, copyright 2000-09 Image credit: NASA. Assignment

More information

Variational Inference. Sargur Srihari

Variational Inference. Sargur Srihari Variational Inference Sargur srihari@cedar.buffalo.edu 1 Plan of discussion We first describe inference with PGMs and the intractability of exact inference Then give a taxonomy of inference algorithms

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

T Machine Learning: Basic Principles

T Machine Learning: Basic Principles Machine Learning: Basic Principles Bayesian Networks Laboratory of Computer and Information Science (CIS) Department of Computer Science and Engineering Helsinki University of Technology (TKK) Autumn 2007

More information

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication

More information

SAMPLE CHAPTER. Avi Pfeffer. FOREWORD BY Stuart Russell MANNING

SAMPLE CHAPTER. Avi Pfeffer. FOREWORD BY Stuart Russell MANNING SAMPLE CHAPTER Avi Pfeffer FOREWORD BY Stuart Russell MANNING Practical Probabilistic Programming by Avi Pfeffer Chapter 9 Copyright 2016 Manning Publications brief contents PART 1 INTRODUCING PROBABILISTIC

More information

Qualifier: CS 6375 Machine Learning Spring 2015

Qualifier: CS 6375 Machine Learning Spring 2015 Qualifier: CS 6375 Machine Learning Spring 2015 The exam is closed book. You are allowed to use two double-sided cheat sheets and a calculator. If you run out of room for an answer, use an additional sheet

More information

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II

CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II 1 Bayes Rule Example Disease {malaria, cold, flu}; Symptom = fever Must compute Pr(D fever) to prescribe treatment Why not assess

More information

Variable Elimination (VE) Barak Sternberg

Variable Elimination (VE) Barak Sternberg Variable Elimination (VE) Barak Sternberg Basic Ideas in VE Example 1: Let G be a Chain Bayesian Graph: X 1 X 2 X n 1 X n How would one compute P X n = k? Using the CPDs: P X 2 = x = x Val X1 P X 1 = x

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Lecture 7 Inference in Bayesian Networks Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Slides by Stuart Russell and Peter Norvig Course Overview Introduction

More information

PROBABILITY AND INFERENCE

PROBABILITY AND INFERENCE PROBABILITY AND INFERENCE Progress Report We ve finished Part I: Problem Solving! Part II: Reasoning with uncertainty Part III: Machine Learning 1 Today Random variables and probabilities Joint, marginal,

More information

Artificial Intelligence Bayesian Networks

Artificial Intelligence Bayesian Networks Artificial Intelligence Bayesian Networks Stephan Dreiseitl FH Hagenberg Software Engineering & Interactive Media Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence

More information

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences CS 188: Artificial Intelligence Spring 2011 Announcements Assignments W4 out today --- this is your last written!! Any assignments you have not picked up yet In bin in 283 Soda [same room as for submission

More information

Learning Bayesian Networks (part 1) Goals for the lecture

Learning Bayesian Networks (part 1) Goals for the lecture Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II. Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

An Introduction to Bayesian Networks: Representation and Approximate Inference

An Introduction to Bayesian Networks: Representation and Approximate Inference An Introduction to Bayesian Networks: Representation and Approximate Inference Marek Grześ Department of Computer Science University of York Graphical Models Reading Group May 7, 2009 Data and Probabilities

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

Introduction to Bayesian inference

Introduction to Bayesian inference Introduction to Bayesian inference Thomas Alexander Brouwer University of Cambridge tab43@cam.ac.uk 17 November 2015 Probabilistic models Describe how data was generated using probability distributions

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Probability Propagation in Singly Connected Networks

Probability Propagation in Singly Connected Networks Lecture 4 Probability Propagation in Singly Connected Networks Intelligent Data Analysis and Probabilistic Inference Lecture 4 Slide 1 Probability Propagation We will now develop a general probability

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies

More information

Probabilistic Reasoning Systems

Probabilistic Reasoning Systems Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Probabilistic Graphical Networks: Definitions and Basic Results

Probabilistic Graphical Networks: Definitions and Basic Results This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical

More information

Introduction to Artificial Intelligence Belief networks

Introduction to Artificial Intelligence Belief networks Introduction to Artificial Intelligence Belief networks Chapter 15.1 2 Dieter Fox Based on AIMA Slides c S. Russell and P. Norvig, 1998 Chapter 15.1 2 0-0 Outline Bayesian networks: syntax and semantics

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network

More information

Review: Bayesian learning and inference

Review: Bayesian learning and inference Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:

More information

Uncertainty and Bayesian Networks

Uncertainty and Bayesian Networks Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks

More information

Multivariate Bayesian Linear Regression MLAI Lecture 11

Multivariate Bayesian Linear Regression MLAI Lecture 11 Multivariate Bayesian Linear Regression MLAI Lecture 11 Neil D. Lawrence Department of Computer Science Sheffield University 21st October 2012 Outline Univariate Bayesian Linear Regression Multivariate

More information

Artificial Intelligence Bayes Nets: Independence

Artificial Intelligence Bayes Nets: Independence Artificial Intelligence Bayes Nets: Independence Instructors: David Suter and Qince Li Course Delivered @ Harbin Institute of Technology [Many slides adapted from those created by Dan Klein and Pieter

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Parameter Estimation December 14, 2015 Overview 1 Motivation 2 3 4 What did we have so far? 1 Representations: how do we model the problem? (directed/undirected). 2 Inference: given a model and partially

More information

Bayesian Network Inference Using Marginal Trees

Bayesian Network Inference Using Marginal Trees Bayesian Network Inference Using Marginal Trees Cory J. Butz 1?, Jhonatan de S. Oliveira 2??, and Anders L. Madsen 34 1 University of Regina, Department of Computer Science, Regina, S4S 0A2, Canada butz@cs.uregina.ca

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Oct, 21, 2015 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models CPSC

More information