CS281A/Stat241A Lecture 19

Size: px
Start display at page:

Download "CS281A/Stat241A Lecture 19"

Transcription

1 CS281A/Stat241A Lecture 19 p. 1/4 CS281A/Stat241A Lecture 19 Junction Tree Algorithm Peter Bartlett

2 CS281A/Stat241A Lecture 19 p. 2/4 Announcements My office hours: Tuesday Nov 3 (today), 1-2pm, in 723 Sutardja Dai Hall. Thursday Nov 5, 1-2pm, in 723 Sutardja Dai Hall. Homework 5 due 5pm Monday, November 16.

3 CS281A/Stat241A Lecture 19 p. 3/4 Key ideas of this lecture Junction Tree Algorithm. (For directed graphical models:) Moralize. Triangulate. Construct a junction tree. Define potentials on maximal cliques. Introduce evidence. Propagate probabilities.

4 CS281A/Stat241A Lecture 19 p. 4/4 Junction Tree Algorithm Inference: Given Graph G = (V, E), Evidence x E, for E V, Set F V, compute p(x F x E ).

5 CS281A/Stat241A Lecture 19 p. 5/4 Junction Tree Algorithm Elimination: Single set F. Any G. Sum-product: All singleton sets F simultaneously. G a tree. Junction tree: All cliques F simultaneously. Any G.

6 CS281A/Stat241A Lecture 19 p. 6/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree. 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

7 CS281A/Stat241A Lecture 19 p. 7/4 1. Moralize In a directed graphical model, local conditionals are functions of a variable and its parents: p(x i x π(i) ) = ψ π(i) {i}. To represent this as an undirected model, the set must be a clique. π(i) {i} We consider the moral graph: all parents connected.

8 CS281A/Stat241A Lecture 19 p. 8/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. (e.g., run UNDIRECTEDGRAPHELIMINATE) 3. Construct a junction tree. 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

9 CS281A/Stat241A Lecture 19 p. 9/4 2. Triangulate: Motivation Theorem: A graph G is chordal iff it has a junction tree. Recall that all of the following are equivalent: G is chordal G is decomposable G is recursively simplicial G is the fixed point of UNDIRECTEDGRAPHELIMINATE an oriented version of G has moral graph G. G implies the same cond. indep. as some directed graph.

10 CS281A/Stat241A Lecture 19 p. 10/4 Chordal/Triangulated Definitions: A cycle for a graph G = (V,E) is a vertex sequence v 1,...,v n with v 1 = v n but all other vertices distinct, and {v i,v i+1 } E. A cycle has a chord if it has a pair v i,v j with 1 < i j < n and {i,j} E. A graph is chordal or triangulated if every cycle of length at least four has a chord.

11 CS281A/Stat241A Lecture 19 p. 11/4 Junction Tree: Definition A clique tree for a graph G = (V,E) is a tree T = (V T,E T ) where V T is a set of cliques of G, V T contains all maximal cliques of G. A junction tree for a graph G is a clique tree T = (V T,E T ) for G that has the running intersection property: for any cliques C 1 and C 2 in V T, every clique on the path connecting C 1 and C 2 contains C 1 C 2.

12 CS281A/Stat241A Lecture 19 p. 12/4 Junction Tree: Example C 2 C 4 3 C 3 3 C 1 C 5 Clique Tree: C 1 C 2 C 3 C 4 C 5 C 6 C 6 3

13 CS281A/Stat241A Lecture 19 p. 13/4 2. Triangulate Theorem: A graph G is chordal iff it has a junction tree. Chordal Recursively Simplicial, and we ll show: Recursively Simplicial Junction Tree Chordal

14 CS281A/Stat241A Lecture 19 p. 14/4 Recursively Simplicial Junction Tree Recall: A graph G is recursively simplicial if it contains a simplicial vertex v (neighbors form a clique), and when v is removed, the remaining graph is recursively simplicial. Proof idea induction step: Consider a recursively simplicial graph G of size n + 1. When we remove a simplicial vertex v, it leaves a subgraph G with a junction tree T. Let N be the clique in T containing v s neighbors. Let C be a new clique of v and its neighbors. To obtain a junction tree for G: If N contains only v s neighbors, replace it with C. Otherwise, add C with an edge to N.

15 CS281A/Stat241A Lecture 19 p. 15/4 Junction Tree Chordal Proof idea induction step: Consider a junction tree T of size n + 1. Consider a leaf C of T and its neighbor N in T. Remove C from T and remove C \ N from V. The remaining tree is a junction tree for the remaining (chordal) graph. All cycles have a chord, since either 1. Cycle is completely in remaining graph, 2. Cycle is completely in C, or 3. Cycle is in C \ N, C N, and V \ C (and C N is complete, so contains a chord).

16 CS281A/Stat241A Lecture 19 p. 16/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree: Find a maximal spanning tree. 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

17 CS281A/Stat241A Lecture 19 p. 17/4 Junction Tree is Maximal Spanning Tree Define: The weight of a clique tree T = (C,E) is w(t) = C C. (C,C ) E A maximal spanning tree for a clique set C of a graph is a clique tree T = (C,E ) with maximal weight over all clique trees of the form (C,E).

18 CS281A/Stat241A Lecture 19 p. 18/4 Junction Tree is Maximal Spanning Tree Theorem: Suppose that a graph G with clique set C has a junction tree. Then a clique tree (C,E) is a junction tree iff it is a maximal spanning tree for C. And there are efficient greedy algorithms for finding the maximal spanning tree: While graph is not connected: Add an edge for the biggest weight separating set that does not lead to a cycle.

19 CS281A/Stat241A Lecture 19 p. 19/4 Maximal Spanning Tree: Proof Consider a graph G with vertices V. For a clique tree (C,E), define the separators S = { C C : (C,C ) E }. Since T is a tree, for any k V, S S 1[k S] C C 1[k C] 1, with equality iff the subgraph of T of nodes containing k is a tree.

20 CS281A/Stat241A Lecture 19 p. 20/4 Maximal Spanning Tree: Proof Thus, w(t) = S S S = 1[k S] S S k ( ) 1[k C] 1 k C C = C n, C C with equality iff T is a junction tree.

21 CS281A/Stat241A Lecture 19 p. 21/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

22 CS281A/Stat241A Lecture 19 p. 22/4 Define potentials on maximal cliques To express a product of clique potentials as a product of maximal clique potentials: ψ S1 (x S1 ) ψ SN (x SN ) = C C ψ C (x C ), 1. Set all clique potentials to For each potential, incorporate (multiply) it into a potential containing its variables.

23 CS281A/Stat241A Lecture 19 p. 23/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

24 CS281A/Stat241A Lecture 19 p. 24/4 Introduce Evidence Two equivalent approaches: 1. Introduce evidence potentials δ(x i, x i ) for i in E, so that marginalizing fixes x E = x E. 2. Take slice of each clique potential: ( ) ψ C (x C ) := ψ C xc (V \E), x C E.

25 CS281A/Stat241A Lecture 19 p. 25/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

26 CS281A/Stat241A Lecture 19 p. 26/4 Hugin Algorithm Add potentials for separator sets. Recall: If G has a junction tree T = (C, S), then any probability distribution that satisfies the conditional independences implied by G can be factorized as p(x) = C C p(x C) S S p(x S), where, if S = {C 1,C 2 } then x S denotes x C1 C 2.

27 CS281A/Stat241A Lecture 19 p. 27/4 Hugin Algorithm Represent p with potentials on separators: p(x) = C C ψ C(x C ) S S φ S(x S ) This can represent any distribution that an undirected graphical model can (since we can set φ S 1). But nothing more (since we can incorporate each φ S into one of the ψ C s that have S C).

28 CS281A/Stat241A Lecture 19 p. 28/4 Hugin Algorithm 1. Initialize: ψ C (x C ) = appropriate clique potential φ S (x S ) = Update potentials so that p(x) is invariant ψ C,φ S become the marginal distributions.

29 CS281A/Stat241A Lecture 19 p. 29/4 Algorithm: Messages are passed between cliques in the junction tree. A message is passed from V to adjacent vertex W once V has received messages from all its other neighbors. The message corresponds to the updates: φ (1) S (x S) = x V S ψ V (x V ), W (x W) = ψ W (x W ) φ(1) S (x S) φ S (x S ), ψ (1) ψ (1) V (x V ) = ψ V (x V ), where S = V W is the separator.

30 CS281A/Stat241A Lecture 19 p. 30/4 Analysis: 1. p(x) is invariant under these updates. 2. In a tree, messages can path in both directions over every edge. 3. The potentials are locally consistent after messages have passed in both directions. 4. In a graph with a junction tree, local consistency implies global consistency.

31 CS281A/Stat241A Lecture 19 p. 31/4 Analysis: p(x) = C C ψ C(x C ) S S φ S(x S ). In this ratio, only ψ V ψ W φ S changes, to ψ (1) V ψ(1) W φ (1) S = ψ V ψ W φ (1) S φ (1) S φ S = ψ V ψ W φ S.

32 CS281A/Stat241A Lecture 19 p. 32/4 Analysis: 1. p(x) is invariant under these updates. 2. In a tree, messages can path in both directions over every edge. 3. The potentials are locally consistent after messages have passed in both directions. 4. In a graph with a junction tree, local consistency implies global consistency.

33 CS281A/Stat241A Lecture 19 p. 33/4 Analysis: Local Consistency Suppose that messages pass both ways, from V to W and back: 1. a message passes from V to W : φ (1) S (x S) = x V S ψ V (x V ), W (x W) = ψ W (x W ) φ(1) S (x S) φ S (x S ), ψ (1) ψ (1) V (x V ) = ψ V (x V ),

34 CS281A/Stat241A Lecture 19 p. 34/4 Analysis: Local Consistency 2. other messages pass to W : φ (2) S ψ (2) V = φ(1) S = ψ(1) V ψ (2) W =

35 CS281A/Stat241A Lecture 19 p. 35/4 Analysis: Local Consistency 3. a message passes from W to V : φ (3) S (x S) = x W S ψ (2) W (x W), ψ (3) V (x V ) = ψ (2) V (x V ) φ(3) S (x S) φ (2) S (x S), ψ (3) W (x V ) = ψ W (x W ). Subsequently, φ S,ψ V,ψ W remain unchanged.

36 CS281A/Stat241A Lecture 19 p. 36/4 Analysis: Local Consistency After messages have passed in both directions, these potentials are locally consistent: x V \S ψ (3) V (x V ) = x W \S ψ (3) W (x W) = φ (3) S (x S). c.f.: x V \S p(x V ) = x W \S p(x W ) = p(x S ).

37 CS281A/Stat241A Lecture 19 p. 37/4 Analysis: Local Consistency Proof ψ (3) V (x V ) = ψ (2) V (xv ) φ(3) S (x S) x V \S x V \S φ (2) S (x S) = φ(3) S (x S) φ (1) S (x S) (W V ) x V \S ψ V (x V ) (to W ) = φ (3) S (x S) (V W ) = x W \S ψ (2) W (x W) = x W \S ψ (3) W (x W).

38 CS281A/Stat241A Lecture 19 p. 38/4 Analysis: 1. p(x) is invariant under these updates. 2. In a tree, messages can path in both directions over every edge. 3. The potentials are locally consistent after messages have passed in both directions. 4. In a graph with a junction tree, local consistency implies global consistency.

39 CS281A/Stat241A Lecture 19 p. 39/4 Analysis: Local implies Global Local consistency: For all adjacent cliques V,W with separator S, x V \S ψ (3) V (x V ) = x W \S ψ (3) W (x W) = φ (3) S (x S). Global consistency: For all cliques C, ψ C (x C ) = x Cc C ψ C(x C ) S φ S(x S ) = p(x C)

40 CS281A/Stat241A Lecture 19 p. 40/4 Local implies Global: Proof Induction on number of cliques. Trivial for one clique. Assume true for all junction trees with n cliques. Consider a tree T W of size n + 1. Fix a leaf B, attached by separator R. Define W = all variables N = B \ R V = W \ N T V = junction tree without B.

41 CS281A/Stat241A Lecture 19 p. 41/4 Local implies Global: Proof The junction tree T V : Has only variables V (from the junction tree property). Satisfies local consistency. Hence satisfies global consistency for V : for all A T V, ψ A (x A ) = C C V ψ C (x C ) x V S S \A V φ S (x S ).

42 CS281A/Stat241A Lecture 19 p. 42/4 Local implies Global: Proof But viewing this A in the larger T W, we have C C W ψ C (x C ) S S W φ S (x S ) = ψ B (x B ) φ x V \A x R (x R ) N x W \A = x V \A = ψ A (x A ). C C V ψ C (x C ) S S V φ S (x S ) C C V ψ C (x C ) S S V φ S (x S )

43 CS281A/Stat241A Lecture 19 p. 43/4 Local implies Global: Proof And for the new clique B: Suppose A is the neighbor of B in T W. C C W ψ C (x C ) x W \B S S W φ S (x S ) = ψ B (x B ) φ x A\R x R (x R ) V \A = ψ B(x B ) φ R (x R ) x V \A = ψ B (x B ) = ψ B (x B ). x A\R x A\R ψ A (x A ) φ R (x R ) C C V ψ C (x C ) S S V φ S (x S ) C C V ψ C (x C ) S S V φ S (x S )

44 CS281A/Stat241A Lecture 19 p. 44/4 Junction Tree Algorithm 1. (For directed graphical models:) Moralize. 2. Triangulate. 3. Construct a junction tree 4. Define potentials on maximal cliques. 5. Introduce evidence. 6. Propagate probabilities.

45 CS281A/Stat241A Lecture 19 p. 45/4 Junction Tree Algorithm: Computation Size of largest maximal clique determines run-time. If variables are categorical and potentials are represented as tables, marginalizing takes time exponential in clique size. Finding a triangulation to minimize the size of the largest clique is NP-hard.

46 CS281A/Stat241A Lecture 19 p. 46/4 Key ideas of this lecture Junction Tree Algorithm. (For directed graphical models:) Moralize. Triangulate. Construct a junction tree. Define potentials on maximal cliques. Introduce evidence. Propagate probabilities.

Probability Propagation

Probability Propagation Graphical Models, Lectures 9 and 10, Michaelmas Term 2009 November 13, 2009 Characterizing chordal graphs The following are equivalent for any undirected graph G. (i) G is chordal; (ii) G is decomposable;

More information

Probability Propagation

Probability Propagation Graphical Models, Lecture 12, Michaelmas Term 2010 November 19, 2010 Characterizing chordal graphs The following are equivalent for any undirected graph G. (i) G is chordal; (ii) G is decomposable; (iii)

More information

Variable Elimination (VE) Barak Sternberg

Variable Elimination (VE) Barak Sternberg Variable Elimination (VE) Barak Sternberg Basic Ideas in VE Example 1: Let G be a Chain Bayesian Graph: X 1 X 2 X n 1 X n How would one compute P X n = k? Using the CPDs: P X 2 = x = x Val X1 P X 1 = x

More information

Inference and Representation

Inference and Representation Inference and Representation David Sontag New York University Lecture 5, Sept. 30, 2014 David Sontag (NYU) Inference and Representation Lecture 5, Sept. 30, 2014 1 / 16 Today s lecture 1 Running-time of

More information

Lecture 17: May 29, 2002

Lecture 17: May 29, 2002 EE596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 2000 Dept. of Electrical Engineering Lecture 17: May 29, 2002 Lecturer: Jeff ilmes Scribe: Kurt Partridge, Salvador

More information

Lecture 12: May 09, Decomposable Graphs (continues from last time)

Lecture 12: May 09, Decomposable Graphs (continues from last time) 596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of lectrical ngineering Lecture : May 09, 00 Lecturer: Jeff Bilmes Scribe: Hansang ho, Izhak Shafran(000).

More information

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = = Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query

More information

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics EECS 281A / STAT 241A Statistical Learning Theory Solutions to Problem Set 2 Fall 2011 Issued: Wednesday,

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 9 Undirected Models CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9)

More information

Clique trees & Belief Propagation. Siamak Ravanbakhsh Winter 2018

Clique trees & Belief Propagation. Siamak Ravanbakhsh Winter 2018 Graphical Models Clique trees & Belief Propagation Siamak Ravanbakhsh Winter 2018 Learning objectives message passing on clique trees its relation to variable elimination two different forms of belief

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation

More information

Message Passing Algorithms and Junction Tree Algorithms

Message Passing Algorithms and Junction Tree Algorithms Message Passing lgorithms and Junction Tree lgorithms Le Song Machine Learning II: dvanced Topics S 8803ML, Spring 2012 Inference in raphical Models eneral form of the inference problem P X 1,, X n Ψ(

More information

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall Chapter 8 Cluster Graph & elief ropagation robabilistic Graphical Models 2016 Fall Outlines Variable Elimination 消元法 imple case: linear chain ayesian networks VE in complex graphs Inferences in HMMs and

More information

CS Lecture 4. Markov Random Fields

CS Lecture 4. Markov Random Fields CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields

More information

Bayesian & Markov Networks: A unified view

Bayesian & Markov Networks: A unified view School of omputer Science ayesian & Markov Networks: unified view Probabilistic Graphical Models (10-708) Lecture 3, Sep 19, 2007 Receptor Kinase Gene G Receptor X 1 X 2 Kinase Kinase E X 3 X 4 X 5 TF

More information

CS173 Lecture B, November 3, 2015

CS173 Lecture B, November 3, 2015 CS173 Lecture B, November 3, 2015 Tandy Warnow November 3, 2015 CS 173, Lecture B November 3, 2015 Tandy Warnow Announcements Examlet 7 is a take-home exam, and is due November 10, 11:05 AM, in class.

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

9 Forward-backward algorithm, sum-product on factor graphs

9 Forward-backward algorithm, sum-product on factor graphs Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous

More information

Junction Tree, BP and Variational Methods

Junction Tree, BP and Variational Methods Junction Tree, BP and Variational Methods Adrian Weller MLSALT4 Lecture Feb 21, 2018 With thanks to David Sontag (MIT) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,

More information

Exact Inference: Clique Trees. Sargur Srihari

Exact Inference: Clique Trees. Sargur Srihari Exact Inference: Clique Trees Sargur srihari@cedar.buffalo.edu 1 Topics 1. Overview 2. Variable Elimination and Clique Trees 3. Message Passing: Sum-Product VE in a Clique Tree Clique-Tree Calibration

More information

Undirected Graphical Models

Undirected Graphical Models Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates

More information

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich Message Passing and Junction Tree Algorithms Kayhan Batmanghelich 1 Review 2 Review 3 Great Ideas in ML: Message Passing Each soldier receives reports from all branches of tree 3 here 7 here 1 of me 11

More information

Statistical Approaches to Learning and Discovery

Statistical Approaches to Learning and Discovery Statistical Approaches to Learning and Discovery Graphical Models Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon University

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 10 Undirected Models CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due this Wednesday (Nov 4) in class Project milestones due next Monday (Nov 9) About half

More information

Fisher Information in Gaussian Graphical Models

Fisher Information in Gaussian Graphical Models Fisher Information in Gaussian Graphical Models Jason K. Johnson September 21, 2006 Abstract This note summarizes various derivations, formulas and computational algorithms relevant to the Fisher information

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 11 CRFs, Exponential Family CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due today Project milestones due next Monday (Nov 9) About half the work should

More information

Lecture 4 October 18th

Lecture 4 October 18th Directed and undirected graphical models Fall 2017 Lecture 4 October 18th Lecturer: Guillaume Obozinski Scribe: In this lecture, we will assume that all random variables are discrete, to keep notations

More information

Graphical Model Inference with Perfect Graphs

Graphical Model Inference with Perfect Graphs Graphical Model Inference with Perfect Graphs Tony Jebara Columbia University July 25, 2013 joint work with Adrian Weller Graphical models and Markov random fields We depict a graphical model G as a bipartite

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 3 1 Gaussian Graphical Models: Schur s Complement Consider

More information

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, Networks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Steve Cheng, Ph.D. Guest Speaker for EECS 6893 Big Data Analytics Columbia University October 26, 2017 Outline Introduction Probability

More information

CSC 412 (Lecture 4): Undirected Graphical Models

CSC 412 (Lecture 4): Undirected Graphical Models CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm Probabilistic Graphical Models 10-708 Homework 2: Due February 24, 2014 at 4 pm Directions. This homework assignment covers the material presented in Lectures 4-8. You must complete all four problems to

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 18 The Junction Tree Algorithm Collect & Distribute Algorithmic Complexity ArgMax Junction Tree Algorithm Review: Junction Tree Algorithm end message

More information

11 The Max-Product Algorithm

11 The Max-Product Algorithm Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 11 The Max-Product Algorithm In the previous lecture, we introduced

More information

Undirected Graphical Models 4 Bayesian Networks and Markov Networks. Bayesian Networks to Markov Networks

Undirected Graphical Models 4 Bayesian Networks and Markov Networks. Bayesian Networks to Markov Networks Undirected Graphical Models 4 ayesian Networks and Markov Networks 1 ayesian Networks to Markov Networks 2 1 Ns to MNs X Y Z Ns can represent independence constraints that MN cannot MNs can represent independence

More information

COMPSCI 276 Fall 2007

COMPSCI 276 Fall 2007 Exact Inference lgorithms for Probabilistic Reasoning; OMPSI 276 Fall 2007 1 elief Updating Smoking lung ancer ronchitis X-ray Dyspnoea P lung cancer=yes smoking=no, dyspnoea=yes =? 2 Probabilistic Inference

More information

EE595A Submodular functions, their optimization and applications Spring 2011

EE595A Submodular functions, their optimization and applications Spring 2011 EE595A Submodular functions, their optimization and applications Spring 2011 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Winter Quarter, 2011 http://ee.washington.edu/class/235/2011wtr/index.html

More information

4.1 Notation and probability review

4.1 Notation and probability review Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall

More information

Inference as Optimization

Inference as Optimization Inference as Optimization Sargur Srihari srihari@cedar.buffalo.edu 1 Topics in Inference as Optimization Overview Exact Inference revisited The Energy Functional Optimizing the Energy Functional 2 Exact

More information

Strongly chordal and chordal bipartite graphs are sandwich monotone

Strongly chordal and chordal bipartite graphs are sandwich monotone Strongly chordal and chordal bipartite graphs are sandwich monotone Pinar Heggernes Federico Mancini Charis Papadopoulos R. Sritharan Abstract A graph class is sandwich monotone if, for every pair of its

More information

Example: multivariate Gaussian Distribution

Example: multivariate Gaussian Distribution School of omputer Science Probabilistic Graphical Models Representation of undirected GM (continued) Eric Xing Lecture 3, September 16, 2009 Reading: KF-chap4 Eric Xing @ MU, 2005-2009 1 Example: multivariate

More information

Cographs; chordal graphs and tree decompositions

Cographs; chordal graphs and tree decompositions Cographs; chordal graphs and tree decompositions Zdeněk Dvořák September 14, 2015 Let us now proceed with some more interesting graph classes closed on induced subgraphs. 1 Cographs The class of cographs

More information

6.867 Machine learning, lecture 23 (Jaakkola)

6.867 Machine learning, lecture 23 (Jaakkola) Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context

More information

Generative and Discriminative Approaches to Graphical Models CMSC Topics in AI

Generative and Discriminative Approaches to Graphical Models CMSC Topics in AI Generative and Discriminative Approaches to Graphical Models CMSC 35900 Topics in AI Lecture 2 Yasemin Altun January 26, 2007 Review of Inference on Graphical Models Elimination algorithm finds single

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

12 : Variational Inference I

12 : Variational Inference I 10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of

More information

3 Undirected Graphical Models

3 Undirected Graphical Models Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 3 Undirected Graphical Models In this lecture, we discuss undirected

More information

Graphical Models. Lecture 12: Belief Update Message Passing. Andrew McCallum

Graphical Models. Lecture 12: Belief Update Message Passing. Andrew McCallum Graphical Models Lecture 12: Belief Update Message Passing Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for slide materials. 1 Today s Plan Quick Review: Sum Product Message

More information

Belief Update in CLG Bayesian Networks With Lazy Propagation

Belief Update in CLG Bayesian Networks With Lazy Propagation Belief Update in CLG Bayesian Networks With Lazy Propagation Anders L Madsen HUGIN Expert A/S Gasværksvej 5 9000 Aalborg, Denmark Anders.L.Madsen@hugin.com Abstract In recent years Bayesian networks (BNs)

More information

Context-specific independence Parameter learning: MLE

Context-specific independence Parameter learning: MLE Use hapter 3 of K&F as a reference for SI Reading for parameter learning: hapter 12 of K&F ontext-specific independence Parameter learning: MLE Graphical Models 10708 arlos Guestrin arnegie Mellon University

More information

Decomposable and Directed Graphical Gaussian Models

Decomposable and Directed Graphical Gaussian Models Decomposable Decomposable and Directed Graphical Gaussian Models Graphical Models and Inference, Lecture 13, Michaelmas Term 2009 November 26, 2009 Decomposable Definition Basic properties Wishart density

More information

Undirected Graphical Models: Markov Random Fields

Undirected Graphical Models: Markov Random Fields Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected

More information

5. Sum-product algorithm

5. Sum-product algorithm Sum-product algorithm 5-1 5. Sum-product algorithm Elimination algorithm Sum-product algorithm on a line Sum-product algorithm on a tree Sum-product algorithm 5-2 Inference tasks on graphical models consider

More information

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph MATHEMATICAL ENGINEERING TECHNICAL REPORTS Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph Hisayuki HARA and Akimichi TAKEMURA METR 2006 41 July 2006 DEPARTMENT

More information

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

Likelihood Analysis of Gaussian Graphical Models

Likelihood Analysis of Gaussian Graphical Models Faculty of Science Likelihood Analysis of Gaussian Graphical Models Ste en Lauritzen Department of Mathematical Sciences Minikurs TUM 2016 Lecture 2 Slide 1/43 Overview of lectures Lecture 1 Markov Properties

More information

Exact Inference Algorithms Bucket-elimination

Exact Inference Algorithms Bucket-elimination Exact Inference Algorithms Bucket-elimination COMPSCI 276, Spring 2011 Class 5: Rina Dechter (Reading: class notes chapter 4, Darwiche chapter 6) 1 Belief Updating Smoking lung Cancer Bronchitis X-ray

More information

Graphical Models Another Approach to Generalize the Viterbi Algorithm

Graphical Models Another Approach to Generalize the Viterbi Algorithm Exact Marginalization Another Approach to Generalize the Viterbi Algorithm Oberseminar Bioinformatik am 20. Mai 2010 Institut für Mikrobiologie und Genetik Universität Göttingen mario@gobics.de 1.1 Undirected

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Revisiting the Limits of MAP Inference by MWSS on Perfect Graphs

Revisiting the Limits of MAP Inference by MWSS on Perfect Graphs Revisiting the Limits of MAP Inference by MWSS on Perfect Graphs Adrian Weller University of Cambridge CP 2015 Cork, Ireland Slides and full paper at http://mlg.eng.cam.ac.uk/adrian/ 1 / 21 Motivation:

More information

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum Graphical Models Lecture 10: Variable Elimina:on, con:nued Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Last Time Probabilis:c inference is

More information

4 : Exact Inference: Variable Elimination

4 : Exact Inference: Variable Elimination 10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference

More information

Lecture 8: Bayesian Networks

Lecture 8: Bayesian Networks Lecture 8: Bayesian Networks Bayesian Networks Inference in Bayesian Networks COMP-652 and ECSE 608, Lecture 8 - January 31, 2017 1 Bayes nets P(E) E=1 E=0 0.005 0.995 E B P(B) B=1 B=0 0.01 0.99 E=0 E=1

More information

UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS

UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS JONATHAN YEDIDIA, WILLIAM FREEMAN, YAIR WEISS 2001 MERL TECH REPORT Kristin Branson and Ian Fasel June 11, 2003 1. Inference Inference problems

More information

13 : Variational Inference: Loopy Belief Propagation and Mean Field

13 : Variational Inference: Loopy Belief Propagation and Mean Field 10-708: Probabilistic Graphical Models 10-708, Spring 2012 13 : Variational Inference: Loopy Belief Propagation and Mean Field Lecturer: Eric P. Xing Scribes: Peter Schulam and William Wang 1 Introduction

More information

Bayesian Networks Representation and Reasoning

Bayesian Networks Representation and Reasoning ayesian Networks Representation and Reasoning Marco F. Ramoni hildren s Hospital Informatics Program Harvard Medical School (2003) Harvard-MIT ivision of Health Sciences and Technology HST.951J: Medical

More information

Bayesian networks approximation

Bayesian networks approximation Bayesian networks approximation Eric Fabre ANR StochMC, Feb. 13, 2014 Outline 1 Motivation 2 Formalization 3 Triangulated graphs & I-projections 4 Successive approximations 5 Best graph selection 6 Conclusion

More information

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course Course on Bayesian Networks, winter term 2007 0/31 Bayesian Networks Bayesian Networks I. Bayesian Networks / 1. Probabilistic Independence and Separation in Graphs Prof. Dr. Lars Schmidt-Thieme, L. B.

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 1: Introduction to Graphical Models Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ epartment of ngineering University of ambridge, UK

More information

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms 1. Computability, Complexity and Algorithms Part a: You are given a graph G = (V,E) with edge weights w(e) > 0 for e E. You are also given a minimum cost spanning tree (MST) T. For one particular edge

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 9: Variational Inference Relaxations Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne 24/10/2011 (EPFL) Graphical Models 24/10/2011 1 / 15

More information

13 : Variational Inference: Loopy Belief Propagation

13 : Variational Inference: Loopy Belief Propagation 10-708: Probabilistic Graphical Models 10-708, Spring 2014 13 : Variational Inference: Loopy Belief Propagation Lecturer: Eric P. Xing Scribes: Rajarshi Das, Zhengzhong Liu, Dishan Gupta 1 Introduction

More information

14 : Theory of Variational Inference: Inner and Outer Approximation

14 : Theory of Variational Inference: Inner and Outer Approximation 10-708: Probabilistic Graphical Models 10-708, Spring 2014 14 : Theory of Variational Inference: Inner and Outer Approximation Lecturer: Eric P. Xing Scribes: Yu-Hsin Kuo, Amos Ng 1 Introduction Last lecture

More information

On Chordal Graphs and Their Chromatic Polynomials

On Chordal Graphs and Their Chromatic Polynomials On Chordal Graphs and Their Chromatic Polynomials Geir Agnarsson Abstract We derive a formula for the chromatic polynomial of a chordal or a triangulated graph in terms of its maximal cliques As a corollary

More information

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: 2017 2018 Time Allowed 2 Hours INSTRUCTIONS TO STUDENTS 1. This assessment consists of Eight (8) questions and comprises

More information

CMPSCI611: The Matroid Theorem Lecture 5

CMPSCI611: The Matroid Theorem Lecture 5 CMPSCI611: The Matroid Theorem Lecture 5 We first review our definitions: A subset system is a set E together with a set of subsets of E, called I, such that I is closed under inclusion. This means that

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Cours 7 12th November 2014

Cours 7 12th November 2014 Sum Product Algorithm and Hidden Markov Model 2014/2015 Cours 7 12th November 2014 Enseignant: Francis Bach Scribe: Pauline Luc, Mathieu Andreux 7.1 Sum Product Algorithm 7.1.1 Motivations Inference, along

More information

COMP538: Introduction to Bayesian Networks

COMP538: Introduction to Bayesian Networks COMP538: Introduction to ayesian Networks Lecture 4: Inference in ayesian Networks: The VE lgorithm Nevin L. Zhang lzhang@cse.ust.hk Department of Computer Science and Engineering Hong Kong University

More information

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics

More information

Enumerating minimal connected dominating sets in graphs of bounded chordality,

Enumerating minimal connected dominating sets in graphs of bounded chordality, Enumerating minimal connected dominating sets in graphs of bounded chordality, Petr A. Golovach a,, Pinar Heggernes a, Dieter Kratsch b a Department of Informatics, University of Bergen, N-5020 Bergen,

More information

Induced Cycles of Fixed Length

Induced Cycles of Fixed Length Induced Cycles of Fixed Length Terry McKee Wright State University Dayton, Ohio USA terry.mckee@wright.edu Cycles in Graphs Vanderbilt University 31 May 2012 Overview 1. Investigating the fine structure

More information

Linear-Time Algorithms for Finding Tucker Submatrices and Lekkerkerker-Boland Subgraphs

Linear-Time Algorithms for Finding Tucker Submatrices and Lekkerkerker-Boland Subgraphs Linear-Time Algorithms for Finding Tucker Submatrices and Lekkerkerker-Boland Subgraphs Nathan Lindzey, Ross M. McConnell Colorado State University, Fort Collins CO 80521, USA Abstract. Tucker characterized

More information

Bayesian Machine Learning - Lecture 7

Bayesian Machine Learning - Lecture 7 Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1

More information

Trees. A tree is a graph which is. (a) Connected and. (b) has no cycles (acyclic).

Trees. A tree is a graph which is. (a) Connected and. (b) has no cycles (acyclic). Trees A tree is a graph which is (a) Connected and (b) has no cycles (acyclic). 1 Lemma 1 Let the components of G be C 1, C 2,..., C r, Suppose e = (u, v) / E, u C i, v C j. (a) i = j ω(g + e) = ω(g).

More information

13: Variational inference II

13: Variational inference II 10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational

More information

Learning With Bayesian Networks. Markus Kalisch ETH Zürich

Learning With Bayesian Networks. Markus Kalisch ETH Zürich Learning With Bayesian Networks Markus Kalisch ETH Zürich Inference in BNs - Review P(Burglary JohnCalls=TRUE, MaryCalls=TRUE) Exact Inference: P(b j,m) = c Sum e Sum a P(b)P(e)P(a b,e)p(j a)p(m a) Deal

More information

Variational Inference (11/04/13)

Variational Inference (11/04/13) STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further

More information

Decomposable Graphical Gaussian Models

Decomposable Graphical Gaussian Models CIMPA Summerschool, Hammamet 2011, Tunisia September 12, 2011 Basic algorithm This simple algorithm has complexity O( V + E ): 1. Choose v 0 V arbitrary and let v 0 = 1; 2. When vertices {1, 2,..., j}

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

F. Roussel, I. Rusu. Université d Orléans, L.I.F.O., B.P. 6759, Orléans Cedex 2, France

F. Roussel, I. Rusu. Université d Orléans, L.I.F.O., B.P. 6759, Orléans Cedex 2, France A linear algorithm to color i-triangulated graphs F. Roussel, I. Rusu Université d Orléans, L.I.F.O., B.P. 6759, 45067 Orléans Cedex 2, France Abstract: We show that i-triangulated graphs can be colored

More information

CSE 254: MAP estimation via agreement on (hyper)trees: Message-passing and linear programming approaches

CSE 254: MAP estimation via agreement on (hyper)trees: Message-passing and linear programming approaches CSE 254: MAP estimation via agreement on (hyper)trees: Message-passing and linear programming approaches A presentation by Evan Ettinger November 11, 2005 Outline Introduction Motivation and Background

More information