Exact Inference: Variable Elimination

Size: px
Start display at page:

Download "Exact Inference: Variable Elimination"

Transcription

1 Readings: K&F Exact nerence: Variable Elimination ecture 6-7 Apr 1/ E 515 tatistical Methods pring 2011 nstructor: u-n ee University o Washington eattle et s revisit the tudent Network Assumptions Binary RVs oherence iiculty ntelligence rade AT etter ob appy ocal probabilistic models: table s arameters and structure are given. Notations & abbreviations : a random variable : a set o random variables Val : a set o values on j : a value on : size o Val j : j 2 1

2 nerence Tasks in tudent Network onditional probability queries l 1 or l 1 h 0 or h 0 j 1 or j 1 j 1 i 1 d 1 or j 1 i 0 d 1 j 1 h 0 i 1 or j 0 h 0 i 1 j 1 i 0 h 0 : Query RVs Evidence RVs ow to compute the probabilities? Use joint distribution Naïve Approach Use ull joint distribution omputing j 1 j 1 c 0 d 0 i 0 g 0 s 0 l 0 j 1 h 0 + c 0 d 0 i 0 g 0 s 0 l 0 j 1 h 1 + c 0 d 0 i 0 g 0 s 0 l 1 j 1 h 0 + c 0 d 0 i 0 g 0 s 0 l 1 j 1 h 1 + c 0 d 0 i 0 g 0 s 1 l 0 j 1 h 0 + c 0 d 0 i 0 g 0 s 1 l 0 j 1 h 1 + c 0 d 0 i 0 g 0 s 1 l 1 j 1 h 0 + c 0 d 0 i 0 g 0 s 1 l 1 j 1 h 1 + c 0 d 0 i 0 g 1 s 0 l 0 j 1 h 0 + c 0 d 0 i 0 g 1 s 0 l 0 j 1 h 1 + c 0 d 0 i 0 g 1 s 0 l 1 j 1 h 0 + c 0 d 0 i 0 g 1 s 0 l 1 j 1 h 1 + c 0 d 0 i 0 g 1 s 1 l 0 j 1 h 0 + c 0 d 0 i 0 g 1 s 1 l 0 j 1 h 1 + c 0 d 0 i 0 g 1 s 1 l 1 j 1 h 0 + c 0 d 0 i 0 g 1 s 1 l 1 j 1 h 1 + : omputing i 0 j 1 i 0 j 1 c 0 d 0 i 0 g 0 s 0 l 0 j 1 h 0 + c 0 d 0 i 0 g 0 s 0 l 0 j 1 h 1 + c 0 d 0 i 0 g 0 s 0 l 1 j 1 h 0 + c 0 d 0 i 0 g 0 s 0 l 1 j 1 h 1 + c 0 d 0 i 0 g 0 s 1 l 0 j 1 h 0 + c 0 d 0 i 0 g 0 s 1 l 0 j 1 h 1 + c 0 d 0 i 0 g 0 s 1 l 1 j 1 h 0 + c 0 d 0 i 0 g 0 s 1 l 1 j 1 h 1 + c 0 d 0 i 0 g 1 s 0 l 0 j 1 h 0 + c 0 d 0 i 0 g 1 s 0 l 0 j 1 h 1 + c 0 d 0 i 0 g 1 s 0 l 1 j 1 h 0 + c 0 d 0 i 0 g 1 s 0 l 1 j 1 h 1 + c 0 d 0 i 0 g 1 s 1 l 0 j 1 h 0 + c 0 d 0 i 0 g 1 s 1 l 0 j 1 h 1 + c 0 d 0 i 0 g 1 s 1 l 1 j 1 h 0 + c 0 d 0 i 0 g 1 s 1 l 1 j 1 h 1 + : omputational complexity: exponential blowup Exploiting the independence properties? 4 2

3 Naïve Approach omputing j 1 c 0 d 0 c 0 i 00 g [ g 0 i 0 d 0 i 0 s d 0 0 s i 0 l 0 i 0 l g 0 j 0 g 01 j l 0 s 1 l 0 h 0 s 0 h 0 g 0 j 0 g 1 0 j 1 + c 0 d 0 c 0 i 0 g + g 0 i d s 0 i l 0 g j 1 l 0 s h 1 g 0 j 1 ertain terms are 0 i 0 d 0 s 0 i 0 l 0 g 0 j 1 l 0 s 0 h 1 g 0 j 1 + c 0 d 0 c 0 i 0 g + g 0 i 0 d i 0 s d 0 s 0 i 0 l i 0 l 1 g 10 g j 0 j 1 l 11 s l 01 h s 0 h 0 g 0 g j 10 j 1 + c 0 d 0 c 0 i 0 g + g 0 i d s 0 i l 1 g 0 j 1 l 1 s 0 h 1 g 0 j 1 repeated several 0 i 0 d 0 s 0 i 0 l 1 g 0 j 1 l 1 s 0 h 1 g 0 j 1 + c 0 d 0 c 0 i 0 g + g 0 i 0 d i 0 s d 0 s 1 i 01 l i 0 l 0 g 0 g j 0 j 1 l 01 s l 10 h s 1 h 0 g 0 g j 10 j 1 times + c 0 d 0 c 0 i 0 g + g 0 i 0 d i 0 s d 0 s 1 i 01 l i 0 l 0 g 0 g j 0 j 1 l 01 s l 10 h s 1 h 1 g 10 g j 10 j 1 + c 0 d 0 c 0 i 0 g + g 0 i 0 d i 0 s d 0 s 1 i 01 l i 0 l 1 g 10 g j 0 j 1 l 11 s l 1 h s 1 h 0 g 0 g j 10 j 1 + c 0 d 0 c 0 i 0 g + g 0 i 0 d i 0 s d 0 s 1 i 01 l i 0 l 1 g 10 g j 0 j 1 l 11 s l 1 h s 1 h 1 g 10 g j 10 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 0 i 0 l i 0 l 0 g 01 g j 1 j 1 l 01 s l 0 h s 0 h 0 g 01 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 0 i 0 l i 0 l 0 g 01 g j 1 j 1 l 01 s l 0 h s 0 h 1 g 1 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 0 i 0 l i 0 l 1 g 1 g j 1 j 1 l 11 s l 01 h s 0 h 0 g 01 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 0 i 0 l i 0 l 1 g 1 g j 1 j 1 l 11 s l 01 h s 0 h 1 g 1 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 1 i 01 l i 0 l 0 g 01 g j 1 j 1 l 01 s l 10 h s 1 h 0 g 01 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 1 i 01 l i 0 l 0 g 01 g j 1 j 1 l 01 s l 10 h s 1 h 1 g 1 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 1 i 01 l i 0 l 1 g 1 g j 1 j 1 l 11 s l 1 h s 1 h 0 g 01 g j 11 j 1 + c 0 d 0 c 0 i 0 g + g 1 i 01 d i 0 s d 0 s 1 i 01 l i 0 l 1 g 1 g j 1 j 1 l 11 s l 1 h s 1 h 1 g 1 g j 11 j 1 + : Exploiting the structure can reduce computation. et s systematically analyze computational complexity. 5 et s start with the simplest network 6

4 Exact nerence Variable Elimination nerence in a simple chain omputing x1 2 x1 2 2 x1 x1 x1 All the numbers or this computation are in the s o the original Bayesian network O 1 2 operations 7 Exact nerence Variable Elimination nerence in a simple chain omputing omputing x 2 x1 2 x1 x1 x1 x2 x2 x2 x2 x is a given 2 was computed above O operations 8 4

5 Exact nerence: Variable Elimination n nerence in a general chain omputing n ompute each i rom i-1 k 2 operations or each computation or i assuming i k Onk 2 operations or the inerence ompare to k n operations required in summing over all possible entries in the joint distribution over 1... n nerence in a general chain can be done in linear time! 9 Exact nerence: Variable Elimination ushing summations ynamic programming 10 5

6 nerence With a oop omputing ierences 2 4 ummations are not pushed in as ar as beore. The scope o includes two variables not one. epends on network structure 11 Eicient nerence in Bayesnets roperties that allow us to avoid exponential blowup in the joint distribution Bayesian network structure some subexpressions depend on a small number o variables omputing these subexpressions and caching the results avoids generating them exponentially many times 12 6

7 Variable Elimination: Factors nerence algorithm deined in terms o actors Factors generalize the notion o s A actor is a unction rom value assignments o a set o random variables to real positive numbers R + The set o variables is the scope o the actor Thus the algorithm we describe applies both to Bayesian networks and Markov networks 1 Operations on Factors : roduct et Y Z be three sets o disjoint sets o RVs and let 1 Y and 2 YZ be two actors We deine the actor product 1 x 2 operation to be a actor ψ:valyz R as ψyz 1 Y 2 YZ Y 1 [Y] x 0 y 0 2 x 0 y 1 x 1 y 0 10 x 1 y 1 5 Y Z 2 [Y] y 0 z 0 6 y 0 z 1 1 y 1 z 0 7 y 1 z 1 12 Y Z Ψ [YZ] x 0 y 0 z 0 2 x 6 12 x 0 y 0 z 1 2 x 1 2 x 0 y 1 z 0 x 7 21 x 0 y 1 z 1 x 12 6 x 1 y 0 z 0 10 x 6 60 x 1 y 0 z 1 10 x 1 10 x 1 y 1 z 0 5 x 7 5 x 1 y 1 z 1 5 x

8 Operations on Factors : Marginalization et be a set o RVs Y a RV and Y a actor We deine the actor marginalization o Y in to be a actor ψ:val R as ψ Σ Y Y Also called summing out n a Bayesian network summing out all variables 1 n a Markov network summing out all variables is the partition unction 15 More on Factors For actors 1 and 2 : Factors are commutative 1 x 2 1 x 2 Σ Σ Y Y Σ Y Σ Y roducts are associative 1 x 2 x 1 x 2 x cope[ 1 ] we used this in elimination above Σ 1 x 2 1 x Σ

9 nerence in hain by Factors cope o and 4 does not contain 1 cope o 4 does not contain 2 17 um-roduct nerence et Y be the query RVs and Z be all other RVs We can generalize this task as that o computing the value o an expression o the orm: Y all it sum-product inerence task. Eective computation Z ' F The scope o the actors is limited. ush in some o the summations perorming them over the product o only a subset o actors ' 18 9

10 um-roduct Variable Elimination Algorithm iven an ordering o variables Z 1 Z n um out the variables one at a time When summing out each variable Z Multiply all the actors s that mention the variable generating a product actor Ψ um out the variable rom the combined actor Ψ generating a new actor without the variable Z um out age um-roduct Variable Elimination Theorem et be a set o RVs et Y be a set o query RVs et Z-Y For any ordering α over Z the above algorithm returns a actor Y such that Y ' Bayesian network query Y F consists o all s in F Each i i a i { } n i i Z ' F Apply variable elimination or ZU-Y summing out Z

11 11 Example et s consider a little more complex network 21 A More omplex Network oal: Eliminate: 22

12 12 A More omplex Network oal: Eliminate: ompute: A More omplex Network oal: Eliminate: ompute:

13 1 A More omplex Network oal: Eliminate: ompute: A More omplex Network oal: Eliminate: ompute:

14 14 A More omplex Network oal: Eliminate: ompute: A More omplex Network oal: Eliminate: ompute:

15 15 A More omplex Network oal: Eliminate: ompute: A More omplex Network oal: Eliminate: dierent ordering Note: intermediate actors tend to be large 1 ordering matters

16 16 nerence With Evidence omputing Y Ee et Y be the query RVs et E be the evidence RVs and e their assignment et Z be all other RVs U-Y-E The general inerence task is Z Y e E Z e E e e Y U U 1 nerence With Evidence oal: hi Eliminate: Below compute hi i h i i i h 1 i h i i 2 i i h i i ierences ess number o variables to be eliminated and are excluded cope o actors tend to be smaller.

17 What s the complexity o variable elimination? omplexity o Variable Elimination Variable elimination consists o enerating the actors s that will be summed out umming out age 1 enerating the actor i 1 x...x ki through actor product operation et i be the scope o i Each entry requires k i multiplications to generate enerating actor i is Ok i Val i umming out Addition operations at most Val i er actor: OkN where Nmax i Val i kmax i k i 4 17

18 omplexity o Variable Elimination tart with n actors nnumber o variables enerate exactly one actor at each iteration there are at most 2n actors enerating actors ay Nmax i Val i At most Σ i Val i k i NΣ i k i N 2n since each actor is multiplied in exactly once and there are 2n actors umming out Σ i Val i N n since we have n summing outs to do Total work is linear in N and n where Nmax i Val i Exponential blowup can be in N i which or actor i can be v m i actor i has m variables with v values each nterpretation: maximum scope size is important. 5 Factors and Undirected raphs The algorithm does not care whether the graph that generated the actors is directed or undirected. The algorithm s input is a set o actors and the only relevant aspect to the computational is the scope o the actors. et s view the algorithm as operating on an undirected graph. For Bayesian networks we consider the moralized Markov network o the original BNs. ow does the network structure change in each variable elimination step? 6 18

19 19 VE as raph Transormation At each step we are computing lot a graph where there is an undirected edge Y i variables and Y appear in the same actor Note: this is the Markov network o the probability on the variables that were not eliminated yet j j j i i Z 7 VE as raph Transormation oal: Eliminate: 8

20 20 VE as raph Transormation oal: Eliminate: ompute: VE as raph Transormation oal: Eliminate: ompute:

21 21 VE as raph Transormation oal: Eliminate: ompute: ill edge VE as raph Transormation oal: Eliminate: ompute:

22 22 VE as raph Transormation oal: Eliminate: ompute: VE as raph Transormation oal: Eliminate: ompute:

23 2 VE as raph Transormation oal: Eliminate: ompute: The nduced raph The induced graph Fα over actors F and ordering α: Union o all o the graphs resulting rom the dierent steps o the variable elimination algorithm. i and j are connected i they appeared in the same actor throughout the VE algorithm using α as the ordering Original graph nduced graph 46

24 The nduced raph The induced graph Fα over actors F and ordering α: Undirected i and j are connected i they appeared in the same actor throughout the VE algorithm using α as the ordering The width o an induced graph width Kα is the number o nodes in the largest clique in the graph minus 1 Minimal induced width o a graph K is min α width Kα Minimal induced width provides a lower bound on best perormance by applying VE to a model that actorized on K ow can we compute the minimal induced width o the graph and the elimination ordering achieving that width? No easy way to answer this question. 47 The nduced raph Finding the optimal ordering is N-hard opeless? No heuristic techniques can ind good elimination orderings reedy search using heuristic cost unction We eliminate variables one at a time in a greedy way so that each step tends to lead to a small blowup in size. At each point ind the node with smallest cost ossible costs: number o neighbors in current graph neighbors o neighbors number o illing edges 48 24

25 nerence should be eicient or certain kinds o graphs 49 Elimination On Trees Tree Bayesian network Each variable has at most one parent All actors involve at most two variables Elimination Eliminate lea variables Maintains tree structure nduced width

26 Elimination on olytrees olytree Bayesian network At most one path between any two variables Theorem: inerence is linear in the network representation size 51 For a ixed graph structure is there any way to reduce the induced width? 52 26

27 nerence By onditioning eneral idea Enumerate the possible values o a variable Apply Variable Elimination in a simpliied network Aggregate the results Transorm s o and to eliminate as parent Observe 5 E 515 tatistical Methods pring

Variable Elimination (VE) Barak Sternberg

Variable Elimination (VE) Barak Sternberg Variable Elimination (VE) Barak Sternberg Basic Ideas in VE Example 1: Let G be a Chain Bayesian Graph: X 1 X 2 X n 1 X n How would one compute P X n = k? Using the CPDs: P X 2 = x = x Val X1 P X 1 = x

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Variable Elimination: Basic Ideas

Variable Elimination: Basic Ideas Variable Elimination: asic Ideas Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference lgorithms 2. Variable Elimination: the asic ideas 3. Variable Elimination Sum-Product VE lgorithm Sum-Product

More information

4 : Exact Inference: Variable Elimination

4 : Exact Inference: Variable Elimination 10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference

More information

Figure 1: Bayesian network for problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs for problem 1. (c) P (E B) C P (D = t) f 0.9 t 0.

Figure 1: Bayesian network for problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs for problem 1. (c) P (E B) C P (D = t) f 0.9 t 0. Probabilistic Artiicial Intelligence Problem Set 3 Oct 27, 2017 1. Variable elimination In this exercise you will use variable elimination to perorm inerence on a bayesian network. Consider the network

More information

Message Passing Algorithms and Junction Tree Algorithms

Message Passing Algorithms and Junction Tree Algorithms Message Passing lgorithms and Junction Tree lgorithms Le Song Machine Learning II: dvanced Topics S 8803ML, Spring 2012 Inference in raphical Models eneral form of the inference problem P X 1,, X n Ψ(

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Inference as Optimization

Inference as Optimization Inference as Optimization Sargur Srihari srihari@cedar.buffalo.edu 1 Topics in Inference as Optimization Overview Exact Inference revisited The Energy Functional Optimizing the Energy Functional 2 Exact

More information

Recitation 9: Graphical Models: D-separation, Variable Elimination and Inference

Recitation 9: Graphical Models: D-separation, Variable Elimination and Inference 10-601b: Machine Learning, Spring 2014 Recitation 9: Graphical Models: -separation, Variable limination and Inference Jing Xiang March 18, 2014 1 -separation Let s start by getting some intuition about

More information

4.1 Notation and probability review

4.1 Notation and probability review Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall

More information

Undirected Graphical Models

Undirected Graphical Models Readings: K&F 4. 4.2 4.3 4.4 Undirected Graphical Models Lecture 4 pr 6 20 SE 55 Statistical Methods Spring 20 Instructor: Su-In Lee University of Washington Seattle ayesian Network Representation irected

More information

12 : Variational Inference I

12 : Variational Inference I 10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of

More information

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall Chapter 8 Cluster Graph & elief ropagation robabilistic Graphical Models 2016 Fall Outlines Variable Elimination 消元法 imple case: linear chain ayesian networks VE in complex graphs Inferences in HMMs and

More information

Directed Graphical Models or Bayesian Networks

Directed Graphical Models or Bayesian Networks Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact

More information

Lecture 4 October 18th

Lecture 4 October 18th Directed and undirected graphical models Fall 2017 Lecture 4 October 18th Lecturer: Guillaume Obozinski Scribe: In this lecture, we will assume that all random variables are discrete, to keep notations

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation

More information

Inference in Bayesian networks

Inference in Bayesian networks Inference in Bayesian networks AIMA2e hapter 14.4 5 1 Outline Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic simulation Approximate inference

More information

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum Graphical Models Lecture 10: Variable Elimina:on, con:nued Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Last Time Probabilis:c inference is

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

Alternative Parameterizations of Markov Networks. Sargur Srihari

Alternative Parameterizations of Markov Networks. Sargur Srihari Alternative Parameterizations of Markov Networks Sargur srihari@cedar.buffalo.edu 1 Topics Three types of parameterization 1. Gibbs Parameterization 2. Factor Graphs 3. Log-linear Models Features (Ising,

More information

Bayesian Networks: Representation, Variable Elimination

Bayesian Networks: Representation, Variable Elimination Bayesian Networks: Representation, Variable Elimination CS 6375: Machine Learning Class Notes Instructor: Vibhav Gogate The University of Texas at Dallas We can view a Bayesian network as a compact representation

More information

Supplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values

Supplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values Supplementary material or Continuous-action planning or discounted ininite-horizon nonlinear optimal control with Lipschitz values List o main notations x, X, u, U state, state space, action, action space,

More information

Exact Inference Algorithms Bucket-elimination

Exact Inference Algorithms Bucket-elimination Exact Inference Algorithms Bucket-elimination COMPSCI 276, Spring 2011 Class 5: Rina Dechter (Reading: class notes chapter 4, Darwiche chapter 6) 1 Belief Updating Smoking lung Cancer Bronchitis X-ray

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 10 Undirected Models CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due this Wednesday (Nov 4) in class Project milestones due next Monday (Nov 9) About half

More information

Inference and Representation

Inference and Representation Inference and Representation David Sontag New York University Lecture 5, Sept. 30, 2014 David Sontag (NYU) Inference and Representation Lecture 5, Sept. 30, 2014 1 / 16 Today s lecture 1 Running-time of

More information

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network. ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm

More information

Artificial Intelligence

Artificial Intelligence ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic

More information

COMPSCI 276 Fall 2007

COMPSCI 276 Fall 2007 Exact Inference lgorithms for Probabilistic Reasoning; OMPSI 276 Fall 2007 1 elief Updating Smoking lung ancer ronchitis X-ray Dyspnoea P lung cancer=yes smoking=no, dyspnoea=yes =? 2 Probabilistic Inference

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 9 Undirected Models CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9)

More information

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1 Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain

More information

Lecture 8 Optimization

Lecture 8 Optimization 4/9/015 Lecture 8 Optimization EE 4386/5301 Computational Methods in EE Spring 015 Optimization 1 Outline Introduction 1D Optimization Parabolic interpolation Golden section search Newton s method Multidimensional

More information

Undirected graphical models

Undirected graphical models Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

5. Sum-product algorithm

5. Sum-product algorithm Sum-product algorithm 5-1 5. Sum-product algorithm Elimination algorithm Sum-product algorithm on a line Sum-product algorithm on a tree Sum-product algorithm 5-2 Inference tasks on graphical models consider

More information

Bayesian & Markov Networks: A unified view

Bayesian & Markov Networks: A unified view School of omputer Science ayesian & Markov Networks: unified view Probabilistic Graphical Models (10-708) Lecture 3, Sep 19, 2007 Receptor Kinase Gene G Receptor X 1 X 2 Kinase Kinase E X 3 X 4 X 5 TF

More information

Lecture 17: May 29, 2002

Lecture 17: May 29, 2002 EE596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 2000 Dept. of Electrical Engineering Lecture 17: May 29, 2002 Lecturer: Jeff ilmes Scribe: Kurt Partridge, Salvador

More information

On High-Rate Cryptographic Compression Functions

On High-Rate Cryptographic Compression Functions On High-Rate Cryptographic Compression Functions Richard Ostertág and Martin Stanek Department o Computer Science Faculty o Mathematics, Physics and Inormatics Comenius University Mlynská dolina, 842 48

More information

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm Probabilistic Graphical Models 10-708 Homework 2: Due February 24, 2014 at 4 pm Directions. This homework assignment covers the material presented in Lectures 4-8. You must complete all four problems to

More information

Curve Sketching. The process of curve sketching can be performed in the following steps:

Curve Sketching. The process of curve sketching can be performed in the following steps: Curve Sketching So ar you have learned how to ind st and nd derivatives o unctions and use these derivatives to determine where a unction is:. Increasing/decreasing. Relative extrema 3. Concavity 4. Points

More information

13 : Variational Inference: Loopy Belief Propagation

13 : Variational Inference: Loopy Belief Propagation 10-708: Probabilistic Graphical Models 10-708, Spring 2014 13 : Variational Inference: Loopy Belief Propagation Lecturer: Eric P. Xing Scribes: Rajarshi Das, Zhengzhong Liu, Dishan Gupta 1 Introduction

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 1: Introduction to Graphical Models Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ epartment of ngineering University of ambridge, UK

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 4 Learning Bayesian Networks CS/CNS/EE 155 Andreas Krause Announcements Another TA: Hongchao Zhou Please fill out the questionnaire about recitations Homework 1 out.

More information

Alternative Parameterizations of Markov Networks. Sargur Srihari

Alternative Parameterizations of Markov Networks. Sargur Srihari Alternative Parameterizations of Markov Networks Sargur srihari@cedar.buffalo.edu 1 Topics Three types of parameterization 1. Gibbs Parameterization 2. Factor Graphs 3. Log-linear Models with Energy functions

More information

Variational algorithms for marginal MAP

Variational algorithms for marginal MAP Variational algorithms for marginal MAP Alexander Ihler UC Irvine CIOG Workshop November 2011 Variational algorithms for marginal MAP Alexander Ihler UC Irvine CIOG Workshop November 2011 Work with Qiang

More information

9 Forward-backward algorithm, sum-product on factor graphs

9 Forward-backward algorithm, sum-product on factor graphs Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous

More information

9.3 Graphing Functions by Plotting Points, The Domain and Range of Functions

9.3 Graphing Functions by Plotting Points, The Domain and Range of Functions 9. Graphing Functions by Plotting Points, The Domain and Range o Functions Now that we have a basic idea o what unctions are and how to deal with them, we would like to start talking about the graph o

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Undirected Graphical Models: Markov Random Fields

Undirected Graphical Models: Markov Random Fields Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected

More information

Probabilistic Graphical Models Lecture Notes Fall 2009

Probabilistic Graphical Models Lecture Notes Fall 2009 Probabilistic Graphical Models Lecture Notes Fall 2009 October 28, 2009 Byoung-Tak Zhang School of omputer Science and Engineering & ognitive Science, Brain Science, and Bioinformatics Seoul National University

More information

CS281A/Stat241A Lecture 19

CS281A/Stat241A Lecture 19 CS281A/Stat241A Lecture 19 p. 1/4 CS281A/Stat241A Lecture 19 Junction Tree Algorithm Peter Bartlett CS281A/Stat241A Lecture 19 p. 2/4 Announcements My office hours: Tuesday Nov 3 (today), 1-2pm, in 723

More information

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course Course on Bayesian Networks, winter term 2007 0/31 Bayesian Networks Bayesian Networks I. Bayesian Networks / 1. Probabilistic Independence and Separation in Graphs Prof. Dr. Lars Schmidt-Thieme, L. B.

More information

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences

Announcements. CS 188: Artificial Intelligence Spring Bayes Net Semantics. Probabilities in BNs. All Conditional Independences CS 188: Artificial Intelligence Spring 2011 Announcements Assignments W4 out today --- this is your last written!! Any assignments you have not picked up yet In bin in 283 Soda [same room as for submission

More information

3 : Representation of Undirected GM

3 : Representation of Undirected GM 10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:

More information

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions

More information

Algorithms and Complexity Results for Exact Bayesian Structure Learning

Algorithms and Complexity Results for Exact Bayesian Structure Learning Algorithms and Complexity Results or Exact Bayesian Structure Learning Sebastian Ordyniak and Stean Szeider Institute o Inormation Systems Vienna University o Technology, Austria Abstract Bayesian structure

More information

Function Operations. I. Ever since basic math, you have been combining numbers by using addition, subtraction, multiplication, and division.

Function Operations. I. Ever since basic math, you have been combining numbers by using addition, subtraction, multiplication, and division. Function Operations I. Ever since basic math, you have been combining numbers by using addition, subtraction, multiplication, and division. Add: 5 + Subtract: 7 Multiply: (9)(0) Divide: (5) () or 5 II.

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Graphical Models Adrian Weller MLSALT4 Lecture Feb 26, 2016 With thanks to David Sontag (NYU) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,

More information

Exact Inference: Clique Trees. Sargur Srihari

Exact Inference: Clique Trees. Sargur Srihari Exact Inference: Clique Trees Sargur srihari@cedar.buffalo.edu 1 Topics 1. Overview 2. Variable Elimination and Clique Trees 3. Message Passing: Sum-Product VE in a Clique Tree Clique-Tree Calibration

More information

Bayesian networks. Chapter 14, Sections 1 4

Bayesian networks. Chapter 14, Sections 1 4 Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks

More information

CS Lecture 4. Markov Random Fields

CS Lecture 4. Markov Random Fields CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields

More information

6.867 Machine learning, lecture 23 (Jaakkola)

6.867 Machine learning, lecture 23 (Jaakkola) Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context

More information

Example: multivariate Gaussian Distribution

Example: multivariate Gaussian Distribution School of omputer Science Probabilistic Graphical Models Representation of undirected GM (continued) Eric Xing Lecture 3, September 16, 2009 Reading: KF-chap4 Eric Xing @ MU, 2005-2009 1 Example: multivariate

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies

More information

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = = Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query

More information

CSC 412 (Lecture 4): Undirected Graphical Models

CSC 412 (Lecture 4): Undirected Graphical Models CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:

More information

COMP538: Introduction to Bayesian Networks

COMP538: Introduction to Bayesian Networks COMP538: Introduction to ayesian Networks Lecture 4: Inference in ayesian Networks: The VE lgorithm Nevin L. Zhang lzhang@cse.ust.hk Department of Computer Science and Engineering Hong Kong University

More information

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks) (Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics

More information

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

On the Relationship between Sum-Product Networks and Bayesian Networks

On the Relationship between Sum-Product Networks and Bayesian Networks On the Relationship between Sum-Product Networks and Bayesian Networks by Han Zhao A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of

More information

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22. Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

Computational Complexity of Inference

Computational Complexity of Inference Computational Complexity of Inference Sargur srihari@cedar.buffalo.edu 1 Topics 1. What is Inference? 2. Complexity Classes 3. Exact Inference 1. Variable Elimination Sum-Product Algorithm 2. Factor Graphs

More information

Inference methods in discrete Bayesian networks

Inference methods in discrete Bayesian networks University of Amsterdam MSc Mathematics Master Thesis Inference methods in discrete Bayesian networks Author: R. Weikamp Supervisors: prof. dr. R. Núñez Queija (UvA) dr. F. Phillipson (TNO) October 30,

More information

Bayesian Networks. Exact Inference by Variable Elimination. Emma Rollon and Javier Larrosa Q

Bayesian Networks. Exact Inference by Variable Elimination. Emma Rollon and Javier Larrosa Q Bayesian Networks Exact Inference by Variable Elimination Emma Rollon and Javier Larrosa Q1-2015-2016 Emma Rollon and Javier Larrosa Bayesian Networks Q1-2015-2016 1 / 25 Recall the most usual queries

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Graphical models. Sunita Sarawagi IIT Bombay

Graphical models. Sunita Sarawagi IIT Bombay 1 Graphical models Sunita Sarawagi IIT Bombay http://www.cse.iitb.ac.in/~sunita 2 Probabilistic modeling Given: several variables: x 1,... x n, n is large. Task: build a joint distribution function Pr(x

More information

Intelligent Systems:

Intelligent Systems: Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 11 CRFs, Exponential Family CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due today Project milestones due next Monday (Nov 9) About half the work should

More information

Undirected Graphical Models

Undirected Graphical Models Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates

More information

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014

1 : Introduction. 1 Course Overview. 2 Notation. 3 Representing Multivariate Distributions : Probabilistic Graphical Models , Spring 2014 10-708: Probabilistic Graphical Models 10-708, Spring 2014 1 : Introduction Lecturer: Eric P. Xing Scribes: Daniel Silva and Calvin McCarter 1 Course Overview In this lecture we introduce the concept of

More information

Uncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes

Uncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes Approximate reasoning Uncertainty and knowledge Introduction All knowledge representation formalism and problem solving mechanisms that we have seen until now are based on the following assumptions: All

More information

( x) f = where P and Q are polynomials.

( x) f = where P and Q are polynomials. 9.8 Graphing Rational Functions Lets begin with a deinition. Deinition: Rational Function A rational unction is a unction o the orm ( ) ( ) ( ) P where P and Q are polynomials. Q An eample o a simple rational

More information

UNIVERSITY OF CALIFORNIA, IRVINE. Fixing and Extending the Multiplicative Approximation Scheme THESIS

UNIVERSITY OF CALIFORNIA, IRVINE. Fixing and Extending the Multiplicative Approximation Scheme THESIS UNIVERSITY OF CALIFORNIA, IRVINE Fixing and Extending the Multiplicative Approximation Scheme THESIS submitted in partial satisfaction of the requirements for the degree of MASTER OF SCIENCE in Information

More information

Bayes Net Representation. CS 188: Artificial Intelligence. Approximate Inference: Sampling. Variable Elimination. Sampling.

Bayes Net Representation. CS 188: Artificial Intelligence. Approximate Inference: Sampling. Variable Elimination. Sampling. 188: Artificial Intelligence Bayes Nets: ampling Bayes Net epresentation A directed, acyclic graph, one node per random variable A conditional probability table (PT) for each node A collection of distributions

More information

Review: Directed Models (Bayes Nets)

Review: Directed Models (Bayes Nets) X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected

More information

PROBABILITY. Inference: constraint propagation

PROBABILITY. Inference: constraint propagation PROBABILITY Inference: constraint propagation! Use the constraints to reduce the number of legal values for a variable! Possible to find a solution without searching! Node consistency " A node is node-consistent

More information

3. Several Random Variables

3. Several Random Variables . Several Random Variables. Two Random Variables. Conditional Probabilit--Revisited. Statistical Independence.4 Correlation between Random Variables. Densit unction o the Sum o Two Random Variables. Probabilit

More information

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference David Poole University of British Columbia 1 Overview Belief Networks Variable Elimination Algorithm Parent Contexts

More information

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Bayesian Networks

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Bayesian Networks STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Bayesian Networks Representing Joint Probability Distributions 2 n -1 free parameters Reducing Number of Parameters: Conditional Independence

More information