Bayesian Networks: Belief Propagation in Singly Connected Networks

Size: px
Start display at page:

Download "Bayesian Networks: Belief Propagation in Singly Connected Networks"

Transcription

1 Bayesian Networks: Belief Propagation in Singly Connected Networks Huizhen u janey.yu@cs.helsinki.fi Dept. Computer Science, Univ. of Helsinki Probabilistic Models, Spring, 2010 Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

2 Outline Belief Propagation In Chains In Trees Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

3 Form of Evidence and Notation We denote evidence (a finding) of X = {X v, v V } by e. Formally, we think of e as a function of x taking values in {0, 1}, representing a statement that some elements of x are impossible, i.e., {x e(x) = 1} is the set of possible values of x based on the evidence e. We also refer to this event as e. We consider e that can be written in the factor form e(x) = v V l v (x v ), where l v (x v ) {0, 1}. For A V, we use e A to denote the partial evidence of X A : e A (x A ) = v A l v (x v ). Other short-hand notation we will use: p(x A & e) = P(X A = x A, e), p(x A & e A x B ) = P(X A = x A, e A X B = x B ) = P(X A = x A X B = x B ) e A (x A ), and p(x A e) denotes the conditional PMF of X A given the event e. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

4 Motivation Inference tasks we consider here: calculate p(x v e), x v and P(e) for P that is directed Markov w.r.t. a DAG G. Note that if we know P(e), then we can calculate the posterior probability of a single x given e easily: p(x e) = p(x & e)/p(e) = p(x v x pa(v) ) l v (x v ) /P(e). Since P(X = x, e) = p(x) e(x), in principle we can calculate P(X v = x v, e) = X P(X V \{v} = x V \{v}, X v = x v, e), x V \{v} v V P(e) = X x P(X = x, e). But such calculation is not easy in most problems when V is large. The function p(x v e) is referred to as the belief of x v. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

5 Features of the Algorithms to be Introduced In the algorithms to be introduced, the DAG G is treated also as the architecture for distributed computation: Nodes: associated with autonomous processors Edges: communication links between processors The independence relations represented by the DAG are exploited to separate the total evidence into pieces and streamline the computation. The algorithms have performance guarantee on DAGs with simple structures G has no loops. But they have also been used successfully as approximate inference algorithms on loopy graphs. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

6 In Chains Outline Belief Propagation In Chains In Trees Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

7 In Chains Evidence Structure in a Chain Suppose G is a chain. Consider a vertex v with parent u and child w: e u + u v w e v We write e as three pieces of evidence, e = (e u +, e v, e v ), where e u +: partial evidence of X an(v) e v : partial evidence of X v e v : partial evidence of X de(v) We want to compute p(x v & e) = P(X v = x v, e) for all x v. Since we have P(X an(v), X v, X de(v) ) = P(X an(v) ) P(X v X u) P(X de(v) X v ), p`(x u, x v ) & e) = p(x u & e u +) p(x v & e v x u) p(e v x v ). If v can get the first and third terms from u and w respectively, then v can calculate its marginal p(x v & e) by summing over x u. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

8 In Chains If node v receives Message Passing in a Chain from parent u the probabilities of x u and partial evidence e u + on u s side: π u,v (x u) = p(x u & e u +), x u; from child w the likelihoods of x v based on the partial evidence e v on w s side: λ w,v (x v ) = p(e v x v ), x v, then node v can calculate p(x v & e) = X x u π u,v (x u) p(x v x u) l v (x v ) λ w,v (x v ), x v. What u and w need from v in order to calculate their marginal probabilities? Parent u needs for all x u, the likelihood of x u based on e u = (e v, e v ): λ v,u(x u) = p(e u x u) = X x v p(x v & e v x u) p(e v x v ) = X x v p(x v x u) l v (x v ) λ w,v (x v ). Child w needs for all x v, the probability of x v and e v + = (e u +, e v ): π v,w (x v ) = p(x v & e v +) = X x u π u,v (x u) p(x v x u) l v (x v ). Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

9 In Chains Algorithm Summary λ-messages (likelihoods) π-messages (probabilities) Each node v when receiving the message λ w,v from its child, sends to its parent u λ v,u(x u) = X x v p(x v x u) l v (x v ) λ w,v (x v ), x u; when receiving the message π u,v from its parent, sends to its child w π v,w (x v ) = X x u π u,v (x u) p(x v x u) l v (x v ), x v ; when receiving both messages, calculates p(x v & e) = X x u π u,v (x u) p(x v x u) l v (x v ) λ w,v (x v ), x v, P(e) = X x v p(x v & e), p(x v e) = p(x v & e)/p(e). Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

10 In Trees Outline Belief Propagation In Chains In Trees Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

11 In Trees Evidence Structure in a Rooted Tree Suppose G is a rooted tree. Then G m = G. Consider a vertex v with parent u and children w 1,..., w m: We write the total evidence e as several pieces of evidence, e nd(v) where e = (e nd(v), e v, e Tw1,..., e Twm ), e nd(v) : partial evidence of X nd(v) u v e v : partial evidence of X v e Tw, w ch(v): partial evidence of the variables associated with the subtree T w rooted at w, i.e., X {w} de(w) e Tw1 w 1 w m Since P(X nd(v), X v, X de(v) ) = P(X nd(v) ) P(X v X u) P(X Tw X v ), p`(x u, x v ) & e = p(x u & e nd(v) ) p(x v & e v x u) Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25 p(e Tw x v ).

12 In Trees Message Passing in a Rooted Tree From p`(x u, x v ) & e = p(x u & e nd(v) ) p(x v & e v x u) p(e Tw x v ). we see that if v receives from parent u the probabilities of x u and evidence e nd(v) for all x u: e nd(v) u π u,v (x u) = p(x u & e nd(v) ), x u; from every child w the likelihoods of all x v based on the evidence e Tw : λ w,v (x v ) = p(e Tw x v ), x v, w 1 v w m then node v can calculate p(x v & e) = X π u,v (x u) p(x v x u) l v (x v ) λ w,v (x v ). x u e Tw1 Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

13 In Trees Message Passing in a Rooted Tree What do nodes u and w need from v in order to calculate their marginals? Parent u needs the likelihoods of x u based on e Tv for all x u: λ v,u(x u) = X p(x v & e v x u) p(e Tw x v ) x v = X x v p(x v x u) l v (x v ) Child w needs for all x v, the probability of x v and «e nd(w) = e nd(v), e Tw : w ch(v)\{w} λ w,v (x v ). end(v) u v π v,w (x v ) = p(x v & e nd(w) ) X = p(x u & e nd(v) ) p(x v & e v x u) x u p(e Tw x v ) w ch(v)\{w} X = π u,v (x u) p(x v x u) l v (x v ) x u λ w,v (x v ). w ch(v)\{w} w1 etw 1 wm Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

14 In Trees Each node v sends to its parent u Algorithm Summary λ v,u(x u) = X x v p(x v x u) l v (x v ) sends to its child w X π v,w (x v ) = π u,v (x u) p(x v x u) l v (x v ) x u when receiving all messages, calculates X p(x v & e) = π u,v (x u) p(x v x u) l v (x v ) x u λ w,v (x v ), x u; w ch(v)\{w} P(e) = X x v p(x v & e), p(x v e) = p(x v & e)/p(e). λ w,v (x v ), x v ; λ w,v (x v ) x v, Message passing schemes: (i) Each node can send a message to a linked node if it has received messages from all the other linked nodes. (ii) Each node can send updated messages to linked nodes whenever it gets a new message from some node. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

15 In Trees Illustration of Parallel Updating From J. Peal s book, 1988: At time 0, each node of the tree has calculated its own marginal. At time 1, two new pieces of evidence arrive and trigger new messages. After time 5, all nodes have updated their marginals incorporating the new evidence. λ-message: Data Data π-message: t = 0 t = 1 t = 2 t = 5 t = 4 t = 3 Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

16 Outline Belief Propagation In Chains In Trees Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

17 Definition of a Singly Connected Network Definition: a DAG G is singly connected, if its undirected version G is a tree. Such a G is also called a polytree. In a polytree G: Each node can have multiple parents and children. But there is only one trail between each pair of nodes. Consider a vertex v with parents u 1,... u n and children w 1,..., w m. When v is viewed as the center, the branch of the polytree containing one of its parents or children is a sub-polytree. Denote T vui, i = 1..., n: the sub-polytree containing the node u i, resulting from removing the edge (u i, v); T vwi, i = 1,..., m: the sub-polytree containing the node w i, resulting from removing the edge (v, w i ). Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

18 Evidence Structure in a Singly Connected Network For a sub-polytree T, denote X T : the variables associated with nodes in T etvu 1 u1 un e T : the partial evidence of X T v We have w1 wm P(X Tvu1,..., X Tvun ) = P(X Tvu1 ) P(X Tvun ), and etvw 1 P(X Tvw1,..., X Tvwm X v ) = P(X Tvw1 X v ) P(X Tvwm X v ). (Why? We may argue this using (DG) or d-separation the latter is also simple in this case because there is only one trail between each pair of nodes.) Therefore, p `xpa(v), x v & e = u pa(v) p(x u & e Tvu ) p`x v & e v x pa(v) p(e Tvw x v ). Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

19 From p `xpa(v), x v & e Message Passing in a Singly Connected Network = u pa(v) p(x u & e Tvu ) p`x v & e v x pa(v) p(e Tvw x v ). we see that v can calculate its marginal if it receives messages π u,v from all parents, where etvu 1 u1 un π u,v (x u) = p(x u & e Tvu ), x u; v and λ w,v from all children, where λ w,v (x v ) = p(e Tvw x v ), x v. w1 wm etvw 1 Then, p(x v & e) is given by X p(x v & e) = x pa(v) u pa(v) π u,v (x u) p(x v x pa(v) ) l v (x v ) λ w,v (x v ) Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

20 Message Passing in a Singly Connected Network What do parents need from v in order to calculate their marginals? A parent u needs the likelihoods of all x u based on the partial evidence e Tuv from the sub-polytree on v s side with respect to u: λ v,u(x u) = X x v X x pa(v)\{u} p`x v & e v x pa(v) = X x v X x pa(v)\{u} p`x v x pa(v) lv (x v ) u pa(v)\{u} u pa(v)\{u} p(x u & e Tvu ) p(e Tvw x v ) π u,v (x u ) λ w,v (x v ). e Tvu1 u 1 u n v w 1 w m e Tvw1 Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

21 Message Passing in a Singly Connected Network What do children need from v in order to calculate their marginals? A child w needs for all x v, the probability of x v and the partial evidence e Twv from the sub-polytree on v s side with respect to w: p(x v & e Twv ) = X p`x v & e v x pa(v) p`x u & e Tvu p(e Tvw xv ) x pa(v) u pa(v) w ch(v)\{w} X = p`x v x pa(v) lv (x v ) π u,v (x u) λ w,v (x v ). x pa(v) u pa(v) w ch(v)\{w} e Tvu1 u 1 u n v w 1 w m e Tvw1 Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

22 Algorithm Summary Each node v sends to each u of its parents λ v,u(x u) = X X p(x v x pa(v) ) l v (x v ) x v x pa(v)\{u} π u,v (x u ) λ w,v (x v ), x u; u pa(v)\{u} sends to each w of its children π v,w (x v ) = λ w,v (x v ) X p(x v x pa(v) ) l v (x v ) x pa(v) w ch(v)\{w} π u,v (x u), x v ; u pa(v) when receiving all messages from parents and children, calculates p(x v & e) = λ w,v (x v ) X π u,v (x u) p(x v x pa(v) ) l v (x v ), x v, x pa(v) u pa(v) P(e) = X x v p(x v & e), p(x v e) = p(x v & e)/p(e). Message passing schemes: (i) Each node can send a message to a linked node if it has received messages from all the other linked nodes. (ii) Each node can send updated messages to linked nodes whenever it gets a new message from some node. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

23 Example of Noisy-Or Gate x i, y i, y {0, 1}. P(X i = 1) = p i, P( i = 1 X i = 0) = 0, P( i = 1 X i = 1) = 1 q i. p(y x 1,..., x n) ( Q i:xi =1 q i, if y = 0; = 1 Q i:x i =1 q i, if y = 1. X X X 1 2 n 1 2 n = OR( 1, 2,... n ) Express the message π Xi, i (x i ) in the vector form ˆ π Xi, i (1), π Xi, i (0) : π Xi, i = ˆ p i, 1 p i. Similarly, express π i, (y i ) as ˆ π i, (1), π i, (0) : π i, (y i ) = X π Xi, i (x i )p(y i x i ), so π i, = ˆ p i (1 q i ), p i q i +(1 p i ). x i {0,1} Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

24 Example of Noisy-Or Gate Suppose e : { = 1} is received. Then, sends a message λ,i = ˆ λ,i (1), λ,i (0) to each i, where λ,i (y i ) = X X p(1 y 1,..., y n) π j, (y j ). k i y k {0,1} j i (What are these values?) Subsequently, each i sends to X i the message λ i,x i (x i ): λ i,x i (1) = (1 q i ) λ,i (1) + q i λ,i (0), λ i,x i (0) = λ,i (0). Each X i can calculate its marginal and posterior probability of X i = 1 as P(X i = 1, e) = P(X i = 1) λ i,x i (1), P(X i = 0, e) = P(X i = 0) λ i,x i (0), P(X i = 1 e) = p i λ i,x i (1) p i λ i,x i (1) + (1 p i ) λ i,x i (0). Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

25 Generalizations and Further Reading Find most probable configurations: max x p(x & e) Conditioning: G is not a tree. We condition on certain variables to create several singly connected networks and then fuse together the calculated results. Loopy belief propagation: G is not a tree, but we apply the message passing algorithm any way. Algorithm variants and convergence analysis are active research topics. Further reading: 1. Judea Pearl. Probabilistic Reasoning in Intelligent Systems, Morgan Kaufmann, Chap. 4. Huizhen u (U.H.) Bayesian Networks: Belief Propagation in Singly Connected Networks Feb / 25

Outline. Bayesian Networks: Belief Propagation in Singly Connected Networks. Form of Evidence and Notation. Motivation

Outline. Bayesian Networks: Belief Propagation in Singly Connected Networks. Form of Evidence and Notation. Motivation Outline Bayesian : in Singly Connected Huizhen u Dept. Computer Science, Uni. of Helsinki Huizhen u (U.H.) Bayesian : in Singly Connected Feb. 16 1 / 25 Huizhen u (U.H.) Bayesian : in Singly Connected

More information

9 Forward-backward algorithm, sum-product on factor graphs

9 Forward-backward algorithm, sum-product on factor graphs Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous

More information

Statistical Approaches to Learning and Discovery

Statistical Approaches to Learning and Discovery Statistical Approaches to Learning and Discovery Graphical Models Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon University

More information

Bayesian networks: approximate inference

Bayesian networks: approximate inference Bayesian networks: approximate inference Machine Intelligence Thomas D. Nielsen September 2008 Approximative inference September 2008 1 / 25 Motivation Because of the (worst-case) intractability of exact

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Outline. Introduction to Bayesian Networks. Outline. Directed Acyclic Graphs. Bayesian Networks Overview Building Models Modeling Tricks.

Outline. Introduction to Bayesian Networks. Outline. Directed Acyclic Graphs. Bayesian Networks Overview Building Models Modeling Tricks. Outline Introduction to Huizhen Yu janey.yu@cs.helsinki.fi Dept. omputer Science, Univ. of Helsinki Probabilistic Models, Spring, 2010 cknowledgment: Illustrative examples in this lecture are mostly from

More information

Machine Learning for Data Science (CS4786) Lecture 24

Machine Learning for Data Science (CS4786) Lecture 24 Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each

More information

Lecture 17: May 29, 2002

Lecture 17: May 29, 2002 EE596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 2000 Dept. of Electrical Engineering Lecture 17: May 29, 2002 Lecturer: Jeff ilmes Scribe: Kurt Partridge, Salvador

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Bayesian Machine Learning - Lecture 7

Bayesian Machine Learning - Lecture 7 Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1

More information

Probability Propagation in Singly Connected Networks

Probability Propagation in Singly Connected Networks Lecture 4 Probability Propagation in Singly Connected Networks Intelligent Data Analysis and Probabilistic Inference Lecture 4 Slide 1 Probability Propagation We will now develop a general probability

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Andrea Passerini passerini@disi.unitn.it Machine Learning Inference in graphical models Description Assume we have evidence e on the state of a subset of variables E in the model (i.e. Bayesian Network)

More information

6.867 Machine learning, lecture 23 (Jaakkola)

6.867 Machine learning, lecture 23 (Jaakkola) Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Bayesian Networks: Independencies and Inference

Bayesian Networks: Independencies and Inference Bayesian Networks: Independencies and Inference Scott Davies and Andrew Moore Note to other teachers and users of these slides. Andrew and Scott would be delighted if you found this source material useful

More information

Chapter 04: Exact Inference in Bayesian Networks

Chapter 04: Exact Inference in Bayesian Networks LEARNING AND INFERENCE IN GRAPHICAL MODELS Chapter 04: Exact Inference in Bayesian Networks Dr. Martin Lauer University of Freiburg Machine Learning Lab Karlsruhe Institute of Technology Institute of Measurement

More information

Variational Inference (11/04/13)

Variational Inference (11/04/13) STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Daniel, Edmundo, Rosa Terceiro trimestre de 2012 UFRJ - COPPE Programa de Engenharia de Sistemas e Computação Bayesian Networks

More information

Artificial Intelligence

Artificial Intelligence ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic

More information

Belief Update in CLG Bayesian Networks With Lazy Propagation

Belief Update in CLG Bayesian Networks With Lazy Propagation Belief Update in CLG Bayesian Networks With Lazy Propagation Anders L Madsen HUGIN Expert A/S Gasværksvej 5 9000 Aalborg, Denmark Anders.L.Madsen@hugin.com Abstract In recent years Bayesian networks (BNs)

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods

Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods AI: Stochastic inference in BNs AI: Stochastic inference in BNs 1 Outline ypes of inference in (causal) BNs Hardness of exact

More information

Decisiveness in Loopy Propagation

Decisiveness in Loopy Propagation Decisiveness in Loopy Propagation Janneke H. Bolt and Linda C. van der Gaag Department of Information and Computing Sciences, Utrecht University P.O. Box 80.089, 3508 TB Utrecht, The Netherlands Abstract.

More information

Directed Graphical Models or Bayesian Networks

Directed Graphical Models or Bayesian Networks Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact

More information

Probabilistic and Bayesian Machine Learning

Probabilistic and Bayesian Machine Learning Probabilistic and Bayesian Machine Learning Day 4: Expectation and Belief Propagation Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/

More information

2 : Directed GMs: Bayesian Networks

2 : Directed GMs: Bayesian Networks 10-708: Probabilistic Graphical Models 10-708, Spring 2017 2 : Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Jayanth Koushik, Hiroaki Hayashi, Christian Perez Topic: Directed GMs 1 Types

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Probabilistic Graphical Networks: Definitions and Basic Results

Probabilistic Graphical Networks: Definitions and Basic Results This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical

More information

Probabilistic Graphical Models: Representation and Inference

Probabilistic Graphical Models: Representation and Inference Probabilistic Graphical Models: Representation and Inference Aaron C. Courville Université de Montréal Note: Material for the slides is taken directly from a presentation prepared by Andrew Moore 1 Overview

More information

5. Sum-product algorithm

5. Sum-product algorithm Sum-product algorithm 5-1 5. Sum-product algorithm Elimination algorithm Sum-product algorithm on a line Sum-product algorithm on a tree Sum-product algorithm 5-2 Inference tasks on graphical models consider

More information

Introduction to Artificial Intelligence. Unit # 11

Introduction to Artificial Intelligence. Unit # 11 Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian

More information

Bayesian network modeling. 1

Bayesian network modeling.  1 Bayesian network modeling http://springuniversity.bc3research.org/ 1 Probabilistic vs. deterministic modeling approaches Probabilistic Explanatory power (e.g., r 2 ) Explanation why Based on inductive

More information

Using first-order logic, formalize the following knowledge:

Using first-order logic, formalize the following knowledge: Probabilistic Artificial Intelligence Final Exam Feb 2, 2016 Time limit: 120 minutes Number of pages: 19 Total points: 100 You can use the back of the pages if you run out of space. Collaboration on the

More information

COMPSCI 276 Fall 2007

COMPSCI 276 Fall 2007 Exact Inference lgorithms for Probabilistic Reasoning; OMPSI 276 Fall 2007 1 elief Updating Smoking lung ancer ronchitis X-ray Dyspnoea P lung cancer=yes smoking=no, dyspnoea=yes =? 2 Probabilistic Inference

More information

Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation

Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation Jason K. Johnson, Dmitry M. Malioutov and Alan S. Willsky Department of Electrical Engineering and Computer Science Massachusetts Institute

More information

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination

Bayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication

More information

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4 ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian

More information

Discrete Bayesian Networks: The Exact Posterior Marginal Distributions

Discrete Bayesian Networks: The Exact Posterior Marginal Distributions arxiv:1411.6300v1 [cs.ai] 23 Nov 2014 Discrete Bayesian Networks: The Exact Posterior Marginal Distributions Do Le (Paul) Minh Department of ISDS, California State University, Fullerton CA 92831, USA dminh@fullerton.edu

More information

Decomposable Graphical Gaussian Models

Decomposable Graphical Gaussian Models CIMPA Summerschool, Hammamet 2011, Tunisia September 12, 2011 Basic algorithm This simple algorithm has complexity O( V + E ): 1. Choose v 0 V arbitrary and let v 0 = 1; 2. When vertices {1, 2,..., j}

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September

More information

BMI/CS 776 Lecture 4. Colin Dewey

BMI/CS 776 Lecture 4. Colin Dewey BMI/CS 776 Lecture 4 Colin Dewey 2007.02.01 Outline Common nucleotide substitution models Directed graphical models Ancestral sequence inference Poisson process continuous Markov process X t0 X t1 X t2

More information

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = = Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query

More information

Sampling Algorithms for Probabilistic Graphical models

Sampling Algorithms for Probabilistic Graphical models Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Kyu-Baek Hwang and Byoung-Tak Zhang Biointelligence Lab School of Computer Science and Engineering Seoul National University Seoul 151-742 Korea E-mail: kbhwang@bi.snu.ac.kr

More information

Generalized Loopy 2U: A New Algorithm for Approximate Inference in Credal Networks

Generalized Loopy 2U: A New Algorithm for Approximate Inference in Credal Networks Generalized Loopy 2U: A New Algorithm for Approximate Inference in Credal Networks Alessandro Antonucci, Marco Zaffalon, Yi Sun, Cassio P. de Campos Istituto Dalle Molle di Studi sull Intelligenza Artificiale

More information

UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS

UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS JONATHAN YEDIDIA, WILLIAM FREEMAN, YAIR WEISS 2001 MERL TECH REPORT Kristin Branson and Ian Fasel June 11, 2003 1. Inference Inference problems

More information

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics

More information

Lecture 4: Probability Propagation in Singly Connected Bayesian Networks

Lecture 4: Probability Propagation in Singly Connected Bayesian Networks Lecture 4: Probability Propagation in Singly Connected Bayesian Networks Singly Connected Networks So far we have looked at several cases of probability propagation in networks. We now want to formalise

More information

Bayesian networks. Chapter Chapter

Bayesian networks. Chapter Chapter Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

Bayesian Networks. B. Inference / B.3 Approximate Inference / Sampling

Bayesian Networks. B. Inference / B.3 Approximate Inference / Sampling Bayesian etworks B. Inference / B.3 Approximate Inference / Sampling Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute of Computer Science University of Hildesheim http://www.ismll.uni-hildesheim.de

More information

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004

A Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004 A Brief Introduction to Graphical Models Presenter: Yijuan Lu November 12,2004 References Introduction to Graphical Models, Kevin Murphy, Technical Report, May 2001 Learning in Graphical Models, Michael

More information

Published in: Tenth Tbilisi Symposium on Language, Logic and Computation: Gudauri, Georgia, September 2013

Published in: Tenth Tbilisi Symposium on Language, Logic and Computation: Gudauri, Georgia, September 2013 UvA-DARE (Digital Academic Repository) Estimating the Impact of Variables in Bayesian Belief Networks van Gosliga, S.P.; Groen, F.C.A. Published in: Tenth Tbilisi Symposium on Language, Logic and Computation:

More information

Inference as Optimization

Inference as Optimization Inference as Optimization Sargur Srihari srihari@cedar.buffalo.edu 1 Topics in Inference as Optimization Overview Exact Inference revisited The Energy Functional Optimizing the Energy Functional 2 Exact

More information

Probabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April

Probabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April Probabilistic Graphical Models Guest Lecture by Narges Razavian Machine Learning Class April 14 2017 Today What is probabilistic graphical model and why it is useful? Bayesian Networks Basic Inference

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Bayes Networks 6.872/HST.950

Bayes Networks 6.872/HST.950 Bayes Networks 6.872/HST.950 What Probabilistic Models Should We Use? Full joint distribution Completely expressive Hugely data-hungry Exponential computational complexity Naive Bayes (full conditional

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Variational Inference II: Mean Field Method and Variational Principle Junming Yin Lecture 15, March 7, 2012 X 1 X 1 X 1 X 1 X 2 X 3 X 2 X 2 X 3

More information

Uncertainty and Bayesian Networks

Uncertainty and Bayesian Networks Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks

More information

Inference and Representation

Inference and Representation Inference and Representation David Sontag New York University Lecture 5, Sept. 30, 2014 David Sontag (NYU) Inference and Representation Lecture 5, Sept. 30, 2014 1 / 16 Today s lecture 1 Running-time of

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation

More information

Expectation Propagation in Factor Graphs: A Tutorial

Expectation Propagation in Factor Graphs: A Tutorial DRAFT: Version 0.1, 28 October 2005. Do not distribute. Expectation Propagation in Factor Graphs: A Tutorial Charles Sutton October 28, 2005 Abstract Expectation propagation is an important variational

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich Message Passing and Junction Tree Algorithms Kayhan Batmanghelich 1 Review 2 Review 3 Great Ideas in ML: Message Passing Each soldier receives reports from all branches of tree 3 here 7 here 1 of me 11

More information

On a Discrete Dirichlet Model

On a Discrete Dirichlet Model On a Discrete Dirichlet Model Arthur Choi and Adnan Darwiche University of California, Los Angeles {aychoi, darwiche}@cs.ucla.edu Abstract The Dirichlet distribution is a statistical model that is deeply

More information

4 : Exact Inference: Variable Elimination

4 : Exact Inference: Variable Elimination 10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference

More information

Statistical Learning

Statistical Learning Statistical Learning Lecture 5: Bayesian Networks and Graphical Models Mário A. T. Figueiredo Instituto Superior Técnico & Instituto de Telecomunicações University of Lisbon, Portugal May 2018 Mário A.

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning Undirected Graphical Models Mark Schmidt University of British Columbia Winter 2016 Admin Assignment 3: 2 late days to hand it in today, Thursday is final day. Assignment 4:

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Franz Pernkopf, Robert Peharz, Sebastian Tschiatschek Graz University of Technology, Laboratory of Signal Processing and Speech Communication Inffeldgasse

More information

13: Variational inference II

13: Variational inference II 10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.

Objectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II. Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

Soft Computing. Lecture Notes on Machine Learning. Matteo Matteucci.

Soft Computing. Lecture Notes on Machine Learning. Matteo Matteucci. Soft Computing Lecture Notes on Machine Learning Matteo Matteucci matteucci@elet.polimi.it Department of Electronics and Information Politecnico di Milano Matteo Matteucci c Lecture Notes on Machine Learning

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

Stat 521A Lecture 18 1

Stat 521A Lecture 18 1 Stat 521A Lecture 18 1 Outline Cts and discrete variables (14.1) Gaussian networks (14.2) Conditional Gaussian networks (14.3) Non-linear Gaussian networks (14.4) Sampling (14.5) 2 Hybrid networks A hybrid

More information

12 : Variational Inference I

12 : Variational Inference I 10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Lecture 6: Graphical Models

Lecture 6: Graphical Models Lecture 6: Graphical Models Kai-Wei Chang CS @ Uniersity of Virginia kw@kwchang.net Some slides are adapted from Viek Skirmar s course on Structured Prediction 1 So far We discussed sequence labeling tasks:

More information

Probabilistic Reasoning Systems

Probabilistic Reasoning Systems Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.

More information

Bayesian networks. Chapter 14, Sections 1 4

Bayesian networks. Chapter 14, Sections 1 4 Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks

More information

Probability Propagation

Probability Propagation Graphical Models, Lectures 9 and 10, Michaelmas Term 2009 November 13, 2009 Characterizing chordal graphs The following are equivalent for any undirected graph G. (i) G is chordal; (ii) G is decomposable;

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Bayesian Networks Representation and Reasoning

Bayesian Networks Representation and Reasoning ayesian Networks Representation and Reasoning Marco F. Ramoni hildren s Hospital Informatics Program Harvard Medical School (2003) Harvard-MIT ivision of Health Sciences and Technology HST.951J: Medical

More information

13 : Variational Inference: Loopy Belief Propagation and Mean Field

13 : Variational Inference: Loopy Belief Propagation and Mean Field 10-708: Probabilistic Graphical Models 10-708, Spring 2012 13 : Variational Inference: Loopy Belief Propagation and Mean Field Lecturer: Eric P. Xing Scribes: Peter Schulam and William Wang 1 Introduction

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

A graph contains a set of nodes (vertices) connected by links (edges or arcs) BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,

More information

Tutorial on Exact Belief Propagation in Bayesian Networks: from Messages to Algorithms.

Tutorial on Exact Belief Propagation in Bayesian Networks: from Messages to Algorithms. Tutorial on Exact Belief Propagation in Bayesian Networks: from Messages to Algorithms. arxiv:1201.4724v1 [math.pr] 23 Jan 2012 Gregory Nuel January, 2012 Abstract In Bayesian networks, exact belief propagation

More information

Cutset sampling for Bayesian networks

Cutset sampling for Bayesian networks Cutset sampling for Bayesian networks Cutset sampling for Bayesian networks Bozhena Bidyuk Rina Dechter School of Information and Computer Science University Of California Irvine Irvine, CA 92697-3425

More information

Introduction to Bayesian Networks

Introduction to Bayesian Networks Introduction to Bayesian Networks Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/23 Outline Basic Concepts Bayesian

More information