Machine Learning Summer School
|
|
- Mark Quinn
- 6 years ago
- Views:
Transcription
1 Machine Learning Summer School Lecture 1: Introduction to Graphical Models Zoubin Ghahramani epartment of ngineering University of ambridge, UK Machine Learning epartment arnegie Mellon University, US ugust 2009
2 Three main kinds of graphical models factor graph undirected graph directed graph Nodes correspond to random variables dges represent statistical dependencies between the variables
3 Why do we need graphical models? Graphs are an intuitive way of representing and visualising the relationships between many variables. (xamples: family trees, electric circuit diagrams, neural networks) graph allows us to abstract out the conditional independence relationships between the variables from the details of their parametric forms. Thus we can answer questions like: Is dependent on given that we know the value of? just by looking at the graph. Graphical models allow us to define general message-passing algorithms that implement probabilistic inference efficiently. Thus we can answer queries like What is p( = c)? without enumerating all settings of all variables in the model. Graphical models = statistics graph theory computer science.
4 onditional Independence onditional Independence: when p(y, V ) > 0. lso X Y V p(x Y, V ) = p(x V ) X Y V p(x, Y V ) = p(x V ) p(y V ) In general we can think of conditional independence between sets of variables: X Y V p(x, Y V) = p(x V) p(y V) Marginal Independence: X Y X Y p(x, Y ) = p(x) p(y )
5 onditional and Marginal Independence (xamples) mount of Speeding Fine Type of ar Speed Lung ancer Yellow Teeth Smoking (Position, Velocity) t+1 (Position, Velocity) t 1 (Position, Velocity) t, cceleration t hild s Genes Grandparents Genes Parents Genes bility of Team bility of Team not ( bility of Team bility of Team Outcome of vs Game )
6 Factor Graphs Two types of nodes: The circles in a factor graph represent random variables (e.g. ). (a) (b) The filled dots represent factors in the joint distribution (e.g. g 1 ( )). (a) p(,,,, ) = 1 Z g 1(, )g 2 (,, )g 3 (,, ) (b) p(,,,, ) = 1 Z g 1(, )g 2 (, )g 3 (, )g 4 (, )g 5 (, )g 6 (, ) The g i are non-negative functions of their arguments, and Z is a normalization constant..g. in (a), if all variables are discrete and take values in : Z = X a X X X X g 1 ( = a, = c)g 2 ( = b, = c, = d)g 3 ( = c, = d, = e) b c d e Two nodes are neighbors if they share a common factor.
7 Factor Graphs (a) (b) The circles in a factor graph represent random variables. The filled dots represent factors in the joint distribution. (a) p(,,,, ) = 1 Z g 1(, )g 2 (,, )g 3 (,, ) (b) p(,,,, ) = 1 Z g 1(, )g 2 (, )g 3 (, )g 4 (, )g 5 (, )g 6 (, ) Two nodes are neighbors if they share a common factor. efinition: path is a sequence of neighboring nodes. Fact: X Y V if every path between X and Y contains some node V V orollary: Given the neighbors of X, the variable X is conditionally independent of all other variables: X Y ne(x), Y / {X} ne(x)
8 Proving onditional Independence ssume: p(x, Y, V ) = 1 Z g 1(X, V )g 2 (Y, V ), g 1 g 2 X V Y (1) We want to show conditional independence: X Y V p(x Y, V ) = p(x V ) (2) Summing (1) over X we get: ividing (1) by (3) we get: p(y, V ) = 1 Z [ ] g 1 (X, V ) X g 2 (Y, V ) (3) p(x Y, V ) = g 1(X, V ) X g 1(X, V ) (4) Since the rhs. of (4) doesn t depend on Y, it follows that X is independent of Y given V. Therefore factorizaton (1) implies conditional independence (2).
9 Undirected Graphical Models In an Undirected Graphical Model, the joint probability over all variables can be written in a factored form: p(x) = 1 g j (x j ) Z where x = (x 1,..., x K ), and j {1,..., K} are subsets of the set of all variables, and x S (x k : k S). Graph Specification: reate a node for each variable. onnect nodes i and k if there exists a set j such that both i j and k j. These sets form the cliques of the graph (fully connected subgraphs). Note: Undirected Graphical Models are also called Markov Networks. Very similar to factor graphs. j
10 Undirected Graphical Models p(,,,, ) = 1 Z g 1(, )g 2 (,, )g 3 (,, ) Fact: X Y V if every path between X and Y contains some node V V orollary: Given the neighbors of X, the variable X is conditionally independent of all other variables: X Y ne(x), Y / {X} ne(x) Markov lanket: V is a Markov lanket for X iff X Y V for all Y / {X V}. Markov oundary: minimal Markov lanket ne(x) for undirected and factor graphs
11 omparing Undirected Graphs and Factor Graphs (a) (b) (c) ll nodes in (a), (b), and (c) have exactly the same neighbors and therefore these three graphs represent exactly the same conditional independence relationships. (c) also represents the fact that the probability factors into a product of pairwise functions. onsider the case where each variables is discrete and can take on K possible values. Then the functions in (a) and (b) are tables with O(K 3 ) cells, whereas in (c) they are O(K 2 ).
12 Problems with Undirected Graphs and Factor Graphs In UGs and FGs, many useful independencies are not represented two variables are connected merely because some other variable depends on them: Rain Sprinkler Rain Sprinkler Ground wet Ground wet This highlights the difference between marginal independence and conditional independence. R and S are marginally independent (i.e. given nothing), but they are conditionally dependent given G. xplaining way : Observing that the spinkler is on would explain away the observation that the ground was wet, making it less probable that it rained.
13 irected cyclic Graphical Models (ayesian Networks) G Model / ayesian network 1 corresponds to a factorization of the joint probability distribution: In general: p(,,,, ) = p()p()p(, )p(, )p(, ) p(x 1,..., X n ) = where pa(i) are the parents of node i. n p(x i Xpa(i)) 1 ayesian networks can and often are learned using non-ayesian (i.e. frequentist) methods; ayesian networks (i.e. Gs) do not require parameter or structure learning using ayesian methods. lso called belief networks. i=1
14 irected cyclic Graphical Models (ayesian Networks) Semantics: X Y V if V d-separates X from Y 2. efinition: V d-separates X from Y if every undirected path 3 between X and Y is blocked by V. path is blocked by V if there is a node W on the path such that either: 1. W has converging arrows along the path ( W ) 4 and neither W nor its descendants are observed (in V), or 2. W does not have converging arrows along the path ( W or W ) and W is observed (W V). orollary: Markov oundary for X: {parents(x) children(x) parents-of-children(x)}. 2 See also the ayes all algorithm in the ppendix 3 n undirected path ignores the direction of the edges. 4 Note that converging arrows along the path only refers to what happens on that path. lso called a collider.
15 xamples of -Separation in Gs xamples: since is blocked 1 by, is blocked 1 by, etc. not ( ) since is not blocked. {, } since is blocked 2 by, is blocked 2 by, and is blocked 2 by. not ( ) since is not blocked. Note that it is the absence of edges that conveys conditional independence.
16 From irected Trees to Undirected Trees p(x 1, x 2,..., x 7 ) = p(x 3 )p(x 1 x 3 )p(x 2 x 3 )p(x 4 x 3 )p(x 5 x 4 )p(x 6 x 4 )p(x 7 x 4 ) = p(x 1, x 3 )p(x 2, x 3 )p(x 3, x 4 )p(x 4, x 5 )p(x 4, x 6 )p(x 4, x 7 ) p(x 3 )p(x 3 )p(x 4 )p(x 4 )p(x 4 ) = product of cliques product of clique intersections = g 1 (x 1, x 3 )g 2 (x 2, x 3 )g 3 (x 3, x 4 )g 4 (x 4, x 5 )g 5 (x 4, x 6 )g 6 (x 4, x 7 ) = i g i ( i ) ny directed tree can be converted into an undirected tree representing the same conditional independence relationships, and viceversa.
17 irected Graphs for Statistical Models: Plate Notation onsider the following simple model. data set of N points is generated i.i.d. from a Gaussian with mean µ and standard deviation σ: p(x 1,..., x N, µ, σ) = p(µ)p(σ) This can be represented graphically as follows: N n=1 p(x n µ, σ) μ σ μ σ x n N x 1 x 2... x N
18 xpressive Power of irected and Undirected Graphs No irected Graph (ayesian network) can represent these and only these independencies No matter how we direct the arrows there will always be two non-adjacent parents sharing a common child = dependence in irected Graph but independence in Undirected Graph. No Undirected Graph or Factor Graph can represent these and only these independencies irected graphs are better at expressing causal generative models, undirected graphs are better at representing soft constraints between variables.
19 Summary Three kinds of graphical models: directed, undirected, factor (there are other important classes, e.g. directed mixed graphs) Marginal and conditional independence Markov boundaries and d-separation ifferences between directed and undirected graphs. Next lectures: exact inference and propagation algorithm parameter and structure learning in graphs nonparametric approaches to graph learning
20 ppendix: Some xamples of irected Graphical Models X 1 X K Λ factor analysis probabilistic P Y 1 Y 2 Y X 1 Y 1 X 2 Y 2 X 3 Y 3 X T Y T hidden Markov models linear dynamical systems U 1 U 2 U 3 S 1 S 2 S 3 X 1 X 2 X 3 Y 1 Y 2 Y 3 switching state-space models
21 ppendix: xamples of Undirected Graphical Models Markov Random Fields (used in omputer Vision) xponential Language Models (used in Speech and Language Modelling) { } p(s) = 1 Z p 0(s) exp i λ i f i (s) Products of xperts (widely applicable) p(x) = 1 Z p j (x θ j ) j oltzmann Machines (a kind of Neural Network/Ising Model)
22 ppendix: lique Potentials and Undirected Graphs efinition: a clique is a fully connected subgraph. y clique we usually mean maximal clique (i.e. not contained within another clique) i denotes the set of variables in the i th clique. p(x 1,..., x K ) = 1 Z g i (x i ) i where Z = x 1 x K i g i(x i ) is the normalization. ssociated with each clique i is a non-negative function g i (x i ) which measures compatibility between settings of the variables. xample: Let 1 = {, }, {0, 1}, {0, 1} What does this mean? g 1 (, )
23 ppendix: Hammersley lifford Theorem (1971) Theorem: probability function p formed by a normalized product of positive functions on cliques of G is a Markov Field relative to G. efinition: The distribution p is a Markov Field relative to G if all conditional independence relations represented by G are true of p. G represents the following I relations: If V V lies on all paths between X and Y in G, then X Y V. Proof: We need to show that if p is a product of functions on cliques of G then a variable is conditionally independent of its non-neighbors in G given its neighbors in G. That is: ne(x l ) is a Markov lanket for x l. Let x m / {x l ne(x l )} p(x l, x m,...) = 1 g i (x i ) = 1 g i (x i ) g j (x j ) Z Z i i:l i j:l/ j = 1 ( Z f 1 xl, ne(x l ) ) ( ) 1 f 2 ne(xl ), x m = Z p(x l ne(x l )) p(x m ne(x l )) It follows that: p(x l, x m ne(x l )) = p(x l ne(x l )) p(x m ne(x l )) x l x m ne(x l ).
24 ppendix: The ayes-ball algorithm Game: can you get a ball from X to Y without being blocked by V? epending on the direction the ball came from and the type of node, the ball can pass through (from a parent to all children, from a child to all parents), bounce back (from any parent to all parents, or from any child to all children), or be blocked. n unobserved (hidden) node (W / balls from children. V) passes balls through but also bounces back n observed (given) node (W V) bounces back balls from parents but blocks balls from children.
Statistical Approaches to Learning and Discovery
Statistical Approaches to Learning and Discovery Graphical Models Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon University
More informationPart I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a
More informationExample: multivariate Gaussian Distribution
School of omputer Science Probabilistic Graphical Models Representation of undirected GM (continued) Eric Xing Lecture 3, September 16, 2009 Reading: KF-chap4 Eric Xing @ MU, 2005-2009 1 Example: multivariate
More informationReview: Directed Models (Bayes Nets)
X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September
More informationUndirected Graphical Models
Readings: K&F 4. 4.2 4.3 4.4 Undirected Graphical Models Lecture 4 pr 6 20 SE 55 Statistical Methods Spring 20 Instructor: Su-In Lee University of Washington Seattle ayesian Network Representation irected
More informationBayesian & Markov Networks: A unified view
School of omputer Science ayesian & Markov Networks: unified view Probabilistic Graphical Models (10-708) Lecture 3, Sep 19, 2007 Receptor Kinase Gene G Receptor X 1 X 2 Kinase Kinase E X 3 X 4 X 5 TF
More informationDirected and Undirected Graphical Models
Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed
More informationMachine Learning Lecture 14
Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationInference in Graphical Models Variable Elimination and Message Passing Algorithm
Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption
More informationGraphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence
Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence General overview Introduction Directed acyclic graphs (DAGs) and conditional independence DAGs and causal effects
More information6.867 Machine learning, lecture 23 (Jaakkola)
Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context
More informationChapter 16. Structured Probabilistic Models for Deep Learning
Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe
More information1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution
NETWORK ANALYSIS Lourens Waldorp PROBABILITY AND GRAPHS The objective is to obtain a correspondence between the intuitive pictures (graphs) of variables of interest and the probability distributions of
More informationCausal Discovery Methods Using Causal Probabilistic Networks
ausal iscovery Methods Using ausal Probabilistic Networks MINFO 2004, T02: Machine Learning Methods for ecision Support and iscovery onstantin F. liferis & Ioannis Tsamardinos iscovery Systems Laboratory
More informationRecall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.
ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm
More informationConditional Independence and Factorization
Conditional Independence and Factorization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More information9 Forward-backward algorithm, sum-product on factor graphs
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous
More informationCOMS 4771 Probabilistic Reasoning via Graphical Models. Nakul Verma
COMS 4771 Probabilistic Reasoning via Graphical Models Nakul Verma Last time Dimensionality Reduction Linear vs non-linear Dimensionality Reduction Principal Component Analysis (PCA) Non-linear methods
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Undirected Graphical Models Mark Schmidt University of British Columbia Winter 2016 Admin Assignment 3: 2 late days to hand it in today, Thursday is final day. Assignment 4:
More informationMachine Learning 4771
Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation
More informationDirected Graphical Models
CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,
More informationGraphical Models. Outline. HMM in short. HMMs. What about continuous HMMs? p(o t q t ) ML 701. Anna Goldenberg ... t=1. !
Outline Graphical Models ML 701 nna Goldenberg! ynamic Models! Gaussian Linear Models! Kalman Filter! N! Undirected Models! Unification! Summary HMMs HMM in short! is a ayes Net hidden states! satisfies
More informationBayesian Networks Representation and Reasoning
ayesian Networks Representation and Reasoning Marco F. Ramoni hildren s Hospital Informatics Program Harvard Medical School (2003) Harvard-MIT ivision of Health Sciences and Technology HST.951J: Medical
More informationMessage Passing Algorithms and Junction Tree Algorithms
Message Passing lgorithms and Junction Tree lgorithms Le Song Machine Learning II: dvanced Topics S 8803ML, Spring 2012 Inference in raphical Models eneral form of the inference problem P X 1,, X n Ψ(
More informationUndirected Graphical Models
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft
More informationUndirected Graphical Models 4 Bayesian Networks and Markov Networks. Bayesian Networks to Markov Networks
Undirected Graphical Models 4 ayesian Networks and Markov Networks 1 ayesian Networks to Markov Networks 2 1 Ns to MNs X Y Z Ns can represent independence constraints that MN cannot MNs can represent independence
More informationProbabilistic Graphical Models Lecture Notes Fall 2009
Probabilistic Graphical Models Lecture Notes Fall 2009 October 28, 2009 Byoung-Tak Zhang School of omputer Science and Engineering & ognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationProbabilistic Graphical Models (I)
Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random
More informationCSC 412 (Lecture 4): Undirected Graphical Models
CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationUndirected Graphical Models
Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates
More informationExact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =
Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query
More informationMachine Learning for Data Science (CS4786) Lecture 24
Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each
More informationGraphical Models and Kernel Methods
Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.
More information4.1 Notation and probability review
Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall
More informationUndirected Graphical Models: Markov Random Fields
Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected
More informationProbabilistic Graphical Models and Bayesian Networks. Artificial Intelligence Bert Huang Virginia Tech
Probabilistic Graphical Models and Bayesian Networks Artificial Intelligence Bert Huang Virginia Tech Concept Map for Segment Probabilistic Graphical Models Probabilistic Time Series Models Particle Filters
More informationMarkov properties for directed graphs
Graphical Models, Lecture 7, Michaelmas Term 2009 November 2, 2009 Definitions Structural relations among Markov properties Factorization G = (V, E) simple undirected graph; σ Say σ satisfies (P) the pairwise
More information6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008
MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, Networks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationIntroduction to Causal Calculus
Introduction to Causal Calculus Sanna Tyrväinen University of British Columbia August 1, 2017 1 / 1 2 / 1 Bayesian network Bayesian networks are Directed Acyclic Graphs (DAGs) whose nodes represent random
More informationUndirected graphical models
Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationRepresentation of undirected GM. Kayhan Batmanghelich
Representation of undirected GM Kayhan Batmanghelich Review Review: Directed Graphical Model Represent distribution of the form ny p(x 1,,X n = p(x i (X i i=1 Factorizes in terms of local conditional probabilities
More informationRecitation 9: Graphical Models: D-separation, Variable Elimination and Inference
10-601b: Machine Learning, Spring 2014 Recitation 9: Graphical Models: -separation, Variable limination and Inference Jing Xiang March 18, 2014 1 -separation Let s start by getting some intuition about
More information2 : Directed GMs: Bayesian Networks
10-708: Probabilistic Graphical Models 10-708, Spring 2017 2 : Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Jayanth Koushik, Hiroaki Hayashi, Christian Perez Topic: Directed GMs 1 Types
More informationA Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004
A Brief Introduction to Graphical Models Presenter: Yijuan Lu November 12,2004 References Introduction to Graphical Models, Kevin Murphy, Technical Report, May 2001 Learning in Graphical Models, Michael
More informationLecture 4 October 18th
Directed and undirected graphical models Fall 2017 Lecture 4 October 18th Lecturer: Guillaume Obozinski Scribe: In this lecture, we will assume that all random variables are discrete, to keep notations
More informationBased on slides by Richard Zemel
CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationAn Introduction to Bayesian Machine Learning
1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems
More informationCOS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference
COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationProbabilistic Graphical Models
2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector
More informationStatistical Learning
Statistical Learning Lecture 5: Bayesian Networks and Graphical Models Mário A. T. Figueiredo Instituto Superior Técnico & Instituto de Telecomunicações University of Lisbon, Portugal May 2018 Mário A.
More information4 : Exact Inference: Variable Elimination
10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationLecture 6: Graphical Models
Lecture 6: Graphical Models Kai-Wei Chang CS @ Uniersity of Virginia kw@kwchang.net Some slides are adapted from Viek Skirmar s course on Structured Prediction 1 So far We discussed sequence labeling tasks:
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture Notes Fall 2009 November, 2009 Byoung-Ta Zhang School of Computer Science and Engineering & Cognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationCavities. Independence. Bayes nets. Why do we care about variable independence? P(W,CY,T,CH) = P(W )P(CY)P(T CY)P(CH CY)
ayes nets S151 David Kauchak Fall 2010 http://www.xkcd.com/801/ Some material borrowed from: Sara Owsley Sood and others Independence Two variables are independent if knowing the values of one, does not
More informationCS281A/Stat241A Lecture 19
CS281A/Stat241A Lecture 19 p. 1/4 CS281A/Stat241A Lecture 19 Junction Tree Algorithm Peter Bartlett CS281A/Stat241A Lecture 19 p. 2/4 Announcements My office hours: Tuesday Nov 3 (today), 1-2pm, in 723
More informationBayesian Machine Learning - Lecture 7
Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1
More informationLecture 15. Probabilistic Models on Graph
Lecture 15. Probabilistic Models on Graph Prof. Alan Yuille Spring 2014 1 Introduction We discuss how to define probabilistic models that use richly structured probability distributions and describe how
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul
More informationProbabilistic Graphical Models
Probabilistic Graphical Models David Sontag New York University Lecture 4, February 16, 2012 David Sontag (NYU) Graphical Models Lecture 4, February 16, 2012 1 / 27 Undirected graphical models Reminder
More informationIndependencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)
(Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies
More informationStochastic inference in Bayesian networks, Markov chain Monte Carlo methods
Stochastic inference in Bayesian networks, Markov chain Monte Carlo methods AI: Stochastic inference in BNs AI: Stochastic inference in BNs 1 Outline ypes of inference in (causal) BNs Hardness of exact
More informationOutline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination
Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:
More informationBayesian Networks. Alan Ri2er
Bayesian Networks Alan Ri2er Problem: Non- IID Data Most real- world data is not IID (like coin flips) MulBple correlated variables Examples: Pixels in an image Words in a document Genes in a microarray
More informationThe Origin of Deep Learning. Lili Mou Jan, 2015
The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets
More information1 Undirected Graphical Models. 2 Markov Random Fields (MRFs)
Machine Learning (ML, F16) Lecture#07 (Thursday Nov. 3rd) Lecturer: Byron Boots Undirected Graphical Models 1 Undirected Graphical Models In the previous lecture, we discussed directed graphical models.
More informationMachine Learning for Data Science (CS4786) Lecture 19
Machine Learning for Data Science (CS4786) Lecture 19 Hidden Markov Models Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2017fa/ Quiz Quiz Two variables can be marginally independent but not
More informationECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4
ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/32 Lecture 5a Bayesian network April 14, 2016 2/32 Table of contents 1 1. Objectives of Lecture 5a 2 2.Bayesian
More informationLecture 12: May 09, Decomposable Graphs (continues from last time)
596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of lectrical ngineering Lecture : May 09, 00 Lecturer: Jeff Bilmes Scribe: Hansang ho, Izhak Shafran(000).
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationCognitive Systems 300: Probability and Causality (cont.)
Cognitive Systems 300: Probability and Causality (cont.) David Poole and Peter Danielson University of British Columbia Fall 2013 1 David Poole and Peter Danielson Cognitive Systems 300: Probability and
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ and Center for Automated Learning and
More informationLearning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University
Learning from Sensor Data: Set II Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University 1 6. Data Representation The approach for learning from data Probabilistic
More informationVariational Inference (11/04/13)
STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further
More informationIntroduction to Bayesian Networks
Introduction to Bayesian Networks The two-variable case Assume two binary (Bernoulli distributed) variables A and B Two examples of the joint distribution P(A,B): B=1 B=0 P(A) A=1 0.08 0.02 0.10 A=0 0.72
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationDirected and Undirected Graphical Models
Directed and Undirected Graphical Models Adrian Weller MLSALT4 Lecture Feb 26, 2016 With thanks to David Sontag (NYU) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,
More informationProbabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April
Probabilistic Graphical Models Guest Lecture by Narges Razavian Machine Learning Class April 14 2017 Today What is probabilistic graphical model and why it is useful? Bayesian Networks Basic Inference
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationBayesian Networks 3 D-separation. D-separation
ayesian Networks 3 D-separation 1 D-separation iven a graph, we would like to read off independencies The converse is easier to think about: when does an independence statement not hold? g. when can X
More informationBayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders
More informationProbabilistic Graphical Models (1): representation
1 / 28 Probabilistic Graphical Models (1): representation Qinfeng (Javen) Shi The ustralian entre for Visual Technologies, The University of delaide, ustralia 15 pril 2011 ourse Outline Probabilistic Graphical
More informationBayesian Networks (Part II)
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part II) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,
More informationNaïve Bayes Classifiers
Naïve Bayes Classifiers Example: PlayTennis (6.9.1) Given a new instance, e.g. (Outlook = sunny, Temperature = cool, Humidity = high, Wind = strong ), we want to compute the most likely hypothesis: v NB
More informationCS 188: Artificial Intelligence. Bayes Nets
CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew
More informationTópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863
Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Daniel, Edmundo, Rosa Terceiro trimestre de 2012 UFRJ - COPPE Programa de Engenharia de Sistemas e Computação Bayesian Networks
More informationSoft Computing. Lecture Notes on Machine Learning. Matteo Matteucci.
Soft Computing Lecture Notes on Machine Learning Matteo Matteucci matteucci@elet.polimi.it Department of Electronics and Information Politecnico di Milano Matteo Matteucci c Lecture Notes on Machine Learning
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/24 Lecture 5b Markov random field (MRF) November 13, 2015 2/24 Table of contents 1 1. Objectives of Lecture
More information