Markov Random Fields
|
|
- Russell Horn
- 6 years ago
- Views:
Transcription
1 Markov Random Fields Umamahesh Srinivas ipal Group Meeting February 25, 2011
2 Outline 1 Basic graph-theoretic concepts 2 Markov chain 3 Markov random field (MRF) 4 Gauss-Markov random field (GMRF), and applications 5 Other popular MRFs 02/25/2011 ipal Group Meeting 2
3 References 1 Charles Bouman, Markov random fields and stochastic image models. Tutorial presented at ICIP Mario Figueiredo, Bayesian methods and Markov random fields. Tutorial presented at CVPR /25/2011 ipal Group Meeting 3
4 Basic graph-theoretic concepts A graph G = (V, E) is a finite collection of nodes (or vertices) V = {n 1, n 2,..., n N } and set of edges E ( ) V 2 We consider only undirected graphs Neighbor: Two nodes n i, n j V are neighbors if (n i, n j ) E Neighborhood of a node: N (n i ) = {n j : (n i, n j ) E} Neighborhood is a symmetric relation: n i N (n j ) n j N (n i ) Complete graph: n i V, N (n i ) = {(n i, n j ), j = {1, 2,..., N}\{i}} Clique: a complete subgraph of G. Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 ipal Group Meeting 4
5 Basic graph-theoretic concepts A graph G = (V, E) is a finite collection of nodes (or vertices) V = {n 1, n 2,..., n N } and set of edges E ( ) V 2 We consider only undirected graphs Neighbor: Two nodes n i, n j V are neighbors if (n i, n j ) E Neighborhood of a node: N (n i ) = {n j : (n i, n j ) E} Neighborhood is a symmetric relation: n i N (n j ) n j N (n i ) Complete graph: n i V, N (n i ) = {(n i, n j ), j = {1, 2,..., N}\{i}} Clique: a complete subgraph of G. Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 ipal Group Meeting 4
6 Basic graph-theoretic concepts A graph G = (V, E) is a finite collection of nodes (or vertices) V = {n 1, n 2,..., n N } and set of edges E ( ) V 2 We consider only undirected graphs Neighbor: Two nodes n i, n j V are neighbors if (n i, n j ) E Neighborhood of a node: N (n i ) = {n j : (n i, n j ) E} Neighborhood is a symmetric relation: n i N (n j ) n j N (n i ) Complete graph: n i V, N (n i ) = {(n i, n j ), j = {1, 2,..., N}\{i}} Clique: a complete subgraph of G. Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 ipal Group Meeting 4
7 Illustration V = {1, 2, 3, 4, 5, 6} E = {(1, 2), (1, 3), (2, 4), (2, 5), (3, 4), (3, 6), (4, 6), (5, 6)} N (4) = {2, 3, 6} Examples of cliques: {(1), (3, 4, 6), (2, 5)} Set of all cliques: V E {3, 4, 6} 02/25/2011 ipal Group Meeting 5
8 Separation Let A, B, C be three disjoint subsets of V C separates A from B if any path from a node in A to a node in B contains some node in C Example: C = {1, 4, 6} separates A = {3} from B = {2, 5} 02/25/2011 ipal Group Meeting 6
9 Markov chains Graphical model: Associate each node of a graph with a random variable (or a collection thereof) Homogeneous 1-D Markov chain: p(x n x i, i < n) = p(x n x n 1 ) Probability of a sequence given by: N p(x) = p(x 0 ) p(x n x n 1 ) n=1 02/25/2011 ipal Group Meeting 7
10 2-D Markov chains Advantages: Simple expressions for probability Simple parameter estimation Disadvantages: No natural ordering of image pixels Anisotropic model behavior 02/25/2011 ipal Group Meeting 8
11 Random fields on graphs Consider a collection of random variables x = (x 1, x 2,..., x N ) with associated joint probability distribution p(x) Let A, B, C be three disjoint subsets of V. Let x A denote the collection of random variables in A. Conditional independence: A B C A B C p(x A, x B x C) = p(x A x C)p(x B x C) Markov random field: undirected graphical model in which each node corresponds to a random variable or a collection of random variables, and the edges identify conditional dependencies. 02/25/2011 ipal Group Meeting 9
12 Random fields on graphs Consider a collection of random variables x = (x 1, x 2,..., x N ) with associated joint probability distribution p(x) Let A, B, C be three disjoint subsets of V. Let x A denote the collection of random variables in A. Conditional independence: A B C A B C p(x A, x B x C) = p(x A x C)p(x B x C) Markov random field: undirected graphical model in which each node corresponds to a random variable or a collection of random variables, and the edges identify conditional dependencies. 02/25/2011 ipal Group Meeting 9
13 Markov properties Pairwise Markovianity: (n i, n j ) / E x i and x j are independent when conditioned on all other variables Local Markovianity: p(x i, x j x \{i,j} ) = p(x i x \{i,j} )p(x j x \{i,j} ) Given its neighborhood, a variable is independent on the rest of the variables p(x i x V\{i} ) = p(x i x N (i) ) Global Markovianity: Let A, B, C be three disjoint subsets of V. If C separates A from B p(x A, x B x C ) = p(x A x C )p(x B x C ), then p( ) is global Markov w.r.t. G. 02/25/2011 ipal Group Meeting 10
14 Markov properties Pairwise Markovianity: (n i, n j ) / E x i and x j are independent when conditioned on all other variables Local Markovianity: p(x i, x j x \{i,j} ) = p(x i x \{i,j} )p(x j x \{i,j} ) Given its neighborhood, a variable is independent on the rest of the variables p(x i x V\{i} ) = p(x i x N (i) ) Global Markovianity: Let A, B, C be three disjoint subsets of V. If C separates A from B p(x A, x B x C ) = p(x A x C )p(x B x C ), then p( ) is global Markov w.r.t. G. 02/25/2011 ipal Group Meeting 10
15 Markov properties Pairwise Markovianity: (n i, n j ) / E x i and x j are independent when conditioned on all other variables Local Markovianity: p(x i, x j x \{i,j} ) = p(x i x \{i,j} )p(x j x \{i,j} ) Given its neighborhood, a variable is independent on the rest of the variables p(x i x V\{i} ) = p(x i x N (i) ) Global Markovianity: Let A, B, C be three disjoint subsets of V. If C separates A from B p(x A, x B x C ) = p(x A x C )p(x B x C ), then p( ) is global Markov w.r.t. G. 02/25/2011 ipal Group Meeting 10
16 Hammersley-Clifford Theorem Consider a random field x on a graph G, such that p(x) > 0. Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p(x) can be written as a Gibbs distribution: { p(x) = 1 Z exp } V C (x C ), C C where Z, the normalizing constant, is called the partition function; V C (x C ) are the clique potentials If p(x) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 ipal Group Meeting 11
17 Hammersley-Clifford Theorem Consider a random field x on a graph G, such that p(x) > 0. Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p(x) can be written as a Gibbs distribution: { p(x) = 1 Z exp } V C (x C ), C C where Z, the normalizing constant, is called the partition function; V C (x C ) are the clique potentials If p(x) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 ipal Group Meeting 11
18 Hammersley-Clifford Theorem Consider a random field x on a graph G, such that p(x) > 0. Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p(x) can be written as a Gibbs distribution: { p(x) = 1 Z exp } V C (x C ), C C where Z, the normalizing constant, is called the partition function; V C (x C ) are the clique potentials If p(x) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 ipal Group Meeting 11
19 Regular rectangular lattices V = {(i, j), i = 1,..., M, j = 1,..., N} Order-K neighborhood system: N K (i, j) = {(m, n) : (i m) 2 + (j n) 2 K} 02/25/2011 ipal Group Meeting 12
20 Auto-models Only pair-wise interactions In terms of clique potentials: C > 2 V C ( ) = 0 Simplest possible neighborhood models 02/25/2011 ipal Group Meeting 13
21 Gauss-Markov Random Fields (GMRF) Joint probability function (assuming zero mean): { 1 p(x) = exp 1 } (2π) n/2 Σ 1/2 2 xt Σ 1 x Quadratic form in the exponent: x T Σ 1 x = x i x i Σ 1 i,j auto-model i j The neighborhood system is determined by the potential matrix Σ 1 Local conditionals are univariate Gaussian 02/25/2011 ipal Group Meeting 14
22 Gauss-Markov Random Fields (GMRF) Joint probability function (assuming zero mean): { 1 p(x) = exp 1 } (2π) n/2 Σ 1/2 2 xt Σ 1 x Quadratic form in the exponent: x T Σ 1 x = x i x i Σ 1 i,j auto-model i j The neighborhood system is determined by the potential matrix Σ 1 Local conditionals are univariate Gaussian 02/25/2011 ipal Group Meeting 14
23 Gauss-Markov Random Fields Specification via clique potentials: ( ) 2 ( ) 2 V C (x C ) = 1 αi C x i = 1 αi C x i 2 2 i C as long as i / C αi C = 0 The exponent of the GMRF density becomes: ( ) 2 V C (x C ) = 1 αi C x i 2 C C C C i V ( ) = 1 αi C αj C x i x j = xt Σ 1 x. i V j V C C i V 02/25/2011 ipal Group Meeting 15
24 GMRF: Application to image processing Classical image smoothing prior Consider an image to be a rectangular lattice with first-order pixel neighborhoods Cliques: pairs of vertically or horizontally adjacent pixels Clique potentials: squares of first-order differences (approximation of continuous derivative) V {(i,j),(i,j 1)} (x i,j, x i,j 1 ) = 1 2 (x i,j x i,j 1 ) 2 Resulting Σ 1 : block-tridiagonal with tridiagonal blocks 02/25/2011 ipal Group Meeting 16
25 Bayesian image restoration with GMRF prior Observation model: y = Hx + n, n N(0, σ 2 I) Smoothing GMRF prior: p(x) exp{ 1 2 xt Σ 1 x} MAP estimate: ˆx = [σ 2 Σ 1 + H T H] 1 H T y 02/25/2011 ipal Group Meeting 17
26 Bayesian image restoration with GMRF prior Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restored version of (b), (d) No blur, severe noise, (e) Restored version of (d). Deblurring: good Denoising: oversmoothing; edge discontinuities smoothed out How to preserve discontinuities? Other prior models Hidden/latent binary random variables Robust potential functions (e.g. L 2 vs. L 1-norm) 02/25/2011 ipal Group Meeting 18
27 Bayesian image restoration with GMRF prior Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restored version of (b), (d) No blur, severe noise, (e) Restored version of (d). Deblurring: good Denoising: oversmoothing; edge discontinuities smoothed out How to preserve discontinuities? Other prior models Hidden/latent binary random variables Robust potential functions (e.g. L 2 vs. L 1-norm) 02/25/2011 ipal Group Meeting 18
28 Compound GMRF Insert binary variables v to turn off clique potentials Modified clique potentials: V (x i,j, x i,j 1, v i,j ) = 1 2 (1 v i,j)(x i,j x i,j 1 ) 2 Intuitive explanation: v = 0 clique potential is quadratic ( on ) v = 1 V C ( ) = 0 no smoothing; image has an edge at this location. Can choose separate latent variables v and h for vertical and horizontal edges respectively { p(x h, v) exp 1 } 2 xt Σ 1 (h, v)x MAP estimate: ˆx = [σ 2 Σ 1 (h, v) + H T H] 1 H T y 02/25/2011 ipal Group Meeting 19
29 Compound GMRF Insert binary variables v to turn off clique potentials Modified clique potentials: V (x i,j, x i,j 1, v i,j ) = 1 2 (1 v i,j)(x i,j x i,j 1 ) 2 Intuitive explanation: v = 0 clique potential is quadratic ( on ) v = 1 V C ( ) = 0 no smoothing; image has an edge at this location. Can choose separate latent variables v and h for vertical and horizontal edges respectively { p(x h, v) exp 1 } 2 xt Σ 1 (h, v)x MAP estimate: ˆx = [σ 2 Σ 1 (h, v) + H T H] 1 H T y 02/25/2011 ipal Group Meeting 19
30 Compound GMRF Insert binary variables v to turn off clique potentials Modified clique potentials: V (x i,j, x i,j 1, v i,j ) = 1 2 (1 v i,j)(x i,j x i,j 1 ) 2 Intuitive explanation: v = 0 clique potential is quadratic ( on ) v = 1 V C ( ) = 0 no smoothing; image has an edge at this location. Can choose separate latent variables v and h for vertical and horizontal edges respectively { p(x h, v) exp 1 } 2 xt Σ 1 (h, v)x MAP estimate: ˆx = [σ 2 Σ 1 (h, v) + H T H] 1 H T y 02/25/2011 ipal Group Meeting 19
31 Compound GMRF Insert binary variables v to turn off clique potentials Modified clique potentials: V (x i,j, x i,j 1, v i,j ) = 1 2 (1 v i,j)(x i,j x i,j 1 ) 2 Intuitive explanation: v = 0 clique potential is quadratic ( on ) v = 1 V C ( ) = 0 no smoothing; image has an edge at this location. Can choose separate latent variables v and h for vertical and horizontal edges respectively { p(x h, v) exp 1 } 2 xt Σ 1 (h, v)x MAP estimate: ˆx = [σ 2 Σ 1 (h, v) + H T H] 1 H T y 02/25/2011 ipal Group Meeting 19
32 Discontinuity-preserving restoration Convex potentials: Generalized Gaussians: V (x) = x p, p [1, 2] Stevenson: V (x) = { x 2, 2a x a 2, x < a x a Green: V (x) = 2a 2 log cosh(x/a) Non-convex potentials: Blake, Zisserman: V (x) = (min{ x, a}) 2 Geman, McClure: V (x) = x2 x 2 +a 2 02/25/2011 ipal Group Meeting 20
33 Ising model (2-D MRF) V (x i, x j ) = βδ(x i x j ), β is a model parameter Energy function: V C (x C ) = β(boundary length) C C Longer boundaries less probable 02/25/2011 ipal Group Meeting 21
34 Application: Image segmentation Discrete MRF used to model the segmentation field Each class represented by a value X s {0,..., M 1} Joint distribution function: P {Y dy, X = x} = p(y x)p(x) (Bayesian) MAP estimation: ˆX = arg max x p x Y (x Y ) = arg max(log p(y x) + log(p(x))) x 02/25/2011 ipal Group Meeting 22
35 Application: Image segmentation Discrete MRF used to model the segmentation field Each class represented by a value X s {0,..., M 1} Joint distribution function: P {Y dy, X = x} = p(y x)p(x) (Bayesian) MAP estimation: ˆX = arg max x p x Y (x Y ) = arg max(log p(y x) + log(p(x))) x 02/25/2011 ipal Group Meeting 22
36 Application: Image segmentation Discrete MRF used to model the segmentation field Each class represented by a value X s {0,..., M 1} Joint distribution function: P {Y dy, X = x} = p(y x)p(x) (Bayesian) MAP estimation: ˆX = arg max x p x Y (x Y ) = arg max(log p(y x) + log(p(x))) x 02/25/2011 ipal Group Meeting 22
37 MAP optimization for segmentation Data model: p y x (y x) = s S p(y s x s ) Prior (Ising) model: p x (x) = 1 Z exp{ βt 1(x)}, where t 1 (x) is the number of horizontal and vertical neighbors of x having a different value MAP estimate: Hard optimization problem ˆx = arg min{ log p y x (y x) + βt 1 (x)} x 02/25/2011 ipal Group Meeting 23
38 MAP optimization for segmentation Data model: p y x (y x) = s S p(y s x s ) Prior (Ising) model: p x (x) = 1 Z exp{ βt 1(x)}, where t 1 (x) is the number of horizontal and vertical neighbors of x having a different value MAP estimate: Hard optimization problem ˆx = arg min{ log p y x (y x) + βt 1 (x)} x 02/25/2011 ipal Group Meeting 23
39 Some proposed approaches Iterated conditional modes: iterative minimization w.r.t. each pixel Simulated annealing: Generate samples from prior distribution Multi-scale resolution segmentation Figure: (a) Synthetic image with three textures, (b) ICM, (c) Simulated annealing, (d) Multi-resolution approach. 02/25/2011 ipal Group Meeting 24
40 Summary Graphical models study probability distributions whose conditional dependencies arise out of specific graph structures Markov random field is an undirected graphical model with special factorization properties 2-D MRFs have been widely used as priors in image processing problems Choice of potential functions leads to different optimization problems 02/25/2011 ipal Group Meeting 25
Probabilistic Graphical Models Lecture Notes Fall 2009
Probabilistic Graphical Models Lecture Notes Fall 2009 October 28, 2009 Byoung-Tak Zhang School of omputer Science and Engineering & ognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationUndirected Graphical Models: Markov Random Fields
Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationUndirected Graphical Models
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional
More informationContinuous State MRF s
EE64 Digital Image Processing II: Purdue University VISE - December 4, Continuous State MRF s Topics to be covered: Quadratic functions Non-Convex functions Continuous MAP estimation Convex functions EE64
More informationMAP Examples. Sargur Srihari
MAP Examples Sargur srihari@cedar.buffalo.edu 1 Potts Model CRF for OCR Topics Image segmentation based on energy minimization 2 Examples of MAP Many interesting examples of MAP inference are instances
More informationUndirected graphical models
Undirected graphical models Semantics of probabilistic models over undirected graphs Parameters of undirected models Example applications COMP-652 and ECSE-608, February 16, 2017 1 Undirected graphical
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationCSC 412 (Lecture 4): Undirected Graphical Models
CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:
More informationDirected and Undirected Graphical Models
Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed
More informationReview: Directed Models (Bayes Nets)
X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected
More informationStatistical Learning
Statistical Learning Lecture 5: Bayesian Networks and Graphical Models Mário A. T. Figueiredo Instituto Superior Técnico & Instituto de Telecomunicações University of Lisbon, Portugal May 2018 Mário A.
More informationConditional Independence and Factorization
Conditional Independence and Factorization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationIndependencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)
(Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/24 Lecture 5b Markov random field (MRF) November 13, 2015 2/24 Table of contents 1 1. Objectives of Lecture
More informationPart I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a
More informationDiscriminative Fields for Modeling Spatial Dependencies in Natural Images
Discriminative Fields for Modeling Spatial Dependencies in Natural Images Sanjiv Kumar and Martial Hebert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 {skumar,hebert}@ri.cmu.edu
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationUndirected Graphical Models
Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates
More informationMarkov Chains and MCMC
Markov Chains and MCMC Markov chains Let S = {1, 2,..., N} be a finite set consisting of N states. A Markov chain Y 0, Y 1, Y 2,... is a sequence of random variables, with Y t S for all points in time
More informationMarkov and Gibbs Random Fields
Markov and Gibbs Random Fields Bruno Galerne bruno.galerne@parisdescartes.fr MAP5, Université Paris Descartes Master MVA Cours Méthodes stochastiques pour l analyse d images Lundi 6 mars 2017 Outline The
More information3 Undirected Graphical Models
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 3 Undirected Graphical Models In this lecture, we discuss undirected
More informationCourse 16:198:520: Introduction To Artificial Intelligence Lecture 9. Markov Networks. Abdeslam Boularias. Monday, October 14, 2015
Course 16:198:520: Introduction To Artificial Intelligence Lecture 9 Markov Networks Abdeslam Boularias Monday, October 14, 2015 1 / 58 Overview Bayesian networks, presented in the previous lecture, are
More informationVariational Inference (11/04/13)
STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further
More informationMarkov Random Fields and Bayesian Image Analysis. Wei Liu Advisor: Tom Fletcher
Markov Random Fields and Bayesian Image Analysis Wei Liu Advisor: Tom Fletcher 1 Markov Random Field: Application Overview Awate and Whitaker 2006 2 Markov Random Field: Application Overview 3 Markov Random
More informationProbabilistic Graphical Models
Probabilistic Graphical Models David Sontag New York University Lecture 4, February 16, 2012 David Sontag (NYU) Graphical Models Lecture 4, February 16, 2012 1 / 27 Undirected graphical models Reminder
More informationProbabilistic Graphical Models
2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Variational Inference II: Mean Field Method and Variational Principle Junming Yin Lecture 15, March 7, 2012 X 1 X 1 X 1 X 1 X 2 X 3 X 2 X 2 X 3
More informationSimultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors
Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Sean Borman and Robert L. Stevenson Department of Electrical Engineering, University of Notre Dame Notre Dame,
More informationStatistics 352: Spatial statistics. Jonathan Taylor. Department of Statistics. Models for discrete data. Stanford University.
352: 352: Models for discrete data April 28, 2009 1 / 33 Models for discrete data 352: Outline Dependent discrete data. Image data (binary). Ising models. Simulation: Gibbs sampling. Denoising. 2 / 33
More information1 Undirected Graphical Models. 2 Markov Random Fields (MRFs)
Machine Learning (ML, F16) Lecture#07 (Thursday Nov. 3rd) Lecturer: Byron Boots Undirected Graphical Models 1 Undirected Graphical Models In the previous lecture, we discussed directed graphical models.
More informationBayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models
Bayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models Kohta Aoki 1 and Hiroshi Nagahashi 2 1 Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology
More informationAlternative Parameterizations of Markov Networks. Sargur Srihari
Alternative Parameterizations of Markov Networks Sargur srihari@cedar.buffalo.edu 1 Topics Three types of parameterization 1. Gibbs Parameterization 2. Factor Graphs 3. Log-linear Models Features (Ising,
More informationGibbs Fields & Markov Random Fields
Statistical Techniques in Robotics (16-831, F10) Lecture#7 (Tuesday September 21) Gibbs Fields & Markov Random Fields Lecturer: Drew Bagnell Scribe: Bradford Neuman 1 1 Gibbs Fields Like a Bayes Net, a
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Undirected Graphical Models Mark Schmidt University of British Columbia Winter 2016 Admin Assignment 3: 2 late days to hand it in today, Thursday is final day. Assignment 4:
More informationCovariance Matrix Simplification For Efficient Uncertainty Management
PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part
More informationMarkov Random Fields
Markov Random Fields 1. Markov property The Markov property of a stochastic sequence {X n } n 0 implies that for all n 1, X n is independent of (X k : k / {n 1, n, n + 1}), given (X n 1, X n+1 ). Another
More informationCS Lecture 4. Markov Random Fields
CS 6347 Lecture 4 Markov Random Fields Recap Announcements First homework is available on elearning Reminder: Office hours Tuesday from 10am-11am Last Time Bayesian networks Today Markov random fields
More informationChapter 16. Structured Probabilistic Models for Deep Learning
Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe
More information6 Markov Chain Monte Carlo (MCMC)
6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationIntelligent Systems:
Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition
More informationGraphical Models. Outline. HMM in short. HMMs. What about continuous HMMs? p(o t q t ) ML 701. Anna Goldenberg ... t=1. !
Outline Graphical Models ML 701 nna Goldenberg! ynamic Models! Gaussian Linear Models! Kalman Filter! N! Undirected Models! Unification! Summary HMMs HMM in short! is a ayes Net hidden states! satisfies
More informationBayesian Machine Learning - Lecture 7
Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1
More informationUNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS
UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS JONATHAN YEDIDIA, WILLIAM FREEMAN, YAIR WEISS 2001 MERL TECH REPORT Kristin Branson and Ian Fasel June 11, 2003 1. Inference Inference problems
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationMost image analysis problems are \inference" problems: f g \Inference" Observed image Inferred image For example, \edge detection": \Inference" Page 2
Methods and Bayesian Random Fields Markov Department of Electrical and Computer Engineering Superior Tecnico Lisboa, PORTUGAL email: mtf@lx.it.pt Thanks: Anil K. Jain and Robert D. Nowak, Michigan State
More informationRandom Field Models for Applications in Computer Vision
Random Field Models for Applications in Computer Vision Nazre Batool Post-doctorate Fellow, Team AYIN, INRIA Sophia Antipolis Outline Graphical Models Generative vs. Discriminative Classifiers Markov Random
More informationMarkov properties for undirected graphs
Graphical Models, Lecture 2, Michaelmas Term 2009 October 15, 2009 Formal definition Fundamental properties Random variables X and Y are conditionally independent given the random variable Z if L(X Y,
More informationDual Decomposition for Inference
Dual Decomposition for Inference Yunshu Liu ASPITRG Research Group 2014-05-06 References: [1]. D. Sontag, A. Globerson and T. Jaakkola, Introduction to Dual Decomposition for Inference, Optimization for
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning More Approximate Inference Mark Schmidt University of British Columbia Winter 2018 Last Time: Approximate Inference We ve been discussing graphical models for density estimation,
More informationMarkov Random Fields for Computer Vision (Part 1)
Markov Random Fields for Computer Vision (Part 1) Machine Learning Summer School (MLSS 2011) Stephen Gould stephen.gould@anu.edu.au Australian National University 13 17 June, 2011 Stephen Gould 1/23 Pixel
More informationEfficient Inference in Fully Connected CRFs with Gaussian Edge Potentials
Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials by Phillip Krahenbuhl and Vladlen Koltun Presented by Adam Stambler Multi-class image segmentation Assign a class label to each
More informationBetter restore the recto side of a document with an estimation of the verso side: Markov model and inference with graph cuts
June 23 rd 2008 Better restore the recto side of a document with an estimation of the verso side: Markov model and inference with graph cuts Christian Wolf Laboratoire d InfoRmatique en Image et Systèmes
More informationAlternative Parameterizations of Markov Networks. Sargur Srihari
Alternative Parameterizations of Markov Networks Sargur srihari@cedar.buffalo.edu 1 Topics Three types of parameterization 1. Gibbs Parameterization 2. Factor Graphs 3. Log-linear Models with Energy functions
More informationDiscriminative Random Fields: A Discriminative Framework for Contextual Interaction in Classification
Discriminative Random Fields: A Discriminative Framework for Contextual Interaction in Classification Sanjiv Kumar and Martial Hebert The Robotics Institute, Carnegie Mellon University Pittsburgh, PA 15213,
More informationMarkov random fields. The Markov property
Markov random fields The Markov property Discrete time: (X k X k!1,x k!2,... = (X k X k!1 A time symmetric version: (X k! X!k = (X k X k!1,x k+1 A more general version: Let A be a set of indices >k, B
More informationA Generative Perspective on MRFs in Low-Level Vision Supplemental Material
A Generative Perspective on MRFs in Low-Level Vision Supplemental Material Uwe Schmidt Qi Gao Stefan Roth Department of Computer Science, TU Darmstadt 1. Derivations 1.1. Sampling the Prior We first rewrite
More informationMachine Learning 4771
Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More informationEE512 Graphical Models Fall 2009
EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 3 -
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Markov Random Fields: Representation Conditional Random Fields Log-Linear Models Readings: KF
More informationImplicit Priors for Model-Based Inversion
Purdue University Purdue e-pubs ECE Technical Reports Electrical and Computer Engineering 9-29-20 Implicit Priors for Model-Based Inversion Eri Haneda School of Electrical and Computer Engineering, Purdue
More informationAreal Unit Data Regular or Irregular Grids or Lattices Large Point-referenced Datasets
Areal Unit Data Regular or Irregular Grids or Lattices Large Point-referenced Datasets Is there spatial pattern? Chapter 3: Basics of Areal Data Models p. 1/18 Areal Unit Data Regular or Irregular Grids
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationLecture 4 October 18th
Directed and undirected graphical models Fall 2017 Lecture 4 October 18th Lecturer: Guillaume Obozinski Scribe: In this lecture, we will assume that all random variables are discrete, to keep notations
More informationMarkov properties for undirected graphs
Graphical Models, Lecture 2, Michaelmas Term 2011 October 12, 2011 Formal definition Fundamental properties Random variables X and Y are conditionally independent given the random variable Z if L(X Y,
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Gaussian graphical models and Ising models: modeling networks Eric Xing Lecture 0, February 7, 04 Reading: See class website Eric Xing @ CMU, 005-04
More informationSubmodularity in Machine Learning
Saifuddin Syed MLRG Summer 2016 1 / 39 What are submodular functions Outline 1 What are submodular functions Motivation Submodularity and Concavity Examples 2 Properties of submodular functions Submodularity
More informationEstimation theory and information geometry based on denoising
Estimation theory and information geometry based on denoising Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract What is the best
More informationChapter 10: Random Fields
LEARNING AND INFERENCE IN GRAPHICAL MODELS Chapter 10: Random Fields Dr. Martin Lauer University of Freiburg Machine Learning Lab Karlsruhe Institute of Technology Institute of Measurement and Control
More information4.1 Notation and probability review
Directed and undirected graphical models Fall 2015 Lecture 4 October 21st Lecturer: Simon Lacoste-Julien Scribe: Jaime Roquero, JieYing Wu 4.1 Notation and probability review 4.1.1 Notations Let us recall
More informationEnergy Based Models. Stefano Ermon, Aditya Grover. Stanford University. Lecture 13
Energy Based Models Stefano Ermon, Aditya Grover Stanford University Lecture 13 Stefano Ermon, Aditya Grover (AI Lab) Deep Generative Models Lecture 13 1 / 21 Summary Story so far Representation: Latent
More informationHigh-dimensional graphical model selection: Practical and information-theoretic limits
1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John
More informationRepresentation of undirected GM. Kayhan Batmanghelich
Representation of undirected GM Kayhan Batmanghelich Review Review: Directed Graphical Model Represent distribution of the form ny p(x 1,,X n = p(x i (X i i=1 Factorizes in terms of local conditional probabilities
More informationMarkov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)
Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov
More informationMarkov Random Fields (A Rough Guide)
Sigmedia, Electronic Engineering Dept., Trinity College, Dublin. 1 Markov Random Fields (A Rough Guide) Anil C. Kokaram anil.kokaram@tcd.ie Electrical and Electronic Engineering Dept., University of Dublin,
More informationJoint Optimization of Segmentation and Appearance Models
Joint Optimization of Segmentation and Appearance Models David Mandle, Sameep Tandon April 29, 2013 David Mandle, Sameep Tandon (Stanford) April 29, 2013 1 / 19 Overview 1 Recap: Image Segmentation 2 Optimization
More informationChange Detection in Optical Aerial Images by a Multi-Layer Conditional Mixed Markov Model
Change Detection in Optical Aerial Images by a Multi-Layer Conditional Mixed Markov Model Csaba Benedek 12 Tamás Szirányi 1 1 Distributed Events Analysis Research Group Computer and Automation Research
More informationLearning discrete graphical models via generalized inverse covariance matrices
Learning discrete graphical models via generalized inverse covariance matrices Duzhe Wang, Yiming Lv, Yongjoon Kim, Young Lee Department of Statistics University of Wisconsin-Madison {dwang282, lv23, ykim676,
More informationSpectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods
Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition
More informationEnergy Minimization via Graph Cuts
Energy Minimization via Graph Cuts Xiaowei Zhou, June 11, 2010, Journal Club Presentation 1 outline Introduction MAP formulation for vision problems Min-cut and Max-flow Problem Energy Minimization via
More informationIntroduction to Graphical Models
Introduction to Graphical Models STA 345: Multivariate Analysis Department of Statistical Science Duke University, Durham, NC, USA Robert L. Wolpert 1 Conditional Dependence Two real-valued or vector-valued
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,
More informationGraphical Models and Independence Models
Graphical Models and Independence Models Yunshu Liu ASPITRG Research Group 2014-03-04 References: [1]. Steffen Lauritzen, Graphical Models, Oxford University Press, 1996 [2]. Christopher M. Bishop, Pattern
More informationAn Introduction to Exponential-Family Random Graph Models
An Introduction to Exponential-Family Random Graph Models Luo Lu Feb.8, 2011 1 / 11 Types of complications in social network Single relationship data A single relationship observed on a set of nodes at
More information4 : Exact Inference: Variable Elimination
10-708: Probabilistic Graphical Models 10-708, Spring 2014 4 : Exact Inference: Variable Elimination Lecturer: Eric P. ing Scribes: Soumya Batra, Pradeep Dasigi, Manzil Zaheer 1 Probabilistic Inference
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationHigh dimensional Ising model selection
High dimensional Ising model selection Pradeep Ravikumar UT Austin (based on work with John Lafferty, Martin Wainwright) Sparse Ising model US Senate 109th Congress Banerjee et al, 2008 Estimate a sparse
More informationThe Gibbs fields approach and related dynamics in image processing
Condensed Matter Physics 200, Vol. 11, No 2(54), pp. 293 312 The Gibbs fields approach and related dynamics in image processing X.Descombes 1, E.Zhizhina 2 1 Ariana, Joint group, CNRS/INRIA/UNSA, INRIA,
More informationIntroduction to Bayesian methods in inverse problems
Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction
More informationCS 294-2, Visual Grouping and Recognition èprof. Jitendra Malikè Sept. 8, 1999
CS 294-2, Visual Grouping and Recognition èprof. Jitendra Malikè Sept. 8, 999 Lecture è6 èbayesian estimation and MRFè DRAFT Note by Xunlei Wu æ Bayesian Philosophy æ Markov Random Fields Introduction
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul
More informationConditional Random Fields and beyond DANIEL KHASHABI CS 546 UIUC, 2013
Conditional Random Fields and beyond DANIEL KHASHABI CS 546 UIUC, 2013 Outline Modeling Inference Training Applications Outline Modeling Problem definition Discriminative vs. Generative Chain CRF General
More informationNotes on Regularization and Robust Estimation Psych 267/CS 348D/EE 365 Prof. David J. Heeger September 15, 1998
Notes on Regularization and Robust Estimation Psych 67/CS 348D/EE 365 Prof. David J. Heeger September 5, 998 Regularization. Regularization is a class of techniques that have been widely used to solve
More informationGibbs Field & Markov Random Field
Statistical Techniques in Robotics (16-831, F12) Lecture#07 (Wednesday September 19) Gibbs Field & Markov Random Field Lecturer: Drew Bagnell Scribe:Minghao Ruan 1 1 SLAM (continued from last lecture)
More informationSummary STK 4150/9150
STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than
More informationProbabilistic Graphical Models (I)
Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random
More informationIntelligent Systems (AI-2)
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Oct, 21, 2015 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models CPSC
More information