Bayesian Microwave Breast Imaging

Size: px
Start display at page:

Download "Bayesian Microwave Breast Imaging"

Transcription

1 Bayesian Microwave Breast Imaging Leila Gharsalli 1 Hacheme Ayasso 2 Bernard Duchêne 1 Ali Mohammad-Djafari 1 1 Laboratoire des Signaux et Systèmes (L2S) (UMR8506: CNRS-SUPELEC-Univ Paris-Sud) Plateau de Moulon, Gif-sur-Yvette, FRANCE. 2 GIPSA-LAB, Département Image Signal (CNRS-Univ Grenoble) BP , Saint Martin d Hères, France.

2 Summary 1 Introduction 2 Forward problem 3 Inversion 4 Inversion 5 Application and results 6 Conclusions and prospects 7 References 2/18

3 Motivation Breast cancer Most commonly diagnosed cancer Second cause of cancer death for women Need to early stage cancer detection Detection techniques X-ray mammography: Ionizing radiation, costly, high false-positive rate, low sensitivity Ultrasound and MRI: High spatial resolution but luck of functional information Microwave imaging: non negligible dielectric contrast between cancerous and normal healthy breast tissues, non-ionizing nature, easy accessibility Promising alternative for breast tumor detection 3/18

4 Microwave imaging Different source positions Several receivers and frequencies in the microwave range Retrieve unknown target (electrical permittivity and conductivity) from measurements of the scattered field, the incident wave being known Non-linear ill-posed inverse problem 4/18

5 Domain integral representation E(r): total field in the object Observation equation y(r) = k1 2 G(r, r ) χ(r )E(r) dr D Coupling equation E(r) = E inc (r)+k1 2 G(r, r ) χ(r )E(r) dr χ(r) = (k(r) 2 k1 2)/k2 1 : normalized contrast function k(r) 2 = ω 2 ɛ 0 µ 0 ɛ r (r) + iωµ 0 σ(r), k 1 the outside medium G(r, r ): Green s function D = propagation constant of 5/18

6 Discretized model Test domain D partitioned into N D elementary pixels Coupling equation rewritten in terms of contrast sources: w(r ) = χ(r )E(r ) Method of moments [Harrington, 1993] bilinear discret model: { y = G o w + ɛ (observation) w = χ E inc + χ G c w + ξ (coupling) ɛ, ξ: model and measurements errors 6/18

7 Bayesian framework { y = G o w + ɛ (observation) w = χ E inc + χ G c w + ξ (coupling) Let consider the first equation: y = G o w + ɛ, p(ɛ) = N (ɛ 0, σ 2 ɛi) Bayes rule: p(y w) = p(y w)p(w) p(y) { p(y w) = N (y G o w, σɛi) 2 p(w) = N (w 0, σw 2 p(w y) = N (w ŵ, I) Σ) { ŵ = [G ot G o + λi] 1 G ot y Σ = [G o G o + λi] 1 We can do the same with the second equation. 7/18

8 Bayesian framework p(χ, w, Θ y; M) = p(y w, Θ; M) p(w χ, Θ; M) p(χ Θ; M) p(θ M) p(y M) Θ: hyper-parameters p(y w, Θ; M) (observation equation) p(w χ, Θ; M) (coupling equation) p(χ Θ; M): a priori on the object p(θ M): a priori information on the hyperparameters p(χ, w, Θ y; M): a posteriori 8/18

9 Mixture of independant Gaussian laws (MGI): pixels of the same class are assumed independant Gauss-Markov mixture model (MGM): a high correlation between pixels of the same class by means of a Markov field 9/18 Gauss-Markov-Potts prior Objects composed of a finite number K of materials distributed into homogenous and compact regions

10 Gauss-Markov-Potts prior Homogeneity of a class: p(χ(r) z(r)) = N (m k (r), v k (r)), k = 1,..., K, - v k (r) = v k r R k m k if C(r) = 1 - m k (r) = 1 χ(r ) if C(r) = 0, N V r V(r) C(r) = 1 r V(r) δ(z(r ) z(r)), V(r) is a neighborhood of r made of the four nearest pixels and N V = card(v). Compactness of the regions (Potts): p(z λ) exp r D ( Φ(z(r)) + λ 2 r V(r) δ (z(r) z(r ))) 10/18

11 Probabilistic modeling Model and measurement errors: ɛ N (0, v ɛ I), ξ N (0, v ξ I) Hyperparameters: p(m k ) = N (µ, τ), p(v k ) = IG(η, φ) p(v ɛ ) = IG(η ɛ, φ ɛ ), p(v ξ ) = IG(η ξ, φ ξ ) A posteriori: p (χ, w, z, Θ y) p (y w, v ɛ ) p (w χ, v ξ ) p (χ z, m, v) p (z λ) p (m µ, τ) p (v η, φ) p (v ɛ η ɛ, φ ɛ ) p (v ξ η ξ, φ ξ ) with Θ = {m, v, v ɛ, v ξ }. 11/18

12 Bayesian inversion techniques p (χ, w, z, Θ y) p (y w, v ɛ ) p (w χ, v ξ ) p (χ z, m, v) p (z λ) p (m µ, τ) p (v η, φ) p (v ɛ η ɛ, φ ɛ ) p (v ξ η ξ, φ ξ ) Joint Maximum A Posteriori (JMAP) (intractable solutions in non-linear cases) [Gharsalli et al., 2014] needs to approximate the joint a posteriori distribution Monte-Carlo Markov Chain sampling (MCMC) (costly) [Féron, 2006] Variational Bayesian Approach (VBA) (less costly alternative) [Ayasso et al., 2012] 12/18

13 Variational Bayesian Approach (VBA) [Mackay, 2003] Aim: Approximating p(χ, w, z, Θ y; M) by a separable law: q(χ, w, z, Θ) = q 1 (χ)q 2 (w)q 3 (z)q 4 (Θ) Criterion: KL(q : p) = q ln q qp p q = = F(q) + ln p(y M) Free negative energy: ( ) F(q) = ln p u,y M ( ), u = {χ, w, z, Θ} q u q Minimize KL(q : p) Maximize F(q) Find: q opt = arg maxq F(q) { } q(u i ) exp log(p(u, y)) q(u j i j ) 13/18

14 Measurement configuration: application to breast cancer detection S source 7.5 cm D₁ receivers skin ( D₂ ) tumor ( D₄ ) 2 cm 10 cm breast ( D₃ ) D Breast model built up from MRI scan [Zastrow and Hagness, 2008] 64 sources 64 receivers 6 frequencies in the band GHz 12.2 cm D = N x N y D1 (ɛ 1, σ 1 (S/m)) D2 (ɛ 2, σ 2 (S/m)) D3 (ɛ 3, σ 3 (S/m)) D4 (ɛ 3, σ 3 (S/m)) (10, 0.5) (6.12, 0.11) ([2.46, 60.6], [0.01, 2.28]) (55.3, 1.57) 14/18

15 Reconstruction CSI: Contrast Source Inversion, VBA: Variational Bayesian Approach, MGI: Independent Gaussian mixture, MGM: Gauss-Markov mixture 15/18

16 Reconstruction Original VBA MGI VBA MGM Hidden field and 3029 pixels have undergone a change of class during the iterative process Re(m k ) Im(m k ) Class 1 Class 2 Class 3 Class Iterations Iterations 16/18

17 Conclusions : - Bayesian approach to microwave imaging: complexity of breast imaging compared to previous applications [Ayasso, 2010] - Better resolution with VBA than with CSI - Better identication of heterogeneous areas with MGM than with MGI prior - Risk of hiding small details of interest Prospects - Accelerate the convergence by introducing a gradient step - Account for breast tissue frequency dispersivity by means of Debye models - Test on laboratory controlled experimental data involving breast phantoms - Study the false alarm rate of the method need of a reference database 17/18

18 , Ayasso, H. (2010). PhD thesis, Université Paris sud 11,. Ayasso, H., Duchêne, B., and Mohammad-Djafari, A. (2012). Inverse Problems in Science and Engineering, 20(1): Féron, O. (2006). PhD thesis, Université Paris sud 11. Gharsalli, L., Ayasso, H., Duchêne, B., and Mohammad-Djafari, A. (2014). Inverse scattering in a bayesian framework: application to microwave imaging for breast cancer detection. Inverse Problems, 30(11): Harrington, R. F. (1993). Field Computation by Moment Methods. Wiley-IEEE Press. Mackay, D. J. C. (2003). Information Theory, Inference, and Learning Algorithms. Cambridge University Press. Zastrow, E., D. S. L. M. K. F. V. V. B. and Hagness, S. (2008). Database of 3d grid-based numerical breast phantoms for use in computational electromagnetics simulations. 18/18

19 Thanks! Questions? 18/18

Bayesian Approach for Inverse Problem in Microwave Tomography

Bayesian Approach for Inverse Problem in Microwave Tomography Bayesian Approach for Inverse Problem in Microwave Tomography Hacheme Ayasso Ali Mohammad-Djafari Bernard Duchêne Laboratoire des Signaux et Systèmes, UMRS 08506 (CNRS-SUPELEC-UNIV PARIS SUD 11), 3 rue

More information

Hierarchical prior models for scalable Bayesian Tomography

Hierarchical prior models for scalable Bayesian Tomography A. Mohammad-Djafari, Hierarchical prior models for scalable Bayesian Tomography, WSTL, January 25-27, 2016, Szeged, Hungary. 1/45. Hierarchical prior models for scalable Bayesian Tomography Ali Mohammad-Djafari

More information

A. Mohammad-Djafari, Bayesian Discrete Tomography from a few number of projections, Mars 21-23, 2016, Polytechnico de Milan, Italy.

A. Mohammad-Djafari, Bayesian Discrete Tomography from a few number of projections, Mars 21-23, 2016, Polytechnico de Milan, Italy. A. Mohammad-Djafari, Bayesian Discrete Tomography from a few number of projections, Mars 21-23, 2016, Polytechnico de Milan, Italy. 1. A Student-t based sparsity enforcing hierarchical prior for linear

More information

Inverse Problems: From Regularization to Bayesian Inference

Inverse Problems: From Regularization to Bayesian Inference 1 / 42 Inverse Problems: From Regularization to Bayesian Inference An Overview on Prior Modeling and Bayesian Computation Application to Computed Tomography Ali Mohammad-Djafari Groupe Problèmes Inverses

More information

Inverse Problems: From Regularization Theory to Probabilistic and Bayesian Inference

Inverse Problems: From Regularization Theory to Probabilistic and Bayesian Inference A. Mohammad-Djafari, Inverse Problems: From Regularization to Bayesian Inference, Pôle signaux, 1/11. Inverse Problems: From Regularization Theory to Probabilistic and Bayesian Inference Ali Mohammad-Djafari

More information

Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model

Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 1/26 Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model Li Wang, Ali Mohammad-Djafari, Nicolas Gac Laboratoire

More information

Bayesian inference methods for sources separation

Bayesian inference methods for sources separation A. Mohammad-Djafari, BeBec2012, February 22-23, 2012, Berlin, Germany, 1/12. Bayesian inference methods for sources separation Ali Mohammad-Djafari Laboratoire des Signaux et Systèmes, UMR8506 CNRS-SUPELEC-UNIV

More information

SUPREME: a SUPer REsolution Mapmaker for Extended emission

SUPREME: a SUPer REsolution Mapmaker for Extended emission SUPREME: a SUPer REsolution Mapmaker for Extended emission Hacheme Ayasso 1,2 Alain Abergel 2 Thomas Rodet 3 François Orieux 2,3,4 Jean-François Giovannelli 5 Karin Dassas 2 Boualam Hasnoun 1 1 GIPSA-LAB

More information

VARIATIONAL BAYES WITH GAUSS-MARKOV-POTTS PRIOR MODELS FOR JOINT IMAGE RESTORATION AND SEGMENTATION

VARIATIONAL BAYES WITH GAUSS-MARKOV-POTTS PRIOR MODELS FOR JOINT IMAGE RESTORATION AND SEGMENTATION VARIATIONAL BAYES WITH GAUSS-MARKOV-POTTS PRIOR MODELS FOR JOINT IMAGE RESTORATION AND SEGMENTATION Hacheme AYASSO and Ali MOHAMMAD-DJAFARI Laboratoire des Signaux et Systèmes, UMR 8506 (CNRS-SUPELEC-UPS)

More information

Bayesian inference for Machine Learning, Inverse Problems and Big Data: from Basics to Computational algorithms

Bayesian inference for Machine Learning, Inverse Problems and Big Data: from Basics to Computational algorithms A. Mohammad-Djafari, Approximate Bayesian Computation for Big Data, Seminar 2016, Nov. 21-25, Aix-Marseille Univ. 1/76. Bayesian inference for Machine Learning, Inverse Problems and Big Data: from Basics

More information

Approximate Bayesian Computation for Big Data

Approximate Bayesian Computation for Big Data A. Mohammad-Djafari, Approximate Bayesian Computation for Big Data, Tutorial at MaxEnt 2016, July 10-15, Gent, Belgium. 1/63. Approximate Bayesian Computation for Big Data Ali Mohammad-Djafari Laboratoire

More information

Breast Cancer Detection by Scattering of Electromagnetic Waves

Breast Cancer Detection by Scattering of Electromagnetic Waves 57 MSAS'2006 Breast Cancer Detection by Scattering of Electromagnetic Waves F. Seydou & T. Seppänen University of Oulu Department of Electrical and Information Engineering P.O. Box 3000, 9040 Oulu Finland

More information

Bayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models

Bayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models Bayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models Kohta Aoki 1 and Hiroshi Nagahashi 2 1 Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology

More information

MAXIMUM ENTROPIES COPULAS

MAXIMUM ENTROPIES COPULAS MAXIMUM ENTROPIES COPULAS Doriano-Boris Pougaza & Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes (UMR 8506 CNRS - SUPELEC - UNIV PARIS SUD) Supélec, Plateau de Moulon,

More information

Regularized Regression A Bayesian point of view

Regularized Regression A Bayesian point of view Regularized Regression A Bayesian point of view Vincent MICHEL Director : Gilles Celeux Supervisor : Bertrand Thirion Parietal Team, INRIA Saclay Ile-de-France LRI, Université Paris Sud CEA, DSV, I2BM,

More information

Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems Entropy 2015, 17, 3989-4027; doi:10.3390/e17063989 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article Entropy, Information Theory, Information Geometry and Bayesian Inference in Data,

More information

Modified 2 gradient method and modified Born method for solving a two-dimensional inverse scattering problem

Modified 2 gradient method and modified Born method for solving a two-dimensional inverse scattering problem INSTITUTE OF PHYSICS PUBLISHING Inverse Problems 17 (2001) 1671 1688 INVERSE PROBLEMS PII: S0266-5611(01)24165-3 Modified 2 gradient method and modified Born method for solving a two-dimensional inverse

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Inverse problems in imaging and computer vision

Inverse problems in imaging and computer vision 1 / 62. Inverse problems in imaging and computer vision From deterministic methods to probabilistic Bayesian inference Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes

More information

Unsupervised Learning

Unsupervised Learning Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College

More information

Variational Methods in Bayesian Deconvolution

Variational Methods in Bayesian Deconvolution PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Variational Scoring of Graphical Model Structures

Variational Scoring of Graphical Model Structures Variational Scoring of Graphical Model Structures Matthew J. Beal Work with Zoubin Ghahramani & Carl Rasmussen, Toronto. 15th September 2003 Overview Bayesian model selection Approximations using Variational

More information

Probabilistic Reasoning in Deep Learning

Probabilistic Reasoning in Deep Learning Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian

More information

CSC2535: Computation in Neural Networks Lecture 7: Variational Bayesian Learning & Model Selection

CSC2535: Computation in Neural Networks Lecture 7: Variational Bayesian Learning & Model Selection CSC2535: Computation in Neural Networks Lecture 7: Variational Bayesian Learning & Model Selection (non-examinable material) Matthew J. Beal February 27, 2004 www.variational-bayes.org Bayesian Model Selection

More information

A Bayesian Local Linear Wavelet Neural Network

A Bayesian Local Linear Wavelet Neural Network A Bayesian Local Linear Wavelet Neural Network Kunikazu Kobayashi, Masanao Obayashi, and Takashi Kuremoto Yamaguchi University, 2-16-1, Tokiwadai, Ube, Yamaguchi 755-8611, Japan {koba, m.obayas, wu}@yamaguchi-u.ac.jp

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

Scalar electromagnetic integral equations

Scalar electromagnetic integral equations Scalar electromagnetic integral equations Uday K Khankhoje Abstract This brief note derives the two dimensional scalar electromagnetic integral equation starting from Maxwell s equations, and shows how

More information

Bayesian Hidden Markov Models and Extensions

Bayesian Hidden Markov Models and Extensions Bayesian Hidden Markov Models and Extensions Zoubin Ghahramani Department of Engineering University of Cambridge joint work with Matt Beal, Jurgen van Gael, Yunus Saatci, Tom Stepleton, Yee Whye Teh Modeling

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm Qiang Liu and Dilin Wang NIPS 2016 Discussion by Yunchen Pu March 17, 2017 March 17, 2017 1 / 8 Introduction Let x R d

More information

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

Variational Inference via Stochastic Backpropagation

Variational Inference via Stochastic Backpropagation Variational Inference via Stochastic Backpropagation Kai Fan February 27, 2016 Preliminaries Stochastic Backpropagation Variational Auto-Encoding Related Work Summary Outline Preliminaries Stochastic Backpropagation

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Linear Regression Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574 1

More information

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven

More information

an introduction to bayesian inference

an introduction to bayesian inference with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

Results: MCMC Dancers, q=10, n=500

Results: MCMC Dancers, q=10, n=500 Motivation Sampling Methods for Bayesian Inference How to track many INTERACTING targets? A Tutorial Frank Dellaert Results: MCMC Dancers, q=10, n=500 1 Probabilistic Topological Maps Results Real-Time

More information

Outline Lecture 3 2(40)

Outline Lecture 3 2(40) Outline Lecture 3 4 Lecture 3 Expectation aximization E and Clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 3-8373 Office: House

More information

Master 2 Informatique Probabilistic Learning and Data Analysis

Master 2 Informatique Probabilistic Learning and Data Analysis Master 2 Informatique Probabilistic Learning and Data Analysis Faicel Chamroukhi Maître de Conférences USTV, LSIS UMR CNRS 7296 email: chamroukhi@univ-tln.fr web: chamroukhi.univ-tln.fr 2013/2014 Faicel

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Image segmentation combining Markov Random Fields and Dirichlet Processes

Image segmentation combining Markov Random Fields and Dirichlet Processes Image segmentation combining Markov Random Fields and Dirichlet Processes Jessica SODJO IMS, Groupe Signal Image, Talence Encadrants : A. Giremus, J.-F. Giovannelli, F. Caron, N. Dobigeon Jessica SODJO

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate

More information

Variational inference

Variational inference Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM

More information

Variational Principal Components

Variational Principal Components Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement

A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement Simon Leglaive 1 Laurent Girin 1,2 Radu Horaud 1 1: Inria Grenoble Rhône-Alpes 2: Univ. Grenoble Alpes, Grenoble INP,

More information

MODEL SELECTION IN ULTRASONIC MEASUREMENTS ON TRABECULAR BONE

MODEL SELECTION IN ULTRASONIC MEASUREMENTS ON TRABECULAR BONE MODEL SELECTION IN ULTRASONIC MEASUREMENTS ON TRABECULAR BONE Christian C. Anderson, M.A., Karen R. Marutyan, Ph.D., Keith A. Wear Ph.D., Mark R. Holland, Ph.D., James G. Miller, Ph.D. and G. Larry Bretthorst,

More information

Markov Networks.

Markov Networks. Markov Networks www.biostat.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts Markov network syntax Markov network semantics Potential functions Partition function

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 143 Part IV

More information

A FLANGED PARALLEL-PLATE WAVEGUIDE PROBE FOR MICROWAVE IMAGING OF TUMORS

A FLANGED PARALLEL-PLATE WAVEGUIDE PROBE FOR MICROWAVE IMAGING OF TUMORS Progress In Electromagnetics Research, PIER 97, 45 60, 2009 A FLANGED PARALLEL-PLATE WAVEGUIDE PROBE FOR MICROWAVE IMAGING OF TUMORS H. Zhang, S. Y. Tan, and H. S. Tan School of Electrical and Electronic

More information

Expectation Propagation for Approximate Bayesian Inference

Expectation Propagation for Approximate Bayesian Inference Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given

More information

Variable selection for model-based clustering

Variable selection for model-based clustering Variable selection for model-based clustering Matthieu Marbac (Ensai - Crest) Joint works with: M. Sedki (Univ. Paris-sud) and V. Vandewalle (Univ. Lille 2) The problem Objective: Estimation of a partition

More information

Lecture 9: PGM Learning

Lecture 9: PGM Learning 13 Oct 2014 Intro. to Stats. Machine Learning COMP SCI 4401/7401 Table of Contents I Learning parameters in MRFs 1 Learning parameters in MRFs Inference and Learning Given parameters (of potentials) and

More information

Approximate Inference using MCMC

Approximate Inference using MCMC Approximate Inference using MCMC 9.520 Class 22 Ruslan Salakhutdinov BCS and CSAIL, MIT 1 Plan 1. Introduction/Notation. 2. Examples of successful Bayesian models. 3. Basic Sampling Algorithms. 4. Markov

More information

Lecture 13 : Variational Inference: Mean Field Approximation

Lecture 13 : Variational Inference: Mean Field Approximation 10-708: Probabilistic Graphical Models 10-708, Spring 2017 Lecture 13 : Variational Inference: Mean Field Approximation Lecturer: Willie Neiswanger Scribes: Xupeng Tong, Minxing Liu 1 Problem Setup 1.1

More information

Variational Learning : From exponential families to multilinear systems

Variational Learning : From exponential families to multilinear systems Variational Learning : From exponential families to multilinear systems Ananth Ranganathan th February 005 Abstract This note aims to give a general overview of variational inference on graphical models.

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo 1 Motivation 1.1 Bayesian Learning Markov Chain Monte Carlo Yale Chang In Bayesian learning, given data X, we make assumptions on the generative process of X by introducing hidden variables Z: p(z): prior

More information

Contrastive Divergence

Contrastive Divergence Contrastive Divergence Training Products of Experts by Minimizing CD Hinton, 2002 Helmut Puhr Institute for Theoretical Computer Science TU Graz June 9, 2010 Contents 1 Theory 2 Argument 3 Contrastive

More information

Graphical Models for Collaborative Filtering

Graphical Models for Collaborative Filtering Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,

More information

On Bayesian Computation

On Bayesian Computation On Bayesian Computation Michael I. Jordan with Elaine Angelino, Maxim Rabinovich, Martin Wainwright and Yun Yang Previous Work: Information Constraints on Inference Minimize the minimax risk under constraints

More information

Summary STK 4150/9150

Summary STK 4150/9150 STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than

More information

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) = Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,

More information

Bayesian Deep Learning

Bayesian Deep Learning Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference

More information

PARAMETER ESTIMATION IN ULTRASONIC MEASUREMENTS ON TRABECULAR BONE

PARAMETER ESTIMATION IN ULTRASONIC MEASUREMENTS ON TRABECULAR BONE PARAMETER ESTIMATIO I ULTRASOIC MEASUREMETS O TRABECULAR BOE Karen R. Marutyan, PhD, Christian C. Anderson, MA, Keith A. Wear PhD, Mark R. Holland, PhD, James G. Miller, PhD and G. Larry Bretthorst, PhD

More information

13: Variational inference II

13: Variational inference II 10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational

More information

Integrated Non-Factorized Variational Inference

Integrated Non-Factorized Variational Inference Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Scaling Neighbourhood Methods

Scaling Neighbourhood Methods Quick Recap Scaling Neighbourhood Methods Collaborative Filtering m = #items n = #users Complexity : m * m * n Comparative Scale of Signals ~50 M users ~25 M items Explicit Ratings ~ O(1M) (1 per billion)

More information

Particle MCMC for Bayesian Microwave Control

Particle MCMC for Bayesian Microwave Control Particle MCMC for Bayesian Microwave Control P. Minvielle 1, A. Todeschini 2, F. Caron 3, P. Del Moral 4, 1 CEA-CESTA, 33114 Le Barp, France 2 INRIA Bordeaux Sud-Ouest, 351, cours de la Liberation, 33405

More information

SCMA292 Mathematical Modeling : Machine Learning. Krikamol Muandet. Department of Mathematics Faculty of Science, Mahidol University.

SCMA292 Mathematical Modeling : Machine Learning. Krikamol Muandet. Department of Mathematics Faculty of Science, Mahidol University. SCMA292 Mathematical Modeling : Machine Learning Krikamol Muandet Department of Mathematics Faculty of Science, Mahidol University February 9, 2016 Outline Quick Recap of Least Square Ridge Regression

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures 17th Europ. Conf. on Machine Learning, Berlin, Germany, 2006. Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures Shipeng Yu 1,2, Kai Yu 2, Volker Tresp 2, and Hans-Peter

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1) HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

Bayesian Knowledge Fusion in Prognostics and Health Management A Case Study

Bayesian Knowledge Fusion in Prognostics and Health Management A Case Study Bayesian Knowledge Fusion in Prognostics and Health Management A Case Study Masoud Rabiei Mohammad Modarres Center for Risk and Reliability University of Maryland-College Park Ali Moahmamd-Djafari Laboratoire

More information

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions Outline 1 Introduction 2 Problem and solution 3 Bayesian tracking model of group debris 4 Simulation results 5 Conclusions Problem The limited capability of the radar can t always satisfy the detection

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Hybrid Dirichlet processes for functional data

Hybrid Dirichlet processes for functional data Hybrid Dirichlet processes for functional data Sonia Petrone Università Bocconi, Milano Joint work with Michele Guindani - U.T. MD Anderson Cancer Center, Houston and Alan Gelfand - Duke University, USA

More information

Hmms with variable dimension structures and extensions

Hmms with variable dimension structures and extensions Hmm days/enst/january 21, 2002 1 Hmms with variable dimension structures and extensions Christian P. Robert Université Paris Dauphine www.ceremade.dauphine.fr/ xian Hmm days/enst/january 21, 2002 2 1 Estimating

More information

Latent Variable Models

Latent Variable Models Latent Variable Models Stefano Ermon, Aditya Grover Stanford University Lecture 5 Stefano Ermon, Aditya Grover (AI Lab) Deep Generative Models Lecture 5 1 / 31 Recap of last lecture 1 Autoregressive models:

More information

Longitudinal breast density as a marker of breast cancer risk

Longitudinal breast density as a marker of breast cancer risk Longitudinal breast density as a marker of breast cancer risk C. Armero (1), M. Rué (2), A. Forte (1), C. Forné (2), H. Perpiñán (1), M. Baré (3), and G. Gómez (4) (1) BIOstatnet and Universitat de València,

More information

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for

More information

Outline lecture 2 2(30)

Outline lecture 2 2(30) Outline lecture 2 2(3), Lecture 2 Linear Regression it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic Control

More information

Quantitative Microwave Imaging Based on a Huber regularization

Quantitative Microwave Imaging Based on a Huber regularization Quantitative Microwave Imaging Based on a Huber regularization Funing Bai, Wilfried Philips and Aleksandra Pižurica Department of Telecommunications and Information Processing (IPI-TELIN-iMinds) Ghent

More information

REDUCTION OF ELETROMAGNETIC PERTURBATIONS BY OPTIMIZING THE PRINTED CIRCUIT BOARD

REDUCTION OF ELETROMAGNETIC PERTURBATIONS BY OPTIMIZING THE PRINTED CIRCUIT BOARD REDUCTION OF ELETROMAGNETIC PERTURBATIONS BY OPTIMIZING THE PRINTED CIRCUIT BOARD J Taki, M Bensetti, D Sadarnac To cite this version: J Taki, M Bensetti, D Sadarnac. REDUCTION OF ELETROMAGNETIC PERTURBATIONS

More information

Expectation Propagation Algorithm

Expectation Propagation Algorithm Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,

More information

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003 Statistical Approaches to Learning and Discovery Week 4: Decision Theory and Risk Minimization February 3, 2003 Recall From Last Time Bayesian expected loss is ρ(π, a) = E π [L(θ, a)] = L(θ, a) df π (θ)

More information

Chapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang

Chapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang Chapter 4 Dynamic Bayesian Networks 2016 Fall Jin Gu, Michael Zhang Reviews: BN Representation Basic steps for BN representations Define variables Define the preliminary relations between variables Check

More information