Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model
|
|
- Stuart Bradford
- 5 years ago
- Views:
Transcription
1 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 1/26 Bayesian X-ray Computed Tomography using a Three-level Hierarchical Prior Model Li Wang, Ali Mohammad-Djafari, Nicolas Gac Laboratoire des Signaux et Systèmes (L2S) UMR8506 CNRS-CentraleSupélec-UNIV PARIS SUD SUPELEC, Gif-sur-Yvette, France li.wang@lss.supelec.fr djafari@lss.supelec.fr nicolas.gac@lss.supelec.fr
2 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 2/26 General presentation Context: X ray Computed Tomography (CT) Classical methods: Analytic(FBP), Algebraic and iterative(ls,qr,l1,reg,tv,etc.) Proposed Bayesian methods: Sparsity enforcing unsupervised three-level hierarchical model 1 Simulation performances Conclusion and perspectives [1], Computed tomography reconstruction based on a hierarchical model and variational Bayesian method. L.Wang, A.Mohamma-Djafari, N.Gac, M.Dumitru. ICASSP 16.
3 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 3/26 Context: X-ray Computed Tomography (CT) (I) Low dose Reduce intensity of rays Reduce number of projections Figure: X-ray CT scanner.
4 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 4/26 Context: X-ray Computed Tomography (CT) (II) Non Destructive Testing Limited number of projections Limited angles of projections Figure: X-ray CT scanner. left: right:
5 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 5/26 Context: Radon Transformation g(r, ) I0 l(r, ) f(x,y) Figure: Parallel beam tomography projection interpretation. y r Hij fj x projection i g(r, φ) = l r,φ f (x, y)dl Without noise: g [i] = j H [i, j] f [j] Accounting for errors: g [i] = j H [i, j] f [j] + ɛ[i] Notations: f = {f 1, f 2,, f N } The quantified object pixel value; ɛ = {ɛ 1, ɛ 2,, ɛ N } additive noise; H = (h mn ) 1<n<N,1<m<M the projection operator; g = {g 1, g 2,, g N } measured projection data;
6 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 6/26 Operators and back-projections g = Hf + ɛ (1) Forward operator: g = Hf Adjoint operator: f = H g Back-projection: f = H g Filtered back-projection: f = H ( HH ) 1 g H huge dimensional In general not accessible directly
7 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 7/26 An example of the direct model original image Projection Back-Projection f ( ) g (64 128) fbp Filtered Back-Projection Filtered Back-Projection Filtered Back-Projection f FBP (64 in [0, π]) ffbp (32 in [0, π]) ffbp (64 in [0, 1 2 π]) Figure: An example of projection.
8 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 8/26 Outline Classical methods Proposed methods: The Hierarchical Haar Transformation Based Method (HHBM) Conclusion and Perspectives
9 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 9/26 Context: Classical methods f = arg minf g Hf λr(f ) Least Square R(f ) = 0 l 2 : Quadratic Regularization(QR) R(f ) = j f j 2 = f 2 l 2 : General QR R(f ) = 2 j [Df ] j = Df 2 l 1 : LASSO R(f ) = j f j = f 1 Total Variation(1D) R(f ) = Df 1 Total Variation(2D) R(f ) = D x f 1 + D y f 1 Main difficulties: How to choose λ? (Cross Validation, L-Curve) How to quantify uncertainties?
10 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 10/26 Bayesian inference Supervised method: Unsupervised method: p (f g) = p (g f ) p (f ) p (g) (2) p (f, θ g) = p (g f, θ) p (f θ) p (θ) p (g) (3) where θ the hyper-parameters. Main steps: Choose prior models: p(f θ) and p(θ); Computation: JMAP, VBA...
11 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 11/26 Outline Classical methods Proposed methods: The Hierarchical Haar Transformation Based Method (HHBM) Conclusion and Perspectives
12 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 12/26 Sparsity in a transform domain Sparsity Specific: volume domain, f is sparse Generally: transformation domain, Df is sparse D: Fourier, Wavelet, Dictionary, etc Sparsity enforcing distribution: Generalized Gaussian distributions (Laplace, double exponential, etc) Gaussian Mixture distributions Heavy tailed distributions (Student-t, Cauchy, etc)
13 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 13/26 Multilevel Haar Transformation (a) (b) (c) (a) Original image; (b) Level 1 Haar transformation; (c) Level 2 Haar transformation. f = Dz + ξ, where z and ξ sparse.
14 (Generalized) Student-t and Sparsity coefficient Pdf Normal, St t and St g Normal St t St g Figure: Pdfs of Normal, Student-t and Generalized Student-t distribution St(f ν) = 0 St g (f α, β) = N (f 0, z) IG(z ν 2, ν 2 ) dz = Γ ( ) ν+1 2 ( νπ Γ ν ) (1 + f 2 ν 2 0 ) ν+1 2 N (f 0, v)ig(v α, β) dv = Γ(α + 1 ) ( f ) 2 (α+ 2 1 ) 2βπ Γ(α) 2β [1] A generalization of Student-t based on Infinite Gaussian Scaled Mixture model and its use as a sparsity enforcing in Bayesian signal processing. M.Dumitru, L.Wang, A.Mohammad-Djafari. L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 14/26
15 Proposed three-level hierarchical model f H g ɛ Direct model { g = Hf + ɛ L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 15/26
16 Proposed three-level hierarchical model ξ z D f ɛ H g Direct model { g = Hf + ɛ f = Dz + ξ D: Haar transformation operator. z: Haar transformation coefficient of f. L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 16/26
17 Proposed three-level hierarchical model α z0, β z0 v z ξ z D f ɛ H g Direct model { g = Hf + ɛ f = Dz + ξ D: Haar transformation operator. z: Haar transformation coefficient of f. L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 17/26
18 Proposed three-level hierarchical model α ξ0, β ξ0 α z0, β z0 Direct model v ξ v z v ɛ ξ α ɛ0, β ɛ0 z D f ɛ H (Details: Appendix I) g { g = Hf + ɛ f = Dz + ξ D: Haar transformation operator. z: Haar transformation coefficient of f. Likelihood: p(g f, v ɛ ). Prior p(f z, v ξ ), p(z v z ), p(v ɛ ), p(v ξ ) and p(v z ). Posterior p(f, z, v ɛ, v ξ, v z g) L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 18/26
19 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 19/26 Estimation Algorithms Bayesian Inference p(f, z, v ɛ, v ξ, v z g) p(g f, v ɛ ) p(f z, v ξ )p(z v z )p(v ɛ )p(v ξ )p(v z ) }{{}}{{} Likelihood Prior Joint Maximum A Posterior (JMAP) ( f, ẑ, v ɛ, v ξ, v z ) = arg max f,z,v ɛ,v ξ,v z {p(f, z, v ɛ, v ξ, v z g)} Posterior Mean (PM) via Variational Bayesian Approach (VBA) p(f, z, v ɛ, v ξ, v z g) q(f, z, v ɛ, v ξ, v z ) = q 1 (f )q 2 (z)q 3 (v ɛ )q 4 (v ξ )q 5 (v z ) by minimizing the Kullback-Leibler divergence: ( ) KL q(θ) : p(θ g) = q(θ) ln q(θ) p(θ g) dθ
20 Results(I) Original images and Reconstructed results using 128 projections with SNR=dB. δ f = f f 2 f Original Image FBP (δ f = 0.32) QR (δ f = 0.056) LASSO (δ f = ) TV (δ f = ) HHBM (δ f = 0.038) L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. /26
21 f Results(II) Zone of profiles of Original and reconstructed Shepp-Logan image. Results are obtained with 128 projections and noise of db. Profile case 128 projection and noise db Original FBP QR LASSO L1Haar TV HHBM x L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 21/26
22 Results(III) δ f δ f with 128 projections and noise db FBP QR LASSO TV HHBM δ f iteration δ f while using 128 projections with SNR=dB δ f with 64 projections and noise db FBP QR LASSO TV HHBM iteration δ f while using 64 projections with SNR=dB L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 22/26
23 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 23/26 Outline Classical methods Proposed methods: The Hierarchical Haar Transformation Based Method (HHBM) Conclusion and Perspectives
24 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 24/26 Conclusion and Perspectives Conclusions: Using sparsity of Haar transformation coefficient Joint MAP estimations Method realizable in 3D case Perspectives: Reconstruction of object and contours simultaneously Using our methods and ASTRA toolbox to solve large scale data problems Find ways to decrease the iterative methods cost
25 Thank you very much! L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 25/26
26 L. Wang, A. Mohammad-Djafari, N. Gac, MaxEnt 16, Ghent, Belgium. 26/26 Appendix (I) The three-level hierarchical haar transformation based model: p(g f, v ɛ ) = N (g Hf, V ɛ ) V ɛ 1 2 exp [ 1 2 (g Hf ) V 1 ɛ (g Hf ) ] [ ] p(f z, v ξ ) = N (f Dz, v ξ ) v ξ 1 2 exp 1 2 (f Dz) V 1 ξ (f Dz) p(z v z ) = N (z 0, V z ) V z 1 2 exp [ ] 1 2 z V 1 z z p(v z α z0, β z0 ) = N j IG(v zj α z0, β z0 ) [ ] N j v (αz 0 +1) z j exp β z0 v 1 z j p(v ɛ α ɛ0, β ɛ0 ) = M i IG(v ɛi α ɛ0, β ɛ0 ) M i v (αɛ 0 +1) ɛ i exp [ ] β ɛ0 v 1 ɛ i p(v ξ α ξ0, β ξ0 ) = N j IG(v ξj α ξ0, β ξ0 ) [ N j v (α ξ 0 +1) exp β ξ0 v 1 ξ j ξ j ]
A. Mohammad-Djafari, Bayesian Discrete Tomography from a few number of projections, Mars 21-23, 2016, Polytechnico de Milan, Italy.
A. Mohammad-Djafari, Bayesian Discrete Tomography from a few number of projections, Mars 21-23, 2016, Polytechnico de Milan, Italy. 1. A Student-t based sparsity enforcing hierarchical prior for linear
More informationHierarchical prior models for scalable Bayesian Tomography
A. Mohammad-Djafari, Hierarchical prior models for scalable Bayesian Tomography, WSTL, January 25-27, 2016, Szeged, Hungary. 1/45. Hierarchical prior models for scalable Bayesian Tomography Ali Mohammad-Djafari
More informationBayesian inference methods for sources separation
A. Mohammad-Djafari, BeBec2012, February 22-23, 2012, Berlin, Germany, 1/12. Bayesian inference methods for sources separation Ali Mohammad-Djafari Laboratoire des Signaux et Systèmes, UMR8506 CNRS-SUPELEC-UNIV
More informationApproximate Bayesian Computation for Big Data
A. Mohammad-Djafari, Approximate Bayesian Computation for Big Data, Tutorial at MaxEnt 2016, July 10-15, Gent, Belgium. 1/63. Approximate Bayesian Computation for Big Data Ali Mohammad-Djafari Laboratoire
More informationInverse Problems: From Regularization to Bayesian Inference
1 / 42 Inverse Problems: From Regularization to Bayesian Inference An Overview on Prior Modeling and Bayesian Computation Application to Computed Tomography Ali Mohammad-Djafari Groupe Problèmes Inverses
More informationBayesian inference for Machine Learning, Inverse Problems and Big Data: from Basics to Computational algorithms
A. Mohammad-Djafari, Approximate Bayesian Computation for Big Data, Seminar 2016, Nov. 21-25, Aix-Marseille Univ. 1/76. Bayesian inference for Machine Learning, Inverse Problems and Big Data: from Basics
More informationBayesian Microwave Breast Imaging
Bayesian Microwave Breast Imaging Leila Gharsalli 1 Hacheme Ayasso 2 Bernard Duchêne 1 Ali Mohammad-Djafari 1 1 Laboratoire des Signaux et Systèmes (L2S) (UMR8506: CNRS-SUPELEC-Univ Paris-Sud) Plateau
More informationBayesian Approach for Inverse Problem in Microwave Tomography
Bayesian Approach for Inverse Problem in Microwave Tomography Hacheme Ayasso Ali Mohammad-Djafari Bernard Duchêne Laboratoire des Signaux et Systèmes, UMRS 08506 (CNRS-SUPELEC-UNIV PARIS SUD 11), 3 rue
More informationInverse Problems: From Regularization Theory to Probabilistic and Bayesian Inference
A. Mohammad-Djafari, Inverse Problems: From Regularization to Bayesian Inference, Pôle signaux, 1/11. Inverse Problems: From Regularization Theory to Probabilistic and Bayesian Inference Ali Mohammad-Djafari
More informationEntropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems
Entropy 2015, 17, 3989-4027; doi:10.3390/e17063989 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article Entropy, Information Theory, Information Geometry and Bayesian Inference in Data,
More informationVariational inference
Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM
More informationSparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference
Sparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference Shunsuke Horii Waseda University s.horii@aoni.waseda.jp Abstract In this paper, we present a hierarchical model which
More informationMAXIMUM ENTROPIES COPULAS
MAXIMUM ENTROPIES COPULAS Doriano-Boris Pougaza & Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes (UMR 8506 CNRS - SUPELEC - UNIV PARIS SUD) Supélec, Plateau de Moulon,
More informationVARIATIONAL BAYES WITH GAUSS-MARKOV-POTTS PRIOR MODELS FOR JOINT IMAGE RESTORATION AND SEGMENTATION
VARIATIONAL BAYES WITH GAUSS-MARKOV-POTTS PRIOR MODELS FOR JOINT IMAGE RESTORATION AND SEGMENTATION Hacheme AYASSO and Ali MOHAMMAD-DJAFARI Laboratoire des Signaux et Systèmes, UMR 8506 (CNRS-SUPELEC-UPS)
More information2D X-Ray Tomographic Reconstruction From Few Projections
2D X-Ray Tomographic Reconstruction From Few Projections Application of Compressed Sensing Theory CEA-LID, Thalès, UJF 6 octobre 2009 Introduction Plan 1 Introduction 2 Overview of Compressed Sensing Theory
More informationOutline Lecture 2 2(32)
Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic
More informationHierarchical Sparse Bayesian Learning. Pierre Garrigues UC Berkeley
Hierarchical Sparse Bayesian Learning Pierre Garrigues UC Berkeley Outline Motivation Description of the model Inference Learning Results Motivation Assumption: sensory systems are adapted to the statistical
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationInverse problems in imaging and computer vision
1 / 62. Inverse problems in imaging and computer vision From deterministic methods to probabilistic Bayesian inference Ali Mohammad-Djafari Groupe Problèmes Inverses Laboratoire des Signaux et Systèmes
More informationA Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement
A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement Simon Leglaive 1 Laurent Girin 1,2 Radu Horaud 1 1: Inria Grenoble Rhône-Alpes 2: Univ. Grenoble Alpes, Grenoble INP,
More informationBayesian Knowledge Fusion in Prognostics and Health Management A Case Study
Bayesian Knowledge Fusion in Prognostics and Health Management A Case Study Masoud Rabiei Mohammad Modarres Center for Risk and Reliability University of Maryland-College Park Ali Moahmamd-Djafari Laboratoire
More informationVariational Methods in Bayesian Deconvolution
PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the
More informationThe Bayesian approach to inverse problems
The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu
More informationCheng Soon Ong & Christian Walder. Canberra February June 2017
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2017 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 679 Part XIX
More informationCovariance Matrix Simplification For Efficient Uncertainty Management
PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part
More informationStochastic Spectral Approaches to Bayesian Inference
Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationBayesian Dropout. Tue Herlau, Morten Morup and Mikkel N. Schmidt. Feb 20, Discussed by: Yizhe Zhang
Bayesian Dropout Tue Herlau, Morten Morup and Mikkel N. Schmidt Discussed by: Yizhe Zhang Feb 20, 2016 Outline 1 Introduction 2 Model 3 Inference 4 Experiments Dropout Training stage: A unit is present
More informationLecture 13 : Variational Inference: Mean Field Approximation
10-708: Probabilistic Graphical Models 10-708, Spring 2017 Lecture 13 : Variational Inference: Mean Field Approximation Lecturer: Willie Neiswanger Scribes: Xupeng Tong, Minxing Liu 1 Problem Setup 1.1
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationSUPREME: a SUPer REsolution Mapmaker for Extended emission
SUPREME: a SUPer REsolution Mapmaker for Extended emission Hacheme Ayasso 1,2 Alain Abergel 2 Thomas Rodet 3 François Orieux 2,3,4 Jean-François Giovannelli 5 Karin Dassas 2 Boualam Hasnoun 1 1 GIPSA-LAB
More informationIEOR E4570: Machine Learning for OR&FE Spring 2015 c 2015 by Martin Haugh. The EM Algorithm
IEOR E4570: Machine Learning for OR&FE Spring 205 c 205 by Martin Haugh The EM Algorithm The EM algorithm is used for obtaining maximum likelihood estimates of parameters when some of the data is missing.
More informationIntroduction to the Mathematics of Medical Imaging
Introduction to the Mathematics of Medical Imaging Second Edition Charles L. Epstein University of Pennsylvania Philadelphia, Pennsylvania EiaJTL Society for Industrial and Applied Mathematics Philadelphia
More informationVariable selection for model-based clustering
Variable selection for model-based clustering Matthieu Marbac (Ensai - Crest) Joint works with: M. Sedki (Univ. Paris-sud) and V. Vandewalle (Univ. Lille 2) The problem Objective: Estimation of a partition
More informationFoundations of Statistical Inference
Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 32 Lecture 14 : Variational Bayes
More informationPart 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationWell-posed Bayesian Inverse Problems: Beyond Gaussian Priors
Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Bamdad Hosseini Department of Mathematics Simon Fraser University, Canada The Institute for Computational Engineering and Sciences University
More informationVariational Principal Components
Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings
More informationBayesian Paradigm. Maximum A Posteriori Estimation
Bayesian Paradigm Maximum A Posteriori Estimation Simple acquisition model noise + degradation Constraint minimization or Equivalent formulation Constraint minimization Lagrangian (unconstraint minimization)
More informationMaximal Entropy for Reconstruction of Back Projection Images
Maximal Entropy for Reconstruction of Back Projection Images Tryphon Georgiou Department of Electrical and Computer Engineering University of Minnesota Minneapolis, MN 5545 Peter J Olver Department of
More informationSensors, Measurement systems and Inverse problems
A. Mohammad-Djafari, Sensors, Measurement systems and Inverse problems, 2012-2013 1/78. Sensors, Measurement systems and Inverse problems Ali Mohammad-Djafari Laboratoire des Signaux et Systèmes, UMR8506
More informationBayesian inference J. Daunizeau
Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty
More informationInfinite latent feature models and the Indian Buffet Process
p.1 Infinite latent feature models and the Indian Buffet Process Tom Griffiths Cognitive and Linguistic Sciences Brown University Joint work with Zoubin Ghahramani p.2 Beyond latent classes Unsupervised
More informationSeries 7, May 22, 2018 (EM Convergence)
Exercises Introduction to Machine Learning SS 2018 Series 7, May 22, 2018 (EM Convergence) Institute for Machine Learning Dept. of Computer Science, ETH Zürich Prof. Dr. Andreas Krause Web: https://las.inf.ethz.ch/teaching/introml-s18
More informationAn introduction to Bayesian inference and model comparison J. Daunizeau
An introduction to Bayesian inference and model comparison J. Daunizeau ICM, Paris, France TNU, Zurich, Switzerland Overview of the talk An introduction to probabilistic modelling Bayesian model comparison
More informationVariational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M
A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M PD M = PD θ, MPθ Mdθ Lecture 14 : Variational Bayes where θ are the parameters of the model and Pθ M is
More informationIntroduction to Probabilistic Machine Learning
Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning
More informationBayesian inference J. Daunizeau
Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty
More informationREVEAL. Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008
REVEAL Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008 Presented to Program Officers: Drs. John Tague and Keith Davidson Undersea Signal Processing Team, Office
More informationUnsupervised Learning
Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College
More informationLecture 6: Model Checking and Selection
Lecture 6: Model Checking and Selection Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de May 27, 2014 Model selection We often have multiple modeling choices that are equally sensible: M 1,, M T. Which
More informationLecture 1b: Linear Models for Regression
Lecture 1b: Linear Models for Regression Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk Advanced
More informationWavelet Based Image Restoration Using Cross-Band Operators
1 Wavelet Based Image Restoration Using Cross-Band Operators Erez Cohen Electrical Engineering Department Technion - Israel Institute of Technology Supervised by Prof. Israel Cohen 2 Layout Introduction
More informationG8325: Variational Bayes
G8325: Variational Bayes Vincent Dorie Columbia University Wednesday, November 2nd, 2011 bridge Variational University Bayes Press 2003. On-screen viewing permitted. Printing not permitted. http://www.c
More informationAlgorithms for Variational Learning of Mixture of Gaussians
Algorithms for Variational Learning of Mixture of Gaussians Instructors: Tapani Raiko and Antti Honkela Bayes Group Adaptive Informatics Research Center 28.08.2008 Variational Bayesian Inference Mixture
More informationFaster Stochastic Variational Inference using Proximal-Gradient Methods with General Divergence Functions
Faster Stochastic Variational Inference using Proximal-Gradient Methods with General Divergence Functions Mohammad Emtiyaz Khan, Reza Babanezhad, Wu Lin, Mark Schmidt, Masashi Sugiyama Conference on Uncertainty
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationLecture 3: More on regularization. Bayesian vs maximum likelihood learning
Lecture 3: More on regularization. Bayesian vs maximum likelihood learning L2 and L1 regularization for linear estimators A Bayesian interpretation of regularization Bayesian vs maximum likelihood fitting
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationSparsity Regularization
Sparsity Regularization Bangti Jin Course Inverse Problems & Imaging 1 / 41 Outline 1 Motivation: sparsity? 2 Mathematical preliminaries 3 l 1 solvers 2 / 41 problem setup finite-dimensional formulation
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationGeneral Bayesian Inference I
General Bayesian Inference I Outline: Basic concepts, One-parameter models, Noninformative priors. Reading: Chapters 10 and 11 in Kay-I. (Occasional) Simplified Notation. When there is no potential for
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationPosterior Regularization
Posterior Regularization 1 Introduction One of the key challenges in probabilistic structured learning, is the intractability of the posterior distribution, for fast inference. There are numerous methods
More informationState Space Gaussian Processes with Non-Gaussian Likelihoods
State Space Gaussian Processes with Non-Gaussian Likelihoods Hannes Nickisch 1 Arno Solin 2 Alexander Grigorievskiy 2,3 1 Philips Research, 2 Aalto University, 3 Silo.AI ICML2018 July 13, 2018 Outline
More informationRecent Advances in Bayesian Inference Techniques
Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian
More informationLarge-Scale L1-Related Minimization in Compressive Sensing and Beyond
Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March
More informationGWAS V: Gaussian processes
GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 20: Expectation Maximization Algorithm EM for Mixture Models Many figures courtesy Kevin Murphy s
More informationOutline lecture 2 2(30)
Outline lecture 2 2(3), Lecture 2 Linear Regression it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic Control
More informationMarkov Chain Monte Carlo
Department of Statistics The University of Auckland https://www.stat.auckland.ac.nz/~brewer/ Emphasis I will try to emphasise the underlying ideas of the methods. I will not be teaching specific software
More informationMixture Models and Expectation-Maximization
Mixture Models and Expectation-Maximiation David M. Blei March 9, 2012 EM for mixtures of multinomials The graphical model for a mixture of multinomials π d x dn N D θ k K How should we fit the parameters?
More informationKernel-based Image Reconstruction from scattered Radon data by (anisotropic) positive definite functions
Kernel-based Image Reconstruction from scattered Radon data by (anisotropic) positive definite functions Stefano De Marchi 1 February 5, 2016 1 Joint work with A. Iske (Hamburg, D), A. Sironi (Lousanne,
More informationLatent Dirichlet Allocation
Outlines Advanced Artificial Intelligence October 1, 2009 Outlines Part I: Theoretical Background Part II: Application and Results 1 Motive Previous Research Exchangeability 2 Notation and Terminology
More informationCompressive Sensing (CS)
Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)
More informationIntroduction to Probabilistic Graphical Models: Exercises
Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationFrequentist-Bayesian Model Comparisons: A Simple Example
Frequentist-Bayesian Model Comparisons: A Simple Example Consider data that consist of a signal y with additive noise: Data vector (N elements): D = y + n The additive noise n has zero mean and diagonal
More information13 : Variational Inference: Loopy Belief Propagation and Mean Field
10-708: Probabilistic Graphical Models 10-708, Spring 2012 13 : Variational Inference: Loopy Belief Propagation and Mean Field Lecturer: Eric P. Xing Scribes: Peter Schulam and William Wang 1 Introduction
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationVariational Inference via Stochastic Backpropagation
Variational Inference via Stochastic Backpropagation Kai Fan February 27, 2016 Preliminaries Stochastic Backpropagation Variational Auto-Encoding Related Work Summary Outline Preliminaries Stochastic Backpropagation
More informationGaussian Processes (10/16/13)
STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs
More informationRegularized Regression A Bayesian point of view
Regularized Regression A Bayesian point of view Vincent MICHEL Director : Gilles Celeux Supervisor : Bertrand Thirion Parietal Team, INRIA Saclay Ile-de-France LRI, Université Paris Sud CEA, DSV, I2BM,
More informationReinforcement Learning as Variational Inference: Two Recent Approaches
Reinforcement Learning as Variational Inference: Two Recent Approaches Rohith Kuditipudi Duke University 11 August 2017 Outline 1 Background 2 Stein Variational Policy Gradient 3 Soft Q-Learning 4 Closing
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationOn the coherence barrier and analogue problems in compressed sensing
On the coherence barrier and analogue problems in compressed sensing Clarice Poon University of Cambridge June 1, 2017 Joint work with: Ben Adcock Anders Hansen Bogdan Roman (Simon Fraser) (Cambridge)
More informationApproximating mixture distributions using finite numbers of components
Approximating mixture distributions using finite numbers of components Christian Röver and Tim Friede Department of Medical Statistics University Medical Center Göttingen March 17, 2016 This project has
More informationData assimilation with and without a model
Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU
More informationBayesian Multi-Task Compressive Sensing with Dirichlet Process Priors
1 Bayesian Multi-Task Compressive Sensing with Dirichlet Process Priors 1 Yuting Qi, 1 Dehong Liu, David Dunson and 1 Lawrence Carin 1 Department of Electrical and Computer Engineering Department of Statistical
More informationThe Expectation Maximization or EM algorithm
The Expectation Maximization or EM algorithm Carl Edward Rasmussen November 15th, 2017 Carl Edward Rasmussen The EM algorithm November 15th, 2017 1 / 11 Contents notation, objective the lower bound functional,
More informationBayesian Regularization
Bayesian Regularization Aad van der Vaart Vrije Universiteit Amsterdam International Congress of Mathematicians Hyderabad, August 2010 Contents Introduction Abstract result Gaussian process priors Co-authors
More informationBayesian estimation of the discrepancy with misspecified parametric models
Bayesian estimation of the discrepancy with misspecified parametric models Pierpaolo De Blasi University of Torino & Collegio Carlo Alberto Bayesian Nonparametrics workshop ICERM, 17-21 September 2012
More informationBayesian methods in economics and finance
1/26 Bayesian methods in economics and finance Linear regression: Bayesian model selection and sparsity priors Linear Regression 2/26 Linear regression Model for relationship between (several) independent
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More information