Dimension-Independent likelihood-informed (DILI) MCMC

Size: px
Start display at page:

Download "Dimension-Independent likelihood-informed (DILI) MCMC"

Transcription

1 Dimension-Independent likelihood-informed (DILI) MCMC Tiangang Cui, Kody Law 2, Youssef Marzouk Massachusetts Institute of Technology 2 Oak Ridge National Laboratory 2 August 25 TC, KL, YM DILI MCMC USC UQ summer school / 25

2 Inverse Problems Data Parameter y = F ( u ) + e forward model (PDE) observation/model errors y R Ny u H F : H R Ny Data y are limited in number, noisy, and indirect. Parameter u is often a function, and discretized on some mesh. Continuous, bounded, and st order differentiable. TC, KL, YM DILI MCMC USC UQ summer school 2 / 25

3 Infinite Dimensional Bayesian Inference Assume Gaussian observation noise, e N (, Γ obs ) Data-misfit function Likelihood function Φ(u; y) = 2 y F (u) 2 Γ obs L(y u) exp ( Φ(y u)) Posterior measure dµ y dµ (u) L(y u), µ = N (m, Γ pr ) where Γ pr is a trace class operator, so µ (H) = Goal: sample posterior µ y using intrinsic low dimensional structure of inverse problems. TC, KL, YM DILI MCMC USC UQ summer school 3 / 25

4 MCMC Sampling Autocorrelations of different samplers versus parameter dimension random walk MALA lag Random walk O(N u ) lag MALA O(N 3 u ) Standard MCMC is not dimension-independent Look at the infinite dimensional limit! TC, KL, YM DILI MCMC USC UQ summer school 4 / 25

5 MCMC Sampling: Metropolis-Hastings Given a proposal q(u, du ), Transition probability Acceptance probability ν(du, du ) = µ y (du) q(u, du ) ν (du, du ) = µ y (du )q(u, du) α(u, u ) = dν dν (u, u ) Requires ν ν for a well-defined MCMC for functions. Many MCMCs defined for the finite dimensional setting, ν ν. Preconditioned Crank-Nicolson (pcn) proposal u = b 2 u + b N (, Γ pr ), satisfies ν ν a a Beskos et al. 28, Stuart 2, Cotter et al. 23 TC, KL, YM DILI MCMC USC UQ summer school 5 / 25

6 MCMC Sampling: Metropolis-Hastings Given a proposal q(u, du ), Transition probability Acceptance probability ν(du, du ) = µ y (du) q(u, du ) ν (du, du ) = µ y (du )q(u, du) α(u, u ) = dν dν (u, u ) Requires ν ν for a well-defined MCMC for functions. Many MCMCs defined for the finite dimensional setting, ν ν. Preconditioned Crank-Nicolson (pcn) proposal u = b 2 u + b N (, Γ pr ), satisfies ν ν a a Beskos et al. 28, Stuart 2, Cotter et al. 23 TC, KL, YM DILI MCMC USC UQ summer school 5 / 25

7 MCMC Sampling Autocorrelations of different samplers versus parameter dimension random walk MALA Crank Nicolson lag Random walk O(N u ) lag MALA O(N 3 u ) lag pcn O() TC, KL, YM DILI MCMC USC UQ summer school 6 / 25

8 Likelihood Information u = b 2 u + b N (, Γ pr ) pcn proposal is isotropic w.r.t. prior, Γ pr. Likelihood constrains the variability of posterior at some directions. What will happen to pcn? Consider the linear example (Law 24) y = u + e, e N (, σ 2 ), u = (u, u 2 ) N (, I) 2 prior posterior x x TC, KL, YM DILI MCMC USC UQ summer school 7 / 25

9 Likelihood Information 2 prior posterior CN x 2 x 2 2 x x CN samples prior posterior MCMC iterations For pcn/cn proposal, the sample correlation n= corr(u (), u (n) ) const σ x Problem: µ y is anisotropic w.r.t. µ TC, KL, YM DILI MCMC USC UQ summer school 8 / 25

10 Likelihood Information 2 prior posterior CN x 2 x x MCMC iterations To adapt to this anisotropy, consider an alternative likelihood-informed proposal [ ] [ ] u b 2 b = u + N (, I) likelihood-informed in u, and prior-informed in u 2. TC, KL, YM DILI MCMC USC UQ summer school 9 / 25

11 Likelihood Information 2 prior posterior LI CN x 2 x x CN MCMC iterations LI 2 samples prior posterior 2 samples prior posterior x 2 x x x TC, KL, YM DILI MCMC USC UQ summer school / 25

12 Likelihood Information 2 prior posterior LI CN x 2 x x MCMC iterations Messages: Performance of pcn can be characterized by data dominated directions. We want our proposals adapt to the likelihood information. In function space this leads to operator weighted proposals. TC, KL, YM DILI MCMC USC UQ summer school / 25

13 Likelihood Information How does data information impact parameters? Limited information carried in the data, e.g., sensor quality, amount of data... 2 Forward model filters the parameters (ill-posedness) 3 Smoothing property of the prior (e.g., correlation structure) We first look at a linear example: y = F u + e, e N (, Γ obs ), µ (u) = N (, Γ pr ) This leads to Gaussian posterior N (m y, Γ pos ) TC, KL, YM DILI MCMC USC UQ summer school 2 / 25

14 Data information Posterior covariance Γ pos = Γ pr + H, where is the data misfit Hessian. Woodbury : H = F Γ obs F, Γ pos = Γ pr Γ pr F Γ y F Γ pr where Γ y = F Γ pr F + Γ obs. Low dimensionality lies in the change from prior to posterior: Γ pos Γ pr K r K r : rank(k r ) r TC, KL, YM DILI MCMC USC UQ summer school 3 / 25

15 Likelihood-informed subspace Theorem: optimal approximation. Spantini et al. (24) eigendecomposition of the prior-preconditioned Hessian Γ 2 pr H Γ 2 pr z i = z i λ i, λ i > λ i+, provide optimal basis Γ 2 pr z i, i =,..., r in terms of information update from prior to posterior. Γ pos Γ pr r i= λ i + λ i (Γ 2 pr z i ) ( Γ 2 pr z i ), Γ 2 pr H Γ 2 pr = Γ 2 pr F Γ obs F Γ 2 pr = Noisy data, ill-posed forward operator and smooth prior are integrated together TC, KL, YM DILI MCMC USC UQ summer school 4 / 25

16 Likelihood-informed subspace Nonlinear forward model F (u) or non-gaussian noise e Idea behind algorithm: combine locally important directions, over posterior, to yield a global reduced basis ) S = (Γ 2 pr H(u) Γ 2 pr µ y (du) H m m i= = Ψ Λ Ψ Γ 2 pr H(u i ) Γ 2 pr Use Gauss-Newton Hessian or Fisher information (non-gaussian noise) for H(u). TC, KL, YM DILI MCMC USC UQ summer school 5 / 25

17 Operator weighted proposals Likelihood-informed subspace spanned by basis Γ 2 pr Ψ captures the update from prior to posterior. [Γ 2 pr Ψ, Γ 2 pr Ψ ] forms a complete orthogonal system w.r.t. prior, Γ pr u = Γ 2 pr Ψ v }{{} r + Γ 2 pr Ψ v }{{ } Constrained by data prior Prescribe different scales to v r and v : v r : using smaller time steps, gradient, local geometry... v : using homogeneous Crank-Nicolson These leads to operator weighted proposals. TC, KL, YM DILI MCMC USC UQ summer school 6 / 25

18 Operator weighted proposals Operator weighted proposals ( ) u = Γ 2 pr AΓ 2 pr u (Γ 2 pr GΓ 2 pr ) D u Φ(u; y) + ( ) Γ 2 pr B N (, I) where A, B, and G are commutative, bounded, self-adjoint operators. Given Trace ( (A 2 + B 2 I ) 2 ) <, and other mild technical conditions, we have ν ν (and ν ν). Thus the operator proposal is well-defined in the function space setting (Cui, Law & Marzouk 24). TC, KL, YM DILI MCMC USC UQ summer school 7 / 25

19 Examples Split the operators A = A r + A B = B r + B G = G r + G LI-Langevin A r = Ψ r D Ar Ψ r B r = Ψ r D Br Ψ r G r = Ψ r D Gr Ψ r D Ar = I r t r D r D Br = 2 t r D r D Gr = t r D r A = a (I Ψ r Ψ r ) B = b (I Ψ r Ψ r ) G = Metropolis-within-Gibbs, alternate on u r and u A r = Ψ r (D Ar I r ) Ψ r + I B r = Ψ r D Br Ψ r A = Ψ r Ψ r + a (I Ψ r Ψ r ) B = b (I Ψ r Ψ r ) G r = Ψ r D Gr Ψ r G = TC, KL, YM DILI MCMC USC UQ summer school 8 / 25

20 Example: Conditioned Diffusion Path reconstruction of a Brownian motion driven SDE: dp t = f (p t )dt + du t f (p) = θp( p 2 )/( + p 2 ) p = pt truth observation mean p.5 quantile.95 quantile time pt time 6 8 TC, KL, YM DILI MCMC USC UQ summer school 9 / 25

21 Example: Autocorrelations Likelihood: (a) trace plot, MGLI-Langevin (c) autocorrelations (b) trace plot, PCN-RW 6 5 autocorr MGLI-Langevin MGLI-Prior LI-Langevin LI-Prior H-Langevin PCN-RW MCMC Steps lag parameters projected onto KL basis of prior (lag ) H-Langevin MGLI-Langevin components of v parameter H-Langevin: Explicit discretization of Langevin SDE, preconditioned by Hessian at the MAP. TC, KL, YM DILI MCMC USC UQ summer school 2 / 25

22 Example: Autocorrelations Operators built from single Hessian vs. integrated Hessian (a) OMF (b) lag autocorrelation of v.8.8 autocorr.6.4 MAP-LIS Adapt-LIS.6.4 MAP-LIS Adapt-LIS lag components of v parameter TC, KL, YM DILI MCMC USC UQ summer school 2 / 25

23 Example: Elliptic PDE Recover the transmissivity κ(s) from partial observation of the potential p(s). (κ(s) p(s)) = f (s) s2 (a) s2 (b) s s (c) SNR SNR5 Truth TC, KL, YM DILI MCMC USC UQ summer school 22 / 25

24 Example: Likelihood-informed Subspace Lead basis vectors of likelihood-informed subspace with different grid resolutions Index Index 2 Index 3 Index 4 Index TC, KL, YM DILI MCMC USC UQ summer school 23 / 25

25 Example: Autocorrelations Likelihood: parameters projected onto KL basis of prior (lag ) 8 (a) trace plot, MGLI-Langevin (b) trace plot, PCN-RW MCMC Steps 5 Correlation (a) SNR 5 5 Components of v Parameter autocorr Lag Autocorrelation of v H-Langevin MGLI-Langevin (c) autocorrelations MGLI-Langevin MGLI-Prior LI-Langevin LI-Prior H-Langevin PCN-RW lag (b) SNR5 H-Langevin MGLI-Langevin 5 5 H-Langevin: Explicit discretization of Langevin SDE, preconditioned by Hessian at the MAP. TC, KL, YM DILI MCMC USC UQ summer school 24 / 25

26 Conclusions Dimension independent MCMC using operator-weighted proposals Operators are designed by identifying the likelihood-informed directions. Demonstrated efficiency on numerical examples. Future work: hyperparameters, optimal operators, parallelization, extensions to local operators. DILI ideas in transport maps FastFInS package (contact Applications, bigger models: adjoint model. FastFInS only needs the forward model and More info: T. Cui, K. Law, Y. Marzouk, Dimension-independent likelihood-informed MCMC, arxiv: T. Cui and Y. Marzouk acknowledge the financial support from the DOE Applied Mathematics Program, Awards DE-FG2-8ER2585 and DE-SC9297, as part of the DiaMonD Multifaceted Mathematics Integrated Capability Center. K. Law is a member of the SRI-UQ Center at KAUST. TC, KL, YM DILI MCMC USC UQ summer school 25 / 25

Transport maps and dimension reduction for Bayesian computation Youssef Marzouk

Transport maps and dimension reduction for Bayesian computation Youssef Marzouk Transport maps and dimension reduction for Bayesian computation Youssef Marzouk Massachusetts Institute of Technology Department of Aeronautics & Astronautics Center for Computational Engineering http://uqgroup.mit.edu

More information

The Bayesian approach to inverse problems

The Bayesian approach to inverse problems The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Hierarchical Bayesian Inversion

Hierarchical Bayesian Inversion Hierarchical Bayesian Inversion Andrew M Stuart Computing and Mathematical Sciences, Caltech cw/ S. Agapiou, J. Bardsley and O. Papaspiliopoulos SIAM/ASA JUQ 2(2014), pp. 511--544 cw/ M. Dunlop and M.

More information

Uncertainty quantification for inverse problems with a weak wave-equation constraint

Uncertainty quantification for inverse problems with a weak wave-equation constraint Uncertainty quantification for inverse problems with a weak wave-equation constraint Zhilong Fang*, Curt Da Silva*, Rachel Kuske** and Felix J. Herrmann* *Seismic Laboratory for Imaging and Modeling (SLIM),

More information

On an adaptive preconditioned Crank-Nicolson algorithm for infinite dimensional Bayesian inferences

On an adaptive preconditioned Crank-Nicolson algorithm for infinite dimensional Bayesian inferences Noname manuscript No. (will be inserted by the editor) On an adaptive preconditioned Crank-Nicolson algorithm for infinite dimensional Bayesian inferences Zixi Hu Zhewei Yao Jinglai Li Received: date /

More information

Scalable algorithms for optimal experimental design for infinite-dimensional nonlinear Bayesian inverse problems

Scalable algorithms for optimal experimental design for infinite-dimensional nonlinear Bayesian inverse problems Scalable algorithms for optimal experimental design for infinite-dimensional nonlinear Bayesian inverse problems Alen Alexanderian (Math/NC State), Omar Ghattas (ICES/UT-Austin), Noémi Petra (Applied Math/UC

More information

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors Division of Engineering & Applied Science Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors IPAM, UCLA, November 14, 2017 Matt Dunlop Victor Chen (Caltech) Omiros Papaspiliopoulos (ICREA,

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Bayesian Inverse Problems with L[subscript 1] Priors: A Randomize-Then-Optimize Approach

Bayesian Inverse Problems with L[subscript 1] Priors: A Randomize-Then-Optimize Approach Bayesian Inverse Problems with L[subscript ] Priors: A Randomize-Then-Optimize Approach The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters.

More information

NONLINEAR DIFFUSION PDES

NONLINEAR DIFFUSION PDES NONLINEAR DIFFUSION PDES Erkut Erdem Hacettepe University March 5 th, 0 CONTENTS Perona-Malik Type Nonlinear Diffusion Edge Enhancing Diffusion 5 References 7 PERONA-MALIK TYPE NONLINEAR DIFFUSION The

More information

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz

More information

c 2016 Society for Industrial and Applied Mathematics

c 2016 Society for Industrial and Applied Mathematics SIAM J. SCI. COMPUT. Vol. 8, No. 5, pp. A779 A85 c 6 Society for Industrial and Applied Mathematics ACCELERATING MARKOV CHAIN MONTE CARLO WITH ACTIVE SUBSPACES PAUL G. CONSTANTINE, CARSON KENT, AND TAN

More information

ICES REPORT March Tan Bui-Thanh And Mark Andrew Girolami

ICES REPORT March Tan Bui-Thanh And Mark Andrew Girolami ICES REPORT 4- March 4 Solving Large-Scale Pde-Constrained Bayesian Inverse Problems With Riemann Manifold Hamiltonian Monte Carlo by Tan Bui-Thanh And Mark Andrew Girolami The Institute for Computational

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Uncertainty quantification for Wavefield Reconstruction Inversion

Uncertainty quantification for Wavefield Reconstruction Inversion Uncertainty quantification for Wavefield Reconstruction Inversion Zhilong Fang *, Chia Ying Lee, Curt Da Silva *, Felix J. Herrmann *, and Rachel Kuske * Seismic Laboratory for Imaging and Modeling (SLIM),

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November

More information

Riemann Manifold Methods in Bayesian Statistics

Riemann Manifold Methods in Bayesian Statistics Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Introduction to Hamiltonian Monte Carlo Method

Introduction to Hamiltonian Monte Carlo Method Introduction to Hamiltonian Monte Carlo Method Mingwei Tang Department of Statistics University of Washington mingwt@uw.edu November 14, 2017 1 Hamiltonian System Notation: q R d : position vector, p R

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

Bayesian inverse problems with Laplacian noise

Bayesian inverse problems with Laplacian noise Bayesian inverse problems with Laplacian noise Remo Kretschmann Faculty of Mathematics, University of Duisburg-Essen Applied Inverse Problems 2017, M27 Hangzhou, 1 June 2017 1 / 33 Outline 1 Inverse heat

More information

Markov chain Monte Carlo methods

Markov chain Monte Carlo methods Markov chain Monte Carlo methods Youssef Marzouk Department of Aeronautics and Astronatics Massachusetts Institute of Technology ymarz@mit.edu 22 June 2015 Marzouk (MIT) IMA Summer School 22 June 2015

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

Ergodicity in data assimilation methods

Ergodicity in data assimilation methods Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation

More information

Resolving the White Noise Paradox in the Regularisation of Inverse Problems

Resolving the White Noise Paradox in the Regularisation of Inverse Problems 1 / 32 Resolving the White Noise Paradox in the Regularisation of Inverse Problems Hanne Kekkonen joint work with Matti Lassas and Samuli Siltanen Department of Mathematics and Statistics University of

More information

Signal Processing Problems on Function Space: Bayesian Formulation, Stochastic PDEs and Effective MCMC Methods

Signal Processing Problems on Function Space: Bayesian Formulation, Stochastic PDEs and Effective MCMC Methods Signal Processing Problems on Function Space: Bayesian Formulation, Stochastic PDEs and Effective MCMC Methods March 4, 29 M. Hairer 1,2, A. Stuart 1, J. Voss 1,3 1 Mathematics Institute, Warwick University,

More information

Efficient MCMC Sampling for Hierarchical Bayesian Inverse Problems

Efficient MCMC Sampling for Hierarchical Bayesian Inverse Problems Efficient MCMC Sampling for Hierarchical Bayesian Inverse Problems Andrew Brown 1,2, Arvind Saibaba 3, Sarah Vallélian 2,3 CCNS Transition Workshop SAMSI May 5, 2016 Supported by SAMSI Visiting Research

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Krzysztof Fidkowski, David Galbally*, Karen Willcox* (*MIT) Computational Aerospace Sciences Seminar Aerospace Engineering

More information

On Bayesian Computation

On Bayesian Computation On Bayesian Computation Michael I. Jordan with Elaine Angelino, Maxim Rabinovich, Martin Wainwright and Yun Yang Previous Work: Information Constraints on Inference Minimize the minimax risk under constraints

More information

Bayesian Model Comparison:

Bayesian Model Comparison: Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500

More information

MALA versus Random Walk Metropolis Dootika Vats June 4, 2017

MALA versus Random Walk Metropolis Dootika Vats June 4, 2017 MALA versus Random Walk Metropolis Dootika Vats June 4, 2017 Introduction My research thus far has predominantly been on output analysis for Markov chain Monte Carlo. The examples on which I have implemented

More information

Adaptive Posterior Approximation within MCMC

Adaptive Posterior Approximation within MCMC Adaptive Posterior Approximation within MCMC Tiangang Cui (MIT) Colin Fox (University of Otago) Mike O Sullivan (University of Auckland) Youssef Marzouk (MIT) Karen Willcox (MIT) 06/January/2012 C, F,

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Kernel Adaptive Metropolis-Hastings

Kernel Adaptive Metropolis-Hastings Kernel Adaptive Metropolis-Hastings Arthur Gretton,?? Gatsby Unit, CSML, University College London NIPS, December 2015 Arthur Gretton (Gatsby Unit, UCL) Kernel Adaptive Metropolis-Hastings 12/12/2015 1

More information

Randomize-Then-Optimize for Sampling and Uncertainty Quantification in Electrical Impedance Tomography

Randomize-Then-Optimize for Sampling and Uncertainty Quantification in Electrical Impedance Tomography SIAM/ASA J. UNCERTAINTY QUANTIFICATION Vol. 3, pp. 1136 1158 c 2015 Society for Industrial and Applied Mathematics and American Statistical Association Randomize-Then-Optimize for Sampling and Uncertainty

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty

Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty Tan Bui-Thanh Computational Engineering and Optimization (CEO) Group Department of Aerospace Engineering and Engineering

More information

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J. Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Linear Diffusion and Image Processing. Outline

Linear Diffusion and Image Processing. Outline Outline Linear Diffusion and Image Processing Fourier Transform Convolution Image Restoration: Linear Filtering Diffusion Processes for Noise Filtering linear scale space theory Gauss-Laplace pyramid for

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

arxiv: v1 [math.oc] 11 Jan 2018

arxiv: v1 [math.oc] 11 Jan 2018 ESTIMATION OF THE ROBIN COEFFICIENT FIELD IN A POISSON PROBLEM WITH UNCERTAIN CONDUCTIVITY FIELD Ruanui Nicholson, 1, Noémi Petra, 2 & Jari P. Kaipio 3 arxiv:1801.03592v1 [math.oc] 11 Jan 2018 1 Department

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Recent Advances in Bayesian Inference for Inverse Problems

Recent Advances in Bayesian Inference for Inverse Problems Recent Advances in Bayesian Inference for Inverse Problems Felix Lucka University College London, UK f.lucka@ucl.ac.uk Applied Inverse Problems Helsinki, May 25, 2015 Bayesian Inference for Inverse Problems

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors

Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Bamdad Hosseini Department of Mathematics Simon Fraser University, Canada The Institute for Computational Engineering and Sciences University

More information

Bayesian model selection in graphs by using BDgraph package

Bayesian model selection in graphs by using BDgraph package Bayesian model selection in graphs by using BDgraph package A. Mohammadi and E. Wit March 26, 2013 MOTIVATION Flow cytometry data with 11 proteins from Sachs et al. (2005) RESULT FOR CELL SIGNALING DATA

More information

Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems

Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

Bayesian spatial hierarchical modeling for temperature extremes

Bayesian spatial hierarchical modeling for temperature extremes Bayesian spatial hierarchical modeling for temperature extremes Indriati Bisono Dr. Andrew Robinson Dr. Aloke Phatak Mathematics and Statistics Department The University of Melbourne Maths, Informatics

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Statistical Estimation of the Parameters of a PDE

Statistical Estimation of the Parameters of a PDE PIMS-MITACS 2001 1 Statistical Estimation of the Parameters of a PDE Colin Fox, Geoff Nicholls (University of Auckland) Nomenclature for image recovery Statistical model for inverse problems Traditional

More information

Last updated: Oct 22, 2012 LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

Last updated: Oct 22, 2012 LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition Last updated: Oct 22, 2012 LINEAR CLASSIFIERS Problems 2 Please do Problem 8.3 in the textbook. We will discuss this in class. Classification: Problem Statement 3 In regression, we are modeling the relationship

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

A Stein variational Newton method

A Stein variational Newton method A Stein variational Newton method Gianluca Detommaso University of Bath & The Alan Turing Institute gd39@bath.ac.uk Tiangang Cui Monash University Tiangang.Cui@monash.edu Alessio Spantini Massachusetts

More information

Parallelizing large scale time domain electromagnetic inverse problem

Parallelizing large scale time domain electromagnetic inverse problem Parallelizing large scale time domain electromagnetic inverse problems Eldad Haber with: D. Oldenburg & R. Shekhtman + Emory University, Atlanta, GA + The University of British Columbia, Vancouver, BC,

More information

STA414/2104 Statistical Methods for Machine Learning II

STA414/2104 Statistical Methods for Machine Learning II STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements

More information

Spatio-temporal precipitation modeling based on time-varying regressions

Spatio-temporal precipitation modeling based on time-varying regressions Spatio-temporal precipitation modeling based on time-varying regressions Oleg Makhnin Department of Mathematics New Mexico Tech Socorro, NM 87801 January 19, 2007 1 Abstract: A time-varying regression

More information

Stability of Krylov Subspace Spectral Methods

Stability of Krylov Subspace Spectral Methods Stability of Krylov Subspace Spectral Methods James V. Lambers Department of Energy Resources Engineering Stanford University includes joint work with Patrick Guidotti and Knut Sølna, UC Irvine Margot

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy

More information

arxiv: v2 [stat.ml] 29 Oct 2018

arxiv: v2 [stat.ml] 29 Oct 2018 A Stein variational Newton method Gianluca Detommaso University of Bath & The Alan Turing Institute gd39@bath.ac.uk Tiangang Cui Monash University Tiangang.Cui@monash.edu arxiv:86.385v [stat.ml] 9 Oct

More information

Bayesian inference for stochastic differential mixed effects models - initial steps

Bayesian inference for stochastic differential mixed effects models - initial steps Bayesian inference for stochastic differential ixed effects odels - initial steps Gavin Whitaker 2nd May 2012 Supervisors: RJB and AG Outline Mixed Effects Stochastic Differential Equations (SDEs) Bayesian

More information

EnKF and Catastrophic filter divergence

EnKF and Catastrophic filter divergence EnKF and Catastrophic filter divergence David Kelly Kody Law Andrew Stuart Mathematics Department University of North Carolina Chapel Hill NC bkelly.com June 4, 2014 SIAM UQ 2014 Savannah. David Kelly

More information

Markov chain Monte Carlo methods for visual tracking

Markov chain Monte Carlo methods for visual tracking Markov chain Monte Carlo methods for visual tracking Ray Luo rluo@cory.eecs.berkeley.edu Department of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA 94720

More information

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Jeffrey N. Rouder Francis Tuerlinckx Paul L. Speckman Jun Lu & Pablo Gomez May 4 008 1 The Weibull regression model

More information

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017 Zig-Zag Monte Carlo Delft University of Technology Joris Bierkens February 7, 2017 Joris Bierkens (TU Delft) Zig-Zag Monte Carlo February 7, 2017 1 / 33 Acknowledgements Collaborators Andrew Duncan Paul

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

wissen leben WWU Münster

wissen leben WWU Münster MÜNSTER Sparsity Constraints in Bayesian Inversion Inverse Days conference in Jyväskylä, Finland. Felix Lucka 19.12.2012 MÜNSTER 2 /41 Sparsity Constraints in Inverse Problems Current trend in high dimensional

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

Numerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS,

Numerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS, Numerical Methods Rafał Zdunek Underdetermined problems (h.) (FOCUSS, M-FOCUSS, M Applications) Introduction Solutions to underdetermined linear systems, Morphological constraints, FOCUSS algorithm, M-FOCUSS

More information

Block-Structured Adaptive Mesh Refinement

Block-Structured Adaptive Mesh Refinement Block-Structured Adaptive Mesh Refinement Lecture 2 Incompressible Navier-Stokes Equations Fractional Step Scheme 1-D AMR for classical PDE s hyperbolic elliptic parabolic Accuracy considerations Bell

More information

Ch 4. Linear Models for Classification

Ch 4. Linear Models for Classification Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,

More information

String method for the Cahn-Hilliard dynamics

String method for the Cahn-Hilliard dynamics String method for the Cahn-Hilliard dynamics Tiejun Li School of Mathematical Sciences Peking University tieli@pku.edu.cn Joint work with Wei Zhang and Pingwen Zhang Outline Background Problem Set-up Algorithms

More information

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven

More information

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Numerical Analysis for Statisticians

Numerical Analysis for Statisticians Kenneth Lange Numerical Analysis for Statisticians Springer Contents Preface v 1 Recurrence Relations 1 1.1 Introduction 1 1.2 Binomial CoefRcients 1 1.3 Number of Partitions of a Set 2 1.4 Horner's Method

More information

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations SIAM/ASA J. UNCERTAINTY QUANTIFICATION Vol. xx, pp. x c xxxx Society for Industrial and Applied Mathematics x x Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the

More information

Methods of Data Assimilation and Comparisons for Lagrangian Data

Methods of Data Assimilation and Comparisons for Lagrangian Data Methods of Data Assimilation and Comparisons for Lagrangian Data Chris Jones, Warwick and UNC-CH Kayo Ide, UCLA Andrew Stuart, Jochen Voss, Warwick Guillaume Vernieres, UNC-CH Amarjit Budiraja, UNC-CH

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification

Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification N. Zabaras 1 S. Atkinson 1 Center for Informatics and Computational Science Department of Aerospace and Mechanical Engineering University

More information

Non-linear least-squares inversion with data-driven

Non-linear least-squares inversion with data-driven Geophys. J. Int. (2000) 142, 000 000 Non-linear least-squares inversion with data-driven Bayesian regularization Tor Erik Rabben and Bjørn Ursin Department of Petroleum Engineering and Applied Geophysics,

More information

The Inversion Problem: solving parameters inversion and assimilation problems

The Inversion Problem: solving parameters inversion and assimilation problems The Inversion Problem: solving parameters inversion and assimilation problems UE Numerical Methods Workshop Romain Brossier romain.brossier@univ-grenoble-alpes.fr ISTerre, Univ. Grenoble Alpes Master 08/09/2016

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Point spread function reconstruction from the image of a sharp edge

Point spread function reconstruction from the image of a sharp edge DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty

More information

MCMC 2: Lecture 2 Coding and output. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham

MCMC 2: Lecture 2 Coding and output. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham MCMC 2: Lecture 2 Coding and output Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham Contents 1. General (Markov) epidemic model 2. Non-Markov epidemic model 3. Debugging

More information