Statistical Estimation of the Parameters of a PDE

Similar documents
Introduction to Bayesian methods in inverse problems

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

MCMC Sampling for Bayesian Inference using L1-type Priors

MSc MT15. Further Statistical Methods: MCMC. Lecture 5-6: Markov chains; Metropolis Hastings MCMC. Notes and Practicals available at

Bayesian Inference and MCMC

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.

Results: MCMC Dancers, q=10, n=500

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Markov Chains and MCMC

Convergence Rate of Markov Chains

CPSC 540: Machine Learning

Sequential Monte Carlo Samplers for Applications in High Dimensions

An example of Bayesian reasoning Consider the one-dimensional deconvolution problem with various degrees of prior information.

Advances and Applications in Perfect Sampling

Lecture 15: MCMC Sanjeev Arora Elad Hazan. COS 402 Machine Learning and Artificial Intelligence Fall 2016

Augmented Tikhonov Regularization

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Markov Chains and MCMC

ECE521 week 3: 23/26 January 2017

STA 4273H: Statistical Machine Learning

MCMC: Markov Chain Monte Carlo

The Metropolis-Hastings Algorithm. June 8, 2012

Lecture 2: From Linear Regression to Kalman Filter and Beyond

ECE295, Data Assimila0on and Inverse Problems, Spring 2015

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Probabilistic Graphical Models

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering


Introduction to Machine Learning CMU-10701

Introduction to Machine Learning

Markov Chain Monte Carlo, Numerical Integration

Probabilistic Machine Learning

Large Scale Bayesian Inference

Markov chain Monte Carlo methods in atmospheric remote sensing

Forward Problems and their Inverse Solutions

Transdimensional Markov Chain Monte Carlo Methods. Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen

STA414/2104 Statistical Methods for Machine Learning II

Efficient MCMC Sampling for Hierarchical Bayesian Inverse Problems

Bayesian Regression Linear and Logistic Regression

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Bayesian Phylogenetics

Estimating Evolutionary Trees. Phylogenetic Methods

Markov Chain Monte Carlo

The Kalman Filter ImPr Talk

Markov chain Monte Carlo Lecture 9

Discretization-invariant Bayesian inversion and Besov space priors. Samuli Siltanen RICAM Tampere University of Technology

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo

The Markov Chain Monte Carlo Method

Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo

Adaptive Posterior Approximation within MCMC

Inverse problems and uncertainty quantification in remote sensing

Variational Methods in Bayesian Deconvolution

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Contre-examples for Bayesian MAP restoration. Mila Nikolova

Notes on pseudo-marginal methods, variational Bayes and ABC

Simulation - Lectures - Part III Markov chain Monte Carlo

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Bayesian Gaussian / Linear Models. Read Sections and 3.3 in the text by Bishop

Sampling Methods (11/30/04)

Approximate Bayesian Computation: a simulation based approach to inference

Forward and inverse wave problems:quantum billiards and brain imaging p.1

Dimension-Independent likelihood-informed (DILI) MCMC

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors

Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018

Lecture 2: From Linear Regression to Kalman Filter and Beyond

A new iterated filtering algorithm

Continuous State MRF s

Stat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC

Bayesian Methods for Machine Learning

Probabilistic Graphical Models

Strong Lens Modeling (II): Statistical Methods

DRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute

Answers and expectations

Quantifying Uncertainty

Statistical Inference

Paul Karapanagiotidis ECO4060

CSC 2541: Bayesian Methods for Machine Learning

Monte Carlo in Bayesian Statistics

Graphical Models and Kernel Methods

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Computer Intensive Methods in Mathematical Statistics

Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing

Point spread function reconstruction from the image of a sharp edge

MCMC and Gibbs Sampling. Kayhan Batmanghelich

Markov Chain Monte Carlo Methods for Stochastic Optimization

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Markov Chain Monte Carlo. Simulated Annealing.

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

An introduction to Markov Chain Monte Carlo techniques

Stochastic Simulation

Practical unbiased Monte Carlo for Uncertainty Quantification

Particle Filtering Approaches for Dynamic Stochastic Optimization

Markov Chain Monte Carlo

Density Estimation. Seungjin Choi

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Transcription:

PIMS-MITACS 2001 1 Statistical Estimation of the Parameters of a PDE Colin Fox, Geoff Nicholls (University of Auckland) Nomenclature for image recovery Statistical model for inverse problems Traditional approaches deconvolution example Recovering electrical conductivity via inference

PIMS-MITACS 2001 2 Image Recovery nomenclature for Inverse Problems x-rays in x-rays out X-ray tomography Image: spatially varying quantity of interest optical reflectance of a scene optical or radio brightness of sky sound speed in tissue / ocean / earth electrical conductivity of tissue / mud Recovery: estimate image from indirect data Forward Problem image data physical model (PDE) direct computation well posed unique Inverse Problem data image implicit indirect ill posed never unique

PIMS-MITACS 2001 3 Conductivity Imaging Measurement Set Electrodes at x 1,x 2,,x E Assert currents at electrodes j =(j (x 1 ),j(x 2 ),,j(x E )) T Measure voltages v =(φ (x 1 ), φ (x 2 ),, φ (x E )) T. Unknown σ (x) related to measurements via Neumann BVP σ (x) φ (x) =0 x Ω σ (x) φ(x) n (x) = j (x) x Ω Set of measurements is current-voltage pairs {j n,v n } N n=1 Inverse problem is to find σ from these measurements (non linear)

PIMS-MITACS 2001 4 Statistical model of Inverse Problem true image σ forward map K ideal data data loss P noise free data n + + noise actual data v=pk σ+n σ^ estimate -1 K image recovery algorithm prior knowledge - constraints - regularization - preferences If n N(0,s 2 ), v N(PKσ,s 2 ) Given measurements v, the likelihood for σ is L v (σ) Pr (v σ) exp( v φ(σ) 2 /2s 2 ) Posterior distribution for σ conditional on v Pr (v σ)pr(σ) Pr (σ v) = Pr (v) Pr (σ)is the prior distribution (Bayes rule)

PIMS-MITACS 2001 5 Solutions = Summary Statistics All information contained in posterior distribution Pr (σ v) Traditional Solutions - modes ˆσ MLE = arg max L v (σ) arg max Pr (v σ) ˆσ MAP = arg max Pr (σ v) arg max Pr (v σ)pr(σ) e.g. simple Gaussian prior: Pr (σ) exp ³ 2 σ 2 /2λ ˆσ MAP = arg min v φ(σ) 2 + α σ 2 α = s 2 /λ 2 Tikhonov regularization, Kalman filtering, Backus-Gilbert α 0 Moore-Penrose inverse, α =0least-squares Inferential Solutions Answers are expectations over the posterior Z E [f (σ)] = Pr (σ v) f (σ) dσ Pr( σv ) σ

PIMS-MITACS 2001 6 Traditional Solutions Fourier Deconvolution Noisey blurred image Exact inverse MAP solution

PIMS-MITACS 2001 7 Bayesian Formulation for Conductivity Imaging current potential voltage current conductivity in Ω in Ω electrode electrode r.v. R Φ V J Σ value ρ φ v j σ Posterior Pr {Σ = σ, Φ n = φ n,r n = ρ n {J n,v n } = {j n,v n }} =Pr{{J n,v n } = {j n,v n } Σ = σ, Φ n = φ n,r n = ρ n } Pr {Σ = σ, Φ n = φ n,r n = ρ n } R = Σ Φ, φ = Γ σ (ρ Ω ) and ρ = σ φ Pr {Σ = σ, Φ n = φ n,r n = ρ n } =Pr{Σ = σ,r n = ρ n } Stipulate Pr {Σ = σ} only usually a MRF L (σ, φ n, ρ n ) =Pr{{J n,v n } = {j n,v n } Σ = σ, Φ n = φ n,r n = ρ n } =Pr{{V n } = {v n } Φ n = φ n (σ, ρ n )} Pr {{J n } = {j n } R n = ρ n } Errors i.i.d. L (σ, φ n, ρ n )=Π N n=1pr {V n = v n Φ n = Γ σ (ρ n Ω )} Pr {J n = j n R n = ρ n }. Noise is normal (say) Pr {J = j R = ρ} N ³ (ρ (x 1 ), ρ (x 2 ),, ρ (x k )) T, s 2 ρ

PIMS-MITACS 2001 8 Samples from the Prior

PIMS-MITACS 2001 9 Markov chain Monte Carlo Monte Carlo integration If {X t,t= 1, 2,...,n} are sampled from Pr (σ v) E [f (σ)] 1 nx f (X t ) n Markov chain Generate {X t } t=0 as a Markov chain of random variables X t Σ Ω, with a t-step distribution Pr(X t = σ X 0 = σ (0) ) that tends to Pr(σ v),ast. t=1 Metopolis-Hastings algorithm (1) given state σ t at time t generate candidate state σ 0 from a proposal distribution q (. σ t ) (2) Accept candidate with probability µ Pr(Y v)q (X Y ) α (X Y )=min 1, Pr(X v)q (Y X) (3) If accepted, X t+1 = σ 0 otherwise X t+1 = σ t (4) Repeat q (. σ t ) can be any distribution that ensures the chain is: irreducible aperiodic

PIMS-MITACS 2001 10 Three-Move Metropolis Hastings Choose one of 3 moves with probability ζ p,p= 1, 2, 3 Transition probabilities {Pr (p) (X t+1 = σ t+1 X t = σ t )} 3 p=1 (reversible w.r.t. Pr(σ v)). Overall transition probability is Pr(X t+1 = σ t+1 X t = σ t ) 3X = ζ p Pr (p) (X t+1 = σ t+1 X t = σ t ). p=1 If at least one of the moves is irreducible on Σ Ω, then the equilibrium distribution is Pr(σ v). Apixeln is a near-neighbour of pixel m if their lattice distance is less than 8. An update-edge is a pair of near-neighbouring pixels of unequal conductivity. ( N (σ), Nm(σ) ) Move 1Flipapixel. Select a pixel m at random and assign σ m a new conductivity σ 0 m chosen uniformly at random from the other C 1 conductivity values. Move 2Flip a pixel near a conductivity boundary. Pick an update-edge at random from N (σ). Pick one of the two pixels in that edge at random, pixel m say.proceedasinmove1. Move 3Swap conductivities at a pair of pixels. Pick an updateedge at random from N (σ). Setσ 0 m = σ n and σ 0 n = σ m.

PIMS-MITACS 2001 11 Experiment 1 (discrete variables three conductivity levels) A B1 B2 C D

PIMS-MITACS 2001 12 Experiment 2 (continuous variables three conductivity types) A B C D E

PIMS-MITACS 2001 13 Experiment 3 (shielding) A E B F C G D H

PIMS-MITACS 2001 14 Summary If you can simulate the forward map then you can sample and calculate expectations over the posterior, i.e., solve the inverse problem Statistical inference provides a unifying framework for inverse problems