Forward Problems and their Inverse Solutions

Similar documents
Contents. Part I: Fundamentals of Bayesian Inference 1

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture)

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Large Scale Bayesian Inference

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

F denotes cumulative density. denotes probability density function; (.)

Monte Carlo in Bayesian Statistics

Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models

Principles of Bayesian Inference

CS 343: Artificial Intelligence

Markov Chain Monte Carlo, Numerical Integration

Bayesian Inference and MCMC


Lecture Notes based on Koop (2003) Bayesian Econometrics

Bayesian Inference. Chapter 1. Introduction and basic concepts

A Bayesian Approach to Phylogenetics

Introduction to Machine Learning CMU-10701

Reminder of some Markov Chain properties:

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Bayesian Phylogenetics

Markov Chain Monte Carlo

CosmoSIS Webinar Part 1

Machine Learning. Probabilistic KNN.

Bayesian Networks BY: MOHAMAD ALSABBAGH

Learning the hyper-parameters. Luca Martino

Advanced Statistical Modelling

Markov Chains and MCMC

MCMC Sampling for Bayesian Inference using L1-type Priors

Answers and expectations

Down by the Bayes, where the Watermelons Grow

Theory of Stochastic Processes 8. Markov chain Monte Carlo

Bayesian Estimation of Input Output Tables for Russia

Bayesian Analysis of RR Lyrae Distances and Kinematics

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Stat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC

Bayes Nets: Sampling

Bayesian Phylogenetics:

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

Bayesian Estimation with Sparse Grids

Bayesian networks: approximate inference

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

MCMC algorithms for fitting Bayesian models

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution

CS 188: Artificial Intelligence. Bayes Nets

When one of things that you don t know is the number of things that you don t know

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet

STA 4273H: Statistical Machine Learning

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1

MCMC Review. MCMC Review. Gibbs Sampling. MCMC Review

Doing Bayesian Integrals

Paul Karapanagiotidis ECO4060

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Oliver Schulte - CMPT 419/726. Bishop PRML Ch.

BAYESIAN APPROACH TO DECISION MAKING USING ENSEMBLE WEATHER FORECASTS

Tutorial on Probabilistic Programming with PyMC3

Closed-form Gibbs Sampling for Graphical Models with Algebraic constraints. Hadi Mohasel Afshar Scott Sanner Christfried Webers

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo

Markov Networks.

MCMC and Gibbs Sampling. Kayhan Batmanghelich

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Announcements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.

A Review of Pseudo-Marginal Markov Chain Monte Carlo

CPSC 540: Machine Learning

Approximate Bayesian Computation: a simulation based approach to inference

ABC methods for phase-type distributions with applications in insurance risk problems

Monte Carlo Methods. Geoff Gordon February 9, 2006

17 : Markov Chain Monte Carlo

1 Probabilities. 1.1 Basics 1 PROBABILITIES

Bayesian Linear Regression

MCMC Methods: Gibbs and Metropolis

Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016

Machine Learning for Data Science (CS4786) Lecture 24

Metropolis-Hastings Algorithm

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

Spatio-temporal precipitation modeling based on time-varying regressions

Markov chain Monte Carlo

On Markov chain Monte Carlo methods for tall data

Advances and Applications in Perfect Sampling

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Strong Lens Modeling (II): Statistical Methods

General Construction of Irreversible Kernel in Markov Chain Monte Carlo

Stat 516, Homework 1

Monte Carlo integration

1 Probabilities. 1.1 Basics 1 PROBABILITIES

Bayesian phylogenetics. the one true tree? Bayesian phylogenetics

Bayesian model selection in graphs by using BDgraph package

Machine learning: Hypothesis testing. Anders Hildeman

Likelihood-free MCMC

Multilevel Statistical Models: 3 rd edition, 2003 Contents

Computational statistics

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Bayesian Inference for Wind Field Retrieval

ST 740: Markov Chain Monte Carlo

Lecture 6: Markov Chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo

Bayesian Nonparametric Regression for Diabetes Deaths

Stochastic Simulation

Application of Bayesian Inference to Milling Force Modeling

Transcription:

Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013

Outline 1 Forward Problem Example Weather Forecasting 2 Inverse Problem Example CT Scan 3 Line Fitting Least Squares Sampling Method 4 Summary and Future Work

Weather Forecasting Weather Forecasting NOAA

CT Scan CT SCAN (1/4) Setup: InitialGuess:

CT Scan CT SCAN (2/4) Source: Tarantola, 2005

CT Scan CT SCAN (3/4) D. Khider

CT Scan CT SCAN (4/4) L.Stott

Least Squares "Simple" Case of Line Fit A simple inverse problem is "y=mx". Instead of knowing m and x to solve for y, you solve for either x or m given knowledge of y and m or y and x. What we can easily measure is often not what we want to know. In an inverse problem, we use observations to infer what we want to know.

Least Squares Least Squares 80 60 40 Least Squares Fit to Data with Confidence Intervals DATA LEAST SQUARES FIT CONFIDENCE INTERVALS PREDICTION INTERVALS Y 20 0 M=2.34 ± 0.09, B=3.26 ± 1.13 20 0 5 10 15 20 X

Bayesian Methods Part 1: Part 2: Sampling Method (MCMC)

Bayesian vs. Least Squares start with initial guess solve for a distribution on the parameters (many solutions)

Bayesian initial guess of parameters is called the prior distribution (shapes solution) "I know almost nothing about the parameters" > assume uniformly distributed

Part 1: DIRECT METHOD

Number of Slope Values: M Number of Intercept Values: B Number of Calculations: MxB=81

Finding the best solutions: Define a cost function: 2J(m, b) = (g(m) d) T C 1 (g(m) d) = N ((mx + b) y) 2 1 σ 2 (1) best solutions are where cost is within 2σ of the mean (95% confidence interval)

6 5, Cost Function 250 Intercept of line (b) 4 3 2 1 0 1 200 150 100 Cost (no units) 2 2 2.2 2.4 2.6 2.8 3 Slope of line (m)

Finding the best solutions: Define a cost function: 2J(m, b) = (g(m) d) T C 1 (g(m) d) = N ((mx + b) y) 2 1 σ 2 (2) best solutions are where cost is within 2σ of the mean (95% confidence interval) P(m, b) = e J(m,b) (3) is the posterior distribution on the parameters (central limit theorem)

6, Posterior Intercept of line (b) 5 4 3 2 1 0 1 3 2.5 2 1.5 1 0.5 Probability Density 2 2 2.2 2.4 2.6 2.8 3 Slope of line (m)

5 M Marginal Distributions 0.5 B 4 0.4 3 0.3 2 0.2 1 0.1 0 2 2.5 3 M=2.33 ± 0.08 0 2 0 2 4 6 B=3.3 ± 0.86

70 60 50 40 Solutions with Prediction Intervals DIRECT METHOD SOLUTIONS DATA CONFIDENCE INTERVALS PREDICTION INTERVALS Y 30 20 10 0 10 0 5 10 15 20 X

Number of Slope Values: M Number of Intercept Values: B Number of Calculations: MxB=2.56*10 6

Sampling Method Part 2: Sampling Method (MCMC)

Sampling Method Metropolis-Hastings: Monte Carlo Markov Chain Algorithm Step 1: Start with (m 0, b 0 ) Step 2: Step a random distance to (m 1, b 1 ) Step 3: If P(m 1, b 1 ) < P(m 0, b 0 ) accept (m 1, b 1 ). Step 4: If not, calculate a transition probability P(m 1, b 1 )/P(m 0, b 0 ). Calculate a random number ǫ between 0 and 1. If P(m 1, b 1 )/P(m 0, b 0 ) >= ǫ, accept (m 1, b 1 ). Otherwise, reject. Step 5: If (m 1, b 1 ) is accepted, start Step 2 at (m 1, b 1 ). Otherwise, start Step 2 at (m 0, b 0 )

Sampling Method Metropolis-Hastings Bayesian: Posterior Distribution 5 3 4 2.5 intercept (b) 3 2 1 2 1.5 0 1 1 0.5 2 2 2.2 2.4 2.6 2.8 slope (m)

Sampling Method Metropolis-Hastings 3500 3000 2500 2000 1500 1000 500 Histograms of Solutions 3000 2500 2000 1500 1000 500 0 2 2.5 3 M=2.34 ± 0.08 0 2 0 2 4 6 B=3.3 ± 0.94

Sampling Method Metropolis-Hastings 6, Posterior Bayesian: Posterior Distribution Intercept of line (b) 5 4 3 2 1 0 3 2.5 2 1.5 Probability Density intercept (b) 5 4 3 2 1 1 0 3 2.5 2 1.5 1 1 0.5 1 0.5 2 2 2.2 2.4 2.6 2.8 3 Slope of line (m) 2 2 2.2 2.4 2.6 2.8 slope (m)

Summary Least squares: maximum likelihood solution with an error bar. Direct methods: could miss mass in posterior distributions, computationally expensive Sampler methods: don t cover all parameter space (can miss multiple peaks in cost function), easier on computer

Research Applications Current work: estimation of wind stress under hurricane forcing conditions from consideration of observed ocean response constraining constant parameters in turbulent mixing models to observations of ocean state in tropical Pacific Future research: sampler used (Jackson et al., 2004) definition of the cost function (Mu et al., 2004) development of solution smoothers

Acknowledgements Thanks to Deborah Khider and Charles Jackson for help with all aspects of putting this talk together.