Hierarchical Bayesian Inversion
|
|
- Lee Chase
- 5 years ago
- Views:
Transcription
1 Hierarchical Bayesian Inversion Andrew M Stuart Computing and Mathematical Sciences, Caltech cw/ S. Agapiou, J. Bardsley and O. Papaspiliopoulos SIAM/ASA JUQ 2(2014), pp cw/ M. Dunlop and M. Iglesias arxiv Funded by: DARPA, EPSRC, ERC, ONR 1
2 Orientation Bayesian inversion for functions requires MCMC on R N with N 1 (approximating N =.) Succesful Bayesian inversion often requires hierarchical thinking. This talk: design efficient algorithms which confront interaction between these two issues. This talk: showcase the ideas in use for inversion in class of piecewise continuous functions with unknown discontinuity sets. 2
3 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 3
4 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 4
5 The Setting Let (X, F) be an arbitrary measurable space. Target Measure P is a probability measure on X defined via its density with respect to another probability measure P 0 on X : P(du) = Z 1 e Φ(u) P 0 (du), Z = e Φ(u) P 0 (du). X AM Stuart. Acta Numerica 19(2010). (Bayesian Inverse Problems: P 0 prior, P posterior.) M Hairer, AM Stuart and J Voss. Oxford Handbook of Nonlinear Filtering (2011). (Conditioned Diffusions: P 0 Wiener measure, P SDE pathspace measure.) 5
6 Metropolis-Hastings Let X = R N and P and P 0 have Lebesgue densities π and π 0. Let q(, ) be a proposal density for Metropolis-Hastings. Accept-Reject The acceptance probabilty for a proposed move from u to v is { π(v)q(v, u) } a(u, v) = min 1,. π(u)q(u, v) WK Hastings. Biometrika(1970). 6
7 Tierney Formulation of Metropolis-Hastings on X Let Q(, dv) be a proposal probability kernel for Metropolis-Hastings. Define probability measures ν and ν T on X X by ν(du, dv) = P(du)Q(u, dv) ν T (du, dv) = P(dv)Q(v, du) Accept-Reject Metropolis-Hastings is well-defined on X if ν T has a density with respect to ν. Then the acceptance probabilty for a proposed move from u to v is a(u, v) = min {1, dνt } dν (u, v). L. Tierney. Annals Appl. Prob. 8(1998). 7
8 Prior-Reversible Proposal Assume that ν 0 (du, dv) = ν0 T (du, dv) (prior-reversible) where ν 0 (du, dv) = P 0 (du)q(u, dv) ν T 0 (du, dv) = P 0 (dv)q(v, du). Accept-Reject Then Metropolis-Hastings is well-defined on X and the acceptance probabilty for a proposed move from u to v is a(u, v) = min {1, e Φ(v) } e Φ(u). Example 1: X = R N, P 0 Lebesgue, proposal is random walk RWM. Example 2: X = Hilbert (H = R ), P 0 Gaussian, proposal is pcn. SL Cotter, GO Roberts, AM Stuart and D White. Stat. Sci. 28(2013). 8
9 Key Theorem: Key Idea P 0 Gaussian on H. Approximate target measure P on H by measure P N on R N. Theorem 1 pcn to sample P N : spectral gap O(1). RWM to sample P N : spectral gap O(N 1 2 ). M. Hairer, AM Stuart and S. Vollmer. Ann. Appl. Prob. 24(2014). Key Idea MCMC will not degenerate under mesh-refinement MCMC is well-defined on H in sense of Tierney. 9
10 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 10
11 Gaussian Priors: Covariance Function versus Operator Centred Gaussian probability measure µ 0 on Hilbert space H of real-valued functions is characterized, for u µ 0, by: Covariance Function E µ 0 u(x)u(y) = c(x, y). or by: Covariance Operator E µ 0 u u = C. 11
12 Whittle-Matern: Covariance Function c WM (x, y; θ) = σ ν 1 Γ(ν) (τ x y )ν K ν (τ x y ). σ is an amplitude scale. τ is an inverse length-scale. ν controls smoothness draws from Gaussian fields with this covariance have ν fractional Sobolev and Hölder derivatives. We use θ Θ to denote a subset of ν, τ, σ. L. Roininen, JMJ Huttunen and S. Lasanen. Inv. Prob. Imag. (2014). 12
13 Whittle-Matern: Covariance Operator Covariance operator on unbounded domain R d is C WM (θ) σ 2 τ 2ν (τ 2 I ) ν d 2. Thus we may draw u µ 0 by solving the stochastic PDE (τ 2 I ) ν 2 + d 4 u = στ ν ξ where ξ is Gaussian white noise. F. Lindgren, H. Rue and J. Lindstrom. JRSSB (2011). 13
14 Hierarchical Setting Let P 0 (du, dθ) = P u θ (du; θ)p θ (dθ). P u θ (du θ) = N(0, C WM(θ) ). Hierarchical Target Measure Use data to learn both u and θ. P(du, dθ) = Z 1 e Φ(u) P 0 (du, dθ), Z = e Φ(u) P 0 (du, dθ). X 14
15 Gibbs and Metropolis-within-Gibbs Gibbs sample u k+1 P(du θ k ); sample θ k+1 P(dθ u k+1 ). Q(u, dv θ) is a Metropolis kernel invariant for P(du θ). Q(θ, dφ u) is a Metropolis kernel invariant for P(dθ u). Metropolis-Within-Gibbs (MwG) sample u k+1 Q(u k, dv θ k ); sample θ k+1 Q(θ k, dφ u k+1 ). For N independent behaviour each Metropolis kernel should be well-defined in the sense of Tierney. 15
16 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 16
17 Illustrative Example Find u H from y R J where Centred Linear Inverse Problem y = Ku + η. Prior u θ N(0, θ 1 C 0 ); θ P θ. Likelihood y u, θ y u N(Ku, C 1 ). Posterior on (u, θ). Set w = θ 1 2 u. Non-Centred Linear Inverse Problem Prior w N(0, C 0 ) independent of θ P θ. Likelihood y w, θ N(θ 1 2 Kw, C 1 ). Posterior on (w, θ). O Papaspiliopoulos, GO Roberts, M Sköld. Stat. Sci. (2007). Y. Yu and X.-L. Meng, JCGS(2011). 17
18 Key Theorem: Key Idea Approximate target measure P(du, dθ) on H R by measure P N (du, dθ) on R N R. Theorem 2 Centred method to sample P: θ k = θ 0 for all k 0. ((Reducible). Centred method to sample P N : E(θ k+1 θ k ) = O(N 1 ). S. Agapiou, J. Bardsley, O. Papispiliopoulis and AM Stuart. SIAM/ASA J UQ. 2(2014). Key Idea MwG will not degenerate under mesh-refinement MwG is irreducible on H R. G. Roberts and O. Stramer. Biometrika 88(2001). 18
19 Numerical Results Autocorrelation for centred (left) and non-centred (chains). Key Idea For N = 32 (black), N = 512 (blue) and N = 8192 (red). Non-centering leads to mesh independent mixing. S. Agapiou, J. Bardsley, O. Papispiliopoulis and AM Stuart. SIAM/ASA J UQ. 2(2014). 19
20 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 20
21 Bayesian Level Set Inversion Piecewise constant function v defined through thresholding a continuous level set function u. Let = c0 < c1 < < ck 1 < ck =. v(x) = K X vk χ{ck 1 <u ck } (x); v = F(u). k=1 F : X 7 Z is the level-set map., X cts fns. Z piecewise cts fns. S.Osher and J.Sethian. J. Comp. Phys. 79(1988). 21
22 Issues F is discontinuous. How to impose length scale via regularization? How to choose amplitude scales c k in F? 22
23 Discontinuity of Level Set Map F ( ) is continuous at u F ( ) is discontinuous at u v = F(u) := v + χ {u 0} (x) + v χ {u<0} (x). Causes problems in classical level set inversion. Bayesian formulation: it is a probability zero event. M. Iglesias, Y. Lu and AM Stuart. Interfaces and Free Boundary Problems, to appear. arxiv:
24 Length-Scale Matters Figure demonstrates role of length-scale in level-set function. Suggests Bayesian Hierarchical method to learn length scale. 24
25 Amplitude Matters Prior: P 0 (du, dτ) = P du τ (u τ)p τ (dτ). P(du τ) = N(0, C WM(σ,τ) ). Recall C WM(σ,τ,ν) σ 2 τ 2ν (τ 2 I ) ν d 2. Theorem 3 For fixed (τ, ν) the family of measures N(0, C WM(,τ,ν) ) are mutually singular. Key Idea Hierarchical MwG algorithms to learn amplitude will behave poorly under mesh-refinement N not well-defined in sense of Tierney. 25
26 Hierarchical Priors: Length/Amplitude Coupling Hence choose σ = τ ν and define: C τ,ν (τ 2 I ) ν d 2. Prior: P 0 (du, dτ) = P u τ (u τ)p τ (τ). P 0 (du τ) = N(0, C τ,ν ). Theorem 4 For fixed ν the family of measures N(0, C,ν ) are mutually equivalent. Key Idea Suggests need to scale thresholds in F by τ ν. Let = c 0 < c 1 < < c K 1 < c K =. K v(x) = v k χ {ck 1 <uτ ν c k }(x); k=1 v = F(u, τ). M. Dunlop, M. Iglesias and A. M. Stuart. arxiv:
27 Fixed Levels (Movie) Wrong length-scale gives problems 27
28 Rescale Levels (Movie) Learn length-scale hierarchically (amp. coupled) Centred: invert for (u, θ) : C 1 2 θ u = ξ. 28
29 Non-Centering and Rescaled Levels (Movie) Learn length and regularity hierarchically Non-centred: invert for (ξ, θ) : C 1 2 θ u = ξ. 29
30 Groundwater Flow Application (Movie) Forward Problem: Darcy Flow Let Z := {v Z : essinf x D v > 0}. Given κ Z, find y := G(κ). Here y j = l j (p), V := H 1 0 (D), l j V, j = 1,..., J, f V and κ p = f in D, p = 0 in D.. Let η R J be a realization of an observational noise. Inverse Problem Given that κ = F(u), u X and y R J, find u (and hence κ): y = G(κ) + η. 30
31 Table of Contents MCMC in High Dimensions Hierarchical Priors Centering versus Non-Centering Hierarchical Level Set Conclusions 31
32 Highlights 1 BAYESIAN INVERSION FOR FUNCTIONS Well-posed inverse problems: posterior Lipschitz in parameters. Algorithms defined on function space robust to discretization. 2 HIERARCHICAL BAYESIAN INVERSION Whittle-Matern Priors: amplitude, length-scale and regularity. Metropolis-within-Gibbs. Role of non-centring. 3 LEVEL SET Methodology to invert for discotninuous functions. Learning length-scale is important. Non-centring can help. 32
Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors
Division of Engineering & Applied Science Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors IPAM, UCLA, November 14, 2017 Matt Dunlop Victor Chen (Caltech) Omiros Papaspiliopoulos (ICREA,
More informationDimension-Independent likelihood-informed (DILI) MCMC
Dimension-Independent likelihood-informed (DILI) MCMC Tiangang Cui, Kody Law 2, Youssef Marzouk Massachusetts Institute of Technology 2 Oak Ridge National Laboratory 2 August 25 TC, KL, YM DILI MCMC USC
More informationPoint spread function reconstruction from the image of a sharp edge
DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty
More informationNonparametric Drift Estimation for Stochastic Differential Equations
Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,
More informationarxiv: v1 [math.st] 19 Mar 2016
arxiv:63.635v [math.st] 9 Mar 6 Cauchy difference priors for edge-preserving Bayesian inversion with an application to X-ray tomography Markku Markkanen, Lassi Roininen, Janne M J Huttunen 3 and Sari Lasanen
More informationPractical unbiased Monte Carlo for Uncertainty Quantification
Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationHierarchical Matérn fields and Cauchy difference priors for Bayesian inversion
Hierarchical Matérn fields and Cauchy difference priors for Bayesian inversion Lassi Roininen Bayesian Computation for High-Dimensional Statistical Models National University of Singapore, 27 August 2018
More informationComputational Complexity of Metropolis-Hastings Methods in High Dimensions
Computational Complexity of Metropolis-Hastings Methods in High Dimensions Alexandros Beskos and Andrew Stuart Abstract This article contains an overview of the literature concerning the computational
More informationSignal Processing Problems on Function Space: Bayesian Formulation, Stochastic PDEs and Effective MCMC Methods
Signal Processing Problems on Function Space: Bayesian Formulation, Stochastic PDEs and Effective MCMC Methods March 4, 29 M. Hairer 1,2, A. Stuart 1, J. Voss 1,3 1 Mathematics Institute, Warwick University,
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationHastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model
UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced
More informationSpatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016
Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationState Space Representation of Gaussian Processes
State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)
More informationMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo Methods John Geweke University of Iowa, USA 2005 Institute on Computational Economics University of Chicago - Argonne National Laboaratories July 22, 2005 The problem p (θ, ω I)
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationSession 3A: Markov chain Monte Carlo (MCMC)
Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationA REVERSE TO THE JEFFREYS LINDLEY PARADOX
PROBABILITY AND MATHEMATICAL STATISTICS Vol. 38, Fasc. 1 (2018), pp. 243 247 doi:10.19195/0208-4147.38.1.13 A REVERSE TO THE JEFFREYS LINDLEY PARADOX BY WIEBE R. P E S T M A N (LEUVEN), FRANCIS T U E R
More informationSequential Monte Carlo Methods in High Dimensions
Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,
More informationDetermining white noise forcing from Eulerian observations in the Navier-Stokes equation
Stoch PDE: Anal Comp (214) 2:233 261 DOI 1.17/s472-14-28-4 Determining white noise forcing from Eulerian observations in the Navier-Stokes equation Viet Ha Hoang Kody J. H. Law Andrew M. Stuart Received:
More informationOn Reparametrization and the Gibbs Sampler
On Reparametrization and the Gibbs Sampler Jorge Carlos Román Department of Mathematics Vanderbilt University James P. Hobert Department of Statistics University of Florida March 2014 Brett Presnell Department
More informationThe Bayesian approach to inverse problems
The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationHierarchical models. Dr. Jarad Niemi. August 31, Iowa State University. Jarad Niemi (Iowa State) Hierarchical models August 31, / 31
Hierarchical models Dr. Jarad Niemi Iowa State University August 31, 2017 Jarad Niemi (Iowa State) Hierarchical models August 31, 2017 1 / 31 Normal hierarchical model Let Y ig N(θ g, σ 2 ) for i = 1,...,
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationEfficient MCMC Sampling for Hierarchical Bayesian Inverse Problems
Efficient MCMC Sampling for Hierarchical Bayesian Inverse Problems Andrew Brown 1,2, Arvind Saibaba 3, Sarah Vallélian 2,3 CCNS Transition Workshop SAMSI May 5, 2016 Supported by SAMSI Visiting Research
More informationComputer Practical: Metropolis-Hastings-based MCMC
Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov
More informationPseudo-marginal MCMC methods for inference in latent variable models
Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016
More informationA Dirichlet Form approach to MCMC Optimal Scaling
A Dirichlet Form approach to MCMC Optimal Scaling Giacomo Zanella, Wilfrid S. Kendall, and Mylène Bédard. g.zanella@warwick.ac.uk, w.s.kendall@warwick.ac.uk, mylene.bedard@umontreal.ca Supported by EPSRC
More informationRiemann Manifold Methods in Bayesian Statistics
Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes
More informationarxiv: v1 [math.st] 26 Mar 2012
POSTERIOR CONSISTENCY OF THE BAYESIAN APPROACH TO LINEAR ILL-POSED INVERSE PROBLEMS arxiv:103.5753v1 [math.st] 6 Mar 01 By Sergios Agapiou Stig Larsson and Andrew M. Stuart University of Warwick and Chalmers
More informationWell-posed Bayesian Inverse Problems: Beyond Gaussian Priors
Well-posed Bayesian Inverse Problems: Beyond Gaussian Priors Bamdad Hosseini Department of Mathematics Simon Fraser University, Canada The Institute for Computational Engineering and Sciences University
More informationWeakly non-linear completely resonant hamiltonian PDEs and the problem of weak turbulence
Sergei Kuksin Weakly non-linear completely resonant hamiltonian PDEs and the problem of weak turbulence (Toronto, 10 January, 2014 ) 1 1 Introduction: weak turbulence (WT) (One of) origines: Rudolf Peierls,
More informationPseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory
Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine
More informationStatistical mechanics of random billiard systems
Statistical mechanics of random billiard systems Renato Feres Washington University, St. Louis Banff, August 2014 1 / 39 Acknowledgements Collaborators: Timothy Chumley, U. of Iowa Scott Cook, Swarthmore
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationStochastic Spectral Approaches to Bayesian Inference
Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to
More informationBayesian inverse problems with Laplacian noise
Bayesian inverse problems with Laplacian noise Remo Kretschmann Faculty of Mathematics, University of Duisburg-Essen Applied Inverse Problems 2017, M27 Hangzhou, 1 June 2017 1 / 33 Outline 1 Inverse heat
More informationList of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016
List of projects FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 206 Work in groups of two (if this is absolutely not possible for some reason, please let the lecturers
More informationBayesian parameter estimation in predictive engineering
Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations
More informationECE 541 Stochastic Signals and Systems Problem Set 11 Solution
ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ
More informationSpatial Statistics with Image Analysis. Lecture L08. Computer exercise 3. Lecture 8. Johan Lindström. November 25, 2016
C3 Repetition Creating Q Spectral Non-grid Spatial Statistics with Image Analysis Lecture 8 Johan Lindström November 25, 216 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 1/39 Lecture L8 C3 Repetition
More informationAEROSOL MODEL SELECTION AND UNCERTAINTY MODELLING BY RJMCMC TECHNIQUE
AEROSOL MODEL SELECTION AND UNCERTAINTY MODELLING BY RJMCMC TECHNIQUE Marko Laine 1, Johanna Tamminen 1, Erkki Kyrölä 1, and Heikki Haario 2 1 Finnish Meteorological Institute, Helsinki, Finland 2 Lappeenranta
More informationInference in state-space models with multiple paths from conditional SMC
Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September
More informationLecture 4: Dynamic models
linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More informationBayesian Nonparametrics
Bayesian Nonparametrics Peter Orbanz Columbia University PARAMETERS AND PATTERNS Parameters P(X θ) = Probability[data pattern] 3 2 1 0 1 2 3 5 0 5 Inference idea data = underlying pattern + independent
More informationBayesian Inverse problem, Data assimilation and Localization
Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationA short introduction to INLA and R-INLA
A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk
More informationLecture 8: The Metropolis-Hastings Algorithm
30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:
More informationUniversity of Toronto Department of Statistics
Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704
More informationBayesian Inverse Problems
Bayesian Inverse Problems Jonas Latz Input/Output: www.latz.io Technical University of Munich Department of Mathematics, Chair for Numerical Analysis Email: jonas.latz@tum.de Garching, July 10 2018 Guest
More informationFiltering the Navier-Stokes Equation
Filtering the Navier-Stokes Equation Andrew M Stuart1 1 Mathematics Institute and Centre for Scientific Computing University of Warwick Geometric Methods Brown, November 4th 11 Collaboration with C. Brett,
More informationWorst-Case Bounds for Gaussian Process Models
Worst-Case Bounds for Gaussian Process Models Sham M. Kakade University of Pennsylvania Matthias W. Seeger UC Berkeley Abstract Dean P. Foster University of Pennsylvania We present a competitive analysis
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationHyperpriors for Matérn fields with applications in Bayesian inversion arxiv: v1 [math.st] 9 Dec 2016
Hyperpriors for Matérn fields with applications in Bayesian inversion arxiv:6.989v [math.st] 9 Dec 6 January 8, 8 Lassi Roininen Department of Mathematics, Imperial College London South Kensington Campus,
More informationHow to solve the stochastic partial differential equation that gives a Matérn random field using the finite element method
arxiv:1803.03765v1 [stat.co] 10 Mar 2018 How to solve the stochastic partial differential equation that gives a Matérn random field using the finite element method Haakon Bakka bakka@r-inla.org March 13,
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationDeblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.
Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More informationBayesian Estimation with Sparse Grids
Bayesian Estimation with Sparse Grids Kenneth L. Judd and Thomas M. Mertens Institute on Computational Economics August 7, 27 / 48 Outline Introduction 2 Sparse grids Construction Integration with sparse
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationMultilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems
Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November
More informationLecture 13 Fundamentals of Bayesian Inference
Lecture 13 Fundamentals of Bayesian Inference Dennis Sun Stats 253 August 11, 2014 Outline of Lecture 1 Bayesian Models 2 Modeling Correlations Using Bayes 3 The Universal Algorithm 4 BUGS 5 Wrapping Up
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationThe University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),
The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters
More informationA quick introduction to Markov chains and Markov chain Monte Carlo (revised version)
A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) Rasmus Waagepetersen Institute of Mathematical Sciences Aalborg University 1 Introduction These notes are intended to
More informationMarkov chain Monte Carlo
Markov chain Monte Carlo Karl Oskar Ekvall Galin L. Jones University of Minnesota March 12, 2019 Abstract Practically relevant statistical models often give rise to probability distributions that are analytically
More informationarxiv: v1 [stat.co] 2 Nov 2017
Binary Bouncy Particle Sampler arxiv:1711.922v1 [stat.co] 2 Nov 217 Ari Pakman Department of Statistics Center for Theoretical Neuroscience Grossman Center for the Statistics of Mind Columbia University
More informationParticle Metropolis-adjusted Langevin algorithms
Particle Metropolis-adjusted Langevin algorithms Christopher Nemeth, Chris Sherlock and Paul Fearnhead arxiv:1412.7299v3 [stat.me] 27 May 2016 Department of Mathematics and Statistics, Lancaster University,
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationSome Results on the Ergodicity of Adaptive MCMC Algorithms
Some Results on the Ergodicity of Adaptive MCMC Algorithms Omar Khalil Supervisor: Jeffrey Rosenthal September 2, 2011 1 Contents 1 Andrieu-Moulines 4 2 Roberts-Rosenthal 7 3 Atchadé and Fort 8 4 Relationship
More informationPDEs in Image Processing, Tutorials
PDEs in Image Processing, Tutorials Markus Grasmair Vienna, Winter Term 2010 2011 Direct Methods Let X be a topological space and R: X R {+ } some functional. following definitions: The mapping R is lower
More informationMCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17
MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making
More informationNumerical explorations of a forward-backward diffusion equation
Numerical explorations of a forward-backward diffusion equation Newton Institute KIT Programme Pauline Lafitte 1 C. Mascia 2 1 SIMPAF - INRIA & U. Lille 1, France 2 Univ. La Sapienza, Roma, Italy September
More informationEnKF and Catastrophic filter divergence
EnKF and Catastrophic filter divergence David Kelly Kody Law Andrew Stuart Mathematics Department University of North Carolina Chapel Hill NC bkelly.com June 4, 2014 SIAM UQ 2014 Savannah. David Kelly
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationLabor-Supply Shifts and Economic Fluctuations. Technical Appendix
Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January
More informationData assimilation as an optimal control problem and applications to UQ
Data assimilation as an optimal control problem and applications to UQ Walter Acevedo, Angwenyi David, Jana de Wiljes & Sebastian Reich Universität Potsdam/ University of Reading IPAM, November 13th 2017
More informationApproximate Bayesian Computation: a simulation based approach to inference
Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics
More informationA Geometric Interpretation of the Metropolis Hastings Algorithm
Statistical Science 2, Vol. 6, No., 5 9 A Geometric Interpretation of the Metropolis Hastings Algorithm Louis J. Billera and Persi Diaconis Abstract. The Metropolis Hastings algorithm transforms a given
More information17 : Optimization and Monte Carlo Methods
10-708: Probabilistic Graphical Models Spring 2017 17 : Optimization and Monte Carlo Methods Lecturer: Avinava Dubey Scribes: Neil Spencer, YJ Choe 1 Recap 1.1 Monte Carlo Monte Carlo methods such as rejection
More informationMarkov chain Monte Carlo methods
Markov chain Monte Carlo methods Youssef Marzouk Department of Aeronautics and Astronatics Massachusetts Institute of Technology ymarz@mit.edu 22 June 2015 Marzouk (MIT) IMA Summer School 22 June 2015
More informationBayesian inference of random fields represented with the Karhunen-Loève expansion
Bayesian inference of random fields represented with the Karhunen-Loève expansion Felipe Uribe a,, Iason Papaioannou a, Wolfgang Betz a, Daniel Straub a a Engineering Risk Analysis Group, Technische Universität
More informationMinicourse on: Markov Chain Monte Carlo: Simulation Techniques in Statistics
Minicourse on: Markov Chain Monte Carlo: Simulation Techniques in Statistics Eric Slud, Statistics Program Lecture 1: Metropolis-Hastings Algorithm, plus background in Simulation and Markov Chains. Lecture
More informationBayesian Inference for Clustered Extremes
Newcastle University, Newcastle-upon-Tyne, U.K. lee.fawcett@ncl.ac.uk 20th TIES Conference: Bologna, Italy, July 2009 Structure of this talk 1. Motivation and background 2. Review of existing methods Limitations/difficulties
More informationErgodicity in data assimilation methods
Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation
More informationEfficient adaptive covariate modelling for extremes
Efficient adaptive covariate modelling for extremes Slides at www.lancs.ac.uk/ jonathan Matthew Jones, David Randell, Emma Ross, Elena Zanini, Philip Jonathan Copyright of Shell December 218 1 / 23 Structural
More informationMarkov chain Monte Carlo methods in atmospheric remote sensing
1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,
More informationMARKOV CHAIN MONTE CARLO
MARKOV CHAIN MONTE CARLO RYAN WANG Abstract. This paper gives a brief introduction to Markov Chain Monte Carlo methods, which offer a general framework for calculating difficult integrals. We start with
More information