Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Size: px
Start display at page:

Download "Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems"

Transcription

1 Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November 22nd 2017 Seminar Stochastics, Statistics and Numerical Analysis, Universität Mannheim

2 Jonas Latz 2 Joint work with Elisabeth Ullmann (Lehrstuhl Numerische Mathematik, Mathematik, TUM) Iason Papaioannou (Engineering Risk Analysis, Bau Geo Umwelt, TUM) Acknowledgments This work was supported by Deutsche Forschungsgemeinschaft (DFG) and TU München (TUM) through the International Graduate School of Science and Engineering (IGSSE) at TUM within the project BAYES. The computing resources were provided by Leibniz Rechenzentrum (LRZ) der Bayerischen Akademie der Wissenschaften.

3 Jonas Latz 3 Motivation Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo Numerical Experiments Conclusions

4 Jonas Latz 4 Motivation Motivation Given an Inverse Problem Find θ X : G(θ) + η = y, (IP) where G : X Y is the forward response operator, θ X is the unknown parameter, η N(0, Γ) is observational noise and y Y is observed data.

5 Jonas Latz 5 Motivation Motivation Example: Typically, G = O G, where G(θ) is the solution operator of a PDE exp(θ) p = f (on D) p = 0 (on D) (PDE) and O is the observation operator mapping p (p(x i ) : i = 1,..., N obs ) Y.

6 Jonas Latz 6 Motivation Motivation We approach (IP) Bayesian. So Assume, θ L 2 (Ω, A, P; X), θ µ 0 := N(m 0, C 0 ). (Prior)

7 Jonas Latz 6 Motivation Motivation We approach (IP) Bayesian. So Assume, θ L 2 (Ω, A, P; X), θ µ 0 := N(m 0, C 0 ). (Prior) Find µ y := P(θ G(θ) + η = y), (BIP)

8 Jonas Latz 6 Motivation Motivation We approach (IP) Bayesian. So Assume, θ L 2 (Ω, A, P; X), θ µ 0 := N(m 0, C 0 ). (Prior) Find µ y := P(θ G(θ) + η = y), (BIP) given by dµ y (θ) L(y θ) := exp ( 12 ) dµ Γ 12 (G(θ) y) 2Y. (Bayes Rule) 0

9 Jonas Latz 7 Motivation Typical Approach: Importance Sampling Task: Integrate a quantity of interest Q : X R w.r.t. µ y.

10 Jonas Latz 7 Motivation Typical Approach: Importance Sampling Task: Integrate a quantity of interest Q : X R w.r.t. µ y. Idea: Consider the following identity ] E µ y [Q] = Qdµ y = Q dµy dµ 0 = E µ0 [Q dµy. dµ 0 dµ 0 X integrals w.r.t. µ y can be expressed by integrals w.r.t. µ 0 X

11 Jonas Latz 7 Motivation Typical Approach: Importance Sampling Task: Integrate a quantity of interest Q : X R w.r.t. µ y. Idea: Consider the following identity ] E µ y [Q] = Qdµ y = Q dµy dµ 0 = E µ0 [Q dµy. dµ 0 dµ 0 X X integrals w.r.t. µ y can be expressed by integrals w.r.t. µ 0 we can use vanilla monte carlo to approximate the integral using µ 0 -distributed particles

12 Jonas Latz 8 Motivation Typical Approach: Importance Sampling

13 Jonas Latz 9 Sequential Monte Carlo Samplers Outline Motivation Sequential Monte Carlo Samplers Sequential Monte Carlo (with Tempering) Multilevel Bridging Multilevel Sequential 2 Monte Carlo Numerical Experiments Conclusions

14 Jonas Latz 10 Sequential Monte Carlo Samplers Sequential Monte Carlo Samplers Task: Sample from a sequence of measures µ 0, µ 1,..., µ K, where We can sample from µ 0, µ k and µ 0 are equivalent (k {1,..., K }) Idea: Apply Importance Sampling sequentially to update µ k µ k+1, where µ k µ k. (Del Moral et. al. 2006) [3],[2]

15 Jonas Latz 11 Sequential Monte Carlo Samplers Tempering Tempering Employ a sequence µ 0,..., µ K, µ 0 is the prior distribution µ K = µ y is the posterior distribution µ k is accessible by importance sampling from µ k 1 with a small number of samples.

16 Jonas Latz 11 Sequential Monte Carlo Samplers Tempering Tempering Employ a sequence µ 0,..., µ K, µ 0 is the prior distribution µ K = µ y is the posterior distribution µ k is accessible by importance sampling from µ k 1 with a small number of samples. Use a tempering of the likelihood (β k [0, 1], β 0 = 0, β K = 1): dµ k dµ 0 (θ) L(y θ) β k. (Neal 2001, Beskos et. al. 2015) [6]

17 Jonas Latz 11 Sequential Monte Carlo Samplers Tempering Tempering Employ a sequence µ 0,..., µ K, µ 0 is the prior distribution µ K = µ y is the posterior distribution µ k is accessible by importance sampling from µ k 1 with a small number of samples. Use a tempering of the likelihood (β k [0, 1], β 0 = 0, β K = 1): dµ k dµ 0 (θ) L(y θ) β k. (Neal 2001, Beskos et. al. 2015) [6] (β k : k = 1,..., K ) can be determined a priori or on the fly

18 Jonas Latz 12 Sequential Monte Carlo Samplers Tempering Tempering

19 Jonas Latz 13 Sequential Monte Carlo Samplers Multilevel Bridging Outline Motivation Sequential Monte Carlo Samplers Sequential Monte Carlo (with Tempering) Multilevel Bridging Multilevel Sequential 2 Monte Carlo Numerical Experiments

20 Jonas Latz 14 Sequential Monte Carlo Samplers Multilevel Bridging (Multilevel) Bridging In realistic problems: Approximate G l G, where l {1,..., N L } reflects the discretisation complexity and approximation accuracy of G.

21 Jonas Latz 14 Sequential Monte Carlo Samplers Multilevel Bridging (Multilevel) Bridging In realistic problems: Approximate G l G, where l {1,..., N L } reflects the discretisation complexity and approximation accuracy of G. Task: Increase the complexity of µ y l, by updating directly µ y l µy l+1.

22 Jonas Latz 14 Sequential Monte Carlo Samplers Multilevel Bridging (Multilevel) Bridging In realistic problems: Approximate G l G, where l {1,..., N L } reflects the discretisation complexity and approximation accuracy of G. Task: Increase the complexity of µ y l, by updating directly µ y l µy l+1. SMC sampler from (Koutsourelakis 2009, Del Moral et. al. 2006): [4] dµ k dµ 0 (θ) L l (y θ) 1 ζ k L l+1 (y θ) ζ k, where µ 0 is the underlying prior distribution.

23 Sequential Monte Carlo Samplers Multilevel Bridging (Multilevel) Bridging In realistic problems: Approximate G l G, where l {1,..., N L } reflects the discretisation complexity and approximation accuracy of G. Task: Increase the complexity of µ y l, by updating directly µ y l µy l+1. SMC sampler from (Koutsourelakis 2009, Del Moral et. al. 2006): [4] dµ k dµ 0 (θ) L l (y θ) 1 ζ k L l+1 (y θ) ζ k, where µ 0 is the underlying prior distribution. (β k : k = 1,..., K ) can be determined a priori or on the fly Jonas Latz 14

24 Jonas Latz 15 Sequential Monte Carlo Samplers Multilevel Bridging (Multilevel) Bridging Figure: Bridging between two distributions µ and µ

25 Jonas Latz 16 Sequential Monte Carlo Samplers Multilevel Bridging Inv. Temp. β k (k {1,..., K }) 1 = β K MLB (Target distr. µ y ) β K 1 β K 2. β 2 β 1 SMC (Prior distr. µ 0 ) 0 = β N L 2N L 1 N L Discr. lvl. l {1,..., N L }

26 Jonas Latz 17 Sequential Monte Carlo Samplers Multilevel Bridging However Figure: Bridging between two distributions µ and µ with a high discrepancy

27 Jonas Latz 18 Multilevel Sequential 2 Monte Carlo Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo Basics Adaptivity Numerical Experiments Conclusions

28 Jonas Latz 19 Multilevel Sequential 2 Monte Carlo Basics Multilevel Sequential 2 Monte Carlo Basic Idea: Consider two different update mechanisms at the same time: Update Inverse Temperature (Tempering) or Update level (Bridging) Reference: See (L., Papaioannou, Ullmann 2017; submitted to JCP).[5]

29 Jonas Latz 20 Multilevel Sequential 2 Monte Carlo Basics Inv. Temp. β k (k {1,..., K }) 1 = β K MLB (Target distr. µ y ) β K 1 β K 2. β 2 β 1 SMC (Prior distr. µ 0 ) 0 = β L 2 L 1 L Discr. lvl. l {1,..., L}

30 Jonas Latz 21 Multilevel Sequential 2 Monte Carlo Basics Inv. Temp. β k (k {1,..., K }) 1 = β K MLB (Target distr. µ y ) β K 1 β K 2. β 2 β 1 MLS 2 MC SMC (Prior distr. µ 0 ) 0 = β L 2 L 1 L Discr. lvl. l {1,..., L}

31 Jonas Latz 22 Multilevel Sequential 2 Monte Carlo Adaptivity Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo Basics Adaptivity Numerical Experiments Conclusions

32 Jonas Latz 23 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering Reminder: We construct the SMC sequences by either (Tempering) or (Multilevel Bridging) dµ k dµ 0 (θ) L(y θ) β k. dµ k dµ 0 (θ) L l (y θ) 1 ζ k L l+1 (y θ) ζ k.

33 Jonas Latz 23 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering Reminder: We construct the SMC sequences by either (Tempering) or dµ k dµ 0 (θ) L(y θ) β k. dµ k dµ 0 (θ) L l (y θ) 1 ζ k L l+1 (y θ) ζ k. (Multilevel Bridging) How do we choose β 1,..., β K, resp. ζ 1,..., ζ K?

34 Jonas Latz 24 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering When applying importance sampling, we lose stochastic accuracy unweighted particles vs. }{{} Monte Carlo accuracy weighted particles }{{} loss due to degeneracy of particles

35 Jonas Latz 24 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering When applying importance sampling, we lose stochastic accuracy unweighted particles vs. }{{} Monte Carlo accuracy weighted particles }{{} loss due to degeneracy of particles in terms of sample sizes unweighted particles }{{} actual sample size vs. weighted particles }{{} effective sample size

36 Jonas Latz 24 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering When applying importance sampling, we lose stochastic accuracy unweighted particles vs. }{{} Monte Carlo accuracy weighted particles }{{} loss due to degeneracy of particles in terms of sample sizes unweighted particles vs. weighted particles }{{}}{{} actual sample size effective sample size effective sample size (ESS) depends on β k and can be computed cheaply

37 Jonas Latz 25 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering choose for some given τ > 0. ( β k = argmin β ESS(β) J ) τ 2,

38 Jonas Latz 25 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive Bridging and Tempering choose ( β k = argmin β ESS(β) J ) τ 2, for some given τ > 0. Well, but Introduces a bias into the estimation of the model evidence Adaptive Bridging/Tempering can be shown to converge to the correct posterior measure (Beskos et. al. 2016) [1]

39 Jonas Latz 26 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive update strategy: Bridging vs. Tempering Let β k (0, 1) be the current inverse temperature and l be the current discretisation level.

40 Jonas Latz 26 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive update strategy: Bridging vs. Tempering Let β k (0, 1) be the current inverse temperature and l be the current discretisation level. What do we do next? Bridging l l + 1 or Tempering β k β k+1?

41 Jonas Latz 26 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive update strategy: Bridging vs. Tempering Let β k (0, 1) be the current inverse temperature and l be the current discretisation level. What do we do next? Bridging l l + 1 or Tempering β k β k+1? Strategy: Compute ESS(1) of l l + 1 (measures similarity of µ y l and µy l+1 )

42 Jonas Latz 26 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive update strategy: Bridging vs. Tempering Let β k (0, 1) be the current inverse temperature and l be the current discretisation level. What do we do next? Bridging l l + 1 or Tempering β k β k+1? Strategy: Compute ESS(1) of l l + 1 (measures similarity of µ y l and µy l+1 ) If ESS(1) < J (i.e. µ y 1+τ 2 l and µy l+1 are not very similar) Update Level l l + 1 (otherwise increasing β k would increase the differences of µ y l and µ y l+1 even more)

43 Jonas Latz 26 Multilevel Sequential 2 Monte Carlo Adaptivity Adaptive update strategy: Bridging vs. Tempering Let β k (0, 1) be the current inverse temperature and l be the current discretisation level. What do we do next? Bridging l l + 1 or Tempering β k β k+1? Strategy: Compute ESS(1) of l l + 1 (measures similarity of µ y l and µy l+1 ) If ESS(1) < J (i.e. µ y 1+τ 2 l and µy l+1 are not very similar) Update Level l l + 1 (otherwise increasing β k would increase the differences of µ y l and µ y l+1 even more) If ESS(1) J (i.e. µ y 1+τ 2 l and µy l+1 are very similar) Update Inverse Temperature β k β k+1 (otherwise increasing l would increase the cost of the future tempering)

44 Jonas Latz 27 Numerical Experiments Model Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo for Inverse Problems Numerical Experiments Model Estimating a random field Estimating the model evidence Computational Cost Conclusions

45 Jonas Latz 28 Numerical Experiments Model Model Consider again exp(θ) p = f (on D) p = 0 (on D), (PDE) where D = (0, 1) 2, f contains nine sources. θ C 1 (D; R) shall be estimated based on N obs = 25 noisy observations.

46 Jonas Latz 29 Numerical Experiments Model Model Figure: Measurement locations and actual pressure based on the true underlying parameter θ true.

47 Jonas Latz 30 Numerical Experiments Model Prior Random Field µ 0 = N(0, C 0 ). C 0 is a Matern-type covariance operator with smoothness ν = 1.5 and correlation length λ = 0.65 The random field is discretised with a truncated Karhunen-Loeve-expansion with N sto = 10 KL terms Figure: Samples of the prior random field µ 0

48 Jonas Latz 31 Numerical Experiments Model Likelihood The data y was generated using noise η N(0, Id). The noise assumption(s) for the likelihood was chosen more conservatively: ( L(y θ) exp (y G(θ)) 2) (Ex.1) ( L(y θ) exp (y G(θ)) 2) (Ex.2)

49 Jonas Latz 32 Numerical Experiments Model Simulation settings We applied MLS 2 MC, Tempering and MLB using J {156, 312, 625, 1250, 2500} particles and τ k {0.5, 1} (i.e. ESS {0.8J, 0.5J} in each update) The PDE is evaluated with a mesh size of h l = 2 (l+2) on level l {1,..., 5}. MLB was not possible given the noise standard deviation: (µ y 1 and µy 2 were numerically singular.)

50 Jonas Latz 33 Numerical Experiments Estimating a random field Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo for Inverse Problems Numerical Experiments Model Estimating a random field Estimating the model evidence Computational Cost Conclusions

51 Jonas Latz 34 Numerical Experiments Estimating a random field Error Measure: Posterior measure Let µ, ν be probability measures on (R, BR). We define the Kolmogorov-Smirnoff (KS) distance between µ and ν by d KS (µ, ν) = sup µ((, x]) ν((, x]), x R As an error measure, we consider the d KS of (SMC, MLS 2 MC) and (SMC, MLB), as well as (SMC, SMC) (as a reference), for all pairs of simulation results.

52 Jonas Latz 35 Numerical Experiments Estimating a random field Figure: KS distances between approx. posterior dist., given Γ = Id

53 Jonas Latz 36 Numerical Experiments Estimating a random field Figure: KS distances between approx. posterior dist., given Γ = Id

54 Jonas Latz 37 Numerical Experiments Estimating the Evidence Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo for Inverse Problems Numerical Experiments Model Estimating a random field Estimating the model evidence Computational Cost Conclusions

55 Jonas Latz 38 Numerical Experiments Estimating the Evidence Model Evidence The Model Evidence is the normalising constant of the posterior with respect to the prior: Z y = L(y θ)dµ 0 (θ) in general considered to be difficult to estimate (with Importance Sampling, MCMC) can be determined accurately in any SMC method biased estimator, if (β k : k = 1,..., K ) is picked adaptively is used in Bayesian Model selection

56 Jonas Latz 39 Numerical Experiments Estimating the Evidence Estimation results: Model Evidence 1 Empirical CDF, tau=0.5 F(x) 0.5 MLS 2 MC SMC MLB x 10-5 Empirical CDF, tau=1 1 F(x) 0.5 MLS 2 MC SMC MLB x 10-4 Figure: Empirical CDF of estimated model evidences after 50 runs. J = 2500, Noise Covariance Γ = Id.

57 Jonas Latz 40 Numerical Experiments Estimating the Evidence Estimation results: Model Evidence 1 Empirical CDF, tau=0.5 F(x) 0.5 MLS 2 MC SMC x 10-8 Empirical CDF, tau=1 1 F(x) 0.5 MLS 2 MC SMC x 10-8 Figure: Empirical CDF of estimated model evidences after 50 runs. J = 2500, Noise Covariance Γ = Id.

58 Jonas Latz 41 Numerical Experiments Computational Cost Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo for Inverse Problems Numerical Experiments Model Estimating a random field Estimating the model evidence Computational Cost Conclusions

59 Jonas Latz 42 Numerical Experiments Computational Cost Computational Cost We measure computational cost in terms of the theoretical cost of PDE evaluations on the different levels. Assumption: One evaluation of the model G l requires a computational cost of C l = 4 5 l Hence, C NL = 1.

60 Jonas Latz 43 Numerical Experiments Computational Cost Computational Cost MLS 2 MC SMC MLB 4 7 MLS 2 MC SMC MLB Figure: Computational Cost, given noise covariance Γ = Id

61 Jonas Latz 44 Numerical Experiments Computational Cost Computational Cost 4 7 MLS 2 MC SMC 4 7 MLS 2 MC SMC Figure: Computational Cost, given noise covariance Γ = Id

62 Numerical Experiments Computational Cost Computational Cost vs. Accuracy MLS 2 MC 10 4 MLB SMC MLS 2 MC MLB SMC Figure: Computational Cost, given noise covariance Γ = Id Jonas Latz 45

63 Numerical Experiments Computational Cost Computational Cost vs. Accuracy MLS 2 MC 10 4 SMC MLS 2 MC 10 4 SMC Figure: Computational Cost, given noise covariance Γ = Id Jonas Latz 46

64 Jonas Latz 47 Conclusions Outline Motivation Sequential Monte Carlo Samplers Multilevel Sequential 2 Monte Carlo for Inverse Problems Numerical Experiments Conclusions

65 Jonas Latz 48 Conclusions Conclusions The presented method is more efficient than single lvl SMC can be used consistently with black box models contains an adaptive method to decide whether to update level or inverse temperature can estimate the Model evidence

66 Jonas Latz 48 Conclusions Conclusions The presented method is more efficient than single lvl SMC can be used consistently with black box models contains an adaptive method to decide whether to update level or inverse temperature can estimate the Model evidence Moreover, the presented method works well in high dimensions (tested up to 320 KL terms) can decide adaptively when to stop updating the discretisation level does not require any parameter tuning

67 References References BESKOS, A., JASRA, A., KANTAS, N., AND THIERY, A. On the convergence of adaptive sequential Monte Carlo methods. Ann. Appl. Probab. 26, 2 (2016), BESKOS, A., JASRA, A., MUZAFFER, E. A., AND STUART, A. M. Sequential Monte Carlo methods for Bayesian elliptic inverse problems. Stat. Comput. 25, 4 (2015), DEL MORAL, P., DOUCET, A., AND JASRA, A. Sequential Monte Carlo samplers. J. R. Statist. Soc. B 68, 3 (2006), KOUTSOURELAKIS, P. S. Accurate Uncertainty Quantification using inaccurate Computational Models. SIAM J. Sci. Comput. 31, 5 (2009), LATZ, J., PAPAIOANNOU, I., AND ULLMANN, E. The Multilevel Sequentialˆ2 Monte Carlo Sampler for Bayesian Inverse Problems. ArXiv e-prints (2017). NEAL, R. M. Annealed importance sampling. Stat. Comp. 11, 2 (2001), Jonas Latz 49

68 Jonas Latz 50 Jonas Latz Input/Output:

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Bayesian Inverse Problems

Bayesian Inverse Problems Bayesian Inverse Problems Jonas Latz Input/Output: www.latz.io Technical University of Munich Department of Mathematics, Chair for Numerical Analysis Email: jonas.latz@tum.de Garching, July 10 2018 Guest

More information

Random Fields in Bayesian Inference: Effects of the Random Field Discretization

Random Fields in Bayesian Inference: Effects of the Random Field Discretization Random Fields in Bayesian Inference: Effects of the Random Field Discretization Felipe Uribe a, Iason Papaioannou a, Wolfgang Betz a, Elisabeth Ullmann b, Daniel Straub a a Engineering Risk Analysis Group,

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Practical unbiased Monte Carlo for Uncertainty Quantification

Practical unbiased Monte Carlo for Uncertainty Quantification Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

Bayesian inference of random fields represented with the Karhunen-Loève expansion

Bayesian inference of random fields represented with the Karhunen-Loève expansion Bayesian inference of random fields represented with the Karhunen-Loève expansion Felipe Uribe a,, Iason Papaioannou a, Wolfgang Betz a, Daniel Straub a a Engineering Risk Analysis Group, Technische Universität

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Dimension-Independent likelihood-informed (DILI) MCMC

Dimension-Independent likelihood-informed (DILI) MCMC Dimension-Independent likelihood-informed (DILI) MCMC Tiangang Cui, Kody Law 2, Youssef Marzouk Massachusetts Institute of Technology 2 Oak Ridge National Laboratory 2 August 25 TC, KL, YM DILI MCMC USC

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

The Bayesian approach to inverse problems

The Bayesian approach to inverse problems The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo

Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo, James McGree, Tony Pettitt October 7, 2 Introduction Motivation Model choice abundant throughout literature Take into account

More information

Improving power posterior estimation of statistical evidence

Improving power posterior estimation of statistical evidence Improving power posterior estimation of statistical evidence Nial Friel, Merrilee Hurn and Jason Wyse Department of Mathematical Sciences, University of Bath, UK 10 June 2013 Bayesian Model Choice Possible

More information

Simulation Based Aircraft Trajectories Design for Air Traffic Management

Simulation Based Aircraft Trajectories Design for Air Traffic Management Simulation Based Aircraft Trajectories Design for Air Traffic Management N. Kantas 1 A. Lecchini-Visintini 2 J. M. Maciejowski 1 1 Cambridge University Engineering Dept., UK 2 Dept. of Engineering, University

More information

Bayesian parameter estimation in predictive engineering

Bayesian parameter estimation in predictive engineering Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations

More information

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263 On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers Working Paper Research by Arnaud Dufays September 2014 No 263 Editorial Director Jan Smets, Member of the Board of Directors of

More information

Data assimilation as an optimal control problem and applications to UQ

Data assimilation as an optimal control problem and applications to UQ Data assimilation as an optimal control problem and applications to UQ Walter Acevedo, Angwenyi David, Jana de Wiljes & Sebastian Reich Universität Potsdam/ University of Reading IPAM, November 13th 2017

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

dans les modèles à vraisemblance non explicite par des algorithmes gradient-proximaux perturbés

dans les modèles à vraisemblance non explicite par des algorithmes gradient-proximaux perturbés Inférence pénalisée dans les modèles à vraisemblance non explicite par des algorithmes gradient-proximaux perturbés Gersende Fort Institut de Mathématiques de Toulouse, CNRS and Univ. Paul Sabatier Toulouse,

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

Afternoon Meeting on Bayesian Computation 2018 University of Reading

Afternoon Meeting on Bayesian Computation 2018 University of Reading Gabriele Abbati 1, Alessra Tosi 2, Seth Flaxman 3, Michael A Osborne 1 1 University of Oxford, 2 Mind Foundry Ltd, 3 Imperial College London Afternoon Meeting on Bayesian Computation 2018 University of

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting

More information

Multifidelity Approaches to Approximate Bayesian Computation

Multifidelity Approaches to Approximate Bayesian Computation Multifidelity Approaches to Approximate Bayesian Computation Thomas P. Prescott Wolfson Centre for Mathematical Biology University of Oxford Banff International Research Station 11th 16th November 2018

More information

Efficient Solvers for Stochastic Finite Element Saddle Point Problems

Efficient Solvers for Stochastic Finite Element Saddle Point Problems Efficient Solvers for Stochastic Finite Element Saddle Point Problems Catherine E. Powell c.powell@manchester.ac.uk School of Mathematics University of Manchester, UK Efficient Solvers for Stochastic Finite

More information

Tutorial on ABC Algorithms

Tutorial on ABC Algorithms Tutorial on ABC Algorithms Dr Chris Drovandi Queensland University of Technology, Australia c.drovandi@qut.edu.au July 3, 2014 Notation Model parameter θ with prior π(θ) Likelihood is f(ý θ) with observed

More information

Graphical Models for Collaborative Filtering

Graphical Models for Collaborative Filtering Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations SIAM/ASA J. UNCERTAINTY QUANTIFICATION Vol. xx, pp. x c xxxx Society for Industrial and Applied Mathematics x x Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model On a Data Assimilation Method coupling, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model 2016 SIAM Conference on Uncertainty Quantification Basile Marchand 1, Ludovic

More information

Recent Advances in Bayesian Inference for Inverse Problems

Recent Advances in Bayesian Inference for Inverse Problems Recent Advances in Bayesian Inference for Inverse Problems Felix Lucka University College London, UK f.lucka@ucl.ac.uk Applied Inverse Problems Helsinki, May 25, 2015 Bayesian Inference for Inverse Problems

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

Bayesian inverse problems with Laplacian noise

Bayesian inverse problems with Laplacian noise Bayesian inverse problems with Laplacian noise Remo Kretschmann Faculty of Mathematics, University of Duisburg-Essen Applied Inverse Problems 2017, M27 Hangzhou, 1 June 2017 1 / 33 Outline 1 Inverse heat

More information

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning Tobias Scheffer, Niels Landwehr Remember: Normal Distribution Distribution over x. Density function with parameters

More information

Sequential Monte Carlo Algorithms for Bayesian Sequential Design

Sequential Monte Carlo Algorithms for Bayesian Sequential Design Sequential Monte Carlo Algorithms for Bayesian Sequential Design Dr Queensland University of Technology c.drovandi@qut.edu.au Collaborators: James McGree, Tony Pettitt, Gentry White Acknowledgements: Australian

More information

Chapter 12 PAWL-Forced Simulated Tempering

Chapter 12 PAWL-Forced Simulated Tempering Chapter 12 PAWL-Forced Simulated Tempering Luke Bornn Abstract In this short note, we show how the parallel adaptive Wang Landau (PAWL) algorithm of Bornn et al. (J Comput Graph Stat, to appear) can be

More information

A Backward Particle Interpretation of Feynman-Kac Formulae

A Backward Particle Interpretation of Feynman-Kac Formulae A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint

More information

Multivariate Normal & Wishart

Multivariate Normal & Wishart Multivariate Normal & Wishart Hoff Chapter 7 October 21, 2010 Reading Comprehesion Example Twenty-two children are given a reading comprehsion test before and after receiving a particular instruction method.

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors Division of Engineering & Applied Science Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors IPAM, UCLA, November 14, 2017 Matt Dunlop Victor Chen (Caltech) Omiros Papaspiliopoulos (ICREA,

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Annealing Between Distributions by Averaging Moments

Annealing Between Distributions by Averaging Moments Annealing Between Distributions by Averaging Moments Chris J. Maddison Dept. of Comp. Sci. University of Toronto Roger Grosse CSAIL MIT Ruslan Salakhutdinov University of Toronto Partition Functions We

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 CS students: don t forget to re-register in CS-535D. Even if you just audit this course, please do register.

More information

Patterns of Scalable Bayesian Inference Background (Session 1)

Patterns of Scalable Bayesian Inference Background (Session 1) Patterns of Scalable Bayesian Inference Background (Session 1) Jerónimo Arenas-García Universidad Carlos III de Madrid jeronimo.arenas@gmail.com June 14, 2017 1 / 15 Motivation. Bayesian Learning principles

More information

Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis

Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis Stéphanie Allassonnière CIS, JHU July, 15th 28 Context : Computational Anatomy Context and motivations :

More information

Bayesian estimation of the discrepancy with misspecified parametric models

Bayesian estimation of the discrepancy with misspecified parametric models Bayesian estimation of the discrepancy with misspecified parametric models Pierpaolo De Blasi University of Torino & Collegio Carlo Alberto Bayesian Nonparametrics workshop ICERM, 17-21 September 2012

More information

ML estimation: Random-intercepts logistic model. and z

ML estimation: Random-intercepts logistic model. and z ML estimation: Random-intercepts logistic model log p ij 1 p = x ijβ + υ i with υ i N(0, συ) 2 ij Standardizing the random effect, θ i = υ i /σ υ, yields log p ij 1 p = x ij β + σ υθ i with θ i N(0, 1)

More information

A hierarchical Krylov-Bayes iterative inverse solver for MEG with physiological preconditioning

A hierarchical Krylov-Bayes iterative inverse solver for MEG with physiological preconditioning A hierarchical Krylov-Bayes iterative inverse solver for MEG with physiological preconditioning D Calvetti 1 A Pascarella 3 F Pitolli 2 E Somersalo 1 B Vantaggi 2 1 Case Western Reserve University Department

More information

Infer relationships among three species: Outgroup:

Infer relationships among three species: Outgroup: Infer relationships among three species: Outgroup: Three possible trees (topologies): A C B A B C Model probability 1.0 Prior distribution Data (observations) probability 1.0 Posterior distribution Bayes

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

Numerical Analysis of Elliptic PDEs with Random Coefficients

Numerical Analysis of Elliptic PDEs with Random Coefficients Numerical Analysis of Elliptic PDEs with Random Coefficients (Lecture I) Robert Scheichl Department of Mathematical Sciences University of Bath Workshop on PDEs with Random Coefficients Weierstrass Institute,

More information

Statistical Inverse Problems and Instrumental Variables

Statistical Inverse Problems and Instrumental Variables Statistical Inverse Problems and Instrumental Variables Thorsten Hohage Institut für Numerische und Angewandte Mathematik University of Göttingen Workshop on Inverse and Partial Information Problems: Methodology

More information

Point spread function reconstruction from the image of a sharp edge

Point spread function reconstruction from the image of a sharp edge DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty

More information

Evolutionary Sequential Monte Carlo samplers for Change-point models

Evolutionary Sequential Monte Carlo samplers for Change-point models Evolutionary Sequential Monte Carlo samplers for Change-point models Arnaud Dufays 1 August 24, 2015 Abstract Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. Nevertheless

More information

Sequential Importance Sampling for Structural Reliability Analysis

Sequential Importance Sampling for Structural Reliability Analysis Sequential Importance Sampling for Structural Reliability Analysis Iason Papaioannou a, Costas Papadimitriou b, Daniel Straub a a Engineering Risk Analysis Group, Technische Universität München, Arcisstr.

More information

Fractional Imputation in Survey Sampling: A Comparative Review

Fractional Imputation in Survey Sampling: A Comparative Review Fractional Imputation in Survey Sampling: A Comparative Review Shu Yang Jae-Kwang Kim Iowa State University Joint Statistical Meetings, August 2015 Outline Introduction Fractional imputation Features Numerical

More information

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 8: Importance Sampling 8.1 Importance Sampling Importance sampling

More information

Besov regularity for operator equations on patchwise smooth manifolds

Besov regularity for operator equations on patchwise smooth manifolds on patchwise smooth manifolds Markus Weimar Philipps-University Marburg Joint work with Stephan Dahlke (PU Marburg) Mecklenburger Workshop Approximationsmethoden und schnelle Algorithmen Hasenwinkel, March

More information

Natural Evolution Strategies for Direct Search

Natural Evolution Strategies for Direct Search Tobias Glasmachers Natural Evolution Strategies for Direct Search 1 Natural Evolution Strategies for Direct Search PGMO-COPI 2014 Recent Advances on Continuous Randomized black-box optimization Thursday

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Bayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I

Bayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I Bayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I 1 Sum rule: If set {x i } is exhaustive and exclusive, pr(x

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine

More information

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016 Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström

More information

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1) HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter

More information

Transitional Markov Chain Monte Carlo: Observations and Improvements

Transitional Markov Chain Monte Carlo: Observations and Improvements Transitional Markov Chain Monte Carlo: Observations and Improvements Wolfgang Betz, Iason Papaioannou, Daniel Straub Engineering Risk Analysis Group, Technische Universität München, 8333 München, Germany

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Multimodal Nested Sampling

Multimodal Nested Sampling Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques Institut für Numerische Mathematik und Optimierung Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques Oliver Ernst Computational Methods with Applications Harrachov, CR,

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

DRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute

DRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute Inversion basics y = Kx + ε x ˆ = (K T K) 1 K T y Erkki Kyrölä Finnish Meteorological Institute Day 3 Lecture 1 Retrieval techniques - Erkki Kyrölä 1 Contents 1. Introduction: Measurements, models, inversion

More information