Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems

Size: px
Start display at page:

Download "Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems"

Transcription

1 Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Krzysztof Fidkowski, David Galbally*, Karen Willcox* (*MIT) Computational Aerospace Sciences Seminar Aerospace Engineering Department University of Michigan October 3, 2008

2 Outline 1 Forward and Inverse Problems 2 Statistical Inverse Problem 3 Nonlinear Projection-Based Model Reduction 4 Diffusion Flame Application in 2D and 3D

3 Forward Problem µ 2 Given µ y Calculate y Model µ 1 R(u,µ) = 0 y = y(u) Output index µ = parameter vector [µ 1, µ 2 ] y = output vector u = state vector R(u, µ) = model equations

4 Example Model: Finite element discretization of a scalar convection-diffusionreaction equation; scalar = fuel concentration Sample fuel concentration contours µ = reaction rate parameters y = average fuel concentrations at cut-planes u = finite element solution for fuel concentration

5 Inverse Problem Given fuel concentrations, determine reaction rate parameters µ 2 Calculate µ y Given y Model Parameters R(u,µ) = 0 y = y(u) µ 2 Output index Other applications: Medical imaging Circuit identification Model fitting Geophysics

6 Deterministic Inverse Solution µ 2 Calculate µ y Given ȳ Model D µ 1 Output index Determine the best value of the parameter vector: µ = arg min µ y(µ) ȳ 2 (minimization problem) subject to R(u; µ) = 0, (model equations) y(µ) y(u(µ)), µ D. (a priori knowledge)

7 Deterministic Inverse Solution (ctd.) Shortcomings: Experimental errors not included No uncertainty quantification for the best estimate µ The inverse problem may be ill-posed: no unique solution µ µ sensitive to small perturbations in ȳ Practical solution is some form of regularization, for example: µ = arg min µ y(µ) ȳ 2 + β µ 2 β = regularization parameter

8 Statistical Inverse Solution µ 2 Calculate posterior PDF y Given ȳ and ǫ Model ǫ i D µ 1 Output index For example, ǫ = normally-distributed measurement errors, each with standard deviation σ. With this measurement error, likelihood function is: [ p(ȳ µ) exp 1 ] 2σ 2(ȳ y(µ))t (ȳ y(µ)) = probability of measuring ȳ given µ

9 Posterior Probability Distribution Using Bayes theorem: p(µ ȳ) = 1 p(ȳ) p(ȳ µ)p(µ), so that the posterior PDF is (assuming a uniform prior p(µ)) { exp [ 1 (ȳ y(µ)) T (ȳ y(µ)) ], if µ D p(µ ȳ) 2σ 2 0, otherwise. µ 2 p(µ ȳ) The posterior PDF is inferred from the measured outputs, ȳ, and a model for the measurement error, σ, using y(µ). D µ 1 How do we describe p(µ ȳ)?

10 MCMC Sampling Use Markov Chain Monte Carlo (MCMC) to sample the posterior PDF Take a random walk in parameter space Generate sequence: µ 1, µ 2,... p(µ 2 ȳ) µ 2 Taking a step given µ = µ i : Pick µ from a proposal distribution, q Accept µ (i.e. µ i+1 = µ ) with probability: [ α(µ µ) = min 1, p(µ ȳ)q(µ ] µ) p(µ ȳ)q(µ µ, ) Otherwise reject it: (µ i+1 = µ) µ 2 p(µ 1 ȳ) q(µ µ) µ µ µ 1 µ 1

11 The Proposal Distribution, q(µ µ) Choice of q(µ µ) governs exploration of the parameter space, and affects the acceptance probability α. 1. Uniform box Size = [ 1, 2 ] centered at µ q(µ µ) = q(µ µ ) = const. Inefficient for anisotropic posteriors µ 2 Current sample 1 2 Uniform proposal boundary 2. Stretched ellipse [ q(µ µ) exp 1 ] 2δ 2(µ µ) T H(µ)(µ µ) H(µ) = 1 [ ] y T 2σ 2 µ (µ) y µ (µ) Posterior PDF contours µ 1 δ = step-size param y via FD µ

12 Sampling Statistics N m = number of MCMC samples µ i, i = 1..N m are drawn from the posterior probability distribution for N m, expect convergence to the actual probability distribution of µ for finite N m, can only estimate statistics Estimator of a statistical quantity: f 1 N m N m i=1 g(µ i ) mean of µ j g(µ) = µ j variance of µ j g(µ) = (µ j µ j ) 2 ( µ j = sample mean) N m is usually tens of thousands Each evaluation of acceptance probability requires a forward run Cost becomes prohibitive for large simulations

13 Model Reduction Goals: Create a computationally inexpensive emulator of the forward simulation Require accuracy for µ D Retain physics of the problem Take into account non-linearities Assumption For µ D, solution u resides in a low dimensional manifold i.e. can represent it well using n N degrees of freedom.

14 Model Reduction Using Linear Projection N = # unknowns in full system, ( millions) n = # unknowns in reduced system, ( 100) 1 u = Au + Bµ N u Φu r, 1 n Φ T Φ = I n n n N = A + B N = Φ Columns = basis vectors Multiply original system by Φ T to obtain the reduced system: u r = } Φ T {{ AΦ} u r + }{{} Φ T B µ B r A r 1 n n = A r + B r Can precompute A r and B r System is of size n No order N operations to run the reduced system

15 Model Reduction for Nonlinear Systems u = f(u, µ), f(, µ) = nonlinear function Multiplying the original system by Φ T we obtain a reduced system : n 1 u r = Φ T f(u, µ) = N Φ T f( Φ,µ) n unknowns in reduces system but... Cannot precompute any matrix products because of f Need N nonlinearity evaluations this will dominate the cost!

16 Nonlinearity Expansion Key assumption: f(u, µ) resides in a low manifold of dimension m n. f(u, µ) Φ f f r, f r R m N = Substituting into the reduced system: f(u,µ) 1 m Φ f m f r Columns = basis vectors u r = Φ} T {{ Φ} f n m f r (N not present) But, evaluating f r directly still involves N: f r = (Φ f ) T f(u, µ) }{{}}{{} m N N 1 (order N dependent)

17 Masked Projection Compute f r = nonlinearity expansion coefficients approximately: f(u, µ) Φ f f r Zf(u, µ) ZΦ f f r f r (ZΦ f ) 1 Zf(u, µ) f(u, µ) Φ f (ZΦ f ) 1 Zf(u, µ) }{{} Ψ R N m Z is an m N mask matrix Mostly zeros Ones in columns where f is to be evaluated Z = m 1 N 1 1 Ψ = Φ f (ZΦ f ) 1 can be precomputed Zf(u, µ) consists of m evaluations of the nonlinearity Similar to gappy POD [Everson Sirovicz, 1995]

18 Comparison to Direct Projection Direct Projection Masked Projection f(u,µ) 1 = Φ f m f r Zf(u,µ) 1 ZΦ f m f r N = N = m N N 1 = m I 1 m m = (Φ f ) T f(u,µ) = f r Zf(u,µ) ZΦ f f r

19 Reduced Nonlinear System Using f(u, µ) Ψ f f(zu, µ), u r = Φ T f(u, µ) m Φ T Ψ f }{{} E r R n m f(zu, µ) n = n E r m Steps Form Φ and Ψ f basis matrices by, for example, POD on a set of snapshots Choose a mask Z and calculate Ψ = Φ f (ZΦ f ) 1 Calculate E r = Φ T Ψ f offline Each forward solve of reduced model now involves only m n nonlinearity evaluations

20 Choosing a Mask, Z Accuracy of reduced model depends on Z Option 1: Choose Z to minimize cond(zφ f ) [Willcox, 2006] Option 2: Choose Z to minimize error between the masked projection and the full projection of K snaphsots, ξ f k : Z = arg min Z K (Φ f ) T ξ f k (ZΦf ) 1 Zξ f k 2 2 k=1 BPIM = Best Points Interpolation Method [Nguyen et al, 2007] Option 3: Choose i + 1st mask point recursively as the index where the error between Φ f i+1 and its reconstruction using the first i basis vectors is maximum. EIM = Empirical Interpolation Points Method [Nguyen et al, 2007]

21 Convection-Diffusion-Reaction Equations u = scalar fuel concentration (Uu) (ν u) + f(u, µ) = 0 in Ω, u = u D on Ω D, u n = 0 on Ω\ Ω D, u = fuel concentration U = velocity (constant) ν = diffusion coefficient (constant) Nonlinear reaction term: f(u, µ) = Au(c u)e E d u, µ = (ln(a), E)

22 Combustor Model 18mm 3mm Fuel Oxidizer z 1mm 9mm y 3mm x 1mm Measurement Planes

23 Finite Element Discretization 2D: Streamwise Upwind Petrov Galerkin (SUPG) 3D: Discontinuous Galerkin (DG) General discrete form (N unknowns, M nonlinearity evaluations): R(u; µ) = R 0 + Au + E f(du, µ) = 0 D interpolates u to M quadrature points E sums up the nonlinear evaluations Reduced model (n unknowns, m nonlinearity evaluations): R r + A r u r + E r f(d r u r, µ) = 0 E r = Φ T EΨ R n m D r = ZDΦ R m n

24 2D Reduced Model Performance Basis constructed from K = 196 snapshots. Ξ test = test grid in parameter space Average relative error: ε rel = mean µ Ξ test y(µ) y r (µ) y(µ) n m ε rel Online time E E E E E E E E E E 5 Online time is relative to FEM solution ε rel m = 10 m = 15 m = 25 m = 40 m = n n = 40, m = 50 field comparison with full order solution

25 2D Reduced Model Basis Vectors Obtained by Proper Orthogonal Decomposition (Karhunen Loève expansion) of 196 snapshots in a grid in parameter space. First four modes: Mode 1 Mode 2 Mode 3 Mode 4

26 2D Mask Points For m = 15, the mask Z R m N contains 15 nonzero entries. These are the points at which the nonlinear term is evaluated. [Galbally 2005]

27 2D Inverse Problem Results µ Generated ȳ with FEM model and σ = 1.5% measurement error Goal: determine PDF of parameters µ 1 = log(a ) and µ 2 = E Constructed a Markov chain of size N m = 50, 000 with the n = 40, m = 50 reduced model Frequency µ 1 Frequency ROM, samples FE, samples ROM, samples µ MCMC Updates x mean(µ 1 ) mean(µ 2 ) µ MCMC Updates x

28 3D Reduced Model Performance Full-order FEM model: 8.5 million unknowns (13h CPU time) Basis constructed from K = 169 snapshots. Ξ test = test grid in parameter space ε rel 10 Reduced model:.1s CPU time m=10 m=15 m=25 m=40 m=50 n Finite element sol. (8.5 million unknowns) n = 40, m = 50 reduced model (EIM)

29 3D Reduced Model Basis Vectors POD of 169 snapshots in a grid in parameter space. First four modes: Mode 1 Mode 2 Mode 3 Mode 4

30 3D Inverse Problem Results Generated ȳ with FEM model and σ = 1.5% measurement error Goal: determine PDF of parameters µ 1 = log(a ) and µ 2 = E Constructed a Markov chain of size N m = 50, 000 with the n = 40, m = 50 reduced model Frequency Frequency µ µ MCMC Updates x MCMC Updates x µ µ 2 Uniform proposal distribution Acceptance rate = 3.4% (low)

31 Posterior Anisotropy Low acceptance rate of uniform proposal attributed to anisotropy in the posterior PDF, p(y µ) : E Uniform proposal region ln(a) Improve acceptance rate via a stretched-gaussian proposal

32 3D Inverse Problem Results, Stretched Proposal Used δ = 1.5 for the dimensionless step-size parameter Finite differencing for y/ µ Acceptence rate now 25% Only needed N m = 5, 000 samples for same statistics Frequency Frequency µ µ MCMC Updates µ µ 2 Note: MCMC runs with full-order model are prohibitive (13h CPU per forward solve) MCMC Updates

33 Summary and Conclusions Presented a nonlinear model reduction technique in a projection framework Built on previous work in gappy POD, missing point estimation, masked projection, coefficient-function approximation Applied reduction to a parameter estimation problem in a Bayesian inference setting Vast speedup of reduced model makes such an inverse problem solution possible Additional work: Model-constrained adaptive sampling to generate snapshots as number of parameters is increased Quantification of model reduction errors on statistics of interest

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters.

More information

Model reduction of parametrized aerodynamic flows: discontinuous Galerkin RB empirical quadrature procedure. Masayuki Yano

Model reduction of parametrized aerodynamic flows: discontinuous Galerkin RB empirical quadrature procedure. Masayuki Yano Model reduction of parametrized aerodynamic flows: discontinuous Galerkin RB empirical quadrature procedure Masayuki Yano University of Toronto Institute for Aerospace Studies Acknowledgments: Anthony

More information

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model On a Data Assimilation Method coupling, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model 2016 SIAM Conference on Uncertainty Quantification Basile Marchand 1, Ludovic

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

LOCALIZED DISCRETE EMPIRICAL INTERPOLATION METHOD

LOCALIZED DISCRETE EMPIRICAL INTERPOLATION METHOD SIAM J. SCI. COMPUT. Vol. 36, No., pp. A68 A92 c 24 Society for Industrial and Applied Mathematics LOCALIZED DISCRETE EMPIRICAL INTERPOLATION METHOD BENJAMIN PEHERSTORFER, DANIEL BUTNARU, KAREN WILLCOX,

More information

Model reduction of parametrized aerodynamic flows: stability, error control, and empirical quadrature. Masayuki Yano

Model reduction of parametrized aerodynamic flows: stability, error control, and empirical quadrature. Masayuki Yano Model reduction of parametrized aerodynamic flows: stability, error control, and empirical quadrature Masayuki Yano University of Toronto Institute for Aerospace Studies Acknowledgments: Anthony Patera;

More information

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016 Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström

More information

Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems

Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems Parameter and State Model Reduction for Large-Scale Statistical Inverse Problems The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Adaptive Posterior Approximation within MCMC

Adaptive Posterior Approximation within MCMC Adaptive Posterior Approximation within MCMC Tiangang Cui (MIT) Colin Fox (University of Otago) Mike O Sullivan (University of Auckland) Youssef Marzouk (MIT) Karen Willcox (MIT) 06/January/2012 C, F,

More information

Data assimilation in high dimensions

Data assimilation in high dimensions Data assimilation in high dimensions David Kelly Courant Institute New York University New York NY www.dtbkelly.com February 12, 2015 Graduate seminar, CIMS David Kelly (CIMS) Data assimilation February

More information

Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models

Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models Ralph C. Smith Department of Mathematics North Carolina State University Experimental Setup! d 2 T s 2(a + b) = dx2

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

NON-LINEAR APPROXIMATION OF BAYESIAN UPDATE

NON-LINEAR APPROXIMATION OF BAYESIAN UPDATE tifica NON-LINEAR APPROXIMATION OF BAYESIAN UPDATE Alexander Litvinenko 1, Hermann G. Matthies 2, Elmar Zander 2 http://sri-uq.kaust.edu.sa/ 1 Extreme Computing Research Center, KAUST, 2 Institute of Scientific

More information

Multiobjective PDE-constrained optimization using the reduced basis method

Multiobjective PDE-constrained optimization using the reduced basis method Multiobjective PDE-constrained optimization using the reduced basis method Laura Iapichino1, Stefan Ulbrich2, Stefan Volkwein1 1 Universita t 2 Konstanz - Fachbereich Mathematik und Statistik Technischen

More information

PRECONDITIONING MARKOV CHAIN MONTE CARLO SIMULATIONS USING COARSE-SCALE MODELS

PRECONDITIONING MARKOV CHAIN MONTE CARLO SIMULATIONS USING COARSE-SCALE MODELS PRECONDITIONING MARKOV CHAIN MONTE CARLO SIMULATIONS USING COARSE-SCALE MODELS Y. EFENDIEV, T. HOU, AND W. LUO Abstract. We study the preconditioning of Markov Chain Monte Carlo (MCMC) methods using coarse-scale

More information

Large-scale Ordinal Collaborative Filtering

Large-scale Ordinal Collaborative Filtering Large-scale Ordinal Collaborative Filtering Ulrich Paquet, Blaise Thomson, and Ole Winther Microsoft Research Cambridge, University of Cambridge, Technical University of Denmark ulripa@microsoft.com,brmt2@cam.ac.uk,owi@imm.dtu.dk

More information

Large Scale Bayesian Inference

Large Scale Bayesian Inference Large Scale Bayesian I in Cosmology Jens Jasche Garching, 11 September 2012 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Empirical Interpolation of Nonlinear Parametrized Evolution Operators

Empirical Interpolation of Nonlinear Parametrized Evolution Operators MÜNSTER Empirical Interpolation of Nonlinear Parametrized Evolution Operators Martin Drohmann, Bernard Haasdonk, Mario Ohlberger 03/12/2010 MÜNSTER 2/20 > Motivation: Reduced Basis Method RB Scenario:

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Applications of DEIM in Nonlinear Model Reduction

Applications of DEIM in Nonlinear Model Reduction Applications of DEIM in Nonlinear Model Reduction S. Chaturantabut and D.C. Sorensen Collaborators: M. Heinkenschloss, K. Willcox, H. Antil Support: AFOSR and NSF U. Houston CSE Seminar Houston, April

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

DRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute

DRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute Inversion basics y = Kx + ε x ˆ = (K T K) 1 K T y Erkki Kyrölä Finnish Meteorological Institute Day 3 Lecture 1 Retrieval techniques - Erkki Kyrölä 1 Contents 1. Introduction: Measurements, models, inversion

More information

Dimension-Independent likelihood-informed (DILI) MCMC

Dimension-Independent likelihood-informed (DILI) MCMC Dimension-Independent likelihood-informed (DILI) MCMC Tiangang Cui, Kody Law 2, Youssef Marzouk Massachusetts Institute of Technology 2 Oak Ridge National Laboratory 2 August 25 TC, KL, YM DILI MCMC USC

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

Dynamic Data Driven Simulations in Stochastic Environments

Dynamic Data Driven Simulations in Stochastic Environments Computing 77, 321 333 (26) Digital Object Identifier (DOI) 1.17/s67-6-165-3 Dynamic Data Driven Simulations in Stochastic Environments C. Douglas, Lexington, Y. Efendiev, R. Ewing, College Station, V.

More information

An Empirical Chaos Expansion Method for Uncertainty Quantification

An Empirical Chaos Expansion Method for Uncertainty Quantification An Empirical Chaos Expansion Method for Uncertainty Quantification Melvin Leok and Gautam Wilkins Abstract. Uncertainty quantification seeks to provide a quantitative means to understand complex systems

More information

c 2016 Society for Industrial and Applied Mathematics

c 2016 Society for Industrial and Applied Mathematics SIAM J. SCI. COMPUT. Vol. 8, No. 5, pp. A779 A85 c 6 Society for Industrial and Applied Mathematics ACCELERATING MARKOV CHAIN MONTE CARLO WITH ACTIVE SUBSPACES PAUL G. CONSTANTINE, CARSON KENT, AND TAN

More information

RICE UNIVERSITY. Nonlinear Model Reduction via Discrete Empirical Interpolation by Saifon Chaturantabut

RICE UNIVERSITY. Nonlinear Model Reduction via Discrete Empirical Interpolation by Saifon Chaturantabut RICE UNIVERSITY Nonlinear Model Reduction via Discrete Empirical Interpolation by Saifon Chaturantabut A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

More information

A reduced basis method and ROM-based optimization for batch chromatography

A reduced basis method and ROM-based optimization for batch chromatography MAX PLANCK INSTITUTE MedRed, 2013 Magdeburg, Dec 11-13 A reduced basis method and ROM-based optimization for batch chromatography Yongjin Zhang joint work with Peter Benner, Lihong Feng and Suzhou Li Max

More information

POD-DEIM APPROACH ON DIMENSION REDUCTION OF A MULTI-SPECIES HOST-PARASITOID SYSTEM

POD-DEIM APPROACH ON DIMENSION REDUCTION OF A MULTI-SPECIES HOST-PARASITOID SYSTEM POD-DEIM APPROACH ON DIMENSION REDUCTION OF A MULTI-SPECIES HOST-PARASITOID SYSTEM G. Dimitriu Razvan Stefanescu Ionel M. Navon Abstract In this study, we implement the DEIM algorithm (Discrete Empirical

More information

Markov chain Monte Carlo methods in atmospheric remote sensing

Markov chain Monte Carlo methods in atmospheric remote sensing 1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Estimating functional uncertainty using polynomial chaos and adjoint equations

Estimating functional uncertainty using polynomial chaos and adjoint equations 0. Estimating functional uncertainty using polynomial chaos and adjoint equations February 24, 2011 1 Florida State University, Tallahassee, Florida, Usa 2 Moscow Institute of Physics and Technology, Moscow,

More information

Downloaded 12/22/15 to Redistribution subject to SIAM license or copyright; see

Downloaded 12/22/15 to Redistribution subject to SIAM license or copyright; see SIAM J. SCI. COMPUT. Vol. 37, No. 4, pp. A2123 A2150 c 2015 Society for Industrial and Applied Mathematics ONLINE ADAPTIVE MODEL REDUCTION FOR NONLINEAR SYSTEMS VIA LOW-RANK UPDATES BENJAMIN PEHERSTORFER

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

Parametric Problems, Stochastics, and Identification

Parametric Problems, Stochastics, and Identification Parametric Problems, Stochastics, and Identification Hermann G. Matthies a B. Rosić ab, O. Pajonk ac, A. Litvinenko a a, b University of Kragujevac c SPT Group, Hamburg wire@tu-bs.de http://www.wire.tu-bs.de

More information

Forward Problems and their Inverse Solutions

Forward Problems and their Inverse Solutions Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather

More information

arxiv: v1 [stat.co] 23 Apr 2018

arxiv: v1 [stat.co] 23 Apr 2018 Bayesian Updating and Uncertainty Quantification using Sequential Tempered MCMC with the Rank-One Modified Metropolis Algorithm Thomas A. Catanach and James L. Beck arxiv:1804.08738v1 [stat.co] 23 Apr

More information

Some recent developments on ROMs in computational fluid dynamics

Some recent developments on ROMs in computational fluid dynamics Some recent developments on ROMs in computational fluid dynamics F. Ballarin 1, S. Ali 1, E. Delgado 2, D. Torlo 1,3, T. Chacón 2, M. Gómez 2, G. Rozza 1 1 mathlab, Mathematics Area, SISSA, Trieste, Italy

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

Model Reduction for Dynamic Sensor Steering: A Bayesian Approach to Inverse Problems. Sonja Wogrin

Model Reduction for Dynamic Sensor Steering: A Bayesian Approach to Inverse Problems. Sonja Wogrin Model Reduction for Dynamic Sensor Steering: A Bayesian Approach to Inverse Problems by Sonja Wogrin Submitted to the School of Engineering in Partial Fulfillment of the Requirements for the Degree of

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Progress in Mesh-Adaptive Discontinuous Galerkin Methods for CFD

Progress in Mesh-Adaptive Discontinuous Galerkin Methods for CFD Progress in Mesh-Adaptive Discontinuous Galerkin Methods for CFD German Aerospace Center Seminar Krzysztof Fidkowski Department of Aerospace Engineering The University of Michigan May 4, 2009 K. Fidkowski

More information

Approximate inference in Energy-Based Models

Approximate inference in Energy-Based Models CSC 2535: 2013 Lecture 3b Approximate inference in Energy-Based Models Geoffrey Hinton Two types of density model Stochastic generative model using directed acyclic graph (e.g. Bayes Net) Energy-based

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Dimensionality reduction of parameter-dependent problems through proper orthogonal decomposition

Dimensionality reduction of parameter-dependent problems through proper orthogonal decomposition MATHICSE Mathematics Institute of Computational Science and Engineering School of Basic Sciences - Section of Mathematics MATHICSE Technical Report Nr. 01.2016 January 2016 (New 25.05.2016) Dimensionality

More information

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Bayesian Inference: Concept and Practice

Bayesian Inference: Concept and Practice Inference: Concept and Practice fundamentals Johan A. Elkink School of Politics & International Relations University College Dublin 5 June 2017 1 2 3 Bayes theorem In order to estimate the parameters of

More information

Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty

Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty Towards Large-Scale Computational Science and Engineering with Quantifiable Uncertainty Tan Bui-Thanh Computational Engineering and Optimization (CEO) Group Department of Aerospace Engineering and Engineering

More information

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

Analysis of Regression and Bayesian Predictive Uncertainty Measures

Analysis of Regression and Bayesian Predictive Uncertainty Measures Analysis of and Predictive Uncertainty Measures Dan Lu, Mary C. Hill, Ming Ye Florida State University, dl7f@fsu.edu, mye@fsu.edu, Tallahassee, FL, USA U.S. Geological Survey, mchill@usgs.gov, Boulder,

More information

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus

More information

Gaussian Filtering Strategies for Nonlinear Systems

Gaussian Filtering Strategies for Nonlinear Systems Gaussian Filtering Strategies for Nonlinear Systems Canonical Nonlinear Filtering Problem ~u m+1 = ~ f (~u m )+~ m+1 ~v m+1 = ~g(~u m+1 )+~ o m+1 I ~ f and ~g are nonlinear & deterministic I Noise/Errors

More information

CME 345: MODEL REDUCTION

CME 345: MODEL REDUCTION CME 345: MODEL REDUCTION Methods for Nonlinear Systems Charbel Farhat & David Amsallem Stanford University cfarhat@stanford.edu 1 / 65 Outline 1 Nested Approximations 2 Trajectory PieceWise Linear (TPWL)

More information

Proper Orthogonal Decomposition (POD) for Nonlinear Dynamical Systems. Stefan Volkwein

Proper Orthogonal Decomposition (POD) for Nonlinear Dynamical Systems. Stefan Volkwein Proper Orthogonal Decomposition (POD) for Nonlinear Dynamical Systems Institute for Mathematics and Scientific Computing, Austria DISC Summerschool 5 Outline of the talk POD and singular value decomposition

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation

More information

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute

More information

Parameterized Partial Differential Equations and the Proper Orthogonal D

Parameterized Partial Differential Equations and the Proper Orthogonal D Parameterized Partial Differential Equations and the Proper Orthogonal Decomposition Stanford University February 04, 2014 Outline Parameterized PDEs The steady case Dimensionality reduction Proper orthogonal

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Variational Methods in Bayesian Deconvolution

Variational Methods in Bayesian Deconvolution PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the

More information

Bayesian parameter estimation in predictive engineering

Bayesian parameter estimation in predictive engineering Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations

More information

Bayesian Inference for Normal Mean

Bayesian Inference for Normal Mean Al Nosedal. University of Toronto. November 18, 2015 Likelihood of Single Observation The conditional observation distribution of y µ is Normal with mean µ and variance σ 2, which is known. Its density

More information

Short tutorial on data assimilation

Short tutorial on data assimilation Mitglied der Helmholtz-Gemeinschaft Short tutorial on data assimilation 23 June 2015 Wolfgang Kurtz & Harrie-Jan Hendricks Franssen Institute of Bio- and Geosciences IBG-3 (Agrosphere), Forschungszentrum

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

(Extended) Kalman Filter

(Extended) Kalman Filter (Extended) Kalman Filter Brian Hunt 7 June 2013 Goals of Data Assimilation (DA) Estimate the state of a system based on both current and all past observations of the system, using a model for the system

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES

MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES XX IMEKO World Congress Metrology for Green Growth September 9 14, 212, Busan, Republic of Korea MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES A B Forbes National Physical Laboratory, Teddington,

More information

Machine Learning Applied to 3-D Reservoir Simulation

Machine Learning Applied to 3-D Reservoir Simulation Machine Learning Applied to 3-D Reservoir Simulation Marco A. Cardoso 1 Introduction The optimization of subsurface flow processes is important for many applications including oil field operations and

More information

Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models

Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models Benjamin Peherstorfer a,, Boris Kramer a, Karen Willcox a a Department of Aeronautics

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) = Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Computational Science and Engineering (Int. Master s Program)

Computational Science and Engineering (Int. Master s Program) Computational Science and Engineering (Int. Master s Program) TECHNISCHE UNIVERSITÄT MÜNCHEN Master s Thesis SPARSE GRID INTERPOLANTS AS SURROGATE MODELS IN STATISTICAL INVERSE PROBLEMS Author: Ao Mo-Hellenbrand

More information

Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment

Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment Yong Huang a,b, James L. Beck b,* and Hui Li a a Key Lab

More information

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I. Vector Autoregressive Model Vector Autoregressions II Empirical Macroeconomics - Lect 2 Dr. Ana Beatriz Galvao Queen Mary University of London January 2012 A VAR(p) model of the m 1 vector of time series

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

Proper Orthogonal Decomposition (POD)

Proper Orthogonal Decomposition (POD) Intro Results Problem etras Proper Orthogonal Decomposition () Advisor: Dr. Sorensen CAAM699 Department of Computational and Applied Mathematics Rice University September 5, 28 Outline Intro Results Problem

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

POD/DEIM 4DVAR Data Assimilation of the Shallow Water Equation Model

POD/DEIM 4DVAR Data Assimilation of the Shallow Water Equation Model nonlinear 4DVAR 4DVAR Data Assimilation of the Shallow Water Equation Model R. Ştefănescu and Ionel M. Department of Scientific Computing Florida State University Tallahassee, Florida May 23, 2013 (Florida

More information

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t. ECO 513 Fall 2008 C.Sims KALMAN FILTER Model in the form 1. THE KALMAN FILTER Plant equation : s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. Var(ε t ) = Ω, Var(ν t ) = Ξ. ε t ν t and (ε t,

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Sparse polynomial chaos expansions in engineering applications

Sparse polynomial chaos expansions in engineering applications DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Sparse polynomial chaos expansions in engineering applications B. Sudret G. Blatman (EDF R&D,

More information

Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter

Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter arxiv:physics/0511236 v1 28 Nov 2005 Brian R. Hunt Institute for Physical Science and Technology and Department

More information

Surrogate and reduced-order modeling: a comparison of approaches for large-scale statistical inverse problems

Surrogate and reduced-order modeling: a comparison of approaches for large-scale statistical inverse problems Surrogate and reduced-order modeling: a comparison of approaches for large-scale statistical inverse problems M. Frangos, Y. Marzouk, K. Willcox, Massachusetts Institute of Technology; B. van Bloemen Waanders,

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification

Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification Fully Bayesian Deep Gaussian Processes for Uncertainty Quantification N. Zabaras 1 S. Atkinson 1 Center for Informatics and Computational Science Department of Aerospace and Mechanical Engineering University

More information

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y. ATMO/OPTI 656b Spring 009 Bayesian Retrievals Note: This follows the discussion in Chapter of Rogers (000) As we have seen, the problem with the nadir viewing emission measurements is they do not contain

More information

INTERPOLATION AND UPDATE IN DYNAMIC DATA-DRIVEN APPLICATION SIMULATIONS

INTERPOLATION AND UPDATE IN DYNAMIC DATA-DRIVEN APPLICATION SIMULATIONS INTERPOLATION AND UPDATE IN DYNAMIC DATA-DRIVEN APPLICATION SIMULATIONS CRAIG C. DOUGLAS University of Kentucky, Department of Computer Science, 325 McVey Hall, Lexington, KY 4561 and Yale University,

More information

Riemann Manifold Methods in Bayesian Statistics

Riemann Manifold Methods in Bayesian Statistics Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes

More information

A model order reduction technique for speeding up computational homogenisation

A model order reduction technique for speeding up computational homogenisation A model order reduction technique for speeding up computational homogenisation Olivier Goury, Pierre Kerfriden, Wing Kam Liu, Stéphane Bordas Cardiff University Outline Introduction Heterogeneous materials

More information

Default Priors and Effcient Posterior Computation in Bayesian

Default Priors and Effcient Posterior Computation in Bayesian Default Priors and Effcient Posterior Computation in Bayesian Factor Analysis January 16, 2010 Presented by Eric Wang, Duke University Background and Motivation A Brief Review of Parameter Expansion Literature

More information