Uncertainty Propagation

Size: px
Start display at page:

Download "Uncertainty Propagation"

Transcription

1 Setting: Uncertainty Propagation We assume that we have determined distributions for parameters e.g., Bayesian inference, prior experiments, expert opinion Ṫ 1 = 1 - d 1 T 1 - (1 - ")k 1 VT 1 Ṫ 2 = 2 - d 2 T 2 - (1 - f ")k 2 VT 2 Ṫ 1 =(1 - ")k 1 VT 1 - T 1 - m 1 ET 1 Ṫ 2 =(1 - f ")k 2 VT 2 - T 2 - m 2 ET 2 V = N T (T 1 + T 2 ) - cv - [(1 - ") 1 k 1 T 1 +(1 - f ") 2 k 2 T 2 ]V Ė = E + b E(T1 + T 2 ) T1 + T 2 + K E - d E(T1 + T 2 ) b T1 + T 2 + K E - E E d Goal: Construct statistics for quantities of interest e.g., Expected viral load in HIV patient with appropriate uncertainty intervals Note: Often involves moderate to highdimensional integration Z E[V (t)] = V (t, q) (q)dq R 6

2 Example: Uncertainty Propagation Yesterday, we computed mean responses, credible intervals, and prediction intervals for the SIR model ds dt = N S IS, S(0) = S 0 di dt = IS (r + )I, I(0) = I 0 dr dt = ri R, R(0) = R 0 Susceptible Infectious Recovered Goal: Discuss sampling-based techniques to compute statistics and intervals.

3 Issues: Uncertainty Propagation Uncertainty propagation and computation of statistical quantities of interest much more difficult for PDE models. e.g. Sinko-Streifer model: u(t, x): Number of fish of size x at @x = -µu g(t, x)u(t, x) x=x0 = u(0, x) = (x) Z x1 Random Field Representation: px g(x) = q j j (x) j=1 Quantity of Interest: Z E[u(t, x)] = u(t, x, q) (q)dq R p x 0 k(t, )u(t, )d Issues: How do we efficiently propagate input uncertainties through models? Surrogate models. How do we approximately integrate in moderate to high dimensions; e.g., p = ? Monte Carlo sampling, sparse grid quadrature

4 Forward Uncertainty Propagation: Linear Models Linear Models: Analytic mean and variance relations Example: Linear stress-strain relation i = Ee i + E 2 e 3 i + " i,i=1,...,n Model Statistics: Let E,E 2 and var(e), var(e 2 ) denote parameter means and variance. Then E[Ee i + E 2 e 3 i ]=Ee i + E 2 e 3 i var[ee i + E 2 e 3 i ]=e 2 i var(e)+e 6 i var(e 2 )+2e 4 i cov(e,e 2 ) Response Statistics: Assume measurement errors uncorrelated from model response. E[ i ]=Ee i + E 2 e 3 i var[ i ]=e 2 i var(e)+e 6 i var(e 2 )+2e 4 i cov(e,e 2 ) + var(" i ) Problem: Models are almost always nonlinearly parameterized

5 Forward Uncertainty Propagation: Sampling Methods Strategy: Randomly sample from parameter and measurement error distributions and propagate through model to quantify response uncertainty. Advantages: Applicable to nonlinear models. Parameters can be correlated and non-gaussian. Straight-forward to apply and convergence rate is independent of number of parameters. Can directly incorporate both parameter and measurement uncertainties. Disadvantages: Very slow convergence rate: O (1/ p M) where M is the number of samples. 100-fold more evaluations required to gain additional place of accuracy. This motivates numerical analysis techniques: Next talk by C. Webster

6 Uncertainty Propagation Sampling-Based Approaches: Quadrature: Monte Carlo, Latin hypercube, Sobol Interval definitions and construction Prediction intervals for HIV model via DRAM algorithm Numerical Analysis-Based Approaches: Next presentation by C. Webster

7 Numerical Quadrature Motivation: Computation of expected values requires approximation of integrals Z E[u(t, x)] = u(t, x, q) (q)dq R p Numerical Quadrature: Z RX f (q) (q)dq f (q r )w r R p Questions: r=1 How do we choose the quadrature points and weights? E.g., Newton-Cotes; e.g., trapezoid rule Z b a f (q)dq h 2 " f (a)+f (b)+2 q r = a + hr, h = b - a R - 1 XR-2 r=1 f (q r ) # a b

8 Numerical Quadrature Motivation: Computation of expected values requires approximation of integrals Z E[u(t, x)] = u(t, x, q) (q)dq R p Numerical Quadrature: Z RX f (q) (q)dq f (q r )w r R p r=1 Questions: How do we choose the quadrature points and weights? E.g., Newton-Cotes, Gaussian algorithms a b a xx x j 1 xx x j b

9 Numerical Quadrature: Z RX f (q) (q)dq f (q r )w r R p r=1 Numerical Quadrature Questions: Can we construct nested algorithms to improve efficiency? E.g., employ Clenshaw-Curtis points Level

10 Questions: Numerical Quadrature How do we reduce required number of points while maintaining accuracy? Tensored Grids: Exponential growth Sparse Grids: Same accuracy p R` Sparse Grid R Tensored Grid R =(R`) p , > ,901 > ,353,801 >

11 Numerical Quadrature Problem: Accuracy of methods diminishes as parameter dimension p increases Suppose f 2 C ([0, 1] p ) Tensor products: Take R` points in each dimension so R =(R`) p total points Quadrature errors: Newton-Cotes: E O (R - ` )=O (R - /p ) Gaussian: E O (e - R`) =O e - pp R Sparse Grid: E O R - log (R) (p-1)( +1)

12 Numerical Quadrature Problem: Accuracy of methods diminishes as parameter dimension p increases Suppose f 2 C ([0, 1] p ) Tensor products: Take R` points in each dimension so R =(R`) p total points Quadrature errors: Newton-Cotes: E O (R - ` )=O (R - /p ) Gaussian: E O (e - R`) =O e - pp R Sparse Grid: E O R - log (R) (p-1)( +1) Alternative: Monte Carlo quadrature Z f (q) (q)dq 1 R R p RX f (q r ), E r=1 1 pr Advantage: Errors independent of dimension p Disadvantage: Convergence is very slow!

13 Numerical Quadrature Problem: Accuracy of methods diminishes as parameter dimension p increases Suppose f 2 C ([0, 1] p ) Tensor products: Take R` points in each dimension so R =(R`) p total points Quadrature errors: Newton-Cotes: E O (R - ` )=O (R - /p ) Gaussian: E O (e - R`) =O e - pp R Sparse Grid: E O R - log (R) (p-1)( +1) Alternative: Monte Carlo quadrature Z R p f (q) (q)dq 1 R RX f (q r ), E r=1 1 pr Advantage: Errors independent of dimension p Disadvantage: Convergence is very slow! Conclusion: For high enough dimension p, monkeys throwing darts will beat Gaussian and sparse grid techniques!

14 Monte Carlo Sampling Techniques Issues: Very low accuracy and slow convergence Random sampling may not randomly cover space 1 Samples from Uniform Distribution

15 Monte Carlo Sampling Techniques Issues: Very low accuracy and slow convergence Random sampling may not randomly cover space 1 Samples from Uniform Distribution 1 Sobol Points Sobol Sequence: Use a base of two to form successively finer uniform partitions of unit interval and reorder coordinates in each dimension.

16 Quasi-Monte Carlo Sampling Techniques Issues: Very low accuracy and slow convergence Random sampling may not randomly cover space 1 Samples from Uniform Distribution 1 Sobol Points Z R p f (q) (q)dq 1 R RX f (q r ), E r=1 1 pr E O (log R) p R

17 Monte Carlo Sampling Techniques Example: Use Monte Carlo sampling to approximate area of circle A c A s = r 2 4r 2 = 4 ) A c = 4 A s Strategy: Randomly sample N points in square ) approximately N 4 Count M points in circle in circle ) 4M N Later Tutorial: Run code and see if we see convergence rate 1 p N Quasi-Monte Carlo: Talk with M. Hyman SAMSI Program on QMC in

18 Confidence, Credible and Prediction Intervals Note: We now know how to compute the mean response for the QoI. How do we compute appropriate intervals? SIR Model: ds dt = N S IS, S(0) = S 0 di dt = IS (r + )I, I(0) = I 0 dr dt = ri R, R(0) = R 0 Susceptible Infectious Recovered

19 Confidence, Credible and Prediction Intervals Data: =[ 1,, n ] of iid random observations Confidence Interval (Frequentist): A 100 (1 )% confidence interval for a fixed, unknown parameter q 0 is a random interval [L c ( ),U c ( )], having probability at least 1 of covering q 0 under the joint distribution of. Credible Interval (Bayesian): A 100 (1 ) % credible interval is that having probability at least 1 of containing q. Strategy: Sample out of parameter density Q (q) 90% Confidence Intervals 90% Credible Interval

20 Confidence, Credible and Prediction Intervals Data: =[ 1,, n ] of iid random observations Prediction Interval: A 100 (1 )% prediction interval for a future observable n+1 is a random interval [L c ( ),U c ( )] having probability at least 1 of of containing n+1 under the joint distribution of (, n+1 ). Example: Consider linear model i = q 0 + q 1 x i + " i,i=1,,n Prediction interval Credible interval

21 Example: HIV Model Model: Ṫ 1 = 1 - d 1 T 1 - (1 - ")k 1 VT 1 Ṫ 2 = 2 - d 2 T 2 - (1 - f ")k 2 VT 2 Ṫ 1 =(1 - ")k 1 VT 1 - T 1 - m 1 ET 1 Ṫ 2 =(1 - f ")k 2 VT 2 - T 2 - m 2 ET 2 V = N T (T 1 + T 2 ) - cv - [(1 - ") 1 k 1 T 1 +(1 - f ") 2 k 2 T 2 ]V Ė = E + b E(T1 + T 2 ) T1 + T 2 + K E - d E(T1 + T 2 ) b T1 + T 2 + K E - E E d Parameter Chains and Densities: q =[b E,, d 1, k 2, 1, K b ]

22 Propagation of Uncertainty HIV Example Parameter Densities: Techniques: Sample from parameter and observation error densities to construct mean response, credible intervals, and prediction intervals for QoI. p Slow convergence rate O (1/ M ) Samples from Chain Data, Credible Intervals and Prediction Intervals Non-Gaussian Credible and Prediction Intervals

23 Monte Carlo Quadrature: MATLAB Tutorial Run rand_points.m to observe uniformly sampled and Sobol points. Run pi_approx.m with different values of N to see if you observe convergence rate of 1/ p N SIR Model: Note: ds dt = N S IS, S(0) = S 0 di dt = IS (r + )I, I(0) = I 0 dr dt = ri R, R(0) = R 0 Susceptible Infectious Recovered Run either the posted code for the 4 parameter model or your modification for the 3 parameter model and compute the prediction intervals.

Parameter Selection Techniques and Surrogate Models

Parameter Selection Techniques and Surrogate Models Parameter Selection Techniques and Surrogate Models Model Reduction: Will discuss two forms Parameter space reduction Surrogate models to reduce model complexity Input Representation Local Sensitivity

More information

Differentiation and Integration

Differentiation and Integration Differentiation and Integration (Lectures on Numerical Analysis for Economists II) Jesús Fernández-Villaverde 1 and Pablo Guerrón 2 February 12, 2018 1 University of Pennsylvania 2 Boston College Motivation

More information

Steps in Uncertainty Quantification

Steps in Uncertainty Quantification Steps in Uncertainty Quantification Challenge: How do we do uncertainty quantification for computationally expensive models? Example: We have a computational budget of 5 model evaluations. Bayesian inference

More information

Spectral Representation of Random Processes

Spectral Representation of Random Processes Spectral Representation of Random Processes Example: Represent u(t,x,q) by! u K (t, x, Q) = u k (t, x) k(q) where k(q) are orthogonal polynomials. Single Random Variable:! Let k (Q) be orthogonal with

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Lecture 9: Sensitivity Analysis ST 2018 Tobias Neckel Scientific Computing in Computer Science TUM Repetition of Previous Lecture Sparse grids in Uncertainty Quantification

More information

Accuracy, Precision and Efficiency in Sparse Grids

Accuracy, Precision and Efficiency in Sparse Grids John, Information Technology Department, Virginia Tech.... http://people.sc.fsu.edu/ jburkardt/presentations/ sandia 2009.pdf... Computer Science Research Institute, Sandia National Laboratory, 23 July

More information

Reduction of Variance. Importance Sampling

Reduction of Variance. Importance Sampling Reduction of Variance As we discussed earlier, the statistical error goes as: error = sqrt(variance/computer time). DEFINE: Efficiency = = 1/vT v = error of mean and T = total CPU time How can you make

More information

Quadrature for Uncertainty Analysis Stochastic Collocation. What does quadrature have to do with uncertainty?

Quadrature for Uncertainty Analysis Stochastic Collocation. What does quadrature have to do with uncertainty? Quadrature for Uncertainty Analysis Stochastic Collocation What does quadrature have to do with uncertainty? Quadrature for Uncertainty Analysis Stochastic Collocation What does quadrature have to do with

More information

Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models

Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models Ralph C. Smith Department of Mathematics North Carolina State University Experimental Setup! d 2 T s 2(a + b) = dx2

More information

Stat 451 Lecture Notes Numerical Integration

Stat 451 Lecture Notes Numerical Integration Stat 451 Lecture Notes 03 12 Numerical Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 5 in Givens & Hoeting, and Chapters 4 & 18 of Lange 2 Updated: February 11, 2016 1 / 29

More information

ECE295, Data Assimila0on and Inverse Problems, Spring 2015

ECE295, Data Assimila0on and Inverse Problems, Spring 2015 ECE295, Data Assimila0on and Inverse Problems, Spring 2015 1 April, Intro; Linear discrete Inverse problems (Aster Ch 1 and 2) Slides 8 April, SVD (Aster ch 2 and 3) Slides 15 April, RegularizaFon (ch

More information

Uncertainty Quantification in Viscous Hypersonic Flows using Gradient Information and Surrogate Modeling

Uncertainty Quantification in Viscous Hypersonic Flows using Gradient Information and Surrogate Modeling Uncertainty Quantification in Viscous Hypersonic Flows using Gradient Information and Surrogate Modeling Brian A. Lockwood, Markus P. Rumpfkeil, Wataru Yamazaki and Dimitri J. Mavriplis Mechanical Engineering

More information

Chaospy: A modular implementation of Polynomial Chaos expansions and Monte Carlo methods

Chaospy: A modular implementation of Polynomial Chaos expansions and Monte Carlo methods Chaospy: A modular implementation of Polynomial Chaos expansions and Monte Carlo methods Simen Tennøe Supervisors: Jonathan Feinberg Hans Petter Langtangen Gaute Einevoll Geir Halnes University of Oslo,

More information

Efficient Likelihood-Free Inference

Efficient Likelihood-Free Inference Efficient Likelihood-Free Inference Michael Gutmann http://homepages.inf.ed.ac.uk/mgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh 8th November 2017

More information

Normalising constants and maximum likelihood inference

Normalising constants and maximum likelihood inference Normalising constants and maximum likelihood inference Jakob G. Rasmussen Department of Mathematics Aalborg University Denmark March 9, 2011 1/14 Today Normalising constants Approximation of normalising

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016 Monte Carlo Integration Computer Graphics CMU 15-462/15-662, Fall 2016 Talk Announcement Jovan Popovic, Senior Principal Scientist at Adobe Research will be giving a seminar on Character Animator -- Monday

More information

Computer Practical: Metropolis-Hastings-based MCMC

Computer Practical: Metropolis-Hastings-based MCMC Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov

More information

Slow Growth for Gauss Legendre Sparse Grids

Slow Growth for Gauss Legendre Sparse Grids Slow Growth for Gauss Legendre Sparse Grids John Burkardt, Clayton Webster April 4, 2014 Abstract A sparse grid for multidimensional quadrature can be constructed from products of 1D rules. For multidimensional

More information

STOCHASTIC SAMPLING METHODS

STOCHASTIC SAMPLING METHODS STOCHASTIC SAMPLING METHODS APPROXIMATING QUANTITIES OF INTEREST USING SAMPLING METHODS Recall that quantities of interest often require the evaluation of stochastic integrals of functions of the solutions

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Probabilistic assessment of danger zones using a surrogate model of CFD simulations

Probabilistic assessment of danger zones using a surrogate model of CFD simulations HARMO 17 17 th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes Probabilistic assessment of danger zones using a surrogate model of CFD simulations

More information

Parameter Selection and Verification Techniques Based on Global Sensitivity Analysis Illustrated for an HIV Model

Parameter Selection and Verification Techniques Based on Global Sensitivity Analysis Illustrated for an HIV Model Parameter Selection and Verification Techniques Based on Global Sensitivity Analysis Illustrated for an HIV Model Mami T. Wentworth, Ralph C. Smith and H.T. Banks Department of Mathematics Center for Research

More information

Introduction to Uncertainty Quantification in Computational Science Handout #3

Introduction to Uncertainty Quantification in Computational Science Handout #3 Introduction to Uncertainty Quantification in Computational Science Handout #3 Gianluca Iaccarino Department of Mechanical Engineering Stanford University June 29 - July 1, 2009 Scuola di Dottorato di

More information

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability

More information

Strong Lens Modeling (II): Statistical Methods

Strong Lens Modeling (II): Statistical Methods Strong Lens Modeling (II): Statistical Methods Chuck Keeton Rutgers, the State University of New Jersey Probability theory multiple random variables, a and b joint distribution p(a, b) conditional distribution

More information

Gaussian process regression for Sensitivity analysis

Gaussian process regression for Sensitivity analysis Gaussian process regression for Sensitivity analysis GPSS Workshop on UQ, Sheffield, September 2016 Nicolas Durrande, Mines St-Étienne, durrande@emse.fr GPSS workshop on UQ GPs for sensitivity analysis

More information

Parameter Selection and Model Calibration for an SIR Model

Parameter Selection and Model Calibration for an SIR Model Parameter Selection and Model Calibration for an SIR Model Ralph C. Smith Department of Mathematics North Carolina State Universit SIR Model ds = N - S - kis, S(0) =S 0 di = kis - (r + )I, I(0) =I 0 dr

More information

Fast Likelihood-Free Inference via Bayesian Optimization

Fast Likelihood-Free Inference via Bayesian Optimization Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

Fast Numerical Methods for Stochastic Computations

Fast Numerical Methods for Stochastic Computations Fast AreviewbyDongbinXiu May 16 th,2013 Outline Motivation 1 Motivation 2 3 4 5 Example: Burgers Equation Let us consider the Burger s equation: u t + uu x = νu xx, x [ 1, 1] u( 1) =1 u(1) = 1 Example:

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Pascal s Triangle on a Budget. Accuracy, Precision and Efficiency in Sparse Grids

Pascal s Triangle on a Budget. Accuracy, Precision and Efficiency in Sparse Grids Covering : Accuracy, Precision and Efficiency in Sparse Grids https://people.sc.fsu.edu/ jburkardt/presentations/auburn 2009.pdf... John Interdisciplinary Center for Applied Mathematics & Information Technology

More information

Today: Fundamentals of Monte Carlo

Today: Fundamentals of Monte Carlo Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not

More information

Original Research. Sensitivity Analysis and Variance Reduction in a Stochastic NDT Problem

Original Research. Sensitivity Analysis and Variance Reduction in a Stochastic NDT Problem To appear in the International Journal of Computer Mathematics Vol. xx, No. xx, xx 2014, 1 9 Original Research Sensitivity Analysis and Variance Reduction in a Stochastic NDT Problem R. H. De Staelen a

More information

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky Monte Carlo Lecture 15 4/9/18 1 Sampling with dynamics In Molecular Dynamics we simulate evolution of a system over time according to Newton s equations, conserving energy Averages (thermodynamic properties)

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Utilizing Adjoint-Based Techniques to Improve the Accuracy and Reliability in Uncertainty Quantification

Utilizing Adjoint-Based Techniques to Improve the Accuracy and Reliability in Uncertainty Quantification Utilizing Adjoint-Based Techniques to Improve the Accuracy and Reliability in Uncertainty Quantification Tim Wildey Sandia National Laboratories Center for Computing Research (CCR) Collaborators: E. Cyr,

More information

Today: Fundamentals of Monte Carlo

Today: Fundamentals of Monte Carlo Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -

More information

MODIFIED MONTE CARLO WITH IMPORTANCE SAMPLING METHOD

MODIFIED MONTE CARLO WITH IMPORTANCE SAMPLING METHOD MODIFIED MONTE CARLO WITH IMPORTANCE SAMPLING METHOD Monte Carlo simulation methods apply a random sampling and modifications can be made of this method is by using variance reduction techniques (VRT).

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

Slow Exponential Growth for Clenshaw Curtis Sparse Grids jburkardt/presentations/... slow growth paper.

Slow Exponential Growth for Clenshaw Curtis Sparse Grids   jburkardt/presentations/... slow growth paper. Slow Exponential Growth for Clenshaw Curtis Sparse Grids http://people.sc.fsu.edu/ jburkardt/presentations/...... slow growth paper.pdf John Burkardt, Clayton Webster April 30, 2014 Abstract A widely used

More information

Step-Stress Models and Associated Inference

Step-Stress Models and Associated Inference Department of Mathematics & Statistics Indian Institute of Technology Kanpur August 19, 2014 Outline Accelerated Life Test 1 Accelerated Life Test 2 3 4 5 6 7 Outline Accelerated Life Test 1 Accelerated

More information

CS 450 Numerical Analysis. Chapter 8: Numerical Integration and Differentiation

CS 450 Numerical Analysis. Chapter 8: Numerical Integration and Differentiation Lecture slides based on the textbook Scientific Computing: An Introductory Survey by Michael T. Heath, copyright c 2018 by the Society for Industrial and Applied Mathematics. http://www.siam.org/books/cl80

More information

Monte Carlo Integration I [RC] Chapter 3

Monte Carlo Integration I [RC] Chapter 3 Aula 3. Monte Carlo Integration I 0 Monte Carlo Integration I [RC] Chapter 3 Anatoli Iambartsev IME-USP Aula 3. Monte Carlo Integration I 1 There is no exact definition of the Monte Carlo methods. In the

More information

Professor David B. Stephenson

Professor David B. Stephenson rand Challenges in Probabilistic Climate Prediction Professor David B. Stephenson Isaac Newton Workshop on Probabilistic Climate Prediction D. Stephenson, M. Collins, J. Rougier, and R. Chandler University

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Basic Sampling Methods

Basic Sampling Methods Basic Sampling Methods Sargur Srihari srihari@cedar.buffalo.edu 1 1. Motivation Topics Intractability in ML How sampling can help 2. Ancestral Sampling Using BNs 3. Transforming a Uniform Distribution

More information

Polynomial chaos expansions for sensitivity analysis

Polynomial chaos expansions for sensitivity analysis c DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for sensitivity analysis B. Sudret Chair of Risk, Safety & Uncertainty

More information

However, reliability analysis is not limited to calculation of the probability of failure.

However, reliability analysis is not limited to calculation of the probability of failure. Probabilistic Analysis probabilistic analysis methods, including the first and second-order reliability methods, Monte Carlo simulation, Importance sampling, Latin Hypercube sampling, and stochastic expansions

More information

Spatially Adaptive Smoothing Splines

Spatially Adaptive Smoothing Splines Spatially Adaptive Smoothing Splines Paul Speckman University of Missouri-Columbia speckman@statmissouriedu September 11, 23 Banff 9/7/3 Ordinary Simple Spline Smoothing Observe y i = f(t i ) + ε i, =

More information

Multilevel stochastic collocations with dimensionality reduction

Multilevel stochastic collocations with dimensionality reduction Multilevel stochastic collocations with dimensionality reduction Ionut Farcas TUM, Chair of Scientific Computing in Computer Science (I5) 27.01.2017 Outline 1 Motivation 2 Theoretical background Uncertainty

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software SAMSI Astrostatistics Tutorial More Markov chain Monte Carlo & Demo of Mathematica software Phil Gregory University of British Columbia 26 Bayesian Logical Data Analysis for the Physical Sciences Contents:

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

8 STOCHASTIC SIMULATION

8 STOCHASTIC SIMULATION 8 STOCHASTIC SIMULATIO 59 8 STOCHASTIC SIMULATIO Whereas in optimization we seek a set of parameters x to minimize a cost, or to maximize a reward function J( x), here we pose a related but different question.

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

At A Glance. UQ16 Mobile App.

At A Glance. UQ16 Mobile App. At A Glance UQ16 Mobile App Scan the QR code with any QR reader and download the TripBuilder EventMobile app to your iphone, ipad, itouch or Android mobile device. To access the app or the HTML 5 version,

More information

Sensitivity Analysis with Correlated Variables

Sensitivity Analysis with Correlated Variables Sensitivity Analysis with Correlated Variables st Workshop on Nonlinear Analysis of Shell Structures INTALES GmbH Engineering Solutions University of Innsbruck, Faculty of Civil Engineering University

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

Multi-Element Probabilistic Collocation Method in High Dimensions

Multi-Element Probabilistic Collocation Method in High Dimensions Multi-Element Probabilistic Collocation Method in High Dimensions Jasmine Foo and George Em Karniadakis Division of Applied Mathematics, Brown University, Providence, RI 02912 USA Abstract We combine multi-element

More information

Intro to Probability. Andrei Barbu

Intro to Probability. Andrei Barbu Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems

More information

A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments

A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments University of Exeter Department of Mathematics A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments Nathan Edward Owen May 2017 Supervised

More information

Data Assimilation Research Testbed Tutorial

Data Assimilation Research Testbed Tutorial Data Assimilation Research Testbed Tutorial Section 3: Hierarchical Group Filters and Localization Version 2.: September, 26 Anderson: Ensemble Tutorial 9//6 Ways to deal with regression sampling error:

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Modelling Operational Risk Using Bayesian Inference

Modelling Operational Risk Using Bayesian Inference Pavel V. Shevchenko Modelling Operational Risk Using Bayesian Inference 4y Springer 1 Operational Risk and Basel II 1 1.1 Introduction to Operational Risk 1 1.2 Defining Operational Risk 4 1.3 Basel II

More information

component risk analysis

component risk analysis 273: Urban Systems Modeling Lec. 3 component risk analysis instructor: Matteo Pozzi 273: Urban Systems Modeling Lec. 3 component reliability outline risk analysis for components uncertain demand and uncertain

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Review of the role of uncertainties in room acoustics

Review of the role of uncertainties in room acoustics Review of the role of uncertainties in room acoustics Ralph T. Muehleisen, Ph.D. PE, FASA, INCE Board Certified Principal Building Scientist and BEDTR Technical Lead Division of Decision and Information

More information

Gaussian Processes for Computer Experiments

Gaussian Processes for Computer Experiments Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented

More information

Estimation of Quantiles

Estimation of Quantiles 9 Estimation of Quantiles The notion of quantiles was introduced in Section 3.2: recall that a quantile x α for an r.v. X is a constant such that P(X x α )=1 α. (9.1) In this chapter we examine quantiles

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

In manycomputationaleconomicapplications, one must compute thede nite n

In manycomputationaleconomicapplications, one must compute thede nite n Chapter 6 Numerical Integration In manycomputationaleconomicapplications, one must compute thede nite n integral of a real-valued function f de ned on some interval I of

More information

Sensitivity and Reliability Analysis of Nonlinear Frame Structures

Sensitivity and Reliability Analysis of Nonlinear Frame Structures Sensitivity and Reliability Analysis of Nonlinear Frame Structures Michael H. Scott Associate Professor School of Civil and Construction Engineering Applied Mathematics and Computation Seminar April 8,

More information

Polynomial Chaos and Karhunen-Loeve Expansion

Polynomial Chaos and Karhunen-Loeve Expansion Polynomial Chaos and Karhunen-Loeve Expansion 1) Random Variables Consider a system that is modeled by R = M(x, t, X) where X is a random variable. We are interested in determining the probability of the

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Dimension-adaptive sparse grid for industrial applications using Sobol variances

Dimension-adaptive sparse grid for industrial applications using Sobol variances Master of Science Thesis Dimension-adaptive sparse grid for industrial applications using Sobol variances Heavy gas flow over a barrier March 11, 2015 Ad Dimension-adaptive sparse grid for industrial

More information

Previously Monte Carlo Integration

Previously Monte Carlo Integration Previously Simulation, sampling Monte Carlo Simulations Inverse cdf method Rejection sampling Today: sampling cont., Bayesian inference via sampling Eigenvalues and Eigenvectors Markov processes, PageRank

More information

Long-Run Covariability

Long-Run Covariability Long-Run Covariability Ulrich K. Müller and Mark W. Watson Princeton University October 2016 Motivation Study the long-run covariability/relationship between economic variables great ratios, long-run Phillips

More information

IEOR 4703: Homework 2 Solutions

IEOR 4703: Homework 2 Solutions IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

A Bayesian Approach to Phylogenetics

A Bayesian Approach to Phylogenetics A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte

More information

A Polynomial Chaos Approach to Robust Multiobjective Optimization

A Polynomial Chaos Approach to Robust Multiobjective Optimization A Polynomial Chaos Approach to Robust Multiobjective Optimization Silvia Poles 1, Alberto Lovison 2 1 EnginSoft S.p.A., Optimization Consulting Via Giambellino, 7 35129 Padova, Italy s.poles@enginsoft.it

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Need for Sampling in Machine Learning. Sargur Srihari

Need for Sampling in Machine Learning. Sargur Srihari Need for Sampling in Machine Learning Sargur srihari@cedar.buffalo.edu 1 Rationale for Sampling 1. ML methods model data with probability distributions E.g., p(x,y; θ) 2. Models are used to answer queries,

More information

University of Texas-Austin - Integration of Computing

University of Texas-Austin - Integration of Computing University of Texas-Austin - Integration of Computing During 2001-2002 the Department of Chemical Engineering at UT-Austin revamped the computing thread in its curriculum in order to strengthen student

More information

Semester , Example Exam 1

Semester , Example Exam 1 Semester 1 2017, Example Exam 1 1 of 10 Instructions The exam consists of 4 questions, 1-4. Each question has four items, a-d. Within each question: Item (a) carries a weight of 8 marks. Item (b) carries

More information

Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis

Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis Sobol-Hoeffding decomposition Application to Global SA Computation of the SI Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis Olivier Le Maître with Colleague & Friend Omar

More information