Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro

Size: px
Start display at page:

Download "Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro"

Transcription

1 Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging., Delphine Dupuy and Laurent Carraro

2 Historical context First introduced in the field of geostatistics (Matheron, 1960). Recent use as a response surface for computer experiments (Sacks,1989 ; Santner,003). Prediction and uncertainty on prediction (Jones, 1998, Oakley, 004) Two different approaches among practitioners: Universal Kriging (UK) parameters are estimated (CV, ML) Bayesian Kriging (BK) parameters are random variables Goal : BK allows the interpretation of UK uncertainty as a prediction variance Application of BK : petroleum case study

3 Outline Universal Kriging limits Bayesian Kriging pro and con Case study

4 Universal Kriging

5 Assumptions : ( ) ( ) ( ) Probabilistic Context Y x = f x β + Z x where Z is a GP E ( Z ( x) ) ( ) ( ) = 0 Cov Z x Z x h R h (, + ) = σ ( θ ) T n values, Y = ( y1 y n ), are observed at points = ( ) Prediction and uncertainty : X x1 x n T

6 Assumptions : ( ) = ( ) β + ( ) Probabilistic Context Y x f x Z x where Z is a GP E ( Z ( x) ) ( ) ( ) = 0 Cov Z x Z x h R h (, + ) = σ ( θ ) T n values, Y = ( y1 y n ), are observed at points = ( ) Prediction and uncertainty : Case Simple Kriging : parameters are known SK T 1 ( ) = ( ) β + ( β ) Y x f x r R Y F 0 0 T ( 0 ) = σ ( 1 θ θ θ ) 1 σ SK x r R r θ θ X x1 x n T

7 Assumptions : ( ) = ( ) β + ( ) Probabilistic Context Y x f x Z x where Z is a GP E ( Z ( x) ) ( ) ( ) = 0 Cov Z x Z x h R h (, + ) = σ ( θ ) T n values, Y = ( y1 y n ), are observed at points = ( ) X x1 x n T Prediction and uncertainty : Case Simple Kriging : parameters are known SK T 1 ( ) = ( ) β + ( β ) Y x f x r R Y F 0 0 T ( 0 ) = σ ( 1 θ θ θ ) 1 σ SK x r R r θ θ Case Universal Kriging : parameters are estimated UK ( ) ( ) ˆ T 1 0 = 0 β + ˆ ˆ ( ˆ β θ θ ) Y x f x r R Y F 1 ( )( ) ( ( ) ) T 1 T 1 T 1 T 1 UK ( x 0 ) ˆ 1 r ˆ R ˆ r ˆ f ( x 0 ) r ˆ R ˆ F F R ˆ F f x 0 r ˆ R σ = σ + ˆ F θ θ θ θ θ θ θ θ T

8 Simple Kriging - example output SK data ( ( )) ( ) ( ) ( ) ( ) ( ) ( h ) E Y x = 0 Var Y x = 4 Corr Y x, Y x + h = exp 0.

9 Limits of Universal Kriging Difficulties due to estimation : flat Likelihood too few data to estimate covariance function and ranges experimental design sensibility Underestimation of uncertainty σ UK ( x 0 ) does not take into account uncertainties due the estimation of variance, σ, and range θ No probabilistic interpretation ( ) ( ) ˆ T 1 Y ( ˆ ) UK x0 f x0 β rˆ R ˆ Y Fβ θ θ ( ) = + E Y ( x0 ) Y T 0 = ˆ σ 1 ˆ ˆ ˆ +... θ θ θ Var ( Y ( x0 ) Y ) ( ) ( ) 1 σuk x r R r

10 Bayesian Kriging

11 Model of Bayesian Kriging Assumptions : β, θ, σ are Random Variables. Let π be the prior distribution. ( ) = ( ) + ( ) ( Z ( x) ) x D Y x β, θ, σ f x β Z x, with the same assumptions for T n values, = ( ), are observed at points = ( ) Y y1 y n X x x n 1 T Interpretation : A mixture of Gaussian Processes (Y(x) is not Gaussian) The weight of a given process in the mixture depends on its prior π.

12 Equations of Bayesian Kriging ( SK SK ) ( ), β, θ, σ Ν ( ), σ ( ) Y x Y Y x x ( ( 0 ) ) ( 0 ) β, θ, σ ( ) ( ) E Y x Y E Y x Y, β, θ, σ π β, θ, σ Y dβ dθdσ = (,, Y ) π β θ σ = ( ; β, θ, σ ) π ( β, θ, σ ) L Y π ( Y )

13 Equations of Bayesian Kriging ( SK SK ) ( ), β, θ, σ Ν ( ), σ ( ) Y x Y Y x x ( ( 0 ) ) ( 0 ) β, θ, σ ( ) ( ) E Y x Y E Y x Y, β, θ, σ π β, θ, σ Y dβ dθdσ = Prediction : BK ( ) ( ) = ( ) Y x E Y x Y 0 0 (,, Y ) π β θ σ = ( ; β, θ, σ ) π ( β, θ, σ ) L Y π ( Y )

14 Equations of Bayesian Kriging ( SK SK ) ( ), β, θ, σ Ν ( ), σ ( ) Y x Y Y x x ( ( 0 ) ) ( 0 ) β, θ, σ ( ) ( ) E Y x Y E Y x Y, β, θ, σ π β, θ, σ Y dβ dθdσ = Prediction : BK ( ) ( ) = ( ) Y x E Y x Y Measure of uncertainty : 0 0 ( ) ( ) ( ) BK x 0 Var Y x 0 Y σ = Simulation of the distribution of Y(x 0 ) Y (,, Y ) π β θ σ = ( ; β, θ, σ ) π ( β, θ, σ ) L Y π ( Y )

15 Particular case of prior distribution Gaussian Case for β Prior distribution (θ and σ are constant) ( ) = Ν( µ, λσ) π β

16 Particular case of prior distribution Gaussian Case for β (θ and σ are constant) ( ) = Ν( µ, λσ) Prior distribution π β Posterior Gaussian distribution for β 1 T T ( β ) = µ + λσ ( λ Σ + σ θ ) ( µ ) 1 T T ( β ) λ λ ( λ σ θ ) E Y F F F R Y F Var Y = Σ ΣF FΣ F + R FΣ

17 Particular case of prior distribution Gaussian Case for β (θ and σ are constant) ( ) = Ν( µ, λσ) Prior distribution π β Posterior Gaussian distribution for β Posterior Gaussian distribution for Y(x 0 ) ( ( 0 ) ) ( 0 ) ( ( ) ) ( ) 1 ( )( µ ( λ σ ) ( µ )) T 1 T T T 1 E Y x Y = f x rθ Rθ F + ΣF FΣ F + Rθ Y F + rθ Rθ Y 1 T 1 T T T 1 1 ( ) ( ) ( ( ) ) T T θ θ λ λ λ σ θ θ θ σ ( θ θ θ ) Var Y x0 Y = f x0 r R F Σ ΣF FΣ F + R FΓ f x0 r R F + 1 r R r

18 Particular case of prior distribution Gaussian Case for β (θ and σ are constant) ( ) = Ν( µ, λσ) Prior distribution π β Posterior Gaussian distribution for β Posterior Gaussian distribution for Y(x 0 ) Particular case : λ + (non informative prior for β) 1 Posterior Gaussian distribution for β T 1 T 1 ( ) ( ) ˆ E β Y = F Rθ F F Rθ Y = β 1 T 1 Var β Y = σ F R F Var ˆ θ = β Posterior Gaussian distribution for Y(x 0 ) ( ) ( ) ( ) ( ) ( ML ) ( 0 ) = UK ( 0 ) ( 0 ) = σuk ( 0 ) E Y x Y Y x Var Y x Y x ML

19 Bayesian Kriging- difficulties The simulation of the posterior distribution of the parameters can be hard (Simulation by a Monte Carlo Markov chain method) The choice of the prior is difficult Case of a flat prior for β, θ and σ Roughly equivalent to maximize the likelihood function Advantages: the prediction variance takes all sources of uncertainty into account, the optimization problem disappears Case of an informative prior: which one? What impact? IDEA: to use a simplified simulation (faster) to derive prior information Example : petroleum field

20 Application Simulator : flow simulator - 3DSL 3 inputs on [-1,1] lmultkz (permeability) krwmax (relative permeability), lbhp (low bottom hole pressure) Output: Field oil production total after 7000 days Problem : uncertainty analysis Method : metamodel and its uncertainty (Bayesian Kriging)

21 3DSL /degraded simulations Idea : using a faster simulator to get prior information degraded simulations (NODESMAX, DTMAX, DVPMAX etc)

22 3DSL /degraded simulations Correlation Coefficient ALL lmultkz krwmax lbhp 3DSL /Degraded DSL /Degraded Note : calculations carried out on a grid of 1331=11 3 points

23 3DSL /degraded simulations 3.4 x 107 LMULTKZ 3.5 x 107 KRWMAX 3.45 x 107 LBHP FOPT DSL Degraded 1 Degraded

24 4 different strategies runs UK no info BK BK info1 BK info time UK no info BK BK info1 BK info 3DSL DSL Deg. 1 X X 4 X Deg. 1 X X 4 X Deg. X X X 4 Deg. X X X 3 Total Info 1 = No info BK on the 4 runs of Degraded 1 provides information on trend, variance and correlation

25 4 different strategies UK no info BK BK info1 BK info RMSE* Average standard deviation proportion of outside points* 40% 19% 8% 4% Note : calculations carried out on a grid of 1331=11 3 points Accuracy of prediction: N N i = 1 ( ( ) ( )) i i 1 RMSE = Y x Yˆ x Accuracy of uncertainty: ASD 1 N σ Yˆ i N i= 1 = N i= 1 ( x ) 1 PR = 1 N σ ( ) ˆ( ) > ( ) Y x Y x x i i Yˆ i

26 Conclusion Advantages of Bayesian kriging In the case of a non informative prior for β: ( ( 0 ) ) = UK ( 0 ) ( ) = σ E Y x Y Y x ( 0 ) UK ( 0 ) Var Y x Y x Good estimation of the prediction variance which takes into account all sources of uncertainty: on β, σ and θ. Weakness of Bayesian kriging MCMC simulations Choice of the prior

27 Universal Kriging - example θ = 0. (known) ˆ σ = 0.86 (σ = 4) ˆ β = 1.09 (β = 0) output SK UK data

28 Experimental design sensibility likelihood optimization : f(θ) = -log(l(θ)) x Variability on θ θ Identification pb

29 Prior and posterior distributions 0,35 0,19 0,78 0,68 0,70 0,68 0,4 0,61 Std(. Y) 0,97 0,48,5 1,69-0,58-0,63 0,00-0,56 E(. Y) Deg. 1,30 0,31 1,93 0,88 0,53 0,88 0,57 0,78 Std(. Y),71 0,76 3,57,1-0,64-0,5-0,3-0,40 E(. Y) Deg. 1 teta3 teta teta1 sigma beta3 beta beta1 beta0 1,73 0,8,00 0,80 0,4 1,05 0,55 1,06 Std(. Y) 4,36 1,08 3,86,64-0,35 0,34 0,54-0,57 E(. Y) no info 0,8 0,13 0,64 0,1 0,40 0,44 0,4 0,39 Std(. Y) 1,45 0,69,89 0,97-0,36-0,63 0,16-0,40 E(. Y) info 0,88 0,19 1,16 0,40 0,3 0,64 0,30 0,55 Std(. Y) 3,3 0,87 3,74 1,88-0,47-0,41 0,19-0,41 E(. Y) info 1 teta3 teta teta1 sigma beta3 beta beta1 beta0 17 points Prior

30 Range variation sensibility 4 beta = 1.1 sigma = 0.94 teta = beta = 1.1 sigma = 0.98 teta = beta = 4.4 sigma = 66 teta =

31 Sensibility to range variation 4 beta = 1.1 sigma = 0.98 teta = beta = 4.4 sigma = 66 teta = beta = 1.1 sigma = 0.94 teta = Vraisemblance en fonction des paramètres d'échelle θ θ 1

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

ICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts

ICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes

More information

Fast Likelihood-Free Inference via Bayesian Optimization

Fast Likelihood-Free Inference via Bayesian Optimization Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology

More information

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Module 17: Bayesian Statistics for Genetics Lecture 4: Linear regression

Module 17: Bayesian Statistics for Genetics Lecture 4: Linear regression 1/37 The linear regression model Module 17: Bayesian Statistics for Genetics Lecture 4: Linear regression Ken Rice Department of Biostatistics University of Washington 2/37 The linear regression model

More information

A Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D.

A Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D. A Probabilistic Framework for solving Inverse Problems Lambros S. Katafygiotis, Ph.D. OUTLINE Introduction to basic concepts of Bayesian Statistics Inverse Problems in Civil Engineering Probabilistic Model

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging

Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging Jeremy Staum Collaborators: Bruce Ankenman, Barry Nelson Evren Baysal, Ming Liu, Wei Xie supported by the NSF under Grant No.

More information

arxiv: v1 [stat.me] 24 May 2010

arxiv: v1 [stat.me] 24 May 2010 The role of the nugget term in the Gaussian process method Andrey Pepelyshev arxiv:1005.4385v1 [stat.me] 24 May 2010 Abstract The maximum likelihood estimate of the correlation parameter of a Gaussian

More information

Machine Learning. Probabilistic KNN.

Machine Learning. Probabilistic KNN. Machine Learning. Mark Girolami girolami@dcs.gla.ac.uk Department of Computing Science University of Glasgow June 21, 2007 p. 1/3 KNN is a remarkably simple algorithm with proven error-rates June 21, 2007

More information

Session 5B: A worked example EGARCH model

Session 5B: A worked example EGARCH model Session 5B: A worked example EGARCH model John Geweke Bayesian Econometrics and its Applications August 7, worked example EGARCH model August 7, / 6 EGARCH Exponential generalized autoregressive conditional

More information

Assessing Reliability Using Developmental and Operational Test Data

Assessing Reliability Using Developmental and Operational Test Data Assessing Reliability Using Developmental and Operational Test Data Martin Wayne, PhD, U.S. AMSAA Mohammad Modarres, PhD, University of Maryland, College Park Presented at the RAMS-04 Symposium Colorado

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

Accurate Maximum Likelihood Estimation for Parametric Population Analysis. Bob Leary UCSD/SDSC and LAPK, USC School of Medicine

Accurate Maximum Likelihood Estimation for Parametric Population Analysis. Bob Leary UCSD/SDSC and LAPK, USC School of Medicine Accurate Maximum Likelihood Estimation for Parametric Population Analysis Bob Leary UCSD/SDSC and LAPK, USC School of Medicine Why use parametric maximum likelihood estimators? Consistency: θˆ θ as N ML

More information

A Bayesian Approach to Phylogenetics

A Bayesian Approach to Phylogenetics A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte

More information

STA414/2104 Statistical Methods for Machine Learning II

STA414/2104 Statistical Methods for Machine Learning II STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements

More information

Prediction of Data with help of the Gaussian Process Method

Prediction of Data with help of the Gaussian Process Method of Data with help of the Gaussian Process Method R. Preuss, U. von Toussaint Max-Planck-Institute for Plasma Physics EURATOM Association 878 Garching, Germany March, Abstract The simulation of plasma-wall

More information

Data Analysis and Uncertainty Part 2: Estimation

Data Analysis and Uncertainty Part 2: Estimation Data Analysis and Uncertainty Part 2: Estimation Instructor: Sargur N. University at Buffalo The State University of New York srihari@cedar.buffalo.edu 1 Topics in Estimation 1. Estimation 2. Desirable

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 143 Part IV

More information

Bayesian Dynamic Linear Modelling for. Complex Computer Models

Bayesian Dynamic Linear Modelling for. Complex Computer Models Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer

More information

Multi-fidelity sensitivity analysis

Multi-fidelity sensitivity analysis Multi-fidelity sensitivity analysis Loic Le Gratiet 1,2, Claire Cannamela 2 & Bertrand Iooss 3 1 Université Denis-Diderot, Paris, France 2 CEA, DAM, DIF, F-91297 Arpajon, France 3 EDF R&D, 6 quai Watier,

More information

Statistics in Environmental Research (BUC Workshop Series) II Problem sheet - WinBUGS - SOLUTIONS

Statistics in Environmental Research (BUC Workshop Series) II Problem sheet - WinBUGS - SOLUTIONS Statistics in Environmental Research (BUC Workshop Series) II Problem sheet - WinBUGS - SOLUTIONS 1. (a) The posterior mean estimate of α is 14.27, and the posterior mean for the standard deviation of

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

USEFUL PROPERTIES OF THE MULTIVARIATE NORMAL*

USEFUL PROPERTIES OF THE MULTIVARIATE NORMAL* USEFUL PROPERTIES OF THE MULTIVARIATE NORMAL* 3 Conditionals and marginals For Bayesian analysis it is very useful to understand how to write joint, marginal, and conditional distributions for the multivariate

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 15-7th March Arnaud Doucet

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 15-7th March Arnaud Doucet Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 15-7th March 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Mixture and composition of kernels. Hybrid algorithms. Examples Overview

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Module 4: Bayesian Methods Lecture 5: Linear regression

Module 4: Bayesian Methods Lecture 5: Linear regression 1/28 The linear regression model Module 4: Bayesian Methods Lecture 5: Linear regression Peter Hoff Departments of Statistics and Biostatistics University of Washington 2/28 The linear regression model

More information

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm by Korbinian Schwinger Overview Exponential Family Maximum Likelihood The EM Algorithm Gaussian Mixture Models Exponential

More information

Climate Change: the Uncertainty of Certainty

Climate Change: the Uncertainty of Certainty Climate Change: the Uncertainty of Certainty Reinhard Furrer, UZH JSS, Geneva Oct. 30, 2009 Collaboration with: Stephan Sain - NCAR Reto Knutti - ETHZ Claudia Tebaldi - Climate Central Ryan Ford, Doug

More information

Monitoring Wafer Geometric Quality using Additive Gaussian Process

Monitoring Wafer Geometric Quality using Additive Gaussian Process Monitoring Wafer Geometric Quality using Additive Gaussian Process Linmiao Zhang 1 Kaibo Wang 2 Nan Chen 1 1 Department of Industrial and Systems Engineering, National University of Singapore 2 Department

More information

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Interval Estimation III: Fisher's Information & Bootstrapping

Interval Estimation III: Fisher's Information & Bootstrapping Interval Estimation III: Fisher's Information & Bootstrapping Frequentist Confidence Interval Will consider four approaches to estimating confidence interval Standard Error (+/- 1.96 se) Likelihood Profile

More information

Bayesian Estimation of Input Output Tables for Russia

Bayesian Estimation of Input Output Tables for Russia Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Dynamic Multipath Estimation by Sequential Monte Carlo Methods

Dynamic Multipath Estimation by Sequential Monte Carlo Methods Dynamic Multipath Estimation by Sequential Monte Carlo Methods M. Lentmaier, B. Krach, P. Robertson, and T. Thiasiriphet German Aerospace Center (DLR) Slide 1 Outline Multipath problem and signal model

More information

ABHELSINKI UNIVERSITY OF TECHNOLOGY

ABHELSINKI UNIVERSITY OF TECHNOLOGY Cross-Validation, Information Criteria, Expected Utilities and the Effective Number of Parameters Aki Vehtari and Jouko Lampinen Laboratory of Computational Engineering Introduction Expected utility -

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 2013-14 We know that X ~ B(n,p), but we do not know p. We get a random sample

More information

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Design of experiments for smoke depollution of diesel engine outputs

Design of experiments for smoke depollution of diesel engine outputs ControlledCO 2 Diversifiedfuels Fuel-efficientvehicles Cleanrefining Extendedreserves Design of experiments for smoke depollution of diesel engine outputs M. CANAUD (1), F. WAHL (1), C. HELBERT (2), L.

More information

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014 Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation

More information

Introduction to Gaussian Process

Introduction to Gaussian Process Introduction to Gaussian Process CS 778 Chris Tensmeyer CS 478 INTRODUCTION 1 What Topic? Machine Learning Regression Bayesian ML Bayesian Regression Bayesian Non-parametric Gaussian Process (GP) GP Regression

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty

More information

Bayesian Estimation with Sparse Grids

Bayesian Estimation with Sparse Grids Bayesian Estimation with Sparse Grids Kenneth L. Judd and Thomas M. Mertens Institute on Computational Economics August 7, 27 / 48 Outline Introduction 2 Sparse grids Construction Integration with sparse

More information

VCMC: Variational Consensus Monte Carlo

VCMC: Variational Consensus Monte Carlo VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 10 Alternatives to Monte Carlo Computation Since about 1990, Markov chain Monte Carlo has been the dominant

More information

Consistent Downscaling of Seismic Inversions to Cornerpoint Flow Models SPE

Consistent Downscaling of Seismic Inversions to Cornerpoint Flow Models SPE Consistent Downscaling of Seismic Inversions to Cornerpoint Flow Models SPE 103268 Subhash Kalla LSU Christopher D. White LSU James S. Gunning CSIRO Michael E. Glinsky BHP-Billiton Contents Method overview

More information

Introduction to Bayes and non-bayes spatial statistics

Introduction to Bayes and non-bayes spatial statistics Introduction to Bayes and non-bayes spatial statistics Gabriel Huerta Department of Mathematics and Statistics University of New Mexico http://www.stat.unm.edu/ ghuerta/georcode.txt General Concepts Spatial

More information

Variational Methods in Bayesian Deconvolution

Variational Methods in Bayesian Deconvolution PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the

More information

System identification and control with (deep) Gaussian processes. Andreas Damianou

System identification and control with (deep) Gaussian processes. Andreas Damianou System identification and control with (deep) Gaussian processes Andreas Damianou Department of Computer Science, University of Sheffield, UK MIT, 11 Feb. 2016 Outline Part 1: Introduction Part 2: Gaussian

More information

Part 6: Multivariate Normal and Linear Models

Part 6: Multivariate Normal and Linear Models Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Introduction to Markov Chain Monte Carlo

Introduction to Markov Chain Monte Carlo Introduction to Markov Chain Monte Carlo Jim Albert March 18, 2018 A Selected Data Problem Here is an interesting problem with selected data. Suppose you are measuring the speeds of cars driving on an

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

Review: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF)

Review: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF) Case Study 4: Collaborative Filtering Review: Probabilistic Matrix Factorization Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 2 th, 214 Emily Fox 214 1 Probabilistic

More information

Bayesian Networks in Educational Assessment

Bayesian Networks in Educational Assessment Bayesian Networks in Educational Assessment Estimating Parameters with MCMC Bayesian Inference: Expanding Our Context Roy Levy Arizona State University Roy.Levy@asu.edu 2017 Roy Levy MCMC 1 MCMC 2 Posterior

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

FREQUENTIST BEHAVIOR OF FORMAL BAYESIAN INFERENCE

FREQUENTIST BEHAVIOR OF FORMAL BAYESIAN INFERENCE FREQUENTIST BEHAVIOR OF FORMAL BAYESIAN INFERENCE Donald A. Pierce Oregon State Univ (Emeritus), RERF Hiroshima (Retired), Oregon Health Sciences Univ (Adjunct) Ruggero Bellio Univ of Udine For Perugia

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 Suggested Projects: www.cs.ubc.ca/~arnaud/projects.html First assignement on the web: capture/recapture.

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

Parametric Techniques Lecture 3

Parametric Techniques Lecture 3 Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to

More information

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation

More information

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007 Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.

More information

Bayes: All uncertainty is described using probability.

Bayes: All uncertainty is described using probability. Bayes: All uncertainty is described using probability. Let w be the data and θ be any unknown quantities. Likelihood. The probability model π(w θ) has θ fixed and w varying. The likelihood L(θ; w) is π(w

More information

The local dark matter halo density. Riccardo Catena. Institut für Theoretische Physik, Heidelberg

The local dark matter halo density. Riccardo Catena. Institut für Theoretische Physik, Heidelberg The local dark matter halo density Riccardo Catena Institut für Theoretische Physik, Heidelberg 17.05.2010 R. Catena and P. Ullio, arxiv:0907.0018 [astro-ph.co]. To be published in JCAP Riccardo Catena

More information

Statistical Data Analysis Stat 5: More on nuisance parameters, Bayesian methods

Statistical Data Analysis Stat 5: More on nuisance parameters, Bayesian methods Statistical Data Analysis Stat 5: More on nuisance parameters, Bayesian methods London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement

A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement Simon Leglaive 1 Laurent Girin 1,2 Radu Horaud 1 1: Inria Grenoble Rhône-Alpes 2: Univ. Grenoble Alpes, Grenoble INP,

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality

More information

Modular Bayesian uncertainty assessment for Structural Health Monitoring

Modular Bayesian uncertainty assessment for Structural Health Monitoring uncertainty assessment for Structural Health Monitoring Warwick Centre for Predictive Modelling André Jesus a.jesus@warwick.ac.uk June 26, 2017 Thesis advisor: Irwanda Laory & Peter Brommer Structural

More information

STATISTICAL MODELS FOR QUANTIFYING THE SPATIAL DISTRIBUTION OF SEASONALLY DERIVED OZONE STANDARDS

STATISTICAL MODELS FOR QUANTIFYING THE SPATIAL DISTRIBUTION OF SEASONALLY DERIVED OZONE STANDARDS STATISTICAL MODELS FOR QUANTIFYING THE SPATIAL DISTRIBUTION OF SEASONALLY DERIVED OZONE STANDARDS Eric Gilleland Douglas Nychka Geophysical Statistics Project National Center for Atmospheric Research Supported

More information

A novel determination of the local dark matter density. Riccardo Catena. Institut für Theoretische Physik, Heidelberg

A novel determination of the local dark matter density. Riccardo Catena. Institut für Theoretische Physik, Heidelberg A novel determination of the local dark matter density Riccardo Catena Institut für Theoretische Physik, Heidelberg 28.04.2010 R. Catena and P. Ullio, arxiv:0907.0018 [astro-ph.co]. Riccardo Catena (ITP)

More information

Uncertainty in energy system models

Uncertainty in energy system models Uncertainty in energy system models Amy Wilson Durham University May 2015 Table of Contents 1 Model uncertainty 2 3 Example - generation investment 4 Conclusion Model uncertainty Contents 1 Model uncertainty

More information

Patterns of Scalable Bayesian Inference Background (Session 1)

Patterns of Scalable Bayesian Inference Background (Session 1) Patterns of Scalable Bayesian Inference Background (Session 1) Jerónimo Arenas-García Universidad Carlos III de Madrid jeronimo.arenas@gmail.com June 14, 2017 1 / 15 Motivation. Bayesian Learning principles

More information

A Bayesian Approach to Prediction and Variable Selection Using Nonstationary Gaussian Processes

A Bayesian Approach to Prediction and Variable Selection Using Nonstationary Gaussian Processes A Bayesian Approach to Prediction and Variable Selection Using Nonstationary Gaussian Processes Dissertation Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling

Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling 1 / 27 Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling Melih Kandemir Özyeğin University, İstanbul, Turkey 2 / 27 Monte Carlo Integration The big question : Evaluate E p(z) [f(z)]

More information