Multi-fidelity co-kriging models

Size: px
Start display at page:

Download "Multi-fidelity co-kriging models"

Transcription

1 Application to Sequential design Loic Le Gratiet 12, Claire Cannamela 3 1 EDF R&D, Chatou, France 2 UNS CNRS, 69 Sophia Antipolis, France 3 CEA, DAM, DIF, F Arpajon, France ANR CHORUS April 3, 214

2 Introduction Context Input parameters x Physical system Simulator Meta-model g exp (x) g(x) ĝ(x)

3 Motivations Motivations Objective: replace the output of a code, called g 2(x), by a metamodel. g 2(x) : x Q R d R Framework: a coarse version g 1 of g 2 is available. g 2(x) : Highfidelity code g 1(x) : Lowfidelity code. Principle: build a metamodel of g 2(x) which integrates as well observations of the coarse code output. Multi-fidelity co-kriging model

4 Recursive formulation of the model Multi-fidelity co-kriging model:[kennedy & O Hagan (2), Le Gratiet (213), Le Gratiet (214)] { Z2(x) = ρz 1(x)+δ(x) Z 1(x) δ(x) where Z 1(x) [Z 1(x) Z 1 = g 1,β 1,σ 2 1, θ 1], with g 1 = g 1(x),x D 1 ( ) ( ) and Z 1(x) GP f1(x)β t 1,σ1r 2 1(x, x;θ 1), δ(x) GP fδ(x)β t δ,σδr 2 δ (x, x;θ δ ) Parameters estimation: θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ ) β 1,( : analytical posterior distribution (Bayesian inference) ρ

5 Recursive formulation of the model Multi-fidelity co-kriging model:[kennedy & O Hagan (2), Le Gratiet (213), Le Gratiet (214)] { Z2(x) = ρz 1(x)+δ(x) Z 1(x) δ(x) where Z 1(x) [Z 1(x) Z 1 = g 1,β 1,σ 2 1, θ 1], with g 1 = g 1(x),x D 1 ( ) ( ) and Z 1(x) GP f1(x)β t 1,σ1r 2 1(x, x;θ 1), δ(x) GP fδ(x)β t δ,σδr 2 δ (x, x;θ δ ) Parameters estimation: θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ ) β 1,( : analytical posterior distribution (Bayesian inference) ρ Finally, Z 2(x) [Z 2(x) Z 2 = g 2,Z 1 = g 1] with g 2 = g 2(x),x D 2 We suppose that D 2 D 1 and θ 1, θ δ,σ 2 1, σ 2 δ are known.

6 Recursive formulation of the model Multi-fidelity co-kriging model:[kennedy & O Hagan (2), Le Gratiet (213), Le Gratiet (214)] { Z2(x) = ρz 1(x)+δ(x) Z 1(x) δ(x) where Z 1(x) [Z 1(x) Z 1 = g 1,β 1,σ 2 1, θ 1], with g 1 = g 1(x),x D 1 ( ) ( ) and Z 1(x) GP f1(x)β t 1,σ1r 2 1(x, x;θ 1), δ(x) GP fδ(x)β t δ,σδr 2 δ (x, x;θ δ ) Parameters estimation: θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ ) β 1,( : analytical posterior distribution (Bayesian inference) ρ Finally, Z 2(x) [Z 2(x) Z 2 = g 2,Z 1 = g 1] with g 2 = g 2(x),x D 2 We suppose that D 2 D 1 and θ 1, θ δ,σ 2 1, σ 2 δ are known.

7 Predictive distribution In Universal Cokriging, the predictive distribution of Z 2(x) is not Gaussian. The predictive mean and variance can be decomposed as: µ Z2 (x) = E[Z 2(x) Z 2 = g 2,Z 1 = g 1] = ˆρµ Z1 (x)+µ δ (x) σz 2 2 (x) = var(z 2(x) Z 2 = g 2,Z 1 = g 1]) = ˆσ ρσ 2 Z 2 1 (x)+σδ(x) 2 Remarks: in µ Z2 (x): β and ρ are replaced by their posterior means. in σz 2 2 (x): we infer from the posterior distributions of β and ρ.

8 Generalizations Generalizations for the AR(1) model: s > 2 levels of code. ρ(x) = f ρ(x)β ρ. Bayesian formulation. Non-nested experimental design sets (see L. Le Gratiet thesis 213). Extend the AR(1) approach (see F. Zertuche): Z 2(x) = ψ(z 1(x))+δ(x) Other Bayesian formulation with (see Qian and Wu 28): ρ(x) a Gaussian process. z 1(x) is supposed as known.

9 Sequential design Objective: we want to minimize the following generalization error: IMSE = σz 2 2 (x)dx = ˆσ ρ 2 σz 2 1 (x)dx + σδ(x)dx 2 Q Q Q Sequential strategy: we select a new point x n+1 such that : x n+1 = argmaxσz 2 x 2 (x)

10 Sequential design Objective: we want to minimize the following generalization error: IMSE = σz 2 2 (x)dx = ˆσ ρ 2 σz 2 1 (x)dx + σδ(x)dx 2 Q Q Q Sequential strategy: we select a new point x n+1 such that : x n+1 = argmaxσz 2 x 2 (x) Question : which code level should be simulated? What is the contribution of each code level to the model error? σ 2 Z 2 (x n+1 ) = ˆσ 2 ρσ 2 Z 1 (x n+1 )+σ 2 δ (x n+1) What is computational cost of each code? What is the expected reduction of the generalization error?

11 Sequential design Objective: we want to minimize the following generalization error: IMSE = σz 2 2 (x)dx = ˆσ ρ 2 σz 2 1 (x)dx + σδ(x)dx 2 Q Q Q Sequential strategy: we select a new point x n+1 such that : x n+1 = argmaxσz 2 x 2 (x) Question : which code level should be simulated? What is the contribution of each code level to the model error? σ 2 Z 2 (x n+1 ) = ˆσ 2 ρσ 2 Z 1 (x n+1 )+σ 2 δ (x n+1) What is computational cost of each code? What is the expected reduction of the generalization error?

12 Code level selection What is computational cost of each code? C : CPU time ration between g 2(x) and g 1(x). 1 run of g 1(x) and g 2(x) C +1 runs of g 1(x) (i.e. D 2 D 1) What is the expected reduction of the error? Reduction of the generalization error for Z 1(x) : ˆσ 2 ρσ 2 Z 1 (x n+1) Reduction of the generalization error for the bias δ(x) : σ 2 δ(x n+1) θ i δ θ i 1

13 Code level selection What is computational cost of each code? C : CPU time ration between g 2(x) and g 1(x). 1 run of g 1(x) and g 2(x) C +1 runs of g 1(x) (i.e. D 2 D 1) What is the expected reduction of the error? Reduction of the generalization error for Z 1(x) : ˆσ 2 ρσ 2 Z 1 (x n+1) Reduction of the generalization error for the bias δ(x) : σ 2 δ(x n+1) θ i δ θ i 1

14 Illustration of the design criterion X X 1 X 1 X 2 X X X 1 X 1

15 Code level selection Expected error reduction for the code level 2 : ˆσ ρσ 2 Z 2 1 (x n+1) θ i 1 +σδ(x 2 n+1) Expected error reduction for the code level 1 : (C +1)ˆσ ρσ 2 Z 2 1 (x n+1) θ i 1 θ i δ

16 Code level selection Expected error reduction for the code level 2 : ˆσ ρσ 2 Z 2 1 (x n+1) θ i 1 +σδ(x 2 n+1) Expected error reduction for the code level 1 : (C +1)ˆσ ρσ 2 Z 2 1 (x n+1) It is worth simulating g 2(x) if : (C +1)ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 < ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 +σδ(x 2 n+1) i.e. θ i 1 θ i δ θ i δ

17 Code level selection Expected error reduction for the code level 2 : ˆσ ρσ 2 Z 2 1 (x n+1) θ i 1 +σδ(x 2 n+1) Expected error reduction for the code level 1 : (C +1)ˆσ 2 ρσ 2 Z 1 (x n+1) It is worth simulating g 2(x) if : (C +1)ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 < ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 +σδ(x 2 n+1) i.e. θ i 1 ˆσ ρσ 2 Z 2 1 (x d n+1) θi 1 ˆσ ρσ 2 Z 2 1 (x d n+1) θi 1 +σδ 2(xn+1) < 1 d θi C +1 δ θ i δ θ i δ

18 Code level selection Expected error reduction for the code level 2 : ˆσ ρσ 2 Z 2 1 (x n+1) θ i 1 +σδ(x 2 n+1) Expected error reduction for the code level 1 : (C +1)ˆσ 2 ρσ 2 Z 1 (x n+1) It is worth simulating g 2(x) if : (C +1)ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 < ˆσ ρσ 2 2 Z 1 (x n+1) θ i 1 +σδ(x 2 n+1) i.e. θ i 1 ˆσ ρσ 2 Z 2 1 (x d n+1) θi 1 ˆσ ρσ 2 Z 2 1 (x d n+1) θi 1 +σδ 2(xn+1) < 1 d θi C +1 δ θ i δ θ i δ

19 New strategy Problem: the following criterion does not take into account the real prediction error x n+1 = argmaxσz 2 x 2 (x) New strategy: to take into account the true prediction error we consider the following criterion (adjusted variance) : x n+1 = argmax x ( ˆσ ρσ 2 Z 2 1,UK(x) 1+ n 1 +σ 2 δ,uk(x) ( 1+ n 2 ε 2 LOO,1 (x(1) ) i 1 σ LOO,1 2 x V i,1 (x(1) ) i ε 2 LOO,δ (x(2) ) i σ 2 LOO,δ (x(2) i ) ) 1 x V i,2 ) where V i,j is the Voronoi cell associated to x (j) i, j = 1,2, i = 1,...,n j.

20 New strategy Problem: the following criterion does not take into account the real prediction error x n+1 = argmaxσz 2 x 2 (x) New strategy: to take into account the true prediction error we consider the following criterion (adjusted variance) : x n+1 = argmax x ( ˆσ ρσ 2 Z 2 1,UK(x) 1+ n 1 +σ 2 δ,uk(x) ( 1+ n 2 ε 2 LOO,1 (x(1) ) i 1 σ LOO,1 2 x V i,1 (x(1) ) i ε 2 LOO,δ (x(2) ) i σ 2 LOO,δ (x(2) i ) ) 1 x V i,2 ) where V i,j is the Voronoi cell associated to x (j) i, j = 1,2, i = 1,...,n j.

21 Illustration of the new criterion f(x) x

22 Academic example: the Schubert s function g(x) = ( ( i cos (i +1)x 1 +i ) )( ( ) ) i cos (i +1)x 2 +i Max variance Kleijnen criterion X X X 1 X 1 Kleijnen, JPC and van Beers, WCM (24), Application-driven sequential designs for simulation experiments : Kriging metamodelling. Journal of the Operational Research

23 Academic example: the Schubert s function g(x) = ( ( i cos (i +1)x 1 +i ) )( ( ) ) i cos (i +1)x 2 +i Adjusted variance Shubert s function X IMSE Number of observations adjmmse IMSE KLEIJNEN X 1

24 Application : spherical tank under internal pressure High-fidelity code: g 2(x) is the von Mises stress at point 1, 2 or 3 provided by a finit elements code. x = (P,R int,t shell,t cap,e shell,e cap,σ y,shell,σ y,cap) (d = 8) P : internal pression. R int : tank internal radius. T shell : tank thickness. T cap: cap thickness. E shell : tank Young s modulus. E cap: cap Young s modulus. σ y,shell : tank yield stress. σ y,cap: cap yield stress. Low-fidelity code: g 1 is the 1D approximation of g 2 (perfectly spherical tank). g 1(x) = 3 (R int +T shell ) 3 P 2 (R int +T shell ) 3 Rint 3 3 P CAP 2 SHELL 1

25 Application : spherical tank under internal pressure High-fidelity code: g 2(x) is the von Mises stress at point 1, 2 or 3 provided by a finit elements code. x = (P,R int,t shell,t cap,e shell,e cap,σ y,shell,σ y,cap) (d = 8) P : internal pression. R int : tank internal radius. T shell : tank thickness. T cap: cap thickness. E shell : tank Young s modulus. E cap: cap Young s modulus. σ y,shell : tank yield stress. σ y,cap: cap yield stress. Low-fidelity code: g 1 is the 1D approximation of g 2 (perfectly spherical tank). g 1(x) = 3 (R int +T shell ) 3 P 2 (R int +T shell ) 3 Rint 3 3 P CAP 2 SHELL 1 Co-kriging multi-fidelity model built with n 1 = 1 and n 2 = 2. The model efficiency Q 2 is estimated from a test set of 7 points. Q 2 86%.

26 Application : spherical tank under internal pressure High-fidelity code: g 2(x) is the von Mises stress at point 1, 2 or 3 provided by a finit elements code. x = (P,R int,t shell,t cap,e shell,e cap,σ y,shell,σ y,cap) (d = 8) P : internal pression. R int : tank internal radius. T shell : tank thickness. T cap: cap thickness. E shell : tank Young s modulus. E cap: cap Young s modulus. σ y,shell : tank yield stress. σ y,cap: cap yield stress. Low-fidelity code: g 1 is the 1D approximation of g 2 (perfectly spherical tank). g 1(x) = 3 (R int +T shell ) 3 P 2 (R int +T shell ) 3 Rint 3 3 P CAP 2 SHELL 1 Co-kriging multi-fidelity model built with n 1 = 1 and n 2 = 2. The model efficiency Q 2 is estimated from a test set of 7 points. Q 2 86%.

27 Results Response 1 Response 2 Normalized RMSE.2..1 Sequential kriging Sequential co kriging Normalized RMSE Sequential kriging Sequential co kriging CPU time CPU time Response 3 Normalized RMSE Sequential kriging Sequential co kriging Proportion of accurate code runs Rep. 3 Rep. 2 Rep CPU time CPU time

28 Kennedy, M. C. & O Hagan, A. (2), Predicting the output from a complex computer code when fast approximations are available. Biometrika 87, Le Gratiet, L. (213), Bayesian analysis of hierarchical multifidelity codes. SIAM/ASA J. Uncertainty Quantification 1-1, pp Le Gratiet, L. & Garnier, J. (214), Recursive co-kriging model for Design of Computer experiments with multiple levels of fidelity with an application to hydrodynamic, Int. J. Uncertainty Quantification.DOI: Bates, R.A., Buck, R., Riccomagno, E. & Wynn, H. (1996), Experimental design of computer experiments for large systems, Journal of the Royal Statistical Society B, 8 (1): Kleijnen, J. & van Beers, W. (24), Application-driven sequential design for simulation experiments: Kriging metamodeling., Journal of the Operational Research Society, : Le Gratiet, L. & Cannamela Claire (214), Kriging-based sequential design strategies using fast cross-validation techniques with extensions to multi-fidelity computer codes, to appear in TECHNOMETRICS. R CRAN package: MuFiCokriging

Multi-fidelity sensitivity analysis

Multi-fidelity sensitivity analysis Multi-fidelity sensitivity analysis Loic Le Gratiet 1,2, Claire Cannamela 2 & Bertrand Iooss 3 1 Université Denis-Diderot, Paris, France 2 CEA, DAM, DIF, F-91297 Arpajon, France 3 EDF R&D, 6 quai Watier,

More information

Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes

Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes François Bachoc Department of Statistics and Operations Research, University of Vienna (Former PhD

More information

Gaussian Processes for Computer Experiments

Gaussian Processes for Computer Experiments Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented

More information

Monitoring Wafer Geometric Quality using Additive Gaussian Process

Monitoring Wafer Geometric Quality using Additive Gaussian Process Monitoring Wafer Geometric Quality using Additive Gaussian Process Linmiao Zhang 1 Kaibo Wang 2 Nan Chen 1 1 Department of Industrial and Systems Engineering, National University of Singapore 2 Department

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

An adaptive kriging method for characterizing uncertainty in inverse problems

An adaptive kriging method for characterizing uncertainty in inverse problems Int Statistical Inst: Proc 58th World Statistical Congress, 2, Dublin Session STS2) p98 An adaptive kriging method for characterizing uncertainty in inverse problems FU Shuai 2 University Paris-Sud & INRIA,

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling François Bachoc former PhD advisor: Josselin Garnier former CEA advisor: Jean-Marc Martinez Department

More information

Comparing Non-informative Priors for Estimation and Prediction in Spatial Models

Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Environmentrics 00, 1 12 DOI: 10.1002/env.XXXX Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Regina Wu a and Cari G. Kaufman a Summary: Fitting a Bayesian model to spatial

More information

Kriging by Example: Regression of oceanographic data. Paris Perdikaris. Brown University, Division of Applied Mathematics

Kriging by Example: Regression of oceanographic data. Paris Perdikaris. Brown University, Division of Applied Mathematics Kriging by Example: Regression of oceanographic data Paris Perdikaris Brown University, Division of Applied Mathematics! January, 0 Sea Grant College Program Massachusetts Institute of Technology Cambridge,

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction

Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction Example The Lyon-Fedder-Mobary (LFM) model simulates the interaction of solar wind plasma

More information

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 8: Importance Sampling 8.1 Importance Sampling Importance sampling

More information

Large Scale Modeling by Bayesian Updating Techniques

Large Scale Modeling by Bayesian Updating Techniques Large Scale Modeling by Bayesian Updating Techniques Weishan Ren Centre for Computational Geostatistics Department of Civil and Environmental Engineering University of Alberta Large scale models are useful

More information

Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging

Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging Better Simulation Metamodeling: The Why, What and How of Stochastic Kriging Jeremy Staum Collaborators: Bruce Ankenman, Barry Nelson Evren Baysal, Ming Liu, Wei Xie supported by the NSF under Grant No.

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Kriging approach for the experimental cross-section covariances estimation

Kriging approach for the experimental cross-section covariances estimation EPJ Web of Conferences, () DOI:./ epjconf/ C Owned by te autors, publised by EDP Sciences, Kriging approac for te experimental cross-section covariances estimation S. Varet,a, P. Dossantos-Uzarralde, N.

More information

Remarks on Multi-Output Gaussian Process Regression

Remarks on Multi-Output Gaussian Process Regression Remarks on Multi-Output Gaussian Process Regression Haitao Liu a,, Jianfei Cai b, Yew-Soon Ong b,c a Rolls-Royce@NTU Corporate Lab, Nanyang Technological University, Singapore 637460 b School of Computer

More information

SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION

SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION International Journal for Uncertainty Quantification, ():xxx xxx, 0 SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION Zhen Jiang, Daniel W. Apley, & Wei

More information

A new estimator for quantile-oriented sensitivity indices

A new estimator for quantile-oriented sensitivity indices A new estimator for quantile-oriented sensitivity indices Thomas Browne Supervisors : J-C. Fort (Paris 5) & T. Klein (IMT-Toulouse) Advisors : B. Iooss & L. Le Gratiet (EDF R&D-MRI Chatou) EDF R&D-MRI

More information

Introduction to emulators - the what, the when, the why

Introduction to emulators - the what, the when, the why School of Earth and Environment INSTITUTE FOR CLIMATE & ATMOSPHERIC SCIENCE Introduction to emulators - the what, the when, the why Dr Lindsay Lee 1 What is a simulator? A simulator is a computer code

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

arxiv: v1 [stat.me] 24 May 2010

arxiv: v1 [stat.me] 24 May 2010 The role of the nugget term in the Gaussian process method Andrey Pepelyshev arxiv:1005.4385v1 [stat.me] 24 May 2010 Abstract The maximum likelihood estimate of the correlation parameter of a Gaussian

More information

Econ 5150: Applied Econometrics Dynamic Demand Model Model Selection. Sung Y. Park CUHK

Econ 5150: Applied Econometrics Dynamic Demand Model Model Selection. Sung Y. Park CUHK Econ 5150: Applied Econometrics Dynamic Demand Model Model Selection Sung Y. Park CUHK Simple dynamic models A typical simple model: y t = α 0 + α 1 y t 1 + α 2 y t 2 + x tβ 0 x t 1β 1 + u t, where y t

More information

Robust and accurate inference via a mixture of Gaussian and terrors

Robust and accurate inference via a mixture of Gaussian and terrors Robust and accurate inference via a mixture of Gaussian and terrors Hyungsuk Tak ICTS/SAMSI Time Series Analysis for Synoptic Surveys and Gravitational Wave Astronomy 22 March 2017 Joint work with Justin

More information

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Jon Wakefield Departments of Statistics and Biostatistics University of Washington 1 / 37 Lecture Content Motivation

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Uncertainty quantification and calibration of computer models. February 5th, 2014

Uncertainty quantification and calibration of computer models. February 5th, 2014 Uncertainty quantification and calibration of computer models February 5th, 2014 Physical model Physical model is defined by a set of differential equations. Now, they are usually described by computer

More information

Bayesian Interpretations of Regularization

Bayesian Interpretations of Regularization Bayesian Interpretations of Regularization Charlie Frogner 9.50 Class 15 April 1, 009 The Plan Regularized least squares maps {(x i, y i )} n i=1 to a function that minimizes the regularized loss: f S

More information

Bayesian Hierarchical Modeling for Integrating Low-accuracy and High-accuracy Experiments

Bayesian Hierarchical Modeling for Integrating Low-accuracy and High-accuracy Experiments Bayesian Hierarchical Modeling for Integrating Low-accuracy and High-accuracy Experiments Zhiguang Qian and C. F. Jeff Wu School of Industrial and Systems Engineering Georgia Institute of Technology Atlanta,

More information

Wrapped Gaussian processes: a short review and some new results

Wrapped Gaussian processes: a short review and some new results Wrapped Gaussian processes: a short review and some new results Giovanna Jona Lasinio 1, Gianluca Mastrantonio 2 and Alan Gelfand 3 1-Università Sapienza di Roma 2- Università RomaTRE 3- Duke University

More information

Computer Model Calibration or Tuning in Practice

Computer Model Calibration or Tuning in Practice Computer Model Calibration or Tuning in Practice Jason L. Loeppky Department of Statistics University of British Columbia Vancouver, BC, V6T 1Z2, CANADA (jason@stat.ubc.ca) Derek Bingham Department of

More information

Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro

Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging., Delphine Dupuy and Laurent Carraro Historical context First introduced in the field of geostatistics (Matheron,

More information

Estimation de la fonction de covariance dans le modèle de Krigeage et applications: bilan de thèse et perspectives

Estimation de la fonction de covariance dans le modèle de Krigeage et applications: bilan de thèse et perspectives Estimation de la fonction de covariance dans le modèle de Krigeage et applications: bilan de thèse et perspectives François Bachoc Josselin Garnier Jean-Marc Martinez CEA-Saclay, DEN, DM2S, STMF, LGLS,

More information

Gaussian Process Optimization with Mutual Information

Gaussian Process Optimization with Mutual Information Gaussian Process Optimization with Mutual Information Emile Contal 1 Vianney Perchet 2 Nicolas Vayatis 1 1 CMLA Ecole Normale Suprieure de Cachan & CNRS, France 2 LPMA Université Paris Diderot & CNRS,

More information

Pseudo-measurement simulations and bootstrap for the experimental cross-section covariances estimation with quality quantification

Pseudo-measurement simulations and bootstrap for the experimental cross-section covariances estimation with quality quantification Pseudo-measurement simulations and bootstrap for the experimental cross-section covariances estimation with quality quantification S. Varet 1 P. Dossantos-Uzarralde 1 N. Vayatis 2 E. Bauge 1 1 CEA-DAM-DIF

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

Bayesian sensitivity analysis of a cardiac cell model using a Gaussian process emulator Supporting information

Bayesian sensitivity analysis of a cardiac cell model using a Gaussian process emulator Supporting information Bayesian sensitivity analysis of a cardiac cell model using a Gaussian process emulator Supporting information E T Y Chang 1,2, M Strong 3 R H Clayton 1,2, 1 Insigneo Institute for in-silico Medicine,

More information

Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge

Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Master s Thesis Presentation Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Diego Selle (RIS @ LAAS-CNRS, RT-TUM) Master s Thesis Presentation

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Bayesian Decision and Bayesian Learning

Bayesian Decision and Bayesian Learning Bayesian Decision and Bayesian Learning Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 30 Bayes Rule p(x ω i

More information

Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models

Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models Dr. Liping Wang GE Global Research Manager, Probabilistics Lab Niskayuna, NY 2011 UQ Workshop University

More information

Bayesian Estimation of Input Output Tables for Russia

Bayesian Estimation of Input Output Tables for Russia Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian

More information

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018 CLASS NOTES Models, Algorithms and Data: Introduction to computing 208 Petros Koumoutsakos, Jens Honore Walther (Last update: June, 208) IMPORTANT DISCLAIMERS. REFERENCES: Much of the material (ideas,

More information

Structural reliability analysis for p-boxes using multi-level meta-models

Structural reliability analysis for p-boxes using multi-level meta-models Structural reliability analysis for p-boxes using multi-level meta-models R. Schöbi, B. Sudret To cite this version: R. Schöbi, B. Sudret. Structural reliability analysis for p-boxes using multi-level

More information

When using physical experimental data to adjust, or calibrate, computer simulation models, two general

When using physical experimental data to adjust, or calibrate, computer simulation models, two general A Preposterior Analysis to Predict Identifiability in Experimental Calibration of Computer Models Paul D. Arendt Northwestern University, Department of Mechanical Engineering 2145 Sheridan Road Room B214

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Math 181B Homework 1 Solution

Math 181B Homework 1 Solution Math 181B Homework 1 Solution 1. Write down the likelihood: L(λ = n λ X i e λ X i! (a One-sided test: H 0 : λ = 1 vs H 1 : λ = 0.1 The likelihood ratio: where LR = L(1 L(0.1 = 1 X i e n 1 = λ n X i e nλ

More information

Point spread function reconstruction from the image of a sharp edge

Point spread function reconstruction from the image of a sharp edge DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

Bayesian inference J. Daunizeau

Bayesian inference J. Daunizeau Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

Statistical Inference: Maximum Likelihood and Bayesian Approaches

Statistical Inference: Maximum Likelihood and Bayesian Approaches Statistical Inference: Maximum Likelihood and Bayesian Approaches Surya Tokdar From model to inference So a statistical analysis begins by setting up a model {f (x θ) : θ Θ} for data X. Next we observe

More information

Optimal Design for Prediction in Random Field Models via Covariance Kernel Expansions

Optimal Design for Prediction in Random Field Models via Covariance Kernel Expansions Optimal Design for Prediction in Random Field Models via Covariance Kernel Expansions Bertrand Gauthier, Luc Pronzato To cite this version: Bertrand Gauthier, Luc Pronzato. Optimal Design for Prediction

More information

Nonparametric Estimation of Luminosity Functions

Nonparametric Estimation of Luminosity Functions x x Nonparametric Estimation of Luminosity Functions Chad Schafer Department of Statistics, Carnegie Mellon University cschafer@stat.cmu.edu 1 Luminosity Functions The luminosity function gives the number

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

STATISTICAL ADJUSTMENT, CALIBRATION, AND UNCERTAINTY QUANTIFICATION OF COMPLEX COMPUTER MODELS

STATISTICAL ADJUSTMENT, CALIBRATION, AND UNCERTAINTY QUANTIFICATION OF COMPLEX COMPUTER MODELS STATISTICAL ADJUSTMENT, CALIBRATION, AND UNCERTAINTY QUANTIFICATION OF COMPLEX COMPUTER MODELS A Thesis Presented to The Academic Faculty by Huan Yan In Partial Fulfillment of the Requirements for the

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

Metamodeling-Based Optimization

Metamodeling-Based Optimization Metamodeling-Based Optimization Erdem Acar Postdoctoral Research Associate Mississippi State University ea81@msstate.edu erdem@cavs.msstate.edu Outline What is a metamodel? How do we construct a metamodel?

More information

Surrogate Modeling of Computer Experiments With Different Mesh Densities

Surrogate Modeling of Computer Experiments With Different Mesh Densities Technometrics ISSN: 0040-1706 (Print) 1537-2723 (Online) Journal homepage: http://amstat.tandfonline.com/loi/utch20 Surrogate Modeling of Computer Experiments With Different Mesh Densities Rui Tuo, C.

More information

Bayesian Nonparametric Modelling with the Dirichlet Process Regression Smoother

Bayesian Nonparametric Modelling with the Dirichlet Process Regression Smoother Bayesian Nonparametric Modelling with the Dirichlet Process Regression Smoother J. E. Griffin and M. F. J. Steel University of Warwick Bayesian Nonparametric Modelling with the Dirichlet Process Regression

More information

Multifidelity Information Fusion Algorithms for High- Dimensional Systems and Massive Data sets

Multifidelity Information Fusion Algorithms for High- Dimensional Systems and Massive Data sets Multifidelity Information Fusion Algorithms for High- Dimensional Systems and Massive Data sets The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story

More information

LECTURE NOTE #3 PROF. ALAN YUILLE

LECTURE NOTE #3 PROF. ALAN YUILLE LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.

More information

Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification

Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification François Bachoc Josselin Garnier Jean-Marc Martinez CEA-Saclay, DEN, DM2S, STMF,

More information

An improved approach to estimate the hyper-parameters of the kriging model for high-dimensional problems through the Partial Least Squares method

An improved approach to estimate the hyper-parameters of the kriging model for high-dimensional problems through the Partial Least Squares method An improved approach to estimate the hyper-parameters of the kriging model for high-dimensional problems through the Partial Least Squares method Mohamed Amine Bouhlel, Nathalie Bartoli, Abdelkader Otsmane,

More information

Lecture 8. Stress Strain in Multi-dimension

Lecture 8. Stress Strain in Multi-dimension Lecture 8. Stress Strain in Multi-dimension Module. General Field Equations General Field Equations [] Equilibrium Equations in Elastic bodies xx x y z yx zx f x 0, etc [2] Kinematics xx u x x,etc. [3]

More information

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak Gaussian processes and bayesian optimization Stanisław Jastrzębski kudkudak.github.io kudkudak Plan Goal: talk about modern hyperparameter optimization algorithms Bayes reminder: equivalent linear regression

More information

Practice Final Examination. Please initial the statement below to show that you have read it

Practice Final Examination. Please initial the statement below to show that you have read it EN175: Advanced Mechanics of Solids Practice Final Examination School of Engineering Brown University NAME: General Instructions No collaboration of any kind is permitted on this examination. You may use

More information

STA 294: Stochastic Processes & Bayesian Nonparametrics

STA 294: Stochastic Processes & Bayesian Nonparametrics MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a

More information

Tilburg University. Efficient Global Optimization for Black-Box Simulation via Sequential Intrinsic Kriging Mehdad, Ehsan; Kleijnen, Jack

Tilburg University. Efficient Global Optimization for Black-Box Simulation via Sequential Intrinsic Kriging Mehdad, Ehsan; Kleijnen, Jack Tilburg University Efficient Global Optimization for Black-Box Simulation via Sequential Intrinsic Kriging Mehdad, Ehsan; Kleijnen, Jack Document version: Early version, also known as pre-print Publication

More information

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,

More information

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson Proceedings of the 0 Winter Simulation Conference S. Jain, R. R. Creasey, J. Himmelspach, K. P. White, and M. Fu, eds. RELATIVE ERROR STOCHASTIC KRIGING Mustafa H. Tongarlak Bruce E. Ankenman Barry L.

More information

Gaussian Process Functional Regression Model for Curve Prediction and Clustering

Gaussian Process Functional Regression Model for Curve Prediction and Clustering Gaussian Process Functional Regression Model for Curve Prediction and Clustering J.Q. SHI School of Mathematics and Statistics, University of Newcastle, UK j.q.shi@ncl.ac.uk http://www.staff.ncl.ac.uk/j.q.shi

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Machine Learning Srihari. Probability Theory. Sargur N. Srihari

Machine Learning Srihari. Probability Theory. Sargur N. Srihari Probability Theory Sargur N. Srihari srihari@cedar.buffalo.edu 1 Probability Theory with Several Variables Key concept is dealing with uncertainty Due to noise and finite data sets Framework for quantification

More information

BAYESIAN ESTIMATION OF LINEAR STATISTICAL MODEL BIAS

BAYESIAN ESTIMATION OF LINEAR STATISTICAL MODEL BIAS BAYESIAN ESTIMATION OF LINEAR STATISTICAL MODEL BIAS Andrew A. Neath 1 and Joseph E. Cavanaugh 1 Department of Mathematics and Statistics, Southern Illinois University, Edwardsville, Illinois 606, USA

More information

Advanced Statistics I : Gaussian Linear Model (and beyond)

Advanced Statistics I : Gaussian Linear Model (and beyond) Advanced Statistics I : Gaussian Linear Model (and beyond) Aurélien Garivier CNRS / Telecom ParisTech Centrale Outline One and Two-Sample Statistics Linear Gaussian Model Model Reduction and model Selection

More information

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a

More information

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box 90251 Durham, NC 27708, USA Summary: Pre-experimental Frequentist error probabilities do not summarize

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality

More information

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning Tobias Scheffer, Niels Landwehr Remember: Normal Distribution Distribution over x. Density function with parameters

More information

Bayesian estimation of the discrepancy with misspecified parametric models

Bayesian estimation of the discrepancy with misspecified parametric models Bayesian estimation of the discrepancy with misspecified parametric models Pierpaolo De Blasi University of Torino & Collegio Carlo Alberto Bayesian Nonparametrics workshop ICERM, 17-21 September 2012

More information

Lihua Sun. Department of Economics and Finance, School of Economics and Management Tongji University 1239 Siping Road Shanghai , China

Lihua Sun. Department of Economics and Finance, School of Economics and Management Tongji University 1239 Siping Road Shanghai , China Proceedings of the 20 Winter Simulation Conference S. Jain, R. R. Creasey, J. Himmelspach, K. P. White, and M. Fu, eds. OPTIMIZATION VIA SIMULATION USING GAUSSIAN PROCESS-BASED SEARCH Lihua Sun Department

More information

A parametric study on the elastic-plastic deformation of a centrally heated two-layered composite cylinder with free ends

A parametric study on the elastic-plastic deformation of a centrally heated two-layered composite cylinder with free ends Arch. Mech., 68, 3, pp. 03 8, Warszawa 06 A parametric study on the elastic-plastic deformation of a centrally heated two-layered composite cylinder with free ends F. YALCIN ), A. OZTURK ), M. GULGEC 3)

More information

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008 Ages of stellar populations from color-magnitude diagrams Paul Baines Department of Statistics Harvard University September 30, 2008 Context & Example Welcome! Today we will look at using hierarchical

More information

Hierarchical Modeling for Univariate Spatial Data

Hierarchical Modeling for Univariate Spatial Data Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This

More information

Limit Kriging. Abstract

Limit Kriging. Abstract Limit Kriging V. Roshan Joseph School of Industrial and Systems Engineering Georgia Institute of Technology Atlanta, GA 30332-0205, USA roshan@isye.gatech.edu Abstract A new kriging predictor is proposed.

More information

Calibration and uncertainty quantification using multivariate simulator output

Calibration and uncertainty quantification using multivariate simulator output Calibration and uncertainty quantification using multivariate simulator output LA-UR 4-74 Dave Higdon, Statistical Sciences Group, LANL Jim Gattiker, Statistical Sciences Group, LANL Brian Williams, Statistical

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30

MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30 MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD Copyright c 2012 (Iowa State University) Statistics 511 1 / 30 INFORMATION CRITERIA Akaike s Information criterion is given by AIC = 2l(ˆθ) + 2k, where l(ˆθ)

More information

spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models

spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models Andrew O. Finley 1, Sudipto Banerjee 2, and Bradley P. Carlin 2 1 Michigan State University, Departments

More information

Bayesian inference for the location parameter of a Student-t density

Bayesian inference for the location parameter of a Student-t density Bayesian inference for the location parameter of a Student-t density Jean-François Angers CRM-2642 February 2000 Dép. de mathématiques et de statistique; Université de Montréal; C.P. 628, Succ. Centre-ville

More information

A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS

A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS Statistica Sinica 26 (2016), 1117-1128 doi:http://dx.doi.org/10.5705/ss.202015.0240 A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS Xu He and Peter Z. G. Qian Chinese Academy of Sciences

More information

ICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts

ICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes

More information

Chapter 4 - Spatial processes R packages and software Lecture notes

Chapter 4 - Spatial processes R packages and software Lecture notes TK4150 - Intro 1 Chapter 4 - Spatial processes R packages and software Lecture notes Odd Kolbjørnsen and Geir Storvik February 13, 2017 STK4150 - Intro 2 Last time General set up for Kriging type problems

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Introduction to Probabilistic Graphical Models: Exercises

Introduction to Probabilistic Graphical Models: Exercises Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics

More information

The Gaussian process modelling module in UQLab

The Gaussian process modelling module in UQLab The Gaussian process modelling module in UQLab Christos Lataniotis, Stefano Marelli, and Bruno Sudret Chair of Risk, Safety and Uncertainty Quantification, ETH Zurich, Stefano-Franscini-Platz 5, 8093 Zurich,

More information