Multi-fidelity sensitivity analysis

Size: px
Start display at page:

Download "Multi-fidelity sensitivity analysis"

Transcription

1 Multi-fidelity sensitivity analysis Loic Le Gratiet 1,2, Claire Cannamela 2 & Bertrand Iooss 3 1 Université Denis-Diderot, Paris, France 2 CEA, DAM, DIF, F Arpajon, France 3 EDF R&D, 6 quai Watier, Chatou, France 7th International Conference on Sensitivity Analysis of Model Output July 4th / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

2 Introduction Perform a global sensitivity analysis of a time-consuming numerical model output by calculating variance-based measures of model input parameters. Interest in the so-called Sobol indices. Surrogate the code output by a metamodel for estimating the Sobol indices. Focus on a multi-fidelity cokriging metamodel. Propose a Sobol index estimator taking into account the metamodel error and the sampling error introduced during the estimation. 2 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

3 Outline 1 Multi-fidelity co-kriging model 2 Multi-fidelity sensitivity analysis 3 Applications 3 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

4 Motivations Objective : replace the output of a code, called g 2(x), by a metamodel. g 2(x) : x Q R d R Framework : a coarse version g 1 of g 2 is available. Principle : build a metamodel of g 2(x) which integrates as well observations of the coarse code output. Multi-fidelity co-kriging model 4 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

5 Recursive formulation of the model Multi-fidelity co-kriging model :[Kennedy & O Hagan (2000), Le Gratiet (2012)] j Z2(x) = ρz 1 (x) + δ(x) Z 1 (x) δ(x) where Z 1 (x) [Z 1(x) Z 1 = g 1, β 1, σ 2 1, θ 1], with g 1 = g 1(x), x D 1 and Z 1(x) GP `f t 1(x)β 1, σ 2 1r 1(x, x; θ 1), δ(x) GP `f t δ(x)β δ, σ 2 δr δ (x, x; θ δ ) Parameters estimation : θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ β 1, : analytical posterior distribution (Bayesian inference) ρ Finally, Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1] with g 2 = g 2(x), x D 2 We suppose that D 2 D 1 and θ 1, θ δ,σ 2 1, σ 2 δ are known. 5 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

6 Recursive formulation of the model Multi-fidelity co-kriging model :[Kennedy & O Hagan (2000), Le Gratiet (2012)] j Z2(x) = ρz 1 (x) + δ(x) Z 1 (x) δ(x) where Z 1 (x) [Z 1(x) Z 1 = g 1, β 1, σ 2 1, θ 1], with g 1 = g 1(x), x D 1 and Z 1(x) GP `f t 1(x)β 1, σ 2 1r 1(x, x; θ 1), δ(x) GP `f t δ(x)β δ, σ 2 δr δ (x, x; θ δ ) Parameters estimation : θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ β 1, : analytical posterior distribution (Bayesian inference) ρ Finally, Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1] with g 2 = g 2(x), x D 2 We suppose that D 2 D 1 and θ 1, θ δ,σ 2 1, σ 2 δ are known. 5 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

7 Recursive formulation of the model Multi-fidelity co-kriging model :[Kennedy & O Hagan (2000), Le Gratiet (2012)] j Z2(x) = ρz 1 (x) + δ(x) Z 1 (x) δ(x) where Z 1 (x) [Z 1(x) Z 1 = g 1, β 1, σ 2 1, θ 1], with g 1 = g 1(x), x D 1 and Z 1(x) GP `f t 1(x)β 1, σ 2 1r 1(x, x; θ 1), δ(x) GP `f t δ(x)β δ, σ 2 δr δ (x, x; θ δ ) Parameters estimation : θ 1, θ δ,σ1, 2 σδ 2 : maximum likelihood method βδ β 1, : analytical posterior distribution (Bayesian inference) ρ Finally, Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1] with g 2 = g 2(x), x D 2 We suppose that D 2 D 1 and θ 1, θ δ,σ 2 1, σ 2 δ are known. 5 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

8 Predictive distribution In Universal Cokriging, the predictive distribution of Z 2 (x) is not Gaussian. The predictive mean and variance can be decomposed as : µ Z2 (x) = E [Z 2(x) Z 2 = g 2, Z 1 = g 1] = ˆρµ Z1 (x) + µ δ (x) σz 2 2 (x) = var (Z 2(x) Z 2 = g 2, Z 1 = g 1]) = ˆρ 2 σz 2 1 (x) + σδ(x) 2 Remarks : in µ Z2 (x) : β and ρ are replaced by their posterior means. in σz 2 2 (x) : we infer from the posterior distributions of β and ρ. 6 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

9 Outline 1 Multi-fidelity co-kriging model 2 Multi-fidelity sensitivity analysis 3 Applications 7 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

10 Sobol index definition Context : The input is a random vector X = (X d 1, X d 2 ) defined on (Ω X, F X, P X), with probability measure µ = µ d 1 µ d 2, d = d 1 + d 2. We replace the code output g 2(x) with a multi-fidelity co-kriging model Z 2 (x) defined on (Ω Z, F Z, P Z). Objective : we are interested in evaluating the closed Sobol indices defined by S d 1 = Vd 1 `EX ˆg2(X) X 1 V = varx d var X (g 2(X)) Classical approach estimation : replace g 2(x) by µ Z 2 (x). + Computationally cheap do not infer from the meta-model error 8 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

11 Sobol index definition Context : The input is a random vector X = (X d 1, X d 2 ) defined on (Ω X, F X, P X), with probability measure µ = µ d 1 µ d 2, d = d 1 + d 2. We replace the code output g 2(x) with a multi-fidelity co-kriging model Z 2 (x) defined on (Ω Z, F Z, P Z). Objective : we are interested in evaluating the closed Sobol indices defined by S d 1 = Vd 1 `EX ˆg2(X) X 1 V = varx d var X (g 2(X)) Classical approach estimation : replace g 2(x) by µ Z 2 (x). + Computationally cheap do not infer from the meta-model error 8 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

12 The meta-model error Another approach :[Oakley & O Hagan (2004), Marrel & al. (2009)] Replace g 2(x) by Z2 (x) and consider the term : `EX ˆZ S d 1 Z2 = varx 2 (X) X 1 d var X (Z2 (X)) (on (Ω Z, F Z, P Z)) They suggest the following quantity : They evaluate its uncertainty with : ˆvarX `EX ˆZ S d 1 Z2 = EZ 2 (X) X 1 d E Z [var X (Z2 (X))] eσ 2 Z 2 + infer from the meta-model error = varz `varx `EX ˆZ 2 (X) X d 1 E Z [var X (Z 2 (X))]2 Computationally expensive (numerical integrations) S d 1 Z 2 is not the expectation of S d 1 Z 2 and e Σ 2 Z 2 is different from the variance of S d 1 Z 2 9 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

13 Sobol index estimation - objective Recall : S d 1 Z 2 is a random variable `EX ˆZ S d 1 Z2 = varx 2 (X) X 1 d var X (Z2 (X)) Objective : We want to build A unbiased Monte Carlo estimator Ŝd 1 Z2,m of Sd 1, where m represents the number of Monte-Carlo particles. Z 2 An estimator of the variance of Ŝd 1 Z 2,m. 10 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

14 Sobol index estimation - sampling Generate a 2m-sample x = (x i, x i) i=1,...,m of the random vector (X, X) [Sobol (1993)] such that X = (X d 1, X d 2 ) X = (X d 1, X d 2 ) X d 2 X d 2 For k = 1,..., N Z : 1. Sample a realization z 2,k(x) of Z 2 (x) at points in x. 2. Compute Ŝd 1 Z 2,m,k,1 from z 2,k(x) with : [Janon & al. (2012)] P 1 m Ŝ d 1 Z2,m,k,1 = m i=1 z 2,k(x i)z2,k( x P 1 i) m 2m i=1 z 2,k(x i) + z2,k( x i) P 1 m P 2 m i=1 z 2,k (xi)2 1 m 2m i=1 z 2,k (xi) + z 2,k ( xi) 3. For l = 2,..., B : Sample with replacements x B l from x Compute Ŝd 1 Z 2,m,k,l from z 2,k(x B l ) / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

15 Sobol index estimation - sampling Generate a 2m-sample x = (x i, x i) i=1,...,m of the random vector (X, X) [Sobol (1993)] such that X = (X d 1, X d 2 ) X = (X d 1, X d 2 ) X d 2 X d 2 For k = 1,..., N Z : 1. Sample a realization z 2,k(x) of Z 2 (x) at points in x. 2. Compute Ŝd 1 Z 2,m,k,1 from z 2,k(x) with : [Janon & al. (2012)] P 1 m Ŝ d 1 Z2,m,k,1 = m i=1 z 2,k(x i)z2,k( x P 1 i) m 2m i=1 z 2,k(x i) + z2,k( x i) P 1 m P 2 m i=1 z 2,k (xi)2 1 m 2m i=1 z 2,k (xi) + z 2,k ( xi) 3. For l = 2,..., B : Sample with replacements x B l from x Compute Ŝd 1 Z 2,m,k,l from z 2,k(x B l ) / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

16 Sobol index estimation - sampling Generate a 2m-sample x = (x i, x i) i=1,...,m of the random vector (X, X) [Sobol (1993)] such that X = (X d 1, X d 2 ) X = (X d 1, X d 2 ) X d 2 X d 2 For k = 1,..., N Z : 1. Sample a realization z 2,k(x) of Z 2 (x) at points in x. 2. Compute Ŝd 1 Z 2,m,k,1 from z 2,k(x) with : [Janon & al. (2012)] P 1 m Ŝ d 1 Z2,m,k,1 = m i=1 z 2,k(x i)z2,k( x P 1 i) m 2m i=1 z 2,k(x i) + z2,k( x i) P 1 m P 2 m i=1 z 2,k (xi)2 1 m 2m i=1 z 2,k (xi) + z 2,k ( xi) 3. For l = 2,..., B : Sample with replacements x B l from x Compute Ŝd 1 Z 2,m,k,l from z 2,k(x B l ) / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

17 Sobol index estimation - sampling Generate a 2m-sample x = (x i, x i) i=1,...,m of the random vector (X, X) [Sobol (1993)] such that X = (X d 1, X d 2 ) X = (X d 1, X d 2 ) X d 2 X d 2 For k = 1,..., N Z : 1. Sample a realization z 2,k(x) of Z 2 (x) at points in x. 2. Compute Ŝd 1 Z 2,m,k,1 from z 2,k(x) with : [Janon & al. (2012)] P 1 m Ŝ d 1 Z2,m,k,1 = m i=1 z 2,k(x i)z2,k( x P 1 i) m 2m i=1 z 2,k(x i) + z2,k( x i) P 1 m P 2 m i=1 z 2,k (xi)2 1 m 2m i=1 z 2,k (xi) + z 2,k ( xi) 3. For l = 2,..., B : Sample with replacements x B l from x Compute Ŝd 1 Z 2,m,k,l from z 2,k(x B l ) / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

18 Sobol index estimation - meta-model and MC errors The estimator of S d 1 Z 2 : Ŝ d 1 Z Ŝd 2,m is defined as the empirical mean of the sample 1 Z2 k=1,...,n,m,k,l : Z l=1,...,b N Ŝ d 1 Z2,m = 1 X Z N Z B BX k=1 l=1 Ŝ d 1 Z 2,m,k,l The variance of the estimator Ŝd 1 Z2,m : ˆΣ 2 is estimated from the empirical variance of 1 Ŝd Ŝ d 1 Z 2 Z,m 2 k=1,...,n,m,k,l : Z l=1,...,b N ˆΣ 2 1 X Z Ŝ d 1 = Z 2 N,m Z B 1 BX k=1 l=1 Ŝd 1 Z 2,m,k,l Ŝd 1 Z 2,m 2 It integrates both meta-modeling error and Monte-Carlo integrations error. 12 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

19 Sobol index estimation - minimal number of MC particles Variance due to the meta-modeling : i var Z E 1 X hŝd Z2,m Z 2 (x) can be estimated by : " bσ 2 Z2 = 1 N BX 1 X Z «# 2 Ŝ d 1 Z B N Z 1 2,m,k,l 1 Ŝd Z2,m,l l=1 k=1 Variance due to Monte-Carlo integrations : var X E 1 Z hŝd Z2,m (Xi, X i i) i=1,...,m can be estimated by : " bσ 2 MC = 1 N X Z 1 BX «# 2 Ŝ d 1 Z N Z B 1 2,m,k,l 1 Ŝd Z2,m,k k=1 Determining the minimal number of Monte-Carlo particles m : l=1 Considering given the cokriging model Z 2 (x), the minimal value m is the one such that b Σ 2 Z 2 (x) b Σ 2 MC 13 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

20 Sobol index estimation - minimal number of MC particles Variance due to the meta-modeling : i var Z E 1 X hŝd Z2,m Z 2 (x) can be estimated by : " bσ 2 Z2 = 1 N BX 1 X Z «# 2 Ŝ d 1 Z B N Z 1 2,m,k,l 1 Ŝd Z2,m,l l=1 k=1 Variance due to Monte-Carlo integrations : var X E 1 Z hŝd Z2,m (Xi, X i i) i=1,...,m can be estimated by : " bσ 2 MC = 1 N X Z 1 BX «# 2 Ŝ d 1 Z N Z B 1 2,m,k,l 1 Ŝd Z2,m,k k=1 Determining the minimal number of Monte-Carlo particles m : l=1 Considering given the cokriging model Z 2 (x), the minimal value m is the one such that b Σ 2 Z 2 (x) b Σ 2 MC 13 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

21 Outline 1 Multi-fidelity co-kriging model 2 Multi-fidelity sensitivity analysis 3 Applications 14 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

22 Ishigami function - kriging case First order Sobol index of the first input parameter when the size n of the learning sample increases. Learning sample : space filling design covariance structure : tensorised 5/2-Matérn kernel m = N Z = 500 B = 300 S % meta model confidence intervals 95% MC+meta model confidence intervals Sensitivity index mean Theoretical Sobol index n 15 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

23 Spherical tank under internal pressure - cokriging case Fine code : g 2(x) is the Von Mises stress at point 1 provided by a finite elements code. x = (P, R int, T shell, T cap, E shell, E cap, σ y,shell, σ y,cap) (d = 8) P : value of the internal pressure. R int : length of the internal radius of the shell. T shell : thickness of the shell. T cap : thickness of the cap. E shell : Young s modulus of the shell material. E cap : Young s modulus of the cap material. σ y,shell : yield stress of the cap material. σ y,cap : yield stress of the cap material. Coarse code : g 1 is the 1D simplification corresponding to a perfectly spherical tank g 1(x) = 3 (R int + T shell ) 3 2 (R int + T shell ) 3 P Rint 3 0 P CAP 1 SHELL Multi-fidelity co-kriging model : built with n 1 = 100 and only n 2 = 20. We estimate the efficiency E ff of the metamodel of g 2(x) by the coefficient of determination calculated on an external set of 7000 points. E ff 86% 0 16 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

24 Spherical tank under internal pressure - results Kriging-based sensitivity analysis of the coarse code output g 1(x) Learning sample : space filling designs n 1 = 100 covariance structure : tensorised 5/2-Matérn kernel m = N Z = 300 B = 150 P 8 i=1 Si 97% S % meta model confidence intervals 95% MC+meta model confidence intervals Sensitivity index mean R int T shell T cap E shell σ y E cap σ y P x 17 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

25 Spherical tank under internal pressure - results Cokriging-based sensitivity analysis of the fine code output g 2(x) Learning sample : space filling design n 1 = 100 n 2 = 20 covariance structures : 2 tensorised 5/2-Matérn kernels m = N Z = 300 B = 150 P 8 i=1 Si 96.7% S % meta model confidence intervals 95% MC+meta model confidence intervals Sensitivity index mean R int T shell T cap E shell σ y E cap σ y P x 17 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

26 Conclusion We have proposed a Sobol index estimator which uses realizations of a predictive process (built from a multi-fidelity cokriging model) at points in a large sample. we use the conditioning kriging [Chilès & Delfiner (1999)] In the computation of its variance, we distinguish among metamodel and Monte-Carlo variance contributions. We can also determine the minimal number of particles m such that the Monte-Carlo variance no longer dominates. The value m is different for each Sobol index estimation. 18 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

27 Bibliography Sobol, I. M. (1993), Sensitivity estimates for non linear mathematical models. Mathematical modeling and Computational Experiments 1, Kennedy, M. C. & O Hagan, A. (2000), Predicting the output from a complex computer code when fast approximations are available. Biometrika 87, Le Gratiet, L. & Garnier, J. (2012), Recursive co-kriging model for Design of Computer experiments with multiple levels of fidelity with an application to hydrodynamic, submitted to International Journal of Uncertainty Quantification. J.E. Oakley & A O Hagan (2004), Probabilistic sensitivity analysis of complex models : A bayesian approach, Journal of the Royal Statistical Society B, 66(3) : A. Marrel, B. Iooss, B. Laurent & O. Roustant (2012), Calculations of Sobol indices for the gaussian process metamodel, Reliability Engineering and System Safety, 94 : A. Janon, T. Klein, A. Lagnoux, M. Nodet & C. Prieur (2013), Asymptotic normality and efficiency of two Sobol index estimators, to appear in ESAIM probability and statistics. J.P. Chilès & P. Delfiner (1999), Geostatistics : modeling spatial uncertainty, Wiley series in probability and statistics (Applied probability and statistics section). L. Le Gratiet,C. Cannamela, & B.Iooss (2013), A Bayesian approach for global sensitivity analysis of (multi-fidelity) computer codes, submitted to SIAM/ASA UQ. R CRAN package : 19 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

28 Sampling from the predictive distribution Two issues : 1. How to sample from Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1]? 2. How to deal with large m? Sampling from Z2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1] : Z2 (x) can be written by : Z2 (x) = ρz1 (x) + δρ,β δ (x) where Z1 (x) and δρ,β δ (x) = [δ(x) ρ, β δ ] are Gaussian processes and (ρ, β δ ) have a known distribution. 1. Sample a realization z1,k (x) of Z 1 (x) 2. Sample a realization (ρ k, β k δ ) of (ρ, β δ ) 3. Sample a realization δk (x) of δ (x) ρ k,β k δ 4. Set z2,k (x) = ρk z 1,k (x) + δ k (x) 20 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

29 Sampling from the predictive distribution Two issues : 1. How to sample from Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1]? 2. How to deal with large m? Sampling from Z2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1] : Z2 (x) can be written by : Z2 (x) = ρz1 (x) + δρ,β δ (x) where Z1 (x) and δρ,β δ (x) = [δ(x) ρ, β δ ] are Gaussian processes and (ρ, β δ ) have a known distribution. 1. Sample a realization z1,k (x) of Z 1 (x) 2. Sample a realization (ρ k, β k δ ) of (ρ, β δ ) 3. Sample a realization δk (x) of δ (x) ρ k,β k δ 4. Set z2,k (x) = ρk z 1,k (x) + δ k (x) 20 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

30 Sampling from the predictive distribution Two issues : 1. How to sample from Z 2 (x) [Z 2(x) Z 2 = g 2, Z 1 = g 1]? 2. How to deal with large m? Sampling with large m : Conditioning kriging result :[Chilès & Delfiner (1999)] We can sample from the conditional distributions of Z 1 (x) and δ (x) by sampling from unconditioned distributions and then by applying a linear transformation on them. We can use efficient algorithms to compute realizations of processes with unconditioned distributions : Stationary kernel : spectral decomposition Tensorised covariance kernel : Karhunen-Loeve decomposition sequentially adding new points to x without re-estimating the decomposition. 20 / 19 L. Le Gratiet, C. Cannamela & B. Iooss Multi-fidelity sensitivity analysis

Multi-fidelity co-kriging models

Multi-fidelity co-kriging models Application to Sequential design Loic Le Gratiet 12, Claire Cannamela 3 1 EDF R&D, Chatou, France 2 UNS CNRS, 69 Sophia Antipolis, France 3 CEA, DAM, DIF, F-91297 Arpajon, France ANR CHORUS April 3, 214

More information

Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes

Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes Introduction to Gaussian-process based Kriging models for metamodeling and validation of computer codes François Bachoc Department of Statistics and Operations Research, University of Vienna (Former PhD

More information

Gaussian Processes for Computer Experiments

Gaussian Processes for Computer Experiments Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented

More information

Gaussian process regression for Sensitivity analysis

Gaussian process regression for Sensitivity analysis Gaussian process regression for Sensitivity analysis GPSS Workshop on UQ, Sheffield, September 2016 Nicolas Durrande, Mines St-Étienne, durrande@emse.fr GPSS workshop on UQ GPs for sensitivity analysis

More information

Introduction to emulators - the what, the when, the why

Introduction to emulators - the what, the when, the why School of Earth and Environment INSTITUTE FOR CLIMATE & ATMOSPHERIC SCIENCE Introduction to emulators - the what, the when, the why Dr Lindsay Lee 1 What is a simulator? A simulator is a computer code

More information

A new estimator for quantile-oriented sensitivity indices

A new estimator for quantile-oriented sensitivity indices A new estimator for quantile-oriented sensitivity indices Thomas Browne Supervisors : J-C. Fort (Paris 5) & T. Klein (IMT-Toulouse) Advisors : B. Iooss & L. Le Gratiet (EDF R&D-MRI Chatou) EDF R&D-MRI

More information

Gaussian Process Optimization with Mutual Information

Gaussian Process Optimization with Mutual Information Gaussian Process Optimization with Mutual Information Emile Contal 1 Vianney Perchet 2 Nicolas Vayatis 1 1 CMLA Ecole Normale Suprieure de Cachan & CNRS, France 2 LPMA Université Paris Diderot & CNRS,

More information

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling François Bachoc former PhD advisor: Josselin Garnier former CEA advisor: Jean-Marc Martinez Department

More information

Kriging by Example: Regression of oceanographic data. Paris Perdikaris. Brown University, Division of Applied Mathematics

Kriging by Example: Regression of oceanographic data. Paris Perdikaris. Brown University, Division of Applied Mathematics Kriging by Example: Regression of oceanographic data Paris Perdikaris Brown University, Division of Applied Mathematics! January, 0 Sea Grant College Program Massachusetts Institute of Technology Cambridge,

More information

Lectures. Variance-based sensitivity analysis in the presence of correlated input variables. Thomas Most. Source:

Lectures. Variance-based sensitivity analysis in the presence of correlated input variables. Thomas Most. Source: Lectures Variance-based sensitivity analysis in the presence of correlated input variables Thomas Most Source: www.dynardo.de/en/library Variance-based sensitivity analysis in the presence of correlated

More information

Total interaction index: A variance-based sensitivity index for second-order interaction screening

Total interaction index: A variance-based sensitivity index for second-order interaction screening Total interaction index: A variance-based sensitivity index for second-order interaction screening J. Fruth a, O. Roustant b, S. Kuhnt a,c a Faculty of Statistics, TU Dortmund University, Vogelpothsweg

More information

arxiv: v1 [math.st] 14 Oct 2013

arxiv: v1 [math.st] 14 Oct 2013 ANOVA decomposition of conditional Gaussian processes for sensitivity analysis with dependent inputs Gaelle Chastaing Loic Le Gratiet arxiv:1310.3578v1 [math.st] 14 Oct 2013 Abstract October 15, 2013 Complex

More information

Étude de classes de noyaux adaptées à la simplification et à l interprétation des modèles d approximation.

Étude de classes de noyaux adaptées à la simplification et à l interprétation des modèles d approximation. Étude de classes de noyaux adaptées à la simplification et à l interprétation des modèles d approximation. Soutenance de thèse de N. Durrande Ecole des Mines de St-Etienne, 9 november 2011 Directeurs Laurent

More information

SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION

SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION International Journal for Uncertainty Quantification, ():xxx xxx, 0 SURROGATE PREPOSTERIOR ANALYSES FOR PREDICTING AND ENHANCING IDENTIFIABILITY IN MODEL CALIBRATION Zhen Jiang, Daniel W. Apley, & Wei

More information

Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro

Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging. Céline Helbert, Delphine Dupuy and Laurent Carraro Assessment of uncertainty in computer experiments: from Universal Kriging to Bayesian Kriging., Delphine Dupuy and Laurent Carraro Historical context First introduced in the field of geostatistics (Matheron,

More information

Total interaction index: A variance-based sensitivity index for interaction screening

Total interaction index: A variance-based sensitivity index for interaction screening Total interaction index: A variance-based sensitivity index for interaction screening Jana Fruth, Olivier Roustant, Sonja Kuhnt To cite this version: Jana Fruth, Olivier Roustant, Sonja Kuhnt. Total interaction

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Derivative-based global sensitivity measures for interactions

Derivative-based global sensitivity measures for interactions Derivative-based global sensitivity measures for interactions Olivier Roustant École Nationale Supérieure des Mines de Saint-Étienne Joint work with J. Fruth, B. Iooss and S. Kuhnt SAMO 13 Olivier Roustant

More information

Uncertainty quantification and calibration of computer models. February 5th, 2014

Uncertainty quantification and calibration of computer models. February 5th, 2014 Uncertainty quantification and calibration of computer models February 5th, 2014 Physical model Physical model is defined by a set of differential equations. Now, they are usually described by computer

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing

More information

Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification

Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification Cross Validation and Maximum Likelihood estimations of hyper-parameters of Gaussian processes with model misspecification François Bachoc Josselin Garnier Jean-Marc Martinez CEA-Saclay, DEN, DM2S, STMF,

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Chapter 4 - Fundamentals of spatial processes Lecture notes

Chapter 4 - Fundamentals of spatial processes Lecture notes Chapter 4 - Fundamentals of spatial processes Lecture notes Geir Storvik January 21, 2013 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites Mostly positive correlation Negative

More information

An adaptive kriging method for characterizing uncertainty in inverse problems

An adaptive kriging method for characterizing uncertainty in inverse problems Int Statistical Inst: Proc 58th World Statistical Congress, 2, Dublin Session STS2) p98 An adaptive kriging method for characterizing uncertainty in inverse problems FU Shuai 2 University Paris-Sud & INRIA,

More information

A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments

A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments University of Exeter Department of Mathematics A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments Nathan Edward Owen May 2017 Supervised

More information

Understanding the uncertainty in the biospheric carbon flux for England and Wales

Understanding the uncertainty in the biospheric carbon flux for England and Wales Understanding the uncertainty in the biospheric carbon flux for England and Wales John Paul Gosling and Anthony O Hagan 30th November, 2007 Abstract Uncertainty analysis is the evaluation of the distribution

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt. SINGAPORE SHANGHAI Vol TAIPEI - Interdisciplinary Mathematical Sciences 19 Kernel-based Approximation Methods using MATLAB Gregory Fasshauer Illinois Institute of Technology, USA Michael McCourt University

More information

Polynomial chaos expansions for sensitivity analysis

Polynomial chaos expansions for sensitivity analysis c DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for sensitivity analysis B. Sudret Chair of Risk, Safety & Uncertainty

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Validating Expensive Simulations with Expensive Experiments: A Bayesian Approach

Validating Expensive Simulations with Expensive Experiments: A Bayesian Approach Validating Expensive Simulations with Expensive Experiments: A Bayesian Approach Dr. Arun Subramaniyan GE Global Research Center Niskayuna, NY 2012 ASME V& V Symposium Team: GE GRC: Liping Wang, Natarajan

More information

Practical unbiased Monte Carlo for Uncertainty Quantification

Practical unbiased Monte Carlo for Uncertainty Quantification Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University

More information

Geostatistical Modeling for Large Data Sets: Low-rank methods

Geostatistical Modeling for Large Data Sets: Low-rank methods Geostatistical Modeling for Large Data Sets: Low-rank methods Whitney Huang, Kelly-Ann Dixon Hamil, and Zizhuang Wu Department of Statistics Purdue University February 22, 2016 Outline Motivation Low-rank

More information

Bayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge

Bayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge Bayesian Quadrature: Model-based Approimate Integration David Duvenaud University of Cambridge The Quadrature Problem ˆ We want to estimate an integral Z = f ()p()d ˆ Most computational problems in inference

More information

Geostatistics for Gaussian processes

Geostatistics for Gaussian processes Introduction Geostatistical Model Covariance structure Cokriging Conclusion Geostatistics for Gaussian processes Hans Wackernagel Geostatistics group MINES ParisTech http://hans.wackernagel.free.fr Kernels

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

Uncertainty quantification and visualization for functional random variables

Uncertainty quantification and visualization for functional random variables Uncertainty quantification and visualization for functional random variables MascotNum Workshop 2014 S. Nanty 1,3 C. Helbert 2 A. Marrel 1 N. Pérot 1 C. Prieur 3 1 CEA, DEN/DER/SESI/LSMR, F-13108, Saint-Paul-lez-Durance,

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Nonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University

Nonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University Nonstationary spatial process modeling Part II Paul D. Sampson --- Catherine Calder Univ of Washington --- Ohio State University this presentation derived from that presented at the Pan-American Advanced

More information

Local Polynomial Estimation for Sensitivity Analysis on Models With Correlated Inputs

Local Polynomial Estimation for Sensitivity Analysis on Models With Correlated Inputs Local Polynomial Estimation for Sensitivity Analysis on Models With Correlated Inputs Sébastien Da Veiga, François Wahl, Fabrice Gamboa To cite this version: Sébastien Da Veiga, François Wahl, Fabrice

More information

Simultaneous Robust Design and Tolerancing of Compressor Blades

Simultaneous Robust Design and Tolerancing of Compressor Blades Simultaneous Robust Design and Tolerancing of Compressor Blades Eric Dow Aerospace Computational Design Laboratory Department of Aeronautics and Astronautics Massachusetts Institute of Technology GTL Seminar

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Multifidelity Monte Carlo Estimation of Variance and Sensitivity Indices

Multifidelity Monte Carlo Estimation of Variance and Sensitivity Indices Estimation of Variance and Sensitivity Indices E. Qian, B. Peherstorfer, D. O Malley, V. V. Vesselinov, and K. Willcox Abstract. Variance-based sensitivity analysis provides a quantitative measure of how

More information

Chapter 4 - Fundamentals of spatial processes Lecture notes

Chapter 4 - Fundamentals of spatial processes Lecture notes TK4150 - Intro 1 Chapter 4 - Fundamentals of spatial processes Lecture notes Odd Kolbjørnsen and Geir Storvik January 30, 2017 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites

More information

RECOMMENDED TOOLS FOR SENSITIVITY ANALYSIS ASSOCIATED TO THE EVALUATION OF MEASUREMENT UNCERTAINTY

RECOMMENDED TOOLS FOR SENSITIVITY ANALYSIS ASSOCIATED TO THE EVALUATION OF MEASUREMENT UNCERTAINTY Advanced Mathematical and Computational Tools in Metrology and Testing IX Edited by F. Pavese, M. Bär, J.-R. Filtz, A. B. Forbes, L. Pendrill and K. Shirono c 2012 World Scientific Publishing Company (pp.

More information

GLOBAL SENSITIVITY ANALYSIS WITH DEPENDENT INPUTS. Philippe Wiederkehr

GLOBAL SENSITIVITY ANALYSIS WITH DEPENDENT INPUTS. Philippe Wiederkehr GLOBAL SENSITIVITY ANALYSIS WITH DEPENDENT INPUTS Philippe Wiederkehr CHAIR OF RISK, SAFETY AND UNCERTAINTY QUANTIFICATION STEFANO-FRANSCINI-PLATZ 5 CH-8093 ZÜRICH Risk, Safety & Uncertainty Quantification

More information

Robustness criterion for the optimization scheme based on kriging metamodel

Robustness criterion for the optimization scheme based on kriging metamodel Robustness criterion for the optimization scheme based on kriging metamodel Melina Ribaud a,b, Frederic Gillot a, Celine Helbert b, Christophette Blanchet-Scalliet b and Celine Vial c,d a. Laboratoire

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Dinesh Kumar, Mehrdad Raisee and Chris Lacor

Dinesh Kumar, Mehrdad Raisee and Chris Lacor Dinesh Kumar, Mehrdad Raisee and Chris Lacor Fluid Mechanics and Thermodynamics Research Group Vrije Universiteit Brussel, BELGIUM dkumar@vub.ac.be; m_raisee@yahoo.com; chris.lacor@vub.ac.be October, 2014

More information

Sensitivity analysis using the Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will

Sensitivity analysis using the Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will Lectures Sensitivity analysis using the Metamodel of Optimal Prognosis Thomas Most & Johannes Will presented at the Weimar Optimization and Stochastic Days 2011 Source: www.dynardo.de/en/library Sensitivity

More information

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices

Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Vahid Dehdari and Clayton V. Deutsch Geostatistical modeling involves many variables and many locations.

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

Fast Update of Conditional Simulation Ensembles

Fast Update of Conditional Simulation Ensembles Fast Update of Conditional Simulation Ensembles Clément Chevalier, Xavier Emery, and David Ginsbourger Abstract Gaussian random field (GRF) conditional simulation is a key ingredient in many spatial statistics

More information

Karhunen-Loève decomposition of Gaussian measures on Banach spaces

Karhunen-Loève decomposition of Gaussian measures on Banach spaces Karhunen-Loève decomposition of Gaussian measures on Banach spaces Jean-Charles Croix GT APSSE - April 2017, the 13th joint work with Xavier Bay. 1 / 29 Sommaire 1 Preliminaries on Gaussian processes 2

More information

arxiv: v1 [stat.me] 24 May 2010

arxiv: v1 [stat.me] 24 May 2010 The role of the nugget term in the Gaussian process method Andrey Pepelyshev arxiv:1005.4385v1 [stat.me] 24 May 2010 Abstract The maximum likelihood estimate of the correlation parameter of a Gaussian

More information

Large Scale Modeling by Bayesian Updating Techniques

Large Scale Modeling by Bayesian Updating Techniques Large Scale Modeling by Bayesian Updating Techniques Weishan Ren Centre for Computational Geostatistics Department of Civil and Environmental Engineering University of Alberta Large scale models are useful

More information

Data-driven Kriging models based on FANOVA-decomposition

Data-driven Kriging models based on FANOVA-decomposition Data-driven Kriging models based on FANOVA-decomposition Thomas Muehlenstaedt, Olivier Roustant, Laurent Carraro, Sonja Kuhnt To cite this version: Thomas Muehlenstaedt, Olivier Roustant, Laurent Carraro,

More information

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland EnviroInfo 2004 (Geneva) Sh@ring EnviroInfo 2004 Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland Mikhail Kanevski 1, Michel Maignan 1

More information

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Jon Wakefield Departments of Statistics and Biostatistics University of Washington 1 / 37 Lecture Content Motivation

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

Variational inference

Variational inference Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models

Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models Challenges In Uncertainty, Calibration, Validation and Predictability of Engineering Analysis Models Dr. Liping Wang GE Global Research Manager, Probabilistics Lab Niskayuna, NY 2011 UQ Workshop University

More information

6.4 Kalman Filter Equations

6.4 Kalman Filter Equations 6.4 Kalman Filter Equations 6.4.1 Recap: Auxiliary variables Recall the definition of the auxiliary random variables x p k) and x m k): Init: x m 0) := x0) S1: x p k) := Ak 1)x m k 1) +uk 1) +vk 1) S2:

More information

Bayesian Dynamic Linear Modelling for. Complex Computer Models

Bayesian Dynamic Linear Modelling for. Complex Computer Models Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer

More information

arxiv: v2 [stat.ap] 26 Feb 2013

arxiv: v2 [stat.ap] 26 Feb 2013 Calibration and improved prediction of computer models by universal Kriging arxiv:1301.4114v2 [stat.ap] 26 Feb 2013 François Bachoc CEA-Saclay, DEN, DM2S, STMF, LGLS, F-91191 Gif-Sur-Yvette, France Laboratoire

More information

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016 Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

Geostatistics for Seismic Data Integration in Earth Models

Geostatistics for Seismic Data Integration in Earth Models 2003 Distinguished Instructor Short Course Distinguished Instructor Series, No. 6 sponsored by the Society of Exploration Geophysicists European Association of Geoscientists & Engineers SUB Gottingen 7

More information

Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis

Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis Sobol-Hoeffding decomposition Application to Global SA Computation of the SI Sobol-Hoeffding Decomposition with Application to Global Sensitivity Analysis Olivier Le Maître with Colleague & Friend Omar

More information

Polynomial chaos expansions for structural reliability analysis

Polynomial chaos expansions for structural reliability analysis DEPARTMENT OF CIVIL, ENVIRONMENTAL AND GEOMATIC ENGINEERING CHAIR OF RISK, SAFETY & UNCERTAINTY QUANTIFICATION Polynomial chaos expansions for structural reliability analysis B. Sudret & S. Marelli Incl.

More information

Modelling Under Risk and Uncertainty

Modelling Under Risk and Uncertainty Modelling Under Risk and Uncertainty An Introduction to Statistical, Phenomenological and Computational Methods Etienne de Rocquigny Ecole Centrale Paris, Universite Paris-Saclay, France WILEY A John Wiley

More information

Risk Analysis: Efficient Computation of Failure Probability

Risk Analysis: Efficient Computation of Failure Probability Risk Analysis: Efficient Computation of Failure Probability Nassim RAZAALY CWI - INRIA Bordeaux nassim.razaaly@inria.fr 20/09/2017 Nassim RAZAALY (INRIA-CWI) Tail Probability 20/09/2017 1 / 13 Motivation:

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

On ANOVA decompositions of kernels and Gaussian random field paths

On ANOVA decompositions of kernels and Gaussian random field paths On ANOVA decompositions of kernels and Gaussian random field paths David Ginsbourger, Olivier Roustant, Dominic Schuhmacher, Nicolas Durrande, Nicolas Lenz To cite this version: David Ginsbourger, Olivier

More information

Monte Carlo Integration I [RC] Chapter 3

Monte Carlo Integration I [RC] Chapter 3 Aula 3. Monte Carlo Integration I 0 Monte Carlo Integration I [RC] Chapter 3 Anatoli Iambartsev IME-USP Aula 3. Monte Carlo Integration I 1 There is no exact definition of the Monte Carlo methods. In the

More information

Likelihood-Based Methods

Likelihood-Based Methods Likelihood-Based Methods Handbook of Spatial Statistics, Chapter 4 Susheela Singh September 22, 2016 OVERVIEW INTRODUCTION MAXIMUM LIKELIHOOD ESTIMATION (ML) RESTRICTED MAXIMUM LIKELIHOOD ESTIMATION (REML)

More information

Multiple Speaker Tracking with the Factorial von Mises- Fisher Filter

Multiple Speaker Tracking with the Factorial von Mises- Fisher Filter Multiple Speaker Tracking with the Factorial von Mises- Fisher Filter IEEE International Workshop on Machine Learning for Signal Processing Sept 21-24, 2014 Reims, France Johannes Traa, Paris Smaragdis

More information

REGRESSION WITH SPATIALLY MISALIGNED DATA. Lisa Madsen Oregon State University David Ruppert Cornell University

REGRESSION WITH SPATIALLY MISALIGNED DATA. Lisa Madsen Oregon State University David Ruppert Cornell University REGRESSION ITH SPATIALL MISALIGNED DATA Lisa Madsen Oregon State University David Ruppert Cornell University SPATIALL MISALIGNED DATA 10 X X X X X X X X 5 X X X X X 0 X 0 5 10 OUTLINE 1. Introduction 2.

More information

Steps in Uncertainty Quantification

Steps in Uncertainty Quantification Steps in Uncertainty Quantification Challenge: How do we do uncertainty quantification for computationally expensive models? Example: We have a computational budget of 5 model evaluations. Bayesian inference

More information

The Laplace driven moving average a non-gaussian stationary process

The Laplace driven moving average a non-gaussian stationary process The Laplace driven moving average a non-gaussian stationary process 1, Krzysztof Podgórski 2, Igor Rychlik 1 1 Mathematical Sciences, Mathematical Statistics, Chalmers 2 Centre for Mathematical Sciences,

More information

Comparing Non-informative Priors for Estimation and Prediction in Spatial Models

Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Environmentrics 00, 1 12 DOI: 10.1002/env.XXXX Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Regina Wu a and Cari G. Kaufman a Summary: Fitting a Bayesian model to spatial

More information

Sequential approaches to reliability estimation and optimization based on kriging

Sequential approaches to reliability estimation and optimization based on kriging Sequential approaches to reliability estimation and optimization based on kriging Rodolphe Le Riche1,2 and Olivier Roustant2 1 CNRS ; 2 Ecole des Mines de Saint-Etienne JSO 2012, ONERA Palaiseau 1 Intro

More information

A stochastic 3-scale method to predict the thermo-elastic behaviors of polycrystalline structures

A stochastic 3-scale method to predict the thermo-elastic behaviors of polycrystalline structures Computational & Multiscale Mechanics of Materials CM3 www.ltas-cm3.ulg.ac.be A stochastic 3-scale method to predict the thermo-elastic behaviors of polycrystalline structures Wu Ling, Lucas Vincent, Golinval

More information

Calibration and validation of computer models for radiative shock experiment

Calibration and validation of computer models for radiative shock experiment Calibration and validation of computer models for radiative shock experiment Jean Giorla 1, J. Garnier 2, E. Falize 1,3, B. Loupias 1, C. Busschaert 1,3, M. Koenig 4, A. Ravasio 4, C. Michaut 3 1 CEA/DAM/DIF,

More information

BEST ESTIMATE PLUS UNCERTAINTY SAFETY STUDIES AT THE CONCEPTUAL DESIGN PHASE OF THE ASTRID DEMONSTRATOR

BEST ESTIMATE PLUS UNCERTAINTY SAFETY STUDIES AT THE CONCEPTUAL DESIGN PHASE OF THE ASTRID DEMONSTRATOR BEST ESTIMATE PLUS UNCERTAINTY SAFETY STUDIES AT THE CONCEPTUAL DESIGN PHASE OF THE ASTRID DEMONSTRATOR M. Marquès CEA, DEN, DER F-13108, Saint-Paul-lez-Durance, France Advanced simulation in support to

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Fractional Hot Deck Imputation for Robust Inference Under Item Nonresponse in Survey Sampling

Fractional Hot Deck Imputation for Robust Inference Under Item Nonresponse in Survey Sampling Fractional Hot Deck Imputation for Robust Inference Under Item Nonresponse in Survey Sampling Jae-Kwang Kim 1 Iowa State University June 26, 2013 1 Joint work with Shu Yang Introduction 1 Introduction

More information

Point spread function reconstruction from the image of a sharp edge

Point spread function reconstruction from the image of a sharp edge DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Workshop Key UQ methodologies and motivating applications

Workshop Key UQ methodologies and motivating applications Workshop Key UQ methodologies and motivating applications Uncertainty quantification of numerical experiments: Several issues in industrial applications Bertrand Iooss, EDF R&D Isaac Newton Institute,

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Lecture 9: Sensitivity Analysis ST 2018 Tobias Neckel Scientific Computing in Computer Science TUM Repetition of Previous Lecture Sparse grids in Uncertainty Quantification

More information

Monitoring Wafer Geometric Quality using Additive Gaussian Process

Monitoring Wafer Geometric Quality using Additive Gaussian Process Monitoring Wafer Geometric Quality using Additive Gaussian Process Linmiao Zhang 1 Kaibo Wang 2 Nan Chen 1 1 Department of Industrial and Systems Engineering, National University of Singapore 2 Department

More information

Spatial smoothing using Gaussian processes

Spatial smoothing using Gaussian processes Spatial smoothing using Gaussian processes Chris Paciorek paciorek@hsph.harvard.edu August 5, 2004 1 OUTLINE Spatial smoothing and Gaussian processes Covariance modelling Nonstationary covariance modelling

More information

Efficient sensitivity analysis for virtual prototyping. Lectures. Thomas Most & Johannes Will

Efficient sensitivity analysis for virtual prototyping. Lectures. Thomas Most & Johannes Will Lectures Efficient sensitivity analysis for virtual prototyping Thomas Most & Johannes Will presented at the ECCOMAS Conference 2012 Source: www.dynardo.de/en/library European Congress on Computational

More information