Uncertaint Quantification in Computational Models Habib N. Najm hnnajm@sandia.gov Sandia National Laboratories Livermore, CA, USA Computational Infrastructure for Geodnamics (CIG) Webinar, UC Davis, CA Feb 11, 2016 SNL Najm UQ in Computations 1 / 42
Acknowledgement B.J. Debusschere, R.D. Berr, K. Sargsan, C. Safta, K. Chowdhar, M. Khalil Sandia National Laboratories, CA R.G. Ghanem U. South. California, Los Angeles, CA O.M. Knio Duke Univ., Durham, NC O.P. Le Maître CNRS, Paris, France Y.M. Marzouk Mass. Inst. of Tech., Cambridge, MA This work was supported b: DOE Office of Advanced Scientific Computing Research (ASCR), Scientific Discover through Advanced Computing (SciDAC) DOE ASCR Applied Mathematics program. DOE Office of Basic Energ Sciences, Div. of Chem. Sci., Geosci., & Biosci. Sandia National Laboratories is a multiprogram laborator operated b Sandia Corporation, a Lockheed Martin Compan, for the United States Department of Energ under contract DE-AC04-94-AL85000. SNL Najm UQ in Computations 2 / 42
Uncertaint Quantification and Computational Science /)$%&0+!"#$%&'(")'*+,"-.*+ 1%&$%&+ 23.-45(")0+ + 2'3'#.&.30+ = f() + Forward problem SNL Najm UQ in Computations 3 / 42
Uncertaint Quantification and Computational Science /)$%&0+!"#$%&'(")'*+,"-.*+ 1%&$%&+ 23.-45(")0+ + 2'3'#.&.30+ = f() +,.'0%3.#.)&+,"-.*+ z = g() + 6'&'+ Inverse & Forward problems SNL Najm UQ in Computations 3 / 42
Uncertaint Quantification and Computational Science /)$%&0+!"#$%&'(")'*+,"-.*+ 1%&$%&+ 23.-45(")0+ + 2'3'#.&.30+ = f() +,.'0%3.#.)&+,"-.*+ z = g() + 6'&'+ z d+ Inverse & Forward UQ SNL Najm UQ in Computations 3 / 42
Uncertaint Quantification and Computational Science d+ ={f 1 (), f 2 ()7+87+f M ()}+ /)$%&0+!"#$%&'(")'*+,"-.*+ 1%&$%&+ 23.-45(")0+ + 2'3'#.&.30+ = f() +,.'0%3.#.)&+,"-.*+ z = g() + 6'&'+ z d+ Inverse & Forward UQ Model validation & comparison, Hpothesis testing SNL Najm UQ in Computations 3 / 42
Outline Introduction ForwardPC Baes Closure 1 Introduction 2 Forward UQ - Polnomial Chaos 3 Inverse Problem - Baesian Inference 4 Closure SNL Najm UQ in Computations 4 / 42
Forward propagation of parametric uncertaint Forward model: = f() Local sensitivit analsis (SA) and error propagation = df d 0 This is ok for: small uncertaint low degree of non-linearit in f() Non-probabilistic methods Fuzz logic Evidence theor Dempster-Shafer theor Interval math Probabilistic methods this is our focus SNL Najm UQ in Computations 5 / 42
Probabilistic Forward UQ = f() Represent uncertain quantities using probabilit theor Random sampling, MC, QMC Generate random samples { i } N i=1 from the PDF of, p() Bin the corresponding { i } to construct p() Not feasible for computationall epensive f() slow convergence of MC/QMC methods ver large N required for reliable estimates Build a cheap surrogate for f(), then use MC Collocation interpolants Regression fitting Galerkin methods Polnomial Chaos (PC) Intrusive and non-intrusive PC methods SNL Najm UQ in Computations 6 / 42
Probabilistic Forward UQ & Polnomial Chaos Representation of Random Variables With = f(), a random variable, estimate the RV Can describe a RV in terms of its densit, moments, characteristic function, or as a function on a probabilit space Constraining the analsis to RVs with finite variance Represent RV as a spectral epansion in terms of orthogonal functions of standard RVs Polnomial Chaos Epansion Enables the use of available functional analsis methods for forward UQ SNL Najm UQ in Computations 7 / 42
Polnomial Chaos Epansion (PCE) Model uncertain quantities as random variables (RVs) Given a germ ξ(ω) = {ξ 1,, ξ n } a set of i.i.d. RVs where p(ξ) is uniquel determined b its moments An RV in L 2 (Ω, S(ξ), P ) can be written as a PCE: u(, t, ω) = f(, t, ξ) P u k (, t)ψ k (ξ(ω)) k=0 u k (, t) are mode strengths Ψ k () are multivariate functions orthogonal w.r.t. p(ξ) SNL Najm UQ in Computations 8 / 42
Orthogonalit B construction, the functions Ψ k () are orthogonal with respect to the densit of ξ u k (, t) = uψ k Ψ 2 k = 1 Ψ 2 k u(, t; λ(ξ))ψ k (ξ) p ξ (ξ) dξ Eamples: Hermite polnomials with Gaussian basis Legendre polnomials with Uniform basis,... Global versus Local PC methods Adaptive domain decomposition of the support of ξ SNL Najm UQ in Computations 9 / 42
PC Illustration: WH PCE for a Lognormal RV Lognormal; WH PC order = 1 u = Wiener-Hermite PCE constructed for a Lognormal RV PCE-sampled PDF superposed on true PDF Order = 1 P u k Ψ k (ξ) k=0 = u 0 + u 1 ξ 1.4 1.2 1 0 eact-pdf PCE-PDF 0 1 2 3 4 5 6 SNL Najm UQ in Computations 10 / 42
PC Illustration: WH PCE for a Lognormal RV Lognormal; WH PC order = 2 u = Wiener-Hermite PCE constructed for a Lognormal RV PCE-sampled PDF superposed on true PDF Order = 2 P u k Ψ k (ξ) k=0 1.4 1.2 1 = u 0 + u 1 ξ + u 2 (ξ 2 1) 0 eact-pdf PCE-PDF 0 1 2 3 4 5 6 SNL Najm UQ in Computations 10 / 42
PC Illustration: WH PCE for a Lognormal RV u = Wiener-Hermite PCE constructed for a Lognormal RV PCE-sampled PDF superposed on true PDF Order = 3 P u k Ψ k (ξ) k=0 = u 0 + u 1 ξ + u 2 (ξ 2 1) + u 3 (ξ 3 3ξ) 1.4 1.2 1 0 Lognormal; WH PC order = 3 eact-pdf PCE-PDF 0 1 2 3 4 5 6 SNL Najm UQ in Computations 10 / 42
PC Illustration: WH PCE for a Lognormal RV u = Wiener-Hermite PCE constructed for a Lognormal RV PCE-sampled PDF superposed on true PDF Order = 4 P u k Ψ k (ξ) k=0 1.4 1.2 1 0 Lognormal; WH PC order = 4 eact-pdf PCE-PDF 0 1 2 3 4 5 6 = u 0 + u 1 ξ + u 2 (ξ 2 1) + u 3 (ξ 3 3ξ) + u 4 (ξ 4 6ξ 2 + 3) SNL Najm UQ in Computations 10 / 42
PC Illustration: WH PCE for a Lognormal RV u = Wiener-Hermite PCE constructed for a Lognormal RV PCE-sampled PDF superposed on true PDF Order = 5 P u k Ψ k (ξ) k=0 1.4 1.2 1 0 Lognormal; WH PC order = 5 eact-pdf PCE-PDF 0 1 2 3 4 5 6 = u 0 + u 1 ξ + u 2 (ξ 2 1) + u 3 (ξ 3 3ξ) + u 4 (ξ 4 6ξ 2 + 3) + u 5 (ξ 5 10ξ 3 + 15ξ) SNL Najm UQ in Computations 10 / 42
Random Fields A random variable is a function on an event space Ω No dependence on other coordinates e.g. space or time A random field is a function on a product space Ω D e.g. sea surface temperature T SS (z, ω), z (, t) It is a more comple object than a random variable A combination of an infinite number of random variables In man phsical sstems, uncertain field quantities, described b random fields: are smooth, i.e. the have an underling low dimensional structure due to large correlation length-scales SNL Najm UQ in Computations 11 / 42
Random Fields KLE Smooth random fields can be represented with a small no. of stochastic degrees of freedom A random field M(, ω) with a mean function: µ() a continuous covariance function: C( 1, 2 ) = [M( 1, ω) µ( 1 )][M( 2, ω) µ( 2 )] can be represented with the Karhunen-Loeve Epansion (KLE) M(, ω) = µ() + λi η i (ω)φ i () i=1 where λ i and φ i () are the eigenvalues and eigenfunctions of the covariance function C(, ) η i are uncorrelated zero-mean unit-variance RVs KLE representation of random fields using PC SNL Najm UQ in Computations 12 / 42
RF Illustration: KL of 2D Gaussian Process δ = 0.1 δ = δ = 0.5 2D Gaussian Process with covariance: C( 1, 2 ) = ep( 1 2 2 /δ 2 ) Realizations smoother as covariance length δ increases SNL Najm UQ in Computations 13 / 42
Introduction ForwardPC Baes Closure RF Illustration: 2D KL - Modes for δ = 0.1 0.5 φ1 SNL Najm 0.5 φ4 φ3 0.1 φ2 UQ in Computations 14 / 42
RF Illustration: 2D KL - eigenvalue spectrum δ = 0.1 4 terms 16 terms Eigenvalue Magnitude 10 4 10 2 10 0 δ =0.1 δ = δ =0.5 10-2 0 20 40 60 Eigenvalue # 32 terms 64 terms SNL Najm UQ in Computations 15 / 42
RF Illustration: 2D KL - eigenvalue spectrum δ = 4 terms 16 terms Eigenvalue Magnitude 10 4 10 2 10 0 δ =0.1 δ = δ =0.5 10-2 0 20 40 60 Eigenvalue # 32 terms 64 terms SNL Najm UQ in Computations 16 / 42
RF Illustration: 2D KL - eigenvalue spectrum δ = 0.5 4 terms 16 terms Eigenvalue Magnitude 10 4 10 2 10 0 δ =0.1 δ = δ =0.5 10-2 0 20 40 60 Eigenvalue # 32 terms 64 terms SNL Najm UQ in Computations 17 / 42
Essential Use of PC in UQ Strateg: Represent model parameters/solution as random variables Construct PCEs for uncertain parameters Evaluate PCEs for model outputs Advantages: Computational efficienc Utilit Requirement: Moments: E(u) = u 0, var(u) = P k=1 u2 k Ψ2 k,... Global Sensitivities fractional variances, Sobol indices Surrogate for forward model RVs in L 2, i.e. with finite variance, on (Ω, S(ξ), P ) SNL Najm UQ in Computations 18 / 42
Intrusive PC UQ: A direct non-sampling method Given model equations: M(u(, t); λ) = 0 Epress uncertain parameters/variables using PCEs u = P u k Ψ k ; λ = k=0 P λ k Ψ k k=0 Substitute in model equations; appl Galerkin projection G(U(, t), Λ) = 0 New set of equations: with U = [u 0,..., u P ] T, Λ = [λ 0,..., λ P ] T Solving this deterministic sstem once provides the full specification of uncertain model ouputs SNL Najm UQ in Computations 19 / 42
Laminar 2D Channel Flow with Uncertain Viscosit Incompressible flow Viscosit PCE ν = ν 0 + ν 1 ξ Streamwise velocit P v = v i Ψ i i=0 v 0 : mean v i : i-th order mode P σ 2 = Ψ 2 i i=1 v 2 i v0 v1 v2 v3 sd v 0 v 1 v 2 v 3 σ (Le Maître et al., J. Comput. Phs., 2001) SNL Najm UQ in Computations 20 / 42
Intrusive PC UQ Pros/Cons Cons: Pros: Reformulation of governing equations New discretizations New numerical solution method Consistenc, Convergence, Stabilit Global vs. multi-element local PC constructions New solvers and model codes Opportunities for automated code transformation New preconditioners Tailored solvers can deliver superior performance SNL Najm UQ in Computations 21 / 42
Non-intrusive PC UQ Sampling-based Relies on black-bo utilization of the computational model Evaluate projection integrals numericall For an quantit of interest φ(, t; λ) = P k=0 φ k(, t)ψ k (ξ) φ k (, t) = 1 φ(, t; λ(ξ)) Ψ Ψ 2 k (ξ)p ξ (ξ)dξ, k = 0,..., P k Integrals can be evaluated using A variet of (Quasi) Monte Carlo methods Slow convergence; indep. of dimensionalit Quadrature/Sparse-Quadrature methods Fast convergence; depends on dimensionalit SNL Najm UQ in Computations 22 / 42
PC and High-Dimensionalit Dimensionalit n of the PC basis: ξ = {ξ 1,..., ξ n } n number of uncertain parameters P + 1 = (n + p)!/n!p! grows fast with n Impacts: Size of intrusive PC sstem Hi-D projection integrals large # non-intrusive samples Sparse quadrature methods Clenshaw-Curtis sparse grid, Level = 3 Clenshaw-Curtis sparse grid, Level = 5 SNL Najm UQ in Computations 23 / 42
UQ in LES computations: turbulent bluff-bod flame with M. Khalil, G. Lacaze, & J. Oefelein, Sandia Nat. Labs CH 4 -H 2 jet, air coflow, 3D flow Re=9500, LES subgrid modeling 12 10 6 mesh cells, 1024 cores 3 das run time, 2 10 5 time steps 3 uncertain parameters (C s, P r t, Sc t ) 2 nd -order PC, 25 sparse-quad. pts 1 Mean aial velocit on centerline 1 RMS aial velocit on centerline Fuel jet 108m/s (3.6mm) Main-Effect Sensitivit Inde C s Pr t Sc t Main-Effect Sensitivit Inde C s Pr t Sc t Ceramic bluff-bod (50mm) 150 mm Air co-flow 35m/s 0 0 5 10 15 Aial location (cm) 0 0 5 10 15 Aial location (cm) J. Oefelein & G. Lacaze, SNL Main-Effect Sensitivit Indices SNL Najm UQ in Computations 24 / 42
Introduction ForwardPC Baes Closure UQ in Ocean Modeling Gulf of Meico A. Aleanderian, J. Winokur, I. Sraj, O.M. Knio, Duke Univ. A. Srinivasan, M. Iskandarani, Univ. Miami; W.C. Thacker, NOAA 120 30 100 28 80 26 60 24 22 40 20 20 98 96 94 92 90 88 86 84 82 80 78 Hurricane Ivan, Sep. 2004 HYCOM ocean model (hcom.org) Predicted Mied Laer Depth (MLD) Four uncertain parameters, i.i.d. U subgrid miing & wind drag params 385 sparse quadrature samples (Aleanderian et al., Winokur et. al., Comput. Geosci., 2012, 2013) SNL Najm UQ in Computations 25 / 42
Inverse UQ Estimation of Uncertain Parameters Forward UQ requires specification of uncertain inputs Probabilistic setting Require joint PDF on input space Statistical inference an inverse problem Baesian setting Given Data: PDF on uncertain inputs can be estimated using Baes formula Baesian Inference Given Constraints: PDF on uncertain inputs can be estimated using the Maimum Entrop principle MaEnt Methods SNL Najm UQ in Computations 26 / 42
Baes formula for Parameter Inference Data Model (fit model + noise model): Baes Formula: p(λ, ) = p(λ )p() = p( λ)p(λ) p(λ ) Posterior = Likelihood p( λ) Prior: knowledge of λ prior to data p() Evidence Prior p(λ) = f(λ) g(ɛ) Likelihood: forward model and measurement noise Posterior: combines information from prior and data Evidence: normalizing constant for present contet SNL Najm UQ in Computations 27 / 42
The Prior Prior p(λ) comes from Phsical constraints Prior data Prior knowledge The prior can be uninformative It can be chosen to impose regularization Unknown aspects of the prior can be added to the rest of the parameters as hperparameters The choice of prior can be crucial when there is little information in the data relative to the number of degrees of freedom in the inference problem When there is sufficient information in the data, the data can overrule the prior SNL Najm UQ in Computations 28 / 42
Construction of the Likelihood p( λ) Where does probabilit enter the mapping λ in p( λ)? Through a presumed error model: Eample: Model: m = g(λ) Data: Error between data and model prediction: ɛ = g(λ) + ɛ Model this error as a random variable Eample Error is due to instrument measurement noise Instrument has Gaussian errors, with no bias ɛ N(0, σ 2 ) SNL Najm UQ in Computations 29 / 42
Construction of the Likelihood p( λ) cont d For an given λ, this implies or λ, σ N(g(λ), σ 2 ) p( λ, σ) = 1 ) ( g(λ))2 ep ( 2π σ 2σ 2 Given N measurements ( 1,..., N ), and presuming independent identicall distributed (iid) noise i = g(λ) + ɛ i ɛ i N(0, σ 2 ) N L(λ) = p( 1,..., N λ, σ) = p( i λ, σ) i=1 SNL Najm UQ in Computations 30 / 42
Likelihood Modeling This is frequentl the core modeling challenge Error model: a statistical model for the discrepanc between the forward model and the data composition of the error model with the forward model Error model composed of discrepanc between data and the truth (data error) model prediction and the truth (model error) Mean bias and correlated/uncorrelated noise structure Hierarchical Baes modeling, and dependence trees p(φ, θ D) = p(φ θ, D)p(θ D) Choice of observable constraint on Quantit of Interest? SNL Najm UQ in Computations 31 / 42
Eploring the Posterior Given an sample λ, the un-normalized posterior probabilit can be easil computed p(λ ) p( λ)p(λ) Eplore posterior w/ Markov Chain Monte Carlo (MCMC) Metropolis-Hastings algorithm: Random walk with proposal PDF & rejection rules Computationall intensive, O(10 5 ) samples Each sample: evaluation of the forward model Surrogate models Evaluate moments/marginals from the MCMC statistics SNL Najm UQ in Computations 32 / 42
Baesian inference illustration: noise uncertaint data: = 2 2 3 + 5 + ɛ ɛ N (0, σ 2 ), σ = {0.1, 0.5, } Fit model = a 2 + b + c Marginal posterior densit p(a, c): SNL Najm UQ in Computations 33 / 42
Baesian inference High Dimensionalit Challenge Judgement on local/global posterior peaks is difficult Multiple chains; Tempering Choosing a good starting point is ver important An initial optimization strateg is useful, albeit not trivial Choosing good MCMC proposals, and attaining good miing Likelihood-informed Markov jump in those dimensions informed b data Sample from prior in complement of dimensions Adaptive proposal learning from MCMC samples Log-Posterior Hessian local Gaussian appro. Adaptive, Geometric, Langevin MCMC Dimension independent Proposal design: good MCMC performance in hid Literature: A. Stuart, M. Girolami, K. Law, T. Cui, Y. Marzouk (Law 2014; Cui et al., 2014,2015; Cotter et al., 2013) SNL Najm UQ in Computations 34 / 42
Baesian inference Model Error Challenge Quantifing model error, as distinct from data noise, is important for assessing confidence in model validit Conventional statistical methods for representation of model error have shortcomings when applied to phsical models New methods are under-development for model error: phsical constraints are satisfied feasible disambiguation of model-error/data-noise calibrated model error terms adequatel impact all model outputs of interest uncertainties in predictions from calibrated model reflect the range of discrepanc from the truth Embed model error in submodel components where approimations eist (K. Sargsan et al., 2015) SNL Najm UQ in Computations 35 / 42
Quadratic-fit Classical Baesian likelihood 8 7 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f] 8 7 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f] 6 6 5 5 4 4 3 3 2 1 N = 20 - -0.5 0.5 With additional data, predictive uncertaint around the wrong model is indefinitel reducible Predictive uncertaint not indicative of discrepanc from truth 2 1 N = 50 - -0.5 0.5 - -0.5 0.5 SNL Najm UQ in Computations 36 / 42 8 7 6 5 4 3 2 1 N = 200 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f]
Quadratic-fit ModErr MargGauss 8 7 N = 20 8 7 N = 50 6 6 5 5 4 4 3 2 1 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f] 1σ pushed-forward posterior: model error term Eα[Vξ[f]] - -0.5 0.5 3 2 1 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f] 1σ pushed-forward posterior: model error term Eα[Vξ[f]] - -0.5 0.5 With additional data, predictive uncertaint due to data noise is reducible Predictive uncertaint due to model error is not reducible - -0.5 0.5 SNL Najm UQ in Computations 37 / 42 8 7 6 5 4 3 2 1 Truth function g() Observations {i} with noise Mean pushed-forward posterior EPF[f] 1σ pushed-forward posterior VPF[f] 1σ pushed-forward posterior: model error term Eα[Vξ[f]] N = 200
Quadratic-fit ModErr MargGauss Variance 10 3 10 2 10 1 10 0 10-1 10-2 10-3 10-4 10-5 Pushed-forward variance V PF[f] Model error E α[v ξ[f]] Data error V α[e ξ[f]] 10 1 10 2 10 3 10 4 10 5 10 6 N Calibrating a quadratic f() w.r.t. g() = 6 + 2 + 0.5( + 1) 3.5 SNL Najm UQ in Computations 38 / 42
Model Evidence and Compleit Let M = {M 1, M 2,...} be a set of models of interest Parameter estimation from data is conditioned on the model p(θ D, M k ) = p(d θ, M k)π(θ M k ) p(d M k ) Evidence (marginal likelihood) for M k : p(d M k ) = p(d θ, M k )π(θ M k )dθ Model evidence is useful for model selection Choose model with maimum evidence Compromise between fitting data and model compleit Optimal compleit Occam s razor principle Avoid overfitting SNL Najm UQ in Computations 39 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 1 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 2 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 3 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 4 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 5 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 6 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 7 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 8 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 9 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Too much model compleit leads to overfitting Data model: i = 1,..., N i = 3 i + 2 i 6 + ɛ i ɛ i N(0, s) Order = 10 3.5 Fitted model 4.0 Nois data True function 4.5 Baesian regression with Legendre PCE fit models, order 1-10 m = P c k ψ k () k=0 5.0 5.5 6.0 6.5 0.5 0.5 Uniform priors π(c k ), k = 0,..., P Fitted model pushed-forward posterior versus the data SNL Najm UQ in Computations 40 / 42
Evidence and Cross-Validation Error Model evidence peaks at the true polnomial order of 3 10 20 2.5 3.0 Cross validation error is equall minimal at order 3 Models with optimal compleit are robust to cross validation Log (Evidence) 30 40 50 60 70 80 3.5 4.0 4.5 Log (Validation error) 90 5.0 1 2 3 4 5 6 7 8 9 10 Order Cross validation error and model evidence versus order SNL Najm UQ in Computations 41 / 42
Closure Introduction ForwardPC Baes Closure Probabilistic UQ framework Polnomial Chaos representation of random variables Forward UQ Intrusive and non-intrusive forward PC UQ methods Inverse UQ Parameter estimation via Baesian inference Model error Model compleit Challenges High dimensionalit Intrusive Galerkin stabilit Nonlinearit Time dnamics Model error SNL Najm UQ in Computations 42 / 42