Uncertainty Quantification and Validation Using RAVEN. A. Alfonsi, C. Rabiti. Risk-Informed Safety Margin Characterization. https://lwrs.inl.

Similar documents
Raven Facing the Problem of assembling Multiple Models to Speed up the Uncertainty Quantification and Probabilistic Risk Assessment Analyses

Dynamic System Identification using HDMR-Bayesian Technique

Index. Geostatistics for Environmental Scientists, 2nd Edition R. Webster and M. A. Oliver 2007 John Wiley & Sons, Ltd. ISBN:

Polynomial chaos expansions for structural reliability analysis

Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of

Uncertainty Management and Quantification in Industrial Analysis and Design

Fast Numerical Methods for Stochastic Computations

Polynomial chaos expansions for sensitivity analysis

A Polynomial Chaos Approach to Robust Multiobjective Optimization

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Intensity Analysis of Spatial Point Patterns Geog 210C Introduction to Spatial Data Analysis

Predictive Engineering and Computational Sciences. Local Sensitivity Derivative Enhanced Monte Carlo Methods. Roy H. Stogner, Vikram Garg

SENSITIVITY ANALYSIS IN NUMERICAL SIMULATION OF MULTIPHASE FLOW FOR CO 2 STORAGE IN SALINE AQUIFERS USING THE PROBABILISTIC COLLOCATION APPROACH

Contents. Preface to the Third Edition (2007) Preface to the Second Edition (1992) Preface to the First Edition (1985) License and Legal Information

Basics of Uncertainty Analysis

Bayesian Regression Linear and Logistic Regression

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Sparse polynomial chaos expansions in engineering applications

Safety Envelope for Load Tolerance and Its Application to Fatigue Reliability Design

Statistical Methods in HYDROLOGY CHARLES T. HAAN. The Iowa State University Press / Ames

Employing Model Reduction for Uncertainty Visualization in the Context of CO 2 Storage Simulation

Contents. Preface to Second Edition Preface to First Edition Abbreviations PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1

Pattern Recognition and Machine Learning

However, reliability analysis is not limited to calculation of the probability of failure.

Exploring Monte Carlo Methods

Stochastic Spectral Approaches to Bayesian Inference

EFFICIENT SHAPE OPTIMIZATION USING POLYNOMIAL CHAOS EXPANSION AND LOCAL SENSITIVITIES

Design for Reliability and Robustness through probabilistic Methods in COMSOL Multiphysics with OptiY

Appendix F. Computational Statistics Toolbox. The Computational Statistics Toolbox can be downloaded from:

ISI Web of Knowledge (Articles )

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Bayesian Models in Machine Learning

An Efficient Computational Solution Scheme of the Random Eigenvalue Problems

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

Dinesh Kumar, Mehrdad Raisee and Chris Lacor

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Irr. Statistical Methods in Experimental Physics. 2nd Edition. Frederick James. World Scientific. CERN, Switzerland

Pharmaceutical Experimental Design and Interpretation

CAM Ph.D. Qualifying Exam in Numerical Analysis CONTENTS

Geostatistics for Seismic Data Integration in Earth Models

Modern Methods of Data Analysis - WS 07/08

Machine Learning Overview

STATISTICS ( CODE NO. 08 ) PAPER I PART - I

Learning Objectives for Stat 225

Gaussian process regression for Sensitivity analysis

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Analysis of the Space Propulsion System Problem Using RAVEN

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

Introduction to Uncertainty Quantification in Computational Science Handout #3

Uncertainty Propagation and Global Sensitivity Analysis in Hybrid Simulation using Polynomial Chaos Expansion

Simulating Uniform- and Triangular- Based Double Power Method Distributions

Stochastic Renewal Processes in Structural Reliability Analysis:

Monte Carlo Methods in High Energy Physics I

Practical Statistics for the Analytical Scientist Table of Contents

Distribution Fitting (Censored Data)

HANDBOOK OF APPLICABLE MATHEMATICS

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

component risk analysis

Model Calibration under Uncertainty: Matching Distribution Information

Monte Carlo Methods. Handbook of. University ofqueensland. Thomas Taimre. Zdravko I. Botev. Dirk P. Kroese. Universite de Montreal

Interaction Analysis of Spatial Point Patterns

Uncertainty Quantification in Performance Evaluation of Manufacturing Processes

Modelling Under Risk and Uncertainty

Third Edition. William H. Press. Raymer Chair in Computer Sciences and Integrative Biology The University of Texas at Austin. Saul A.

Generalized Linear. Mixed Models. Methods and Applications. Modern Concepts, Walter W. Stroup. Texts in Statistical Science.

Fundamentals of Applied Probability and Random Processes

Algorithms for Uncertainty Quantification

PHYS 6710: Nuclear and Particle Physics II

Probability and Stochastic Processes

S. Freitag, B. T. Cao, J. Ninić & G. Meschke

Estimating functional uncertainty using polynomial chaos and adjoint equations

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Machine Learning using Bayesian Approaches

Risk Assessment for CO 2 Sequestration- Uncertainty Quantification based on Surrogate Models of Detailed Simulations

Numerical Methods with MATLAB

IDL Advanced Math & Stats Module

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

At A Glance. UQ16 Mobile App.

Evaluation of Non-Intrusive Approaches for Wiener-Askey Generalized Polynomial Chaos

Computational methods for uncertainty quantification and sensitivity analysis of complex systems

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland

BEST ESTIMATE PLUS UNCERTAINTY SAFETY STUDIES AT THE CONCEPTUAL DESIGN PHASE OF THE ASTRID DEMONSTRATOR

Subject CS1 Actuarial Statistics 1 Core Principles

Stat 5101 Lecture Notes

Novel mathematical and statistical approaches to uncertainty evaluation:

Nonparametric Bayesian Methods (Gaussian Processes)

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Utilizing Adjoint-Based Techniques to Improve the Accuracy and Reliability in Uncertainty Quantification

Module 8. Lecture 5: Reliability analysis

Chapter 3 Common Families of Distributions

Bayesian Learning (II)

Monte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Uncertainty Quantification in MEMS

Uncertainty Quantification for multiscale kinetic equations with random inputs. Shi Jin. University of Wisconsin-Madison, USA

Review for the previous lecture

AN EFFICIENT COMPUTATIONAL FRAMEWORK FOR UNCERTAINTY QUANTIFICATION IN MULTISCALE SYSTEMS

Transcription:

Risk-Informed Safety Margin Characterization Uncertainty Quantification and Validation Using RAVEN https://lwrs.inl.gov A. Alfonsi, C. Rabiti North Carolina State University, Raleigh 06/28/2017

Assumptions Numerical error negligible (mostly reducible) Verification has been performed (code is bug free) All uncertain parameters are accessible (eases sampling needs) All uncertain parameter distributions are known (otherwise Bayesian)

Validation Process Select a number of experiments Perform uncertainty propagation for all experiments (UQ) Compare simulation with experiments (Validation versus experiments) Determine the uncertainties in target prediction (extrapolation) RAVEN current capabilities covers: Uncertainties Quantification (mature) Validation vs. experiments (initial) Extrapolation (planned)

UQ

UQ Process Goal: determining the Probability Distribution Function (PDF) of the Figure of Merits (FOM) Select the right sampler: Number of variables Non linearity of the model Sample the model according sampler chosen and distributions Analyze the FOM dispersion (mean, sigma etc.)

Choice of the Sampling strategy Medium/low Number of variables High Non Linearity High Random and quasi random Samplers Low Surrogate building samplers Surrogate If possible Statistical post processing Direct extraction of statistical moments

Sampling strategies RAVEN supports many forward samplers Monte Carlo Grids: equal-spaced in probability equal-spaced in value mixed (probability, custom, value) Custom (used provided values/probability) Stratified (LHS type) equal-spaced in probability equal-spaced in value mixed (probability, custom, value) Custom (used provided values/probability) Generalized stochastic collocation polynomial chaos A different sampling strategies can be associated to each variable separately

Sampling strategies (cont.) Factorial Designs: General Full Factorial (grid) 2-Level Fractional-Factorial Plackett-Burman Response Surface Designs: Box-Behnken Central Composite Central Composite Box-Behnken

Standard Distributions (CROW) Most common used 1D distributions Probability Distribution Function Truncated Form Available Probability Distribution Function Truncated Form Available Bernoulli No Laplace Yes Beta Yes Logistic Yes Binomial No Lognormal Yes Categorical No Normal Yes Custom1D No Poisson No Exponential Yes Triangular Yes Gamma Yes Uniform Yes Geometric No Weibull Yes

N-Dimensional Distributions (CROW) Import from file for custom N-dimensional distributions: N-dimensional splines on Cartesian grids Inverse weight interpolation Micro sphere interpolation Sampling of N-dimensional distributions by not biased random inversion

Available Surrogate Models Nearest neighbor (KD-Tree based) Support vector machine: Polynomial kernel Gaussian Kernels Radial basis functions Micro Sphere Inverse Weight N-Dimensional spline Gaussian process Polynomial (stochastic and not) Linear regressors Many more (raven.inl.gov) Ensemble models

A Road Map for Collocation Methods ~Tens of parameters Polynomial Assumption Not linear but continuous Looking for FOM distribution Black box approach Modal expansion Generalize the development Probability weighted Original space Probability weighted response SCgPC HDMR Legendre expansion Optimizing resources Adaptive Adaptive Decomposing the variance

Stochastic Collocation Generalized Polynomial Chaos SCgPC and Sobolev Indexes Based Usually weighted by the probability Full SCgPC (~10) A priori knowledge of the degree of the function is imposed Full generalized Sobolev decomposition (~10) A priori knowledge of the degree of the function is imposed Sparse grid (~10) Known separability is required Adaptive SCgPC (~100) Separability and almost linearity improves performance Adaptive generalized Sobolev decomposition (~100) Separability and almost linearity improves performance

Grid Filling

Statistical Post Processing Statistical characterization of the output (uncertainty propagation) Mean Sigma Skewness (asymmetry) Kurtosis (more/less peaked than a standard normal) Input/output relationship (ranking/sensitivity) Correlation matrix Covariance matrix Sensitivity matrix (multidimensional linear regression) Normalized sensitivity matrix (% change of the response / % change in the answer)

Example: Bison + RAVEN Power History Power Spike J. Cogliati, J. Chen, J. Patel, D. Mandelli, D. Maljovec, A. Alfonsi, P. Talbot, C. Wang, C. Rabiti Time-Dependent Data Mining in RAVEN INL/EXT-16-39860 Geometry

Uncertainty Input

Figure Of Merits

Uncertainty On FOM 7000 MC Runs 128 BISON simulations simultaneously with each using 16 MPI processes (total of 2048 cores simultaneously used)

Contribution to Dispersion: Covariance

Validation

A Probabilistic Reading of Experimental Data Experimental Data Input Variables (initial and boundary condition) Experimental Measurement (Figures of Merit (FOM)) Uncertainty effect both inputs and readings Probability distribution of the input space Probability distribution of the experimental readings (FOMs) 0.14 0.14 0.12 0.12 0.1 0.1 0.08 0.08 0.06 0.06 0.04 0.02 0 0 5 10 15 20 25 30 35 0.04 0.02 0 0 0.2 0.4 0.6 0.8 1 1.2

then the Comparison 0.14 0.12 0.1 0.08 0.06 0.04 0.02 0 0 10 20 30 Probabilistic Input Model Experimen t Probabilistic Code Output Probabilistic Experimental Reading 0.1 0.15 0.08 0.06 0.04 0.02? 0.1 0.05 0 0 10 20 30 0 0 0.5 1

The Process Input Distribution Sampling Strategy Binning density function estimators Grid based reconstruction ROMs 0.1 0.08 0.06 0.04 0.02 0 0 10 20 30 Analytical expression of FOMs cdf 0.15 0.1 0.05 Comparative metrics 0 0 0.5 1

The goal is to achieve a numerical representation a probability density function of a set of point in the output space The less distorting representation is generate by the binning (histogram) The number of bins and its boundaries should be choose to regularize the function without altering its meaning Binning algorithms Square root: Sturge s Formula: Reconstruction of the FOM Distribution

Next Step. How we compare the simulation to the experiment?? Mean and Sigma are not enough to compare the model output distributions to the experimental reading distributions The metric should be more extensive and to consider the whole PDF: Minkowski L 1 Metric Probability Distribution Function Area Metric Distance Probability Distribution Function

Validation Objectives (Figure of Merits) FOM Mass Flow Rate Temperature Cold Leg Temperature Hot Leg Code RELAP-7 (2014)

The uncertainties on the figure of merits are connected to the type of measurements and not specific to a particular detector and location Pressure Uncertainties on Readings ± 0.1 MPa (Primary Pressure) Power ± 1 kw (Core Power) Mass Flow ± 0.033 kg/s (Mass Flow Rate Primary) Fluid Temperature ± 2 K (Temperature Hot and Cold Leg)

Binning of the Data Generated Number of Samples: 8300 Optimized number of bins: 15 Bin Midpoint Bin Count 0.187921 1 0.188897 4 0.189874 17 0.19085 63 0.191826 205 0.192802 518 0.193778 938 0.194755 1419 0.195731 1638 0.196707 1629 0.197683 1042 0.198659 541 0.199636 213 0.200612 64 0.201588 13 C O U N T S

Minkowski L1 Metric CDF 1 0.9 0.8 relap 0.7 experiment 0.6 CDF 0.5 0.4 0.3 0.2 0.1 0 402 404 406 408 410 412 414 416 418 Hot Leg Temperature A = 3.029 K Lower Value Better agreement

Probability Distribution Function Area Metric PDF 0.4 PDF 0.35 0.3 0.25 0.2 0.15 0.1 0.05 relap experiment I = 0.2704 = 27% Higher Percentage Higher Agreement 0 402 404 406 408 410 412 414 416 418-0.05 Hot Leg Temperature

Distance Probability Distribution Function 14 12 PDF 10 8 6 Fz µ (d) = 0.0662 σ (d) = 0.0325 4 2 0-0.05 0 0.05 0.1 0.15 0.2 z [kg/s]

Use of Voronoi Tessellation RAVEN posses several sampling strategy Different sampling strategy generate different filling of the input space and different weight-point association The Voronoi tessellation is a common statistical representation Sampler Sampler Sampler Voronoi representation Grid Statistical post processing LHS 33

Voronoi tessellation Response Space Tessellation of the Response Space Y Voronoi Tessellation Y X For each point a weight is computed in the probability space These weights are used to construct the variate distribution in the response space and, if applied on the input space, can be used for the computation of the statistical moments without ad-hoc strategies dependent on the sampling methodology X

Voronoi tessellation advantages The methodology has been applied to generalize the validation methodologies previously implemented in RAVEN, demonstrating its validity The generality of the approach provides the following advantages: The validation metrics are not dependent on the employed sampling strategy If the reliability weights are computed in the CDF space, there is no need to create ad-hoc weight generation strategies for the future sampling methods The approach can be used for the approximated computation of joint probability functions (crucial for the computation of correlation and covariance matrices when the targeted variables are differently weighted)

Conclusions Distribution modeling and sampling strategies have a good degree of maturity Static comparison between model output and experimental distribution is feasible but in an early stage Developments are needed in: Time dependent Correlation in input/output in experimental data needs to be accounted for Extrapolation is a new field which hopefully will see growing capabilities in the next years

Thank you Questions? 37

Sustaining National Nuclear Assets https://lwrs.inl.gov 38