Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria

Similar documents
However, reliability analysis is not limited to calculation of the probability of failure.

Worst case analysis of mechanical structures by interval methods

Basics of Uncertainty Analysis

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization

Reasoning with Uncertainty

Predictive Engineering and Computational Sciences. Research Challenges in VUQ. Robert Moser. The University of Texas at Austin.

Optimization Problems with Probabilistic Constraints

Imprecise probability in engineering a case study

Stochastic optimization - how to improve computational efficiency?

Reduction of Random Variables in Structural Reliability Analysis

Optimization Tools in an Uncertain Environment

SOLVING POWER AND OPTIMAL POWER FLOW PROBLEMS IN THE PRESENCE OF UNCERTAINTY BY AFFINE ARITHMETIC

Risk Aggregation with Dependence Uncertainty

Constrained global optimization. Arnold Neumaier University of Vienna Vienna, Austria

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.

Advanced Marine Structures Prof. Dr. Srinivasan Chandrasekaran Department of Ocean Engineering Indian Institute of Technology Madras

Structural Reliability

The Generalized Likelihood Uncertainty Estimation methodology

Robust goal programming

Learning Energy-Based Models of High-Dimensional Data

F denotes cumulative density. denotes probability density function; (.)

Nonparametric Bayesian Methods (Gaussian Processes)

Probabilistic Reasoning. (Mostly using Bayesian Networks)

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Human Pose Tracking I: Basics. David Fleet University of Toronto

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm

Numerical Probabilistic Analysis under Aleatory and Epistemic Uncertainty

Potential based clouds in robust design optimization

A Comparative Study of Different Order Relations of Intervals

Inference with few assumptions: Wasserman s example

Non-arithmetic approach to dealing with uncertainty in fuzzy arithmetic

LOCAL FUSION OF AN ENSEMBLE OF SEMI-SUPERVISED SELF ORGANIZING MAPS FOR POST-PROCESSING ACCIDENTAL SCENARIOS

component risk analysis

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Average Case Complexity: Levin s Theory

Worst-case design of structures using stopping rules in k-adaptive random sampling approach

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Where are we in CS 440?

arxiv: v3 [math.oc] 25 Apr 2018

Employing Model Reduction for Uncertainty Visualization in the Context of CO 2 Storage Simulation

Using Fuzzy Logic as a Complement to Probabilistic Radioactive Waste Disposal Facilities Safety Assessment -8450

Recent Advances in Bayesian Inference Techniques

Introductory Econometrics. Review of statistics (Part II: Inference)

Polynomial-Time Verification of PCTL Properties of MDPs with Convex Uncertainties and its Application to Cyber-Physical Systems

Fuzzy Sets and Fuzzy Techniques. Joakim Lindblad. Outline. Constructing. Characterizing. Techniques. Joakim Lindblad. Outline. Constructing.

Robust Optimization for Risk Control in Enterprise-wide Optimization

Algorithmisches Lernen/Machine Learning

Modelling trends in the ocean wave climate for dimensioning of ships

PROBABILISTIC REASONING SYSTEMS

Reliability Analysis of a Tunnel Design with RELY

Uncertainty quantification via codimension one domain partitioning and a new concentration inequality

Uncertainty Management and Quantification in Industrial Analysis and Design

2 Approaches To Developing Design Ground Motions

A Bayesian perspective on GMM and IV

Grundlagen der Künstlichen Intelligenz

Handout 8: Dealing with Data Uncertainty

Lecture : Probabilistic Machine Learning

Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization

Probability and Information Theory. Sargur N. Srihari

RISK AND RELIABILITY IN OPTIMIZATION UNDER UNCERTAINTY

Stochastic Subgradient Methods

The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008

LO and NLO PDFs in Monte Carlo Event Generators

1 Using standard errors when comparing estimated values

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics

ABC methods for phase-type distributions with applications in insurance risk problems

Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?

Physics 509: Bootstrap and Robust Parameter Estimation

Multimodal Nested Sampling

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

New Advances in Uncertainty Analysis and Estimation

Machine Learning for Signal Processing Bayes Classification and Regression

ORIGINS OF STOCHASTIC PROGRAMMING

Auxiliary signal design for failure detection in uncertain systems

Robust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties

Decision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over

Absolute Value Equations and Inequalities. Use the distance definition of absolute value.

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Integrated reliable and robust design

Privacy in Statistical Databases

Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions

Efficient Approximate Reasoning with Positive and Negative Information

Uncertain Structural Reliability Analysis

Introduction to Maximum Likelihood Estimation

Approximate dynamic programming for stochastic reachability

Safety Envelope for Load Tolerance and Its Application to Fatigue Reliability Design

Ingredients of statistical hypothesis testing and their significance

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institution of Technology, Kharagpur

Computational Complexity of Bayesian Networks

Different Criteria for Active Learning in Neural Networks: A Comparative Study

Lecture 5: Two-point Sampling

Econ 2148, spring 2019 Statistical decision theory

Introduction to Design of Experiments

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Research Collection. Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013.

Planning and Model Selection in Data Driven Markov models

On the Probabilistic Symbolic Analysis of Software. Corina Pasareanu CMU-SV NASA Ames

Price: $25 (incl. T-Shirt, morning tea and lunch) Visit:

Reliability of Traditional Retaining Wall Design

Transcription:

Uncertainty modeling for robust verifiable design Arnold Neumaier University of Vienna Vienna, Austria

Safety Safety studies in structural engineering are supposed to guard against failure in all reasonable situations encountered during the lifetime of a structure. What is reasonable is inferred on the basis of knowledge about the past performance of structures and their building blocks.

The hope is that the future will be sufficiently like the past, so that the unknown future of a (planned or existing) structure can be assessed based on this past information. Traditionally, the past information is summarized in form of either probability distributions (stochastic model), worst case bounds (interval model), or parameterized families of bounds (fuzzy set model).

Design A general design problem involves the following variables: θ (wanted, controllable, low-d) design vector x (uncontrollable, high-d) state vector y response vector (determined by θ and x) c cost vector ε failure probability x contains variables for uncertain material parameters (length, cross section, elasticity module, etc.), uncertain model parameters, etc.

Model of the design context F d (c, θ) F d design restrictions x X uncertain state information E(θ, x, y) = 0 model equations (dim E = dim y) F s (c, y) F s response requirements q(c, ε) q target quality constraint Notation: F F = [F, F ] F i F i F i for all i (includes equality constraints and one-sided inequalities) Most real life design problems fall into this general framework. Cost-dependent constraints allow soft formulations!

Failure probability The failure probability ε := 1 Pr(F s (c, y) F s E(θ, x, y) = 0) is determined by the model only if the complete joint distribution of all state vector variables x i is available. This is never the case in real situations once x contains more than a few variables only.

People often substitute default independence or normality assumptions where in fact information is completely missing. In particular, tails of the joint distribution cannot be estimated reliably without an excessive amount of data. This may lead to drastically wrong assessments, since probabilistic design problems usually depend sensitively on these tails.

Theorem. The best possible bound for the probability that an n-dimensional random vector x with uncorrelated coefficients of mean zero and variance 1 has length at least r is given by the generalized Chebyshev inequality Pr( x 2 r) min(1, n/r 2 ). Although common practice, representing such a vector by a Gaussian distribution, while in fact the distribution is unknown, gives far too optimistic results.

10 0 Failure probability ( = Gaussian = worst case) 10 0 10 0 10 2 10 2 10 2 10 4 10 4 10 4 10 6 0 5 10 15 20 n=1 10 6 0 5 10 15 20 n=3 10 6 0 5 10 15 20 n=10 10 0 10 0 10 0 10 2 10 2 10 2 10 4 10 4 10 4 10 6 0 10 20 30 40 50 n=30 10 6 0 10 20 30 40 50 n=100 10 6 0 10 20 30 40 50 n=300

Even if the complete distribution of x were known (or simply assumed), computing the failure probability ε accurately is very difficult, and nearly impossible in high dimensions. Traditional approximations such as FORM and SORM often work well, but may fail without warning.

Quotes from the Bible (Contemporary English Version) We make our own plans, but the LORD decides where we will go. (Proverbs 16:9) We make our own decisions, but the LORD alone determines what happens. (Proverbs 16:33) We may throw the dice, but the LORD determines how they fall. (Proverbs 16:33, New Living Translation) We may make a lot of plans, but the LORD will do what he has decided. (Proverbs 19:21) How can we know what will happen to us when the LORD alone decides? (Proverbs 20:24)

Probabilistic design goal Maximize the design quality max q(c, ε) s.t. F d (c, θ) F d Pr(F s (c, y) F s E(θ, x, y) = 0) = 1 ε... and check whether the requested quality q q 0 was obtained. This is a stochastic program with an expensive, low accuracy (hence noisy) objective function, often multimodal! SNOBFIT www.mat.univie.ac.at/ neum/software/snobfit/

Robust, verifiable design replaces the inaccessible probabilistic model input by deterministic assumptions that can be justified reasonably well by a limited amount of data. In particular, it avoids using information which in fact is not reliably available. Within such a deterministic framework, a rigorous worst case analysis becomes possible. The results are (in principle) verifiable.

In the verifiable design methodology, probabilistic methods are replaced by techniques of global optimization. This provides guarantees for the model behavior that are as reliable as the information in the model and the description of the uncertain state. Currently the best global solvers are BARON and OQNLP: http://archimedes.scs.uiuc.edu/baron/baron.html http://www.opttek.com/products/gams.html

Clouds allow the representation of incomplete stochastic information in a clearly understandable and computationally attractive way. They describe the rough shapes of typical samples of various size, without fixing the details of the distribution. The use of clouds permits a worst case analysis without losing track of important probabilistic information. All computed probabilities and hence the resulting designs are safeguarded against perturbations due to unmodelled (and unavailable) information.

1 0.5 0 α=20% 0.5 α=40% α=60% α=80% 1 α=100% 1.5 1 0.5 0 0.5 1 1.5 2 2.5 3

The special case of interest for large-scale models is a confocal cloud defined by a continuous potential V which assigns to each scenario x a potential function V (x) defining the shape of the cloud, a lower probability α(u) and an upper probability α(u) defining the fuzzy boundary of the cloud, such that, for all U, α(u) Pr(V (x) < U) Pr(V (x) U) α(u). α and α must be strictly increasing continuous functions of U mapping the range of V to [0, 1]. This corresponds to the level function x(ξ) defined by x(ξ) = α(v (ξ)), x(ξ) = α(v (ξ)).

α-cuts For a given failure probability ε and α = 1 ε, the so-called α-cut describes an inner region C α of α-relevant scenarios with V (x) < U ε and a (generally larger) region C α of α-reasonable scenarios with V (x) < U ε, where α(u ε ) = 1 ε, α(u ε ) = 1 ε. The conditions defining the cloud guarantee that U ε U ε, and that there is a region C with C α C C α containing a fraction of α of all scenarios considered possible.

The potential determines the shape of the cloud. In particular, V (x) = max k x k µ k /r k defines rectangular clouds (a sort of fuzzy boxes), V (x) = Ax b 2 2 defines elliptical clouds (a sort of fuzzy ellipsoids). The construction of appropriate clouds from qualitative or quantitative evidence is a current research topic. Elliptical clouds are probably appropriate in many applications. However, current numerical experience is restricted to rectangular clouds.

In robust verifiable design, one considers a design safe (at the given admissible failure probability ε = 1 α) if all α-reasonable scenarios satisfy the response requirements. One considers a design unsafe if some α-relevent scenario violates the response requirements. In between there is a grey zone of borderline cases, where more detailed statistical information would be required to fully analyze the safety.

In the simplest case, the safety requirement is given by a single constraint Pr(s(x) 0) ε. In this case, the design analysis in terms of a given cloud can be based on the solution of two constraint satisfaction problems involving the safety margin s(x) and the potential V (x).

(CSP1) V (x) U ε, s(x) 0 (CSP2) V (x) U ε, s(x) 0 (CSP2) solvable (CSP2) inconsistent (CSP1) solvable unsafe borderline (CSP1) inconsistent safe Note that even in the borderline case, useful information for the designer is available: The solution of (CSP1) may in this case be considered as a weakly unsafe scenario.

A positive decision about safety guarantees that the failure probability is below ε for all possible distributions compatible with the cloud. Moreover, for any unsafe or borderline case, there is at least one distribution compatible with the cloud for which the failure probability is not below ε.

Note that a single unsafe scenario is considered sufficient ground for rejecting the design. Has no analogue in probabilistic safety design, but is important in a number of applications: The curiosity of children often makes the most dangerous scenarios most attractive for them. If the workspace of a robot contains hypersurfaces of singularities, crossing them is dangerous even if they occupy only a negligible part of the volume. The crossing probability may be very large even if the probability of being there is tiny.

As in any worst case analysis, the design resulting from modeling with clouds will be marginally safe for some probability distribution compatible with the information in the cloud. The design will be oversafe given many other probability distribution compatible with the cloud. This appears to be the appropriate design goal in all situations where safety is critical.

Monte Carlo simulation In many cases, simulation models for generating reasonable scenarios are available. These can be used in Monte Carlo studies to estimate failure probabilities. The resulting failure probabilities are usually not very robust under variations of the simulation model.

Moreover, Monte Carlo simulations typically require a large number of scenarios, especially when tiny falure probabilities are required. The evaluation of a design on a single scenario is typically already expensive (solution of a finite element problem), which makes such Monte Carlo studies very expensive.

Simulation in deterministic design In robust verifiable design, a detailed simulation model can be used to create a large set of representative samples, from which a robust cloud is estimated. This is the only (and relatively cheap) part involving Monte Carlo techniques, and can be done independent of the evaluation of the design quality or later design optimization. (Alternatively, a suitable cloud could be determined based on other information, including expert opinion, aggregated information on subsets of variables, etc...)

For design optimization, only the information contained in the cloud is used. This gives robustness to the design, since it takes account of alternative distributions deviating from the simulation model while being consistent with the derived cloud. There are also computational advantages since the deterministic evaluation of the quality of a design often needs many fewer scenarios than the determination of the failure probability by Monte Carlo simulation.

Deterministic design optimization Maximize the design quality max q(c, ε) s.t. F d (c, θ) F d E(θ, x, y) = 0, V (x) U ε F s (c, y) F s The implication constraint can be written as E(θ, x, y) = 0, V (x) U ε, F s (c, y) F s impossible Thus we have a bilevel optimization problem in which the inner problem involves the negative solution of a CSP. Solution techniques for such problems are a topic of current research.

Conclusions clouds provide a new class of imprecise probability models for robust design precise and intuitive semantics based on accessible information only a poor fit of clouds makes the estimates more conservative but not wrong large scale problems are tractable via global optimization optimal design with clouds leads to bilevel optimization problems

BARON (complete global optimization): http://archimedes.scs.uiuc.edu/baron/baron.html SNOBFIT (optimizes noisy, expensive functions): www.mat.univie.ac.at/ neum/software/snobfit/ Clouds (code limited probabilistic information): www.mat.univie.ac.at/ neum/papers.html#cloud Surprise (codes qualitative information): www.mat.univie.ac.at/ neum/papers.html#fuzzy The present slides (robust, verifiable design): www.mat.univie.ac.at/ neum/ms/uncslides.pdf