Risk Elicitation in Complex Systems: Application to Spacecraft Re-entry

Similar documents
Domino Effect Modeling using Bayesian Network

Beta statistics. Keywords. Bayes theorem. Bayes rule

A risk assessment tool for highly energetic break-up events during the atmospheric re-entry

A BAYESIAN SOLUTION TO INCOMPLETENESS

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Machine Learning

Bayesian Machine Learning

Introduction to Applied Bayesian Modeling. ICPSR Day 4

Bayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Uncertainty of the Level 2 PSA for NPP Paks. Gábor Lajtha, VEIKI Institute for Electric Power Research, Budapest, Hungary

Variability within multi-component systems. Bayesian inference in probabilistic risk assessment The current state of the art

Bayes Nets. CS 188: Artificial Intelligence Fall Example: Alarm Network. Bayes Net Semantics. Building the (Entire) Joint. Size of a Bayes Net

Lecture 6: Graphical Models: Learning

Treatment of Expert Opinion Diversity in Bayesian Belief Network Model for Nuclear Digital I&C Safety Software Reliability Assessment

Non-Parametric Bayes

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Machine Learning Summer School

Bayesian Networks in Educational Assessment Tutorial

estec ESA Space Debris Mitigation Compliance Verification Guidelines

Causal & Frequency Analysis

Bayesian Networks Basic and simple graphs

Hierarchical Models & Bayesian Model Selection

PHA, MOC and Incident Investigation in Partnership: A Catalyst for Process Safety Excellence

Computational Perception. Bayesian Inference

Bayesian Inference. p(y)

ANALYSIS OF INDEPENDENT PROTECTION LAYERS AND SAFETY INSTRUMENTED SYSTEM FOR OIL GAS SEPARATOR USING BAYESIAN METHODS

ECE521 W17 Tutorial 6. Min Bai and Yuhuai (Tony) Wu

Announcements. CS 188: Artificial Intelligence Fall Example Bayes Net. Bayes Nets. Example: Traffic. Bayes Net Semantics

Special Topic: Bayesian Finite Population Survey Sampling

Machine Learning 4771

Simple Counter-terrorism Decision

Intro to Bayesian Methods

A Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D.

Machine Learning

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Bayesian Graphical Models

Reliability of Technical Systems

Inference for a Population Proportion

Case Studies for Uncertainty Quantification of a High-fidelity Spacecraft Oriented Break-up Tool. Bent Fritsche, HTG Stijn Lemmens, ESA

Seminar on Case Studies in Operations Research (Mat )

Readings: K&F: 16.3, 16.4, Graphical Models Carlos Guestrin Carnegie Mellon University October 6 th, 2008

Bayesian networks for multilevel system reliability

Bayesian Machine Learning

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Bayesian statistics. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Lab 3 Parallel Circuits

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models

5.3 METABOLIC NETWORKS 193. P (x i P a (x i )) (5.30) i=1

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

STAT J535: Introduction

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Space Debris Re-entries and Aviation Safety

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation

Bayesian (conditionally) conjugate inference for discrete data models. Jon Forster (University of Southampton)

Bayes theorem and its application to nuclear power plant safety

CS 5522: Artificial Intelligence II

Click Prediction and Preference Ranking of RSS Feeds

Probabilistic Graphical Models

ebay/google short course: Problem set 2

CS 361: Probability & Statistics

Using Probability to do Statistics.

Bayesian network modeling. 1

Other Noninformative Priors

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Introduction to Probabilistic Machine Learning

Introduction to Bayesian Statistics. James Swain University of Alabama in Huntsville ISEEM Department

Learning in Bayesian Networks

Homework 6: Image Completion using Mixture of Bernoullis

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Learning Energy-Based Models of High-Dimensional Data

Modeling Environment

Clustering bi-partite networks using collapsed latent block models

CS540 Machine learning L9 Bayesian statistics

Related Concepts: Lecture 9 SEM, Statistical Modeling, AI, and Data Mining. I. Terminology of SEM

Probabilistic Graphical Models

PHASES OF STATISTICAL ANALYSIS 1. Initial Data Manipulation Assembling data Checks of data quality - graphical and numeric

Introduction into Bayesian statistics

Bayesian GLMs and Metropolis-Hastings Algorithm

Introduction to Bayesian inference

Bayesian Regression Linear and Logistic Regression

The Monte Carlo Method: Bayesian Networks

Bayes Nets III: Inference

Classical and Bayesian inference

Beyond Uniform Priors in Bayesian Network Structure Learning

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4

Lecture 3a: Dirichlet processes

An Empirical-Bayes Score for Discrete Bayesian Networks

Bayesian Analysis for Natural Language Processing Lecture 2

L applicazione dei metodi Bayesiani nella Farmacoeconomia

CS540 Machine learning L8

How to predict the probability of a major nuclear accident after Fukushima Da

Overview of Course. Nevin L. Zhang (HKUST) Bayesian Networks Fall / 58

PMR Learning as Inference

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Programming Assignment 4: Image Completion using Mixture of Bernoullis

Formal Handling of the Level 2 Uncertainty Sources and Their Combination with the Level 1 PSA Uncertainties

Transcription:

Risk Elicitation in Complex Systems: Application to Spacecraft Re-entry Simon Wilson 1 Cristina De Persis 1 Irene Huertas 2 Guillermo Ortega 2 1 School of Computer Science and Statistics Trinity College Dublin 2 European Space Research and Technology Center ESA-Noordwijk 20th May 2016

The motivation

The motivation 5400 tonnes over the last 40 years estimated to have survived re-entry from orbit; No reported casualties; More than 50 debris objects recovered and documented.

The motivation Surviving fragments pose risk to people and property; Greatest risk is probably the regulatory effect on the industry if there were a fatality; Sophisticated deterministic models of re-entry exist, based on finite element approaches: No attempt to discuss uncertainties; Models fail in cases of a highly energetic break-up event; Number of re-entries, controlled and not controlled, is increasing: Seen as a way to control the space junk problem.

The general research question Implement and evaluate a statistical risk assessment model that can: Derive the probability for the top event (explosion); This will be based on a combination of expert opinion and (sparse) data; There is (and will only be) limited data; Diverse expert opinion (no one is an expert on everything); Access to experts is time-limited; May be large variations in conditions surrounding the event;

Stage 1: Model Explosion OR Chemical reaction propellant+air Chemical reaction between hypergolic propellants burst of a battery cells OR Slow release of propellant Simultaneous release of hypergolic propellants Sudden release of propellant OR OR Burst of a pressure vessels Valve leakage Tank destruction Pipe rupture Exothermal chemical reactions overpressure short-circuit overcharge overdischarge corrosion

Probabilistic fault tree Build a fault tree of events that lead to failure Assign a probability θ 1j to elementary events j = 1,..., N; Under assumption of independence, implies probability of intermediate and top events e.g. Prob(Solve release of propellant) [ = 1 (1 Prob(Valve leakage))(1 Prob(Tank destruction)) ] (1 Prob(Pipe rupture)) ; Model parameterised by the elementary event θ 1j probabilities only.

Stage 2: Elicitation Need a prior on the probability of each elementary event θ 1j ; Group these events by expert (or group of events); We discuss everything with respect to a nominal conditions; Time consuming and difficult process so: Ask experts to specify a probability distribution for one of the elementary events in their group.

The beta distribution Prob(THETA opinions) 0 1 2 3 4 0.0 0.2 0.4 0.6 0.8 1.0 p(θ) = Γ(α + β) Γ(α) Γ(β) θα 1 (1 θ) β 1, Elicit values of α and β. 0 θ 1. THETA

Pairwise comparisons of event probabilities We use an idea from analytic hierarchy process to rank the θ 1j ; Experts are asked to specify based on their knowledge and experience if the occurrence of an event is: equally (=1), or moderately more (=3), or strongly more (=5), or very strongly (=7), or absolutely more (=9) probable than another; Can use AHP to map these to a weight for each event w j : Better than just using the raw comparison as a weight?

Mapping AHP weights to prior distributions We have one beta prior, say for event j p(θ 1j ); We have a weight for each event w j ; w j > w j event j more likely that event j so θ 1j > st θ 1j ; We take a high prior probability interval for θ 1j, say (θ L, θ U ) with P(θ L < θ 1j < θ U ) = 0.95; Create the equivalent interval for each θ 1j : w j w j { θ L < θ 1j < min 1, w } j θ U. w j Identify a beta distribution with these 95% probability limits.

Stage 3: Prediction A prior for each elementary event is assessed; Logic of fault tree gives P(top event) or any intermediate event as function of θ 1j ; Prior distribution of top event derived by simulation: Simulate sets of θ 1j from prior; For each set, derive P(top event) from fault tree logic.

Stage 4: Updating with Data From a particular re-entry: If the top event occurred, did not occur or was unobserved; Similarly for any intermediate event; Similarly for any elementary event; Likelihood is then probability of observing all of what we observed in terms of the θ i : Likelihood = Likelihood of observation of event Two cases: observed events 1 Data are observed under nominal conditions; 2 Date are observed under other (non-nominal) conditions. Principle: we always do inference for θ 1j s on the nominal conditions:

Nominal Likelihood Fault tree gives a causal relationship between events; Likelihood of an event is conditional on its parent events; We have a two stage procedure for determining the likelihood: 1 Work up the fault tree to the top event and logically deduce if any unobserved events must have occurred or not: e.g. if an unobserved event is an OR, and one parent is observed, then it must have occurred; 2 Work up the fault tree to the top event and evaluate the likelihood term for each observed node in the tree: e.g. if an observed event is an OR, and one parent is also observed, then it does not contribute to the likelihood;

Nominal Likelihood Relationship to parent events OR AND At least one parent observed to have occurred? All parents observed and all occurred? Likelihood (1) P(event occurs) = 1 NO YES YES NO All parents observed and all did not occur? Likelihood (1) At least one parent observed not to have occurred? Likelihood (2) P(event occurs) = 0 NO YES YES NO Likelihood (3) Likelihood (2) Likelihood (4)

Nominal Likelihood OR At least one parent observed to have occurred? Relationship to parent events AND All parents observed and all occurred? Likelihood (3) P(event occurs) = 1 (1 P(event j occurs)). j: unobserved or not deduced parent NO YES YES NO All parents observed and all did not occur? NO Likelihood (3) YES Likelihood (1) Likelihood (2) YES At least one parent observed not to have occurred? NO Likelihood (4) = Likelihood (4) P(event occurs) P(event j occurs). j: unobserved or not deduced parent

Non-nominal Likelihood We map data from non-nominal cases to a likelihood under the nominal case; We elicit relative risk for each elementary event probability in this case with the nominal case. θ 1j are weighted in the likelihood accordingly; Example: Expert weights the chance of elementary event j as moderately more likely in nominal case than the case in question; Each θ 1j in likelihood is replaced by θ 3 1j ; Intuition: seeing this happen once is like it happening 3 times under nominal conditions; This also permits a prediction of P(top event).

Posterior Not a binomial likelihood so beta prior not conjugate; We use importance sampling to generate samples from posterior of the θ 1j ; For high dimensional situations can use MCMC; Can obtain samples of posterior of P(top event) from samples of the θ 1j as before.

The entire procedure Construct fault tree INITIALISE MODEL Prior elicitation of elementary event probabilities in nominal case FOR EACH OCCURENCE Next event under nominal conditions? Elicit relative risk of elementary events Update posterior distribution of elementary event probabilities Prediction for Prob(top event) Observe data

Worked example fault tree

Worked example data AHP to get initial priors; Then data on 3 re-entries: 1 Nominal conditions, explosion not observed; 2 Nominal conditions, observe explosion, A 3 and A 4 ; 3 Non-nominal conditions, observe explosion and B 2. Relative risk for all C j is 1/5; Likelihoods are: 1 (1 θ 1j ), θ 1,A3 1 (1 θ 1,Dj ), 1 (1 θ1,c 5 j ). j j j

Worked example prior and posteriors for probabilities of elementary events

Worked example prior and posterior for probability of top event

Conclusion A procedure for producing a risk probability in a reasonably complex system; Tackled the challenges of doing this when: Data are sparse; Many experts and/or too much expert elicitation would be needed for the prior; Limited access to experts time; Conditions around event are important. Issues: All the usual ones to do with AHP (recall Fabrizio s tutorial); Sensitivity analysis; Fully probabilistic fault trees.