Efficient Aircraft Design Using Bayesian History Matching. Institute for Risk and Uncertainty

Size: px
Start display at page:

Download "Efficient Aircraft Design Using Bayesian History Matching. Institute for Risk and Uncertainty"

Transcription

1 Efficient Aircraft Design Using Bayesian History Matching F.A. DiazDelaO, A. Garbuno, Z. Gong Institute for Risk and Uncertainty DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

2 Outline 1 THE PROBLEM 2 HISTORY MATCHING 3 SUBSET SIMULATION 4 SUS-BASED SAMPLING DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

3 THE PROBLEM The problem was posed during the First Study Group with Industry at the Institute for Risk and Uncertainty at Liverpool. Modern aircraft must operate within strict performance and regulatory limits. Quickly assertaining options from the vast, uncertain design space is key to increasing the concept design process. Aim: To determine, visualise and act on uncertainties that propagate through the design process to comply with performance and regulatory limits. Objectives: Perform robust design to narrow the set of possible configurations. Discover the parameters that strongly contribute to variations in measures of aptness. Manage key parameters to drive reliably towards desired properties and behaviours. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

4 THE PROBLEM Measures of Aptness Input Units L. Bounds U. Bounds Fan Pressure Ratio (FPR) N/A Overall Pressure Ratio (OPR) N/A Bypass Ratio (B) N/A 6 8 SLS Thrust (ST) lb 26,000 32,000 Wing Area (WA) ft 2 1,300 1,400 Wing Aspect Ratio (WAR) N/A DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

5 THE PROBLEM Figures of Merit Output Flyover Noise Sideline Noise Cruise Fuel Consumption Emissions (NOx) Units EPNLdb EPNLdb lb lb DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

6 THE PROBLEM DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

7 THE PROBLEM DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

8 THE PROBLEM DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

9 HISTORY MATCHING History Matching Aims at identifying subset of input space X X for which the evaluation of a simulator η : X R gives an acceptable match to observed data. Iterative process that starts by sampling from input space X and by applying some implausibility measure. Cutoffs are imposed in order to obtain successive non-implausible sets X... X 2 X 1. If the simulator is expensive, an emulator can be employed. Subtle difference. Calibration will result in a posterior distribution over the input space, whilst History Matching might conclude that the set of acceptable matches is empty. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

10 HISTORY MATCHING Types of Uncertainty Observational uncertainty. Experimental error. Subject to instrument accuracy. Model Discrepancy. However carefully the model is built, there will always be a difference between the real system and the simulator. Models are always wrong. Code uncertainty. For any choice of inputs, the output is known when the model is run. However, the simulator can be computationally expensive. Ensemble Variability. The simulator can be stochastic. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

11 HISTORY MATCHING HM Workflow (Andrianakis et al., 2015) DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

12 Non-Implausible Sampling HISTORY MATCHING Method 1. Acceptance-rejection. Draw samples uniformly from the input space and reject the implausible ones. Easy to implement. Becomes inefficient as non-implausible space shrinks in later waves. Method 2. Perturbation (Andrianakis et al., 2015). Draw samples from a multivariate normal centered at non-implausible seeds. Needs tuning of variance to produce good samples. Method 3. Implausibility Driven Evolutionary Monte Carlo (Williamson and Vernon, 2013). Samples uniformly from non-implausible space. Mixing and efficiency depend on choosing a suitable implausibility ladder. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

13 HISTORY MATCHING The Analogy Given an observation z, History Matching defines the following measure of implausibility: z E[η(y x)] I(x) = [V 0 + V c (x) + Vs + V m ] 1/2 At each iteration, the non-implausible set can be defined as: Π = {x X : I(x) 3} There is a natural analogy that connects History Matching and reliability analysis: The set Π can be regarded as a failure domain. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

14 HISTORY MATCHING The Analogy Given an observation z, History Matching defines the following measure of implausibility: z E[η(y x)] I(x) = [V 0 + V c (x) + Vs + V m ] 1/2 At each iteration, the non-implausible set can be defined as: Π = {x X : I(x) 3} There is a natural analogy that connects History Matching and reliability analysis: The set Π can be regarded as a failure domain. Use Subset Simulation! DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

15 SUBSET SIMULATION The Engineering Reliability Problem Let G : R d R be a system performance function. Aim: To estimate the probability of failure, i.e. the probability of demand exceeding the capacity of the system. Let y be a critical value such that the system fails if y = G(x 1,..., x d ) > y. The failure domain F can thus be defined as: F = {x : G(x) > y } The engineering reliability problem can be formulated as computing the probability of failure: p F = P(x F) = π(x)dx F DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

16 SUBSET SIMULATION Subset Simulation Developed by Au and Beck (2001) to simulate rare events and estimate small probabilities of failure. The idea is to decompose a rare event F into a sequence of progressively less rare events as: F = F m F m 1... F 1 where F 1 is a relatively frequent event. Given the above sequence of events, the small probability P(F) of the rare event can be represented as a product of larger probabilities as: P(F) = P(F m ) = P(F 1 ) P(F 2 F 1 )... P(F m F m 1 ) DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

17 SUBSET SIMULATION Subset Simulation Subset simulation explores the input space X by generating a relatively small number of i.i.d. samples x (1) 0,..., x (n) 0 π(x) and computing the corresponding system responses y (1) 0,..., y (n) 0. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

18 SUBSET SIMULATION Subset Simulation Let p (0, 1) such that np N. Define the first intermediate failure domain as: { F 1 = x : G(x) > y1 = y (np) 0 + y (np+1) } 0 2 DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

19 SUBSET SIMULATION Subset Simulation By construction, x (1) 0,..., x (np) 0 F 1, whilst x (np+1) 0,..., x (n) 0 / F 1. Thus, the Monte Carlo estimate for the probability of F 1 is given by P(F 1 ) 1 n n i=1 I F1 (x (i) 0 ) = p F 1 provides a rough estimate to the failure domain F. Since F F 1, the failure probability can be written as: p F = P(F 1 )P(F F 1 ) In the next stage, instead of sampling in the whole input space, SuS populates F 1. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

20 SUBSET SIMULATION Subset Simulation We start with x (1) 0,..., x (np) 0 π(x F 1 ) and need to draw n np samples from π(x F 1 ). This is done with an MCMC scheme. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

21 SUBSET SIMULATION Subset Simulation Define the second intermediate failure domain as: { F 2 = x : G(x) > y2 = y } (np) 1 + y (np+1) 1 2 DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

22 SUS-BASED SAMPLING 2-D Model (Williamson and Vernon, 2013) DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

23 SUS-BASED SAMPLING 10-D Model (Surjanovic and Bingham, 2016) ( W = 0.036Sw Wfw A cos 2 (Λ) ) 0.6 q ( ) 0.3 λ tc (N z W dg) S w W p cos(λ) DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

24 SUS-BASED SAMPLING 10-D Model DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

25 SUS-BASED SAMPLING Airbus: Single Output, NOx = 240 ± 10 lb DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

26 Airbus: Multiple Output SUS-BASED SAMPLING DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

27 Further Improvements SUS-BASED SAMPLING We propose a full-probabilistic approach to History Matching: z η(x) I(x) = [V o + Vc(x) + Vs + Vm] 1/2 where η(x) GP(m(x), σ 2 (x)) is the Gaussian process emulator for the simulator output. The non-implausible space is described by statements of the form P{I(x) 3} Use measures such as entropy for active learning: H z (η(x)) = z+k σ(x) z k σ(x) ln f (y) f (y) dy DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

28 SUS-BASED SAMPLING Conclusions 1 There s a natural analogy between the Robust Design Problem and calibration of numerical models. 2 History Matching is a form of calibration that finds a subset (possibly empty) in the input space that provides a match between output and observed data. 3 There s also a natural analogy between the non-implausible space and a failure region, for which Subset Simulation is a suitable solution. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

29 SUS-BASED SAMPLING References 1 Z. T. Gong, F. A. DiazDelaO, M. Beer (2016) Bayesian Model Calibration Using Subset Simulation. European Safety and Reliability Conference. Glasgow, UK. 2 Garbuno-Inigo, A. DiazDelaO, F.A., Zuev, K. (2017) Full Probabilistic History Matching, Under Review. 3 Andrianakis et al. (2015) Bayesian History Matching of Complex Infectious Disease Models Using Emulation, PLOS Computational Biology, 11 (1). 4 Au, S.K. and Beck, J. (2001) Estimation of small failure probabilities in high dimensions by subset simulation, Probabilistic Engineering Mechanics, 16 (4), DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

30 SUS-BASED SAMPLING Thank you! Acknowledgments to Sanjiv Sharma and Simon Coggon at Airbus, and Arturo Molina-Cristobal at Cranfield. DiazDelaO (U. of Liverpool) DiPaRT 2017 November 22, / 32

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/

More information

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm On the Optimal Scaling of the Modified Metropolis-Hastings algorithm K. M. Zuev & J. L. Beck Division of Engineering and Applied Science California Institute of Technology, MC 4-44, Pasadena, CA 925, USA

More information

arxiv: v1 [stat.co] 23 Apr 2018

arxiv: v1 [stat.co] 23 Apr 2018 Bayesian Updating and Uncertainty Quantification using Sequential Tempered MCMC with the Rank-One Modified Metropolis Algorithm Thomas A. Catanach and James L. Beck arxiv:1804.08738v1 [stat.co] 23 Apr

More information

New Insights into History Matching via Sequential Monte Carlo

New Insights into History Matching via Sequential Monte Carlo New Insights into History Matching via Sequential Monte Carlo Associate Professor Chris Drovandi School of Mathematical Sciences ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

More information

Reminder of some Markov Chain properties:

Reminder of some Markov Chain properties: Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

However, reliability analysis is not limited to calculation of the probability of failure.

However, reliability analysis is not limited to calculation of the probability of failure. Probabilistic Analysis probabilistic analysis methods, including the first and second-order reliability methods, Monte Carlo simulation, Importance sampling, Latin Hypercube sampling, and stochastic expansions

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Approximate Bayesian Computation: a simulation based approach to inference

Approximate Bayesian Computation: a simulation based approach to inference Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics

More information

Probabilistic numerics for deep learning

Probabilistic numerics for deep learning Presenter: Shijia Wang Department of Engineering Science, University of Oxford rning (RLSS) Summer School, Montreal 2017 Outline 1 Introduction Probabilistic Numerics 2 Components Probabilistic modeling

More information

J. Sadeghi E. Patelli M. de Angelis

J. Sadeghi E. Patelli M. de Angelis J. Sadeghi E. Patelli Institute for Risk and, Department of Engineering, University of Liverpool, United Kingdom 8th International Workshop on Reliable Computing, Computing with Confidence University of

More information

A Note on the Particle Filter with Posterior Gaussian Resampling

A Note on the Particle Filter with Posterior Gaussian Resampling Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and

More information

Uncertainty in energy system models

Uncertainty in energy system models Uncertainty in energy system models Amy Wilson Durham University May 2015 Table of Contents 1 Model uncertainty 2 3 Example - generation investment 4 Conclusion Model uncertainty Contents 1 Model uncertainty

More information

Three examples of a Practical Exact Markov Chain Sampling

Three examples of a Practical Exact Markov Chain Sampling Three examples of a Practical Exact Markov Chain Sampling Zdravko Botev November 2007 Abstract We present three examples of exact sampling from complex multidimensional densities using Markov Chain theory

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Basics of Uncertainty Analysis

Basics of Uncertainty Analysis Basics of Uncertainty Analysis Chapter Six Basics of Uncertainty Analysis 6.1 Introduction As shown in Fig. 6.1, analysis models are used to predict the performances or behaviors of a product under design.

More information

Gaussian Processes for Computer Experiments

Gaussian Processes for Computer Experiments Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented

More information

Enabling Advanced Automation Tools to manage Trajectory Prediction Uncertainty

Enabling Advanced Automation Tools to manage Trajectory Prediction Uncertainty Engineering, Test & Technology Boeing Research & Technology Enabling Advanced Automation Tools to manage Trajectory Prediction Uncertainty ART 12 - Automation Enrique Casado (BR&T-E) enrique.casado@boeing.com

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Uncertainty quantification and calibration of computer models. February 5th, 2014

Uncertainty quantification and calibration of computer models. February 5th, 2014 Uncertainty quantification and calibration of computer models February 5th, 2014 Physical model Physical model is defined by a set of differential equations. Now, they are usually described by computer

More information

Robust Monte Carlo Methods for Sequential Planning and Decision Making

Robust Monte Carlo Methods for Sequential Planning and Decision Making Robust Monte Carlo Methods for Sequential Planning and Decision Making Sue Zheng, Jason Pacheco, & John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

Approximate Bayesian Computation

Approximate Bayesian Computation Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate

More information

Subset Simulation Method for Rare Event Estimation: An Introduction

Subset Simulation Method for Rare Event Estimation: An Introduction Subset Simulation Method for Rare Event Estimation: An Introduction Synonyms Engineering reliability; Failure probability; Markov chain Monte Carlo; Monte Carlo simulation; Rare events; Subset Simulation

More information

A nonlinear filtering tool for analysis of hot-loop test campaings

A nonlinear filtering tool for analysis of hot-loop test campaings A nonlinear filtering tool for analysis of hot-loop test campaings Enso Ikonen* Jenő Kovács*, ** * Systems Engineering Laboratory, Department of Process and Environmental Engineering, University of Oulu,

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Stochastic Optimization One-stage problem

Stochastic Optimization One-stage problem Stochastic Optimization One-stage problem V. Leclère September 28 2017 September 28 2017 1 / Déroulement du cours 1 Problèmes d optimisation stochastique à une étape 2 Problèmes d optimisation stochastique

More information

MCMC algorithms for Subset Simulation

MCMC algorithms for Subset Simulation To appear in Probabilistic Engineering Mechanics June 2015 MCMC algorithms for Subset Simulation Iason Papaioannou *, Wolfgang Betz, Kilian Zwirglmaier, Daniel Straub Engineering Risk Analysis Group, Technische

More information

EFFICIENT SHAPE OPTIMIZATION USING POLYNOMIAL CHAOS EXPANSION AND LOCAL SENSITIVITIES

EFFICIENT SHAPE OPTIMIZATION USING POLYNOMIAL CHAOS EXPANSION AND LOCAL SENSITIVITIES 9 th ASCE Specialty Conference on Probabilistic Mechanics and Structural Reliability EFFICIENT SHAPE OPTIMIZATION USING POLYNOMIAL CHAOS EXPANSION AND LOCAL SENSITIVITIES Nam H. Kim and Haoyu Wang University

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Structural Reliability

Structural Reliability Structural Reliability Thuong Van DANG May 28, 2018 1 / 41 2 / 41 Introduction to Structural Reliability Concept of Limit State and Reliability Review of Probability Theory First Order Second Moment Method

More information

Introduction to Bayesian Learning

Introduction to Bayesian Learning Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline

More information

IDM workshop: emulation and history matching Part 1: General Principles

IDM workshop: emulation and history matching Part 1: General Principles IDM workshop: emulation and history matching Part 1: General Principles Michael Goldstein, Ian Vernon Thanks to MRc, for funding for example in presentation. Complex physical models Most large and complex

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Machine Learning. Probabilistic KNN.

Machine Learning. Probabilistic KNN. Machine Learning. Mark Girolami girolami@dcs.gla.ac.uk Department of Computing Science University of Glasgow June 21, 2007 p. 1/3 KNN is a remarkably simple algorithm with proven error-rates June 21, 2007

More information

Global Optimisation with Gaussian Processes. Michael A. Osborne Machine Learning Research Group Department o Engineering Science University o Oxford

Global Optimisation with Gaussian Processes. Michael A. Osborne Machine Learning Research Group Department o Engineering Science University o Oxford Global Optimisation with Gaussian Processes Michael A. Osborne Machine Learning Research Group Department o Engineering Science University o Oxford Global optimisation considers objective functions that

More information

Doing Bayesian Integrals

Doing Bayesian Integrals ASTR509-13 Doing Bayesian Integrals The Reverend Thomas Bayes (c.1702 1761) Philosopher, theologian, mathematician Presbyterian (non-conformist) minister Tunbridge Wells, UK Elected FRS, perhaps due to

More information

Efficient Likelihood-Free Inference

Efficient Likelihood-Free Inference Efficient Likelihood-Free Inference Michael Gutmann http://homepages.inf.ed.ac.uk/mgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh 8th November 2017

More information

Bayesian Statistics Applied to Complex Models of Physical Systems

Bayesian Statistics Applied to Complex Models of Physical Systems Bayesian Statistics Applied to Complex Models of Physical Systems Bayesian Methods in Nuclear Physics INT 16-2A Ian Vernon Department of Mathematical Sciences Durham University, UK Work done in collaboration

More information

Adaptive Rejection Sampling with fixed number of nodes

Adaptive Rejection Sampling with fixed number of nodes Adaptive Rejection Sampling with fixed number of nodes L. Martino, F. Louzada Institute of Mathematical Sciences and Computing, Universidade de São Paulo, São Carlos (São Paulo). Abstract The adaptive

More information

Stochastic optimization - how to improve computational efficiency?

Stochastic optimization - how to improve computational efficiency? Stochastic optimization - how to improve computational efficiency? Christian Bucher Center of Mechanics and Structural Dynamics Vienna University of Technology & DYNARDO GmbH, Vienna Presentation at Czech

More information

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability

More information

Bayesian Inverse Problems

Bayesian Inverse Problems Bayesian Inverse Problems Jonas Latz Input/Output: www.latz.io Technical University of Munich Department of Mathematics, Chair for Numerical Analysis Email: jonas.latz@tum.de Garching, July 10 2018 Guest

More information

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b)

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b) LECTURE 5 NOTES 1. Bayesian point estimators. In the conventional (frequentist) approach to statistical inference, the parameter θ Θ is considered a fixed quantity. In the Bayesian approach, it is considered

More information

Probabilistic assessment of geotechnical objects by means of MONTE CARLO METHOD. Egidijus R. Vaidogas

Probabilistic assessment of geotechnical objects by means of MONTE CARLO METHOD. Egidijus R. Vaidogas Probabilistic assessment of geotechnical objects by means of MONTE CARLO METHOD Egidijus R. Vaidogas Vilnius Gediminas technical university, Lithuania ERASMUS/SOCRATES program, 2007 What is the principal

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Tutorial on Approximate Bayesian Computation

Tutorial on Approximate Bayesian Computation Tutorial on Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology 16 May 2016

More information

Uncertainty propagation in a sequential model for flood forecasting

Uncertainty propagation in a sequential model for flood forecasting Predictions in Ungauged Basins: Promise and Progress (Proceedings of symposium S7 held during the Seventh IAHS Scientific Assembly at Foz do Iguaçu, Brazil, April 2005). IAHS Publ. 303, 2006. 177 Uncertainty

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Optimization Tools in an Uncertain Environment

Optimization Tools in an Uncertain Environment Optimization Tools in an Uncertain Environment Michael C. Ferris University of Wisconsin, Madison Uncertainty Workshop, Chicago: July 21, 2008 Michael Ferris (University of Wisconsin) Stochastic optimization

More information

16 : Markov Chain Monte Carlo (MCMC)

16 : Markov Chain Monte Carlo (MCMC) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions

More information

Bayesian inference with reliability methods without knowing the maximum of the likelihood function

Bayesian inference with reliability methods without knowing the maximum of the likelihood function Bayesian inference with reliaility methods without knowing the maximum of the likelihood function Wolfgang Betz a,, James L. Beck, Iason Papaioannou a, Daniel Strau a a Engineering Risk Analysis Group,

More information

SOLVING POWER AND OPTIMAL POWER FLOW PROBLEMS IN THE PRESENCE OF UNCERTAINTY BY AFFINE ARITHMETIC

SOLVING POWER AND OPTIMAL POWER FLOW PROBLEMS IN THE PRESENCE OF UNCERTAINTY BY AFFINE ARITHMETIC SOLVING POWER AND OPTIMAL POWER FLOW PROBLEMS IN THE PRESENCE OF UNCERTAINTY BY AFFINE ARITHMETIC Alfredo Vaccaro RTSI 2015 - September 16-18, 2015, Torino, Italy RESEARCH MOTIVATIONS Power Flow (PF) and

More information

Overview. Bayesian assimilation of experimental data into simulation (for Goland wing flutter) Why not uncertainty quantification?

Overview. Bayesian assimilation of experimental data into simulation (for Goland wing flutter) Why not uncertainty quantification? Delft University of Technology Overview Bayesian assimilation of experimental data into simulation (for Goland wing flutter), Simao Marques 1. Why not uncertainty quantification? 2. Why uncertainty quantification?

More information

Uncertainty Management and Quantification in Industrial Analysis and Design

Uncertainty Management and Quantification in Industrial Analysis and Design Uncertainty Management and Quantification in Industrial Analysis and Design www.numeca.com Charles Hirsch Professor, em. Vrije Universiteit Brussel President, NUMECA International The Role of Uncertainties

More information

MODIFIED METROPOLIS-HASTINGS ALGORITHM WITH DELAYED REJECTION FOR HIGH-DIMENSIONAL RELIABILITY ANALYSIS

MODIFIED METROPOLIS-HASTINGS ALGORITHM WITH DELAYED REJECTION FOR HIGH-DIMENSIONAL RELIABILITY ANALYSIS SEECCM 2009 2nd South-East European Conference on Computational Mechanics An IACM-ECCOMAS Special Interest Conference M. Papadrakakis, M. Kojic, V. Papadopoulos (eds.) Rhodes, Greece, 22-24 June 2009 MODIFIED

More information

Uncertainty Quantification in Performance Evaluation of Manufacturing Processes

Uncertainty Quantification in Performance Evaluation of Manufacturing Processes Uncertainty Quantification in Performance Evaluation of Manufacturing Processes Manufacturing Systems October 27, 2014 Saideep Nannapaneni, Sankaran Mahadevan Vanderbilt University, Nashville, TN Acknowledgement

More information

Uncertainty Analysis of the Harmonoise Road Source Emission Levels

Uncertainty Analysis of the Harmonoise Road Source Emission Levels Uncertainty Analysis of the Harmonoise Road Source Emission Levels James Trow Software and Mapping Department, Hepworth Acoustics Limited, 5 Bankside, Crosfield Street, Warrington, WA UP, United Kingdom

More information

Reading Group on Deep Learning Session 1

Reading Group on Deep Learning Session 1 Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular

More information

UDK , Efficient estimation of complex network reliability

UDK , Efficient estimation of complex network reliability UDK 519.235, 519.248 Efficient estimation of complex network reliability K. M. Zuev University of Liverpool, UK S. Wu & J. L. Beck California Institute of Technology, USA ABSTRACT: Complex technological

More information

A Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D.

A Probabilistic Framework for solving Inverse Problems. Lambros S. Katafygiotis, Ph.D. A Probabilistic Framework for solving Inverse Problems Lambros S. Katafygiotis, Ph.D. OUTLINE Introduction to basic concepts of Bayesian Statistics Inverse Problems in Civil Engineering Probabilistic Model

More information

Example: Ground Motion Attenuation

Example: Ground Motion Attenuation Example: Ground Motion Attenuation Problem: Predict the probability distribution for Peak Ground Acceleration (PGA), the level of ground shaking caused by an earthquake Earthquake records are used to update

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the

More information

Modelling Under Risk and Uncertainty

Modelling Under Risk and Uncertainty Modelling Under Risk and Uncertainty An Introduction to Statistical, Phenomenological and Computational Methods Etienne de Rocquigny Ecole Centrale Paris, Universite Paris-Saclay, France WILEY A John Wiley

More information

Adaptive Rejection Sampling with fixed number of nodes

Adaptive Rejection Sampling with fixed number of nodes Adaptive Rejection Sampling with fixed number of nodes L. Martino, F. Louzada Institute of Mathematical Sciences and Computing, Universidade de São Paulo, Brazil. Abstract The adaptive rejection sampling

More information

arxiv: v1 [stat.me] 24 May 2010

arxiv: v1 [stat.me] 24 May 2010 The role of the nugget term in the Gaussian process method Andrey Pepelyshev arxiv:1005.4385v1 [stat.me] 24 May 2010 Abstract The maximum likelihood estimate of the correlation parameter of a Gaussian

More information

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability

More information

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria Uncertainty modeling for robust verifiable design Arnold Neumaier University of Vienna Vienna, Austria Safety Safety studies in structural engineering are supposed to guard against failure in all reasonable

More information

Reliability Analysis of a Tunnel Design with RELY

Reliability Analysis of a Tunnel Design with RELY Reliability Analysis of a Tunnel Design with RELY W.Betz, I. Papaioannou, M. Eckl, H. Heidkamp, D.Straub Reliability-based structural design Eurocode 0 partial safety factors probabilistic techniques decrease

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

Fast Likelihood-Free Inference via Bayesian Optimization

Fast Likelihood-Free Inference via Bayesian Optimization Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology

More information

In-Flight Engine Diagnostics and Prognostics Using A Stochastic-Neuro-Fuzzy Inference System

In-Flight Engine Diagnostics and Prognostics Using A Stochastic-Neuro-Fuzzy Inference System In-Flight Engine Diagnostics and Prognostics Using A Stochastic-Neuro-Fuzzy Inference System Dan M. Ghiocel & Joshua Altmann STI Technologies, Rochester, New York, USA Keywords: reliability, stochastic

More information

Reducing The Computational Cost of Bayesian Indoor Positioning Systems

Reducing The Computational Cost of Bayesian Indoor Positioning Systems Reducing The Computational Cost of Bayesian Indoor Positioning Systems Konstantinos Kleisouris, Richard P. Martin Computer Science Department Rutgers University WINLAB Research Review May 15 th, 2006 Motivation

More information

Problem Solving Strategies: Sampling and Heuristics. Kevin H. Knuth Department of Physics University at Albany Albany NY, USA

Problem Solving Strategies: Sampling and Heuristics. Kevin H. Knuth Department of Physics University at Albany Albany NY, USA Problem Solving Strategies: Sampling and Heuristics Department of Physics University at Albany Albany NY, USA Outline Methodological Differences Inverses vs. Inferences Problem Transformation From Inference

More information

On the Nature of Random System Matrices in Structural Dynamics

On the Nature of Random System Matrices in Structural Dynamics On the Nature of Random System Matrices in Structural Dynamics S. ADHIKARI AND R. S. LANGLEY Cambridge University Engineering Department Cambridge, U.K. Nature of Random System Matrices p.1/20 Outline

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

Artificial Intelligence

Artificial Intelligence ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems

Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Krzysztof Fidkowski, David Galbally*, Karen Willcox* (*MIT) Computational Aerospace Sciences Seminar Aerospace Engineering

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Sequential Importance Sampling for Structural Reliability Analysis

Sequential Importance Sampling for Structural Reliability Analysis Sequential Importance Sampling for Structural Reliability Analysis Iason Papaioannou a, Costas Papadimitriou b, Daniel Straub a a Engineering Risk Analysis Group, Technische Universität München, Arcisstr.

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Comment on Article by Scutari

Comment on Article by Scutari Bayesian Analysis (2013) 8, Number 3, pp. 543 548 Comment on Article by Scutari Hao Wang Scutari s paper studies properties of the distribution of graphs ppgq. This is an interesting angle because it differs

More information

Uncertainty quantification for spatial field data using expensive computer models: refocussed Bayesian calibration with optimal projection

Uncertainty quantification for spatial field data using expensive computer models: refocussed Bayesian calibration with optimal projection University of Exeter Department of Mathematics Uncertainty quantification for spatial field data using expensive computer models: refocussed Bayesian calibration with optimal projection James Martin Salter

More information

Intelligent Systems I

Intelligent Systems I 1, Intelligent Systems I 12 SAMPLING METHODS THE LAST THING YOU SHOULD EVER TRY Philipp Hennig & Stefan Harmeling Max Planck Institute for Intelligent Systems 23. January 2014 Dptmt. of Empirical Inference

More information

Risk Analysis: Efficient Computation of Failure Probability

Risk Analysis: Efficient Computation of Failure Probability Risk Analysis: Efficient Computation of Failure Probability Nassim RAZAALY CWI - INRIA Bordeaux nassim.razaaly@inria.fr 20/09/2017 Nassim RAZAALY (INRIA-CWI) Tail Probability 20/09/2017 1 / 13 Motivation:

More information

Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models

Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models Kathrin Menberg 1,2, Yeonsook Heo 2, Ruchi Choudhary 1 1 University of Cambridge, Department of Engineering, Cambridge,

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information