Using Markov Chain Monte Carlo to Solve your Toughest Problems in Aerospace

Size: px
Start display at page:

Download "Using Markov Chain Monte Carlo to Solve your Toughest Problems in Aerospace"

Transcription

1 Using Markov Chain Monte Carlo to Solve your Toughest Problems in Aerospace AIAA Annual Technology Symposium 30 April 2010 Mark A. Powell Attwater Consulting

2 Introduction Review of Monte Carlo Methods Introduce Markov Chain Monte Carlo Methods Examples of Aerospace Problems not Solvable without MCMC Many Details and Resources in Backup Slides, Contact Me for this Presentation or Other Papers Slide # 2

3 Review of Monte Carlo What are Monte Carlo Methods? Numerical Methods to Approximate Solutions for Definite Integrals Used when no Closed Form Analytical Solution is Available Very Useful to Calculate Probabilities for Complex Problems Requirements Ability to Random Sample the Integrand Computer Time Slide # 3

4 A Simple Example Suppose we want to Evaluate this Integral: Might Be Able to Look it Up or Use a Tool Might Recognize the Integrand and Use Probability Tables Or, Recognize and Use Monte Carlo to Approximate Obtain N samples from a N(0,1) using a Built-in Sampler Count the number between -2 and 1 and divide by N Here is the R code to Do this where N=5000 > x<-rnorm(5000,0,1) # get the 5000 standard normal samples > cnt<-0 # zero our counter > for (i in 1:5000){ # Count up those inside the interval + if (x[[i]]>=-2) if (x[[i]]<=1) cnt<-cnt+1} > cnt/5000 [1] Our Approximation Error is only not Bad! Slide # 4

5 Monte Carlo Catch(es) Must be able to Sample the Integrand For Most Problems, Built-in Samplers do not Exist in Any Tool Most Often, Make Assumptions to Allow Use of Built-in Samplers Danger Will Robinson! How Many Samples are Needed? Depends on Needed Precision Depends on Needed Number of Significant Digits Depends on Repeatability of Answer with Needed Precision and Significant Digits - Convergence Can Only be Determined Empirically, Problem Unique Slide # 5

6 Suppose Your Problem Does Not Have a Closed Form Analytical Solution and, No Built-in Samplers Exist for Your Integrand and, You do not Want to Assume Anything You can Use Markov Chain Monte Carlo Does Not Require Built-in Samplers Does Not Require Questionable Assumptions MCMC Developed about 20 Years ago Used Extensively in Europe for Biostatistics Hardly Used at All in Aerospace even Today Can Enable Quick and Easy Solutions to the Toughest Problems Slide # 6

7 Some Remarkable Capabilities with MCMC Markov Chains can be Produced that Cover the Domain for ANY Integrand Named and Recognized Integrands Unrecognized Analytical Integrands Complex Arbitrary Integrands Even Improper Integrands Even Joint Integrands with more than 10,000 Correlated (Dependent) Variables (multivariate) Marginal Integrands for Each Joint Dependent Variable are Immediately Available Full Joint and Conditional Integrands of Complex Functions of the Dependent Variables are Immediately Available Slide # 7

8 However, Even MCMC has a Few Catches Markov Chains for MCMC must be Tuned by Hand (and eye) Need Burn-in for Stationarity Need Adjustment for Sampling Stability The Solutions are Quite Simple Ignore Early Samples for Burn-in Visually Determine if Stability is Achieved The Basic Algorithm Used to Generate the Markov Chains is Unbelievably Simple Metropolis-Hastings Algorithm Easy to Tune, Using Sample Acceptance Ratio and Visual Inspection of Samples As you may have Guessed There are No Tools which can Do MCMC Coding is Needed Fortunately, it is easy! Slide # 8

9 MCMC Algorithms Primary algorithm called Metropolis-Hastings (M-H) Very General Very Simple Gibbs Sampling is a special M-H algorithm with fast convergence to steady state Must have explicit analytical and recognizable Conditional Models BUGS (Bayesian inference Using Gibbs Sampling) software is free and proven Very Few Problems in Engineering Can Use Gibbs Sampling Slide # 9

10 MCMC Summary MCMC Methods Do Not Require Recognized Integrands MCMC Methods Allow Solutions to Engineering and Decision Problems Previously Unsolvable as Stated MCMC Methods Allow Elimination of Assumptions MCMC Algorithms, Tuning, and Extensions are: Embarrassingly Easy! Slide # 10

11 Examples of Aerospace Problems Solvable Only with Markov Chain Monte Carlo Slide # 11

12 Astronaut Bone Fracture Risk On-Orbit Astronaut Bone Fractures could have Severe Consequences To the Astronaut To the Mission Very Low Probability Event No Astronaut has Ever Broken a Bone during a Mission in History Example Risk Questions What is the Risk of Bone Fracture for Long Mars Missions? How Much will the Risk Increase if International Space Station Missions extend from 180 to 365 Days? (Paper Published at IEEE Aerospace Conference March 2010, Contact Me if you Would Like a Copy) Slide # 12

13 The Available Information 977 Astronaut Missions of Varying Lengths (as of May 2005) No Events Observed No Bones Broken Did Observe 977 Mission Lengths without a Broken Bone (DATA) Slide # 13

14 Risk Results Parameterized for Mission Duration Plotted Various Assurance Levels As a Function of Mission Duration For Mars Missions of 270 Days We Can be 95% Certain that Risk of fracture during the Mission is < 3%, Based on the Information Available Quantified Result consistent with Intuition! Slide # 14

15 The ISS Mission Extension Question Bar Legend Left Side at 5 th Quantile Right Side at 95 th Quantile Color Density Proportional to Probability Density Black Bar at Median (50 th Quantile) Decision is now Much Easier Slide # 15

16 Example: US Coast Guard C130 Cockpit Cooling Turbine Cooling Turbine Provides Cooling and Pressurization to the C130 Crew Failure in Service Loss of Cooling, but More Important, Loss of Cabin Pressurization Smoke, Loud, Crew Must Secure Mission Compromised Costs Replacement: $30,000 Refurbishment: $500 Slide # 16

17 The C130 PM Problem 60:1 Ratio, Replacement:Maintain Only Had Five Failure Data: 463, 538, 1652, 1673, and 2462 flight hours Only Had One Survivor Datum: 96 flight hours What PM Interval to Select? USCG Decision Makers Paralyzed (Paper Published at IEEE Aerospace Conference March 2010, Contact Me if you Would Like a Copy) Slide # 17

18 Figure of Merit USCG had No Idea what the Failure in Service Mode was Costing Them for the Fleet of C130 s Needed to Evaluate Cost Savings for the Fleet of C130 s using a Preventative Maintenance Interval Parameterized as a function of Candidate PM Intervals The Integrand for this Problem was Rather Nasty Used MCMC Rather Liberally Slide # 18

19 Results Used MCMC to Sample Joint Uncertainty Model for Weibull Parameters Based Solely on the Data pd(η,β data) Non-Intuitive Porkchop Looking Distribution Lots of Outliers Good! Correlations Not Describable Using any Recognized Models Slide # 19

20 Cost Savings Risks Using a PM Interval - Parameterized Full Distributions Per Flight Hour, Per Bird - CS tpm Based Solely on The Data, Parameterized as a function of PM Interval in flight hours Plotted Only 5 th, Most Likely, and 95 th percentile Cost Savings Risks Peaks all between 100 and 450 flight hours At tpm = 250 hours, 95% Certain, based on the data, that USCG can SAVE at least $17 per flight hour per bird Slide # 20

21 Example: ISS O2 Sensor Drift Problem: Space Station Oxygen Sensor Measurement Accuracy is Observed to drift with Time If the Measured O 2 is in Error by more than ±6mmHG within 270 days since Calibration, it could Kill an Astronaut Already Compensating for Pressure Variations in Measurement Accuracy (Successful) Proposed Solution Options: Test for Drift rates and Compensate for Drift; OR, Redesign O 2 Sensor and Ship Up to ISS Questions: What is the Existing Risk of Sensor Accuracy Drift Beyond Acceptable Limits? Nobody Knew! What is the Risk After the Proposed Drift Compensation? Nobody Knew! Slide # 21

22 O2 Sensor Test Data Drift of the CSA-O2s During Long Life Evaluation (Data is pressure corrected) 8 Accuracy (mmhg) Days Since Calibration 270 Days Linear (1039) Linear (1031) Linear (1026) Linear (1014) Linear (1037) Slide # 22

23 Drift Corrected O2 Sensor Data Drift Time-Corrected CSA-O2s During Long Life Evaluation (Data is pressure corrected) Accuracy (mmhg) Days Since Calibration 270 Days Slide # 23

24 Problem Setup Use Gaussian Model for Measurement Errors Mean of Errors Appears to Drift with Time Model This: μ = μ 0 + μ dot * t We are Now Uncertain about μ 0, μ dot, and σ 2 errors ~ N(μ 0 + μ dot * t, σ 2 ) MCMC Code: Relatively Straightforward Slide # 24

25 Before and After Drift Correction Risk Results Linear Scale Logarithmic Scale Slide # 25

26 O2 Sensor Risk Summary Without Drift Compensation, Risk of Exceeding Accuracy Limits at 270 Days is 36-46% (with 90% Certainty) With Drift Compensation, 95% Sure Risk of Exceeding Accuracy Limits at 270 Days is < 1.5% Additional O2 Level Compensation could Reduce Risk Further Slide # 26

27 Summary Markov Chain Monte Carlo Allows Accurate Solutions for the Tough Problems faced in Aerospace MCMC Requires Coding, but the Coding is Short and Easy Contact Me for More Information Slide # 27

28 Backup Slides with More Details Slide # 28

29 Heuristic Guidance for Numbers of Monte Carlo Samples that are Needed Always use N = 10 n where n is an integer Try a Few Runs at low values of n to Bound the Precision Experiment, and When You Achieve 3 Consecutive MC runs that agree to m+1 significant digits, then your MC answer is very likely to be good to m significant digits Slide # 29

30 The Metropolis-Hastings Algorithm To Start, formulate the Model (density) pd(θ), and select a proposal Step Size dθ Start with any legal value: Θ i = Θ 1 Repeat This Loop to get new samples Propose a new sample: Θ i+1 = Θ i + ΔΘ, where ΔΘ ~U(-dΘ,dΘ), a Simple Uniform Sample Calculate the ratio (α) of Proposed Model Density to Previous Model Density: α = pd(θ i+1 data)/ pd(θ i data) Obtain a sample u from a uniform distribution: u~u(0,1) If u < α, then accept the proposed sample as Θ i+1, else set the new sample to the previous one: Θ i+1 = Θ I Maintain a Count of the Accepted Proposed Samples Slide # 30

31 M-H Algorithm Notes Θ can be a vector of parameters Θ = (θ 1, θ 2, θ 3,, θ n ) Can propose an entire new vector then test for acceptance of the new vector (dθ and ΔΘ are vectors), or Can add an internal loop to propose each new parameter and test it for acceptance to get a new sample vector Each sample of the vector Θ is a sample of the joint Model Can use other distributions for proposal besides U(-dΘ,dΘ) Scatterplots of θ j vs θ k, j k, show correlation or marginal joint densities Set of θ i samples provide Marginal Models for the i th parameter without doing this integral: Slide # 31

32 Some M-H Heuristics Use short test chains (~1,000 points) to tune Means are good starting points for quicker Burn-in Two Std. Deviations are good first proposal step sizes Parameter-by-parameter (inner loop) acceptance testing is easier to tune than vector-at-once acceptance testing Beware Changes on proposal step size for one parameter may change acceptance ratios for other parameters Improper Models can show stationarity in short chains, Track-off to non-stationarity in short chains, then back to stationarity Slide # 32

33 Tuning the M-H Algorithm Subjective judgment is usually all that is needed Need to collect data on numbers of proposed samples accepted (vector or on each parameter) Acceptance Ratios should be within 30-60% range, if not, adjust the proposal step size If ratio is too high, samples will show tracking, Increase step size If ratio is too low, samples will get stuck, Decrease step size Must look at Burn-In, Convergence to a Stationary Markov Chain Slide # 33

34 M-H Convergence to Stationarity Scatterplot Marginal samples to identify point at which Markov Chain becomes stationary, use samples beyond this point Called Burn-In Converges before 1K samples Rightmost 9,000 samples Slide # 34

35 MCMC Tuning by Acceptance Ratio and Eye Tracking Acceptance Ratio: 0.9 Increase Step Size Stuck Acceptance Ratio: 0.1 Decrease Step Size About Right Acceptance Ratio: 0.53 Suitable Step Size Slide # 35

36 Gibbs Sampling Algorithm Recall requirement to have recognizable Conditional Models: Obviously, cannot have Improper Conditional Models Changes to M-H algorithm Use parameter by parameter inner loop algorithm Do not need a proposal step size dθ To obtain a sample vector of the joint Model, loop for i=1,n proposing new sample θ i from the its conditional model and accept no need to test No need to worry about acceptance ratio or tuning Burn-in still required Slide # 36

37 Gibbs Sampling Advantages Obviously, a faster algorithm than pure Metropolis-Hastings Can combine Gibbs and M-H algorithms when you know Conditional Models for some parameters, use standard M-H step for those you don t Slide # 37

38 MCMC Extensions As will be Seen in Later Modules, Can be Used to Predict Distributions of Future Observations By Evaluating Complex Functions of MCMC Samples, Obtain Uncertainty Models for that Function Example: Obtained Joint Samples of Weibull Parameters η and β based on Observed Data Using MCMC Samples of Reliability at 250 Hours Obtained by Simply Evaluating Reliability Equation at MCMC Samples Slide # 38

39 Minimum Cost Test Plan Example Automaker Test Requirements: 95% Reliability at 100,000 miles with 90% Confidence; Cheapest Possible Test Test Cost Function: Test Plan Problem: Find Number of Vehicles n that will Survive to What Mileage T that will Produce 90% Confidence that Vehicle has 95% Reliability Basics Failures Modeled as Weibull Model with β = 3 To Meet 95% Reliability, η 269,141 Miles We need n and T that will produce 90% Confidence from our Test Results (none of the n Vehicles fail by Mileage T) that η 269,141 Miles at Minimum Cost Slide # 39

40 The Classical Test Plan Equation from Nelson to Relate n and T for 90% Confidence Level and 95% Reliability Dodson Solution Uses Graphical Optimization for Cost: n = 5 T = 207,840 miles C = $401,368 C vs. n Slide # 40

41 Just a Couple of Minor Problems What is the Probability that if the Vehicles Meet the 95% Reliability Requirement, that the Test will be Successful? I.e., Probability that All Five Vehicles Survive to 207,840 miles Simple Solution: What do we do if One or More of the Vehicles Fail During the Test? Dodson Recommended Replacing it with another Vehicle, Recalculate the Test Duration, and Continue What does this do to our Test Plan Cost? Is this a Good Test Plan? Slide # 41

42 The Second Problem: Nelson Equation Generalizes to Include the One Failure Unexpected Failures The Classical Statistics Change!!!!! New Maximum Test Duration: T 134,634 miles New Maximum Test Cost: C $315,213 New Test Plan Break One if all Five Survive to 127,534 miles C = $285,213 T vs. Failure Mileage C vs. Failure Mileage Slide # 42

43 The PRA/MCMC Test Plan Use an Ignorance Prior - Worst Case Scenario Use MCMC to find n and T at 90% Assurance Level; i.e., P(η 269,241 miles n Survivors to T miles) Use Dodson Graph to Find Optimum Cost Test Plan n = 2; 3 T = 107,192; 93,641 C = $195,429; $195,759 Slide # 43

44 A Little Comparison Classical Test Plan PRA/MCMC Test Plan n 5 3 T 207,840 93,641 C $401,368 $195,759 P(Success R=0.95) 10% 88.13% Using an Ignorance Prior and the Classical Test Plan with PRA and MCMC we Find: Automaker Pays $205,609 for an Unneeded (unrequired) Extra 9.2% Probability (Assurance) that R(100,000 miles Successful Test) = 95% Slide # 44

45 The PRA/MCMC Plan with a Failure Only 11.87% Probability of a Failure Occurring Can Use PRA/MCMC for a Failure Occurring Test Costs Increase (this makes sense) Comparable to Original Classical Test Plan Probability of Test Success given R(100,000 miles) = 0.95 Approaches Unity - Also makes Sense Expected Test Cost Still Reasonable More Test Planning Options may be Considered Zero Failures One Failure Zero or One Failure Number of Prototypes Cost P(Pass Test) Cost P(Pass Test) Expected Cost P(Pass Test) 1 $215, $713, $244, $195, $517, $222, $195, $465, $219, $203, $441, $225, $214, $431, $236, Slide # 45

46 Example Synopsis Test Plan Derived Using Classical Procedures Has some Problems Overconservative - extra 9.2% Assurance Too Expensive - More than Twice PRA/MCMC Plan Cost Too Unlikely to Succeed - Even if the Requirement is Met Overall Screwy - Consider the Effects of one Failure Test Plan Using PRA/MCMC Procedures Much More Reasonable Realistic, even with Ignorance Prior (Worst Case) Likely to Succeed if Requirement Met % More Intuitive, all the Way Around Allows Much More Flexible Test Planning and Execution Slide # 46

47 References Anderson, Theodore Wilbur, An Introduction to Multivariate Statistical Analysis, 2 nd Edition. John Wiley & Sons, Inc., New York, Berger, James O., Statistical Decision Theory and Bayesian Analysis, Second Edition. Springer-Verlag, New York, Bernardo, Jose M., and Smith, Adrian F. M., Bayesian Theory. John Wiley & Sons, Ltd., 2001 Box, George E. P., and Tiao, George C., Bayesian Inference in Statistical Analysis. John Wiley & Sons, Inc., New York, Clemen, Robert T., and Reilly, Terence, Making Hard Decisions. Duxbury, Pacific Grove, CA, Daniels, Jesse, Werner, Paul W., and Bahill, A. Terry, Quantitative Methods for Tradeoff Analyses. Systems Engineering, Volume 4, John Wiley & Sons, Inc., New York, Gamerman, Dani, Markov Chain Monte Carlo. Chapman & Hall, London, Gelman, Andrew, Carlin, John B., Stern, Hal S., and Rubin, Donald B., Bayesian Data Analysis. Chapman and Hall/CRC, Boca Raton, Florida, Gilks, W. R., Richardson, S., and Spiegelhalter, D. J., Markov Chain Monte Carlo in Practice. Chapman & Hall, Boca Raton, Florida, Hammond, John S., Keeney, Ralph L., and Raiffa, Howard, Smart Choices, A Practical Guide to Making Better Decisions. Harvard Business School Press, Boston, Jefferys, William H., and Berger, James O., Ockham s Razor and Bayesian Analysis. American Scientist, Volume 80, Research Triangle Park, NC, Jeffreys, Harold, Theory of Probability. Oxford University Press, Oxford, Raiffa, Howard, and Schlaifer, Robert, Applied Statistical Decision Theory. John Wiley & Sons, Inc., New York, Robert, Christian P., The Bayesian Choice. Springer-Verlag, New York, Robert, Christian P., and Casella, George, Monte Carlo Statistical Methods. Springer-Verlag, New York, Schmitt, Samuel A., Measuring Uncertainty, An Elementary Introduction to Bayesian Statistics. Addison-Wesley Publishing Company, Inc., Phillipines, Sivia, D. S., Data Analysis, A Bayesian Tutorial. Oxford University Press, Oxford, Venables, William N., and Ripley, Brian D., Modern Applied Statistics with S-Plus. Springer-Verlag, New York, Williams, David, Probability with Martingales. Cambridge University Press, Cambridge, Slide # 47

Markov Chain Monte Carlo Algorithms

Markov Chain Monte Carlo Algorithms Markov Chain Monte Carlo Algorithms New Methods for Decision Making Mark A. Powell Mechanical Engineering Systems Engineering University of Idaho Introduction Topics for This Evening Previous Seminar Synopses

More information

eqr094: Hierarchical MCMC for Bayesian System Reliability

eqr094: Hierarchical MCMC for Bayesian System Reliability eqr094: Hierarchical MCMC for Bayesian System Reliability Alyson G. Wilson Statistical Sciences Group, Los Alamos National Laboratory P.O. Box 1663, MS F600 Los Alamos, NM 87545 USA Phone: 505-667-9167

More information

Dealing with Uncertainty in Systems Engineering

Dealing with Uncertainty in Systems Engineering Dealing with Uncertainty in Systems Engineering NASA Johnson Space Center Engineering Academy Seminar 26 July 2007 Mark A. Powell Stevens Institute of Technology Topics Today Uncertainty in Systems Engineering

More information

Bayesian Inference. Chapter 1. Introduction and basic concepts

Bayesian Inference. Chapter 1. Introduction and basic concepts Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master

More information

1. INTRODUCTION. Risk assessment essentially answers one simple question:

1. INTRODUCTION. Risk assessment essentially answers one simple question: Risk Assessment Sensitivities for Very Low Probability Events with Severe Consequences Mark A. Powell Attwater Consulting P.O. Box 57702 Webster, TX 77598-7702 208-521-2941 attwater@aol.com Abstract Modern

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

Tools for Parameter Estimation and Propagation of Uncertainty

Tools for Parameter Estimation and Propagation of Uncertainty Tools for Parameter Estimation and Propagation of Uncertainty Brian Borchers Department of Mathematics New Mexico Tech Socorro, NM 87801 borchers@nmt.edu Outline Models, parameters, parameter estimation,

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

Sampling Methods (11/30/04)

Sampling Methods (11/30/04) CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with

More information

MCMC Methods: Gibbs and Metropolis

MCMC Methods: Gibbs and Metropolis MCMC Methods: Gibbs and Metropolis Patrick Breheny February 28 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/30 Introduction As we have seen, the ability to sample from the posterior distribution

More information

Bayesian data analysis in practice: Three simple examples

Bayesian data analysis in practice: Three simple examples Bayesian data analysis in practice: Three simple examples Martin P. Tingley Introduction These notes cover three examples I presented at Climatea on 5 October 0. Matlab code is available by request to

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

NONLINEAR APPLICATIONS OF MARKOV CHAIN MONTE CARLO

NONLINEAR APPLICATIONS OF MARKOV CHAIN MONTE CARLO NONLINEAR APPLICATIONS OF MARKOV CHAIN MONTE CARLO by Gregois Lee, B.Sc.(ANU), B.Sc.Hons(UTas) Submitted in fulfilment of the requirements for the Degree of Doctor of Philosophy Department of Mathematics

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm

On the Optimal Scaling of the Modified Metropolis-Hastings algorithm On the Optimal Scaling of the Modified Metropolis-Hastings algorithm K. M. Zuev & J. L. Beck Division of Engineering and Applied Science California Institute of Technology, MC 4-44, Pasadena, CA 925, USA

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions

Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions Evaluating the value of structural heath monitoring with longitudinal performance indicators and hazard functions using Bayesian dynamic predictions C. Xing, R. Caspeele, L. Taerwe Ghent University, Department

More information

Bayesian Estimation An Informal Introduction

Bayesian Estimation An Informal Introduction Mary Parker, Bayesian Estimation An Informal Introduction page 1 of 8 Bayesian Estimation An Informal Introduction Example: I take a coin out of my pocket and I want to estimate the probability of heads

More information

Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions

Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions R U T C O R R E S E A R C H R E P O R T Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions Douglas H. Jones a Mikhail Nediak b RRR 7-2, February, 2! " ##$%#&

More information

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Statistical Methods in Particle Physics Lecture 1: Bayesian methods Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan

More information

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced

More information

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Jeffrey N. Rouder Francis Tuerlinckx Paul L. Speckman Jun Lu & Pablo Gomez May 4 008 1 The Weibull regression model

More information

Bayesian Networks in Educational Assessment

Bayesian Networks in Educational Assessment Bayesian Networks in Educational Assessment Estimating Parameters with MCMC Bayesian Inference: Expanding Our Context Roy Levy Arizona State University Roy.Levy@asu.edu 2017 Roy Levy MCMC 1 MCMC 2 Posterior

More information

A note on Reversible Jump Markov Chain Monte Carlo

A note on Reversible Jump Markov Chain Monte Carlo A note on Reversible Jump Markov Chain Monte Carlo Hedibert Freitas Lopes Graduate School of Business The University of Chicago 5807 South Woodlawn Avenue Chicago, Illinois 60637 February, 1st 2006 1 Introduction

More information

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns

More information

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals

More information

Bayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida

Bayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida Bayesian Statistical Methods Jeff Gill Department of Political Science, University of Florida 234 Anderson Hall, PO Box 117325, Gainesville, FL 32611-7325 Voice: 352-392-0262x272, Fax: 352-392-8127, Email:

More information

Bayesian modelling. Hans-Peter Helfrich. University of Bonn. Theodor-Brinkmann-Graduate School

Bayesian modelling. Hans-Peter Helfrich. University of Bonn. Theodor-Brinkmann-Graduate School Bayesian modelling Hans-Peter Helfrich University of Bonn Theodor-Brinkmann-Graduate School H.-P. Helfrich (University of Bonn) Bayesian modelling Brinkmann School 1 / 22 Overview 1 Bayesian modelling

More information

Kobe University Repository : Kernel

Kobe University Repository : Kernel Kobe University Repository : Kernel タイトル Title 著者 Author(s) 掲載誌 巻号 ページ Citation 刊行日 Issue date 資源タイプ Resource Type 版区分 Resource Version 権利 Rights DOI URL Note on the Sampling Distribution for the Metropolis-

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 2) Fall 2017 1 / 19 Part 2: Markov chain Monte

More information

The Bayesian Approach to Multi-equation Econometric Model Estimation

The Bayesian Approach to Multi-equation Econometric Model Estimation Journal of Statistical and Econometric Methods, vol.3, no.1, 2014, 85-96 ISSN: 2241-0384 (print), 2241-0376 (online) Scienpress Ltd, 2014 The Bayesian Approach to Multi-equation Econometric Model Estimation

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

A BAYESIAN MATHEMATICAL STATISTICS PRIMER. José M. Bernardo Universitat de València, Spain

A BAYESIAN MATHEMATICAL STATISTICS PRIMER. José M. Bernardo Universitat de València, Spain A BAYESIAN MATHEMATICAL STATISTICS PRIMER José M. Bernardo Universitat de València, Spain jose.m.bernardo@uv.es Bayesian Statistics is typically taught, if at all, after a prior exposure to frequentist

More information

Advanced Statistical Modelling

Advanced Statistical Modelling Markov chain Monte Carlo (MCMC) Methods and Their Applications in Bayesian Statistics School of Technology and Business Studies/Statistics Dalarna University Borlänge, Sweden. Feb. 05, 2014. Outlines 1

More information

SAMPLING ALGORITHMS. In general. Inference in Bayesian models

SAMPLING ALGORITHMS. In general. Inference in Bayesian models SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be

More information

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Madeleine B. Thompson Radford M. Neal Abstract The shrinking rank method is a variation of slice sampling that is efficient at

More information

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution MH I Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution a lot of Bayesian mehods rely on the use of MH algorithm and it s famous

More information

Markov Chain Monte Carlo in Practice

Markov Chain Monte Carlo in Practice Markov Chain Monte Carlo in Practice Edited by W.R. Gilks Medical Research Council Biostatistics Unit Cambridge UK S. Richardson French National Institute for Health and Medical Research Vilejuif France

More information

A Beginner s Guide to MCMC

A Beginner s Guide to MCMC A Beginner s Guide to MCMC David Kipping Sagan Workshop 2016 but first, Sagan workshops, symposiums and fellowships are the bomb how to get the most out of a Sagan workshop, 2009-style lunch with Saganites

More information

Modelling trends in the ocean wave climate for dimensioning of ships

Modelling trends in the ocean wave climate for dimensioning of ships Modelling trends in the ocean wave climate for dimensioning of ships STK1100 lecture, University of Oslo Erik Vanem Motivation and background 2 Ocean waves and maritime safety Ships and other marine structures

More information

Validation of the Thermal Challenge Problem Using Bayesian Belief Networks

Validation of the Thermal Challenge Problem Using Bayesian Belief Networks SANDIA REPORT SAND25-598 Unlimited Release Printed November 25 Validation of the Thermal Challenge Problem Using Bayesian Belief Networks John M. McFarland and Laura P. Swiler Prepared by Sandia National

More information

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version)

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) Rasmus Waagepetersen Institute of Mathematical Sciences Aalborg University 1 Introduction These notes are intended to

More information

Introduction to Bayesian Statistics 1

Introduction to Bayesian Statistics 1 Introduction to Bayesian Statistics 1 STA 442/2101 Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 42 Thomas Bayes (1701-1761) Image from the Wikipedia

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

Bayes Nets: Sampling

Bayes Nets: Sampling Bayes Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Approximate Inference:

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Metropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9

Metropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9 Metropolis Hastings Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601 Module 9 1 The Metropolis-Hastings algorithm is a general term for a family of Markov chain simulation methods

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Markov Chain Monte Carlo Michael Johannes Columbia University Nicholas Polson University of Chicago August 28, 2007 1 Introduction The Bayesian solution to any inference problem is a simple rule: compute

More information

Bayesian Phylogenetics:

Bayesian Phylogenetics: Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes

More information

The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations

The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations The Mixture Approach for Simulating New Families of Bivariate Distributions with Specified Correlations John R. Michael, Significance, Inc. and William R. Schucany, Southern Methodist University The mixture

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

Quantile POD for Hit-Miss Data

Quantile POD for Hit-Miss Data Quantile POD for Hit-Miss Data Yew-Meng Koh a and William Q. Meeker a a Center for Nondestructive Evaluation, Department of Statistics, Iowa State niversity, Ames, Iowa 50010 Abstract. Probability of detection

More information

Ronald Christensen. University of New Mexico. Albuquerque, New Mexico. Wesley Johnson. University of California, Irvine. Irvine, California

Ronald Christensen. University of New Mexico. Albuquerque, New Mexico. Wesley Johnson. University of California, Irvine. Irvine, California Texts in Statistical Science Bayesian Ideas and Data Analysis An Introduction for Scientists and Statisticians Ronald Christensen University of New Mexico Albuquerque, New Mexico Wesley Johnson University

More information

A General Bayes Weibull Inference Model for Accelerated Life Testing

A General Bayes Weibull Inference Model for Accelerated Life Testing A General Bayes Weibull Inference Model for Accelerated Life Testing J. René Van Dorp & Thomas A. Mazzuchi The George Washington University, Washington D.C., USA Submitted to: European Safety and Reliability

More information

Markov chain Monte Carlo methods for visual tracking

Markov chain Monte Carlo methods for visual tracking Markov chain Monte Carlo methods for visual tracking Ray Luo rluo@cory.eecs.berkeley.edu Department of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA 94720

More information

Machine Learning. Probabilistic KNN.

Machine Learning. Probabilistic KNN. Machine Learning. Mark Girolami girolami@dcs.gla.ac.uk Department of Computing Science University of Glasgow June 21, 2007 p. 1/3 KNN is a remarkably simple algorithm with proven error-rates June 21, 2007

More information

PUSHING THE LIMITS OF AMS RADIOCARBON DATING WITH IMPROVED BAYESIAN DATA ANALYSIS

PUSHING THE LIMITS OF AMS RADIOCARBON DATING WITH IMPROVED BAYESIAN DATA ANALYSIS RADIOCARBON, Vol 49, Nr 3, 2007, p 1261 1272 2007 by the Arizona Board of Regents on behalf of the University of Arizona PUSHING THE LIMITS OF AMS RADIOCARBON DATING WITH IMPROVED BAYESIAN DATA ANALYSIS

More information

Forward Problems and their Inverse Solutions

Forward Problems and their Inverse Solutions Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather

More information

A Note on Bayesian Inference After Multiple Imputation

A Note on Bayesian Inference After Multiple Imputation A Note on Bayesian Inference After Multiple Imputation Xiang Zhou and Jerome P. Reiter Abstract This article is aimed at practitioners who plan to use Bayesian inference on multiplyimputed datasets in

More information

Theory and Methods of Statistical Inference

Theory and Methods of Statistical Inference PhD School in Statistics cycle XXIX, 2014 Theory and Methods of Statistical Inference Instructors: B. Liseo, L. Pace, A. Salvan (course coordinator), N. Sartori, A. Tancredi, L. Ventura Syllabus Some prerequisites:

More information

MCMC notes by Mark Holder

MCMC notes by Mark Holder MCMC notes by Mark Holder Bayesian inference Ultimately, we want to make probability statements about true values of parameters, given our data. For example P(α 0 < α 1 X). According to Bayes theorem:

More information

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014 Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation

More information

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007 Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

Effects of Error, Variability, Testing and Safety Factors on Aircraft Safety

Effects of Error, Variability, Testing and Safety Factors on Aircraft Safety Effects of Error, Variability, Testing and Safety Factors on Aircraft Safety E. Acar *, A. Kale ** and R.T. Haftka Department of Mechanical and Aerospace Engineering University of Florida, Gainesville,

More information

Online Appendix to: Crises and Recoveries in an Empirical Model of. Consumption Disasters

Online Appendix to: Crises and Recoveries in an Empirical Model of. Consumption Disasters Online Appendix to: Crises and Recoveries in an Empirical Model of Consumption Disasters Emi Nakamura Columbia University Robert Barro Harvard University Jón Steinsson Columbia University José Ursúa Harvard

More information

Advanced Statistical Methods. Lecture 6

Advanced Statistical Methods. Lecture 6 Advanced Statistical Methods Lecture 6 Convergence distribution of M.-H. MCMC We denote the PDF estimated by the MCMC as. It has the property Convergence distribution After some time, the distribution

More information

Bivariate Degradation Modeling Based on Gamma Process

Bivariate Degradation Modeling Based on Gamma Process Bivariate Degradation Modeling Based on Gamma Process Jinglun Zhou Zhengqiang Pan Member IAENG and Quan Sun Abstract Many highly reliable products have two or more performance characteristics (PCs). The

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Systematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009

Systematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009 Systematic uncertainties in statistical data analysis for particle physics DESY Seminar Hamburg, 31 March, 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan

More information

Source Data Applicability Impacts On Epistemic Uncertainty For Launch Vehicle Fault Tree Models

Source Data Applicability Impacts On Epistemic Uncertainty For Launch Vehicle Fault Tree Models Source Data Applicability Impacts On Epistemic Uncertainty For Launch Vehicle Fault Tree Models Society Of Reliability Engineers Huntsville Chapter Redstone Arsenal May 11, 2016 Mohammad AL Hassan, Steven

More information

Session 3A: Markov chain Monte Carlo (MCMC)

Session 3A: Markov chain Monte Carlo (MCMC) Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte

More information

Penalized Loss functions for Bayesian Model Choice

Penalized Loss functions for Bayesian Model Choice Penalized Loss functions for Bayesian Model Choice Martyn International Agency for Research on Cancer Lyon, France 13 November 2009 The pure approach For a Bayesian purist, all uncertainty is represented

More information

Monte Carlo Integration using Importance Sampling and Gibbs Sampling

Monte Carlo Integration using Importance Sampling and Gibbs Sampling Monte Carlo Integration using Importance Sampling and Gibbs Sampling Wolfgang Hörmann and Josef Leydold Department of Statistics University of Economics and Business Administration Vienna Austria hormannw@boun.edu.tr

More information

SMSTC: Probability and Statistics

SMSTC: Probability and Statistics SMSTC: Probability and Statistics Fraser Daly Heriot Watt University October 2018 Fraser Daly (Heriot Watt University) SMSTC: Probability and Statistics October 2018 1 / 28 Outline Probability and Statistics

More information

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset ASTR509-14 Detection William Sealey Gosset 1876-1937 Best known for his Student s t-test, devised for handling small samples for quality control in brewing. To many in the statistical world "Student" was

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos Contents Markov Chain Monte Carlo Methods Sampling Rejection Importance Hastings-Metropolis Gibbs Markov Chains

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Peter Beerli October 10, 2005 [this chapter is highly influenced by chapter 1 in Markov chain Monte Carlo in Practice, eds Gilks W. R. et al. Chapman and Hall/CRC, 1996] 1 Short

More information

Modelling Operational Risk Using Bayesian Inference

Modelling Operational Risk Using Bayesian Inference Pavel V. Shevchenko Modelling Operational Risk Using Bayesian Inference 4y Springer 1 Operational Risk and Basel II 1 1.1 Introduction to Operational Risk 1 1.2 Defining Operational Risk 4 1.3 Basel II

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Why Bellman-Zadeh Approach to Fuzzy Optimization

Why Bellman-Zadeh Approach to Fuzzy Optimization Applied Mathematical Sciences, Vol. 12, 2018, no. 11, 517-522 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ams.2018.8456 Why Bellman-Zadeh Approach to Fuzzy Optimization Olga Kosheleva 1 and Vladik

More information

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

By choosing to view this document, you agree to all provisions of the copyright laws protecting it. Copyright 2017 IEEE. Reprinted, with permission, from Sharon L. Honecker and Umur Yenal, Quantifying the Effect of a Potential Corrective Action on Product Life, 2017 Reliability and Maintainability Symposium,

More information

Simulation of truncated normal variables. Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris

Simulation of truncated normal variables. Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris Simulation of truncated normal variables Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris Abstract arxiv:0907.4010v1 [stat.co] 23 Jul 2009 We provide in this paper simulation algorithms

More information

Precision Engineering

Precision Engineering Precision Engineering 38 (2014) 18 27 Contents lists available at ScienceDirect Precision Engineering j o ur nal homep age : www.elsevier.com/locate/precision Tool life prediction using Bayesian updating.

More information

Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling

Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Monte Carlo Methods Appl, Vol 6, No 3 (2000), pp 205 210 c VSP 2000 Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Daniel B Rowe H & SS, 228-77 California Institute of

More information

Bayesian Inference for the Multivariate Normal

Bayesian Inference for the Multivariate Normal Bayesian Inference for the Multivariate Normal Will Penny Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK. November 28, 2014 Abstract Bayesian inference for the multivariate

More information

Estimation of Operational Risk Capital Charge under Parameter Uncertainty

Estimation of Operational Risk Capital Charge under Parameter Uncertainty Estimation of Operational Risk Capital Charge under Parameter Uncertainty Pavel V. Shevchenko Principal Research Scientist, CSIRO Mathematical and Information Sciences, Sydney, Locked Bag 17, North Ryde,

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Markov chain Monte Carlo

Markov chain Monte Carlo 1 / 26 Markov chain Monte Carlo Timothy Hanson 1 and Alejandro Jara 2 1 Division of Biostatistics, University of Minnesota, USA 2 Department of Statistics, Universidad de Concepción, Chile IAP-Workshop

More information

Bayesian Inference: Concept and Practice

Bayesian Inference: Concept and Practice Inference: Concept and Practice fundamentals Johan A. Elkink School of Politics & International Relations University College Dublin 5 June 2017 1 2 3 Bayes theorem In order to estimate the parameters of

More information

MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES

MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES XX IMEKO World Congress Metrology for Green Growth September 9 14, 212, Busan, Republic of Korea MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES A B Forbes National Physical Laboratory, Teddington,

More information

Hazard Function, Failure Rate, and A Rule of Thumb for Calculating Empirical Hazard Function of Continuous-Time Failure Data

Hazard Function, Failure Rate, and A Rule of Thumb for Calculating Empirical Hazard Function of Continuous-Time Failure Data Hazard Function, Failure Rate, and A Rule of Thumb for Calculating Empirical Hazard Function of Continuous-Time Failure Data Feng-feng Li,2, Gang Xie,2, Yong Sun,2, Lin Ma,2 CRC for Infrastructure and

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Karl Oskar Ekvall Galin L. Jones University of Minnesota March 12, 2019 Abstract Practically relevant statistical models often give rise to probability distributions that are analytically

More information