System reliability using the survival signature

Size: px
Start display at page:

Download "System reliability using the survival signature"

Transcription

1 System reliability using the survival signature Frank Coolen GDRR Ireland 8-10 July 2013 (GDRR 2013) Survival signature 1 / 31

2 Joint work with: Tahani Coolen-Maturi (Durham University) Ahmad Aboalkhair (Mansoura University, Egypt) Abdullah Al-nefaiee (PhD student) (GDRR 2013) Survival signature 2 / 31

3 System reliability: structure function System with m components: State vector x =(x 1, x 2,...,x m ), with x i = 1 if the ith component functions and x i = 0 if not. Structure function (x) =0 if not. (x) =1 if system functions with state x and Assume: (x) is not decreasing in any of the components of x ( coherent system ) and (0) =0 and (1) =1. (This assumption can be deleted) (GDRR 2013) Survival signature 3 / 31

4 System signature T S > 0: random failure time of the system. T j:m : j-th order statistic of the m random component failure times for j = 1,...,m, with T 1:m apple T 2:m apple...apple T m:m. Assume: component failure times are independent and identically distributed with CDF F(t) (exchangeability can be assumed instead). The system s signature is the m-vector q with j-th component q j = P(T S = T j:m ) so q j is the probability that the system failure occurs at the moment of the j-th component failure. (GDRR 2013) Survival signature 4 / 31

5 The signature provides a qualitative description of the system structure that can be used in reliability quantification. with P(T j:m > t) = P(T S > t) = mx r=m j+1 mx q j P(T j:m > t) j=1 m [1 F(t)] r [F(t)] m r r The system structure is fully taken into account through the signature and is separated from the information about the random failure times of the components. (GDRR 2013) Survival signature 5 / 31

6 Consider two systems, each with m components and all failure times of the 2m components assumed to be iid. Let the signature of system A be q a and of system B be q b, and let their failure times be T a and T b. If mx j=r q a j mx j=r q b j for all r = 1,...,m, then for all t > 0. P(T a > t) P(T b > t) (GDRR 2013) Survival signature 6 / 31

7 Straightforward to extend signature to system with more components (same type) and same system failure time. Two systems with different types of components (one type per system) can be compared by with m b Xm a X P(T a < T b )= qi a qj b P(Ti:m a a < Tj:m b b ) i=1 j=1 P(T a i:m a < T b j:m b )= Z 1 0 f i a(t)p(t b j:m b > t)dt where f i a(t) is the PDF for T a i:m a, which, with PDF f a (t) for the failure time of components in system A, is equal to f i a(t) =f a (t) Xm a r a=m a i+1 ma r a [1 F a (t)] ra 1 [F a (t)] ma ra 1 [r a m a (1 F a (t))] (GDRR 2013) Survival signature 7 / 31

8 Computing the signature is difficult for most practical systems, but only needs to be derived once. One can work with bounds of signature ( imprecise signature ), which can be used to determine if its further computation is needed in case of a specific inference. Unfortunately: Generalizing the signature to systems with multiple types of components (nearly all real-world systems) requires (many) probabilities for orderings of order statistics of different distributions to be computed - this is very difficult. (GDRR 2013) Survival signature 8 / 31

9 Survival signature For systems with one type of component: Let (l), for l = 1,...,m, denote the probability that a system functions given that precisely l of its components function. For coherent systems (l) is an increasing function of l. There are m l state vectors x with P m i=1 x i = l; let S l be the set of these state vectors. Due to the iid assumption for the failure times of the m components, (l) = m 1 X l x2s l (x) We call (l) the (system) survival signature. (GDRR 2013) Survival signature 9 / 31

10 Let C t 2{0, 1,...,m} denote the number of components in the system that function at time t > 0. If the probability distribution of the component failure time has CDF F(t), then for l 2{0, 1,...,m} P(C t = l) = m l [F(t)] m l [1 F(t)] l and P(T S > t) = mx l=0 (l)p(c t = l) The survival signature is easily derived from the signature mx (l) = j=m l+1 so it adops all the nice properties of signatures. q j (GDRR 2013) Survival signature 10 / 31

11 Multiple types of components Let (l 1, l 2,...,l K ), for l k = 0, 1,...,m k, denote the probability that a system functions given that precisely l k of its components of type k function, for each k 2{1, 2,...,K }. There are m k l k state vectors x k with precisely l k of their m k components xi k = 1, so with P m k i=1 x i k = l k. Let S l1,...,l K denote the set of all state vectors for the whole system for which P m k i=1 x i k = l k, k = 1, 2,...,K. Due to the iid assumption for the failure times of the m k components of type k, (l 1,...,l K )= " K Y k=1 mk l k 1 # X x2s l1,...,l K (x) (GDRR 2013) Survival signature 11 / 31

12 Let Ct k 2{0, 1,...,m k } denote the number of components of type k in the system that function at time t > 0. If the probability distribution for the failure time of components of type k is known and has CDF F k (t), then the probability that the system functions at time t > 0 is m 1 X P(T S > t) = m 1 X l 1 =0 m K X l K =0 " l 1 =0 m K X l K =0 (l 1,...,l K ) (l 1,...,l K )P( KY k=1 mk l k K\ {Ct k = l k })= k=1 # [F k (t)] m k l k [1 F k (t)] l k (GDRR 2013) Survival signature 12 / 31

13 Example Figure: System with 2 types of components (GDRR 2013) Survival signature 13 / 31

14 l 1 l 2 (l 1, l 2 ) l 1 l 2 (l 1, l 2 ) / / / / Table: Survival signature of this system (GDRR 2013) Survival signature 14 / 31

15 Comments on survival signature Computation of the survival signature can be a complicated task, particularly for real systems and networks. But it is only required once, and one can work with bounds based on partial knowledge or computation (and can check if further computation is required if there is a specific reliability aim). Louis Aslett (Trinity College Dublin) has created an R-function for it! Package: ReliabilityTheory: Tools for structural reliability analysis Function: computesystemsurvivalsignature (GDRR 2013) Survival signature 15 / 31

16 Recent results Combining survival signatures of two subsystems in series or parallel configuration. For series: Xl 1 Xl K apple S(l 1,...,l K ) =... 1 (l1 1,...,l1 K ) 2 (l 1 l1 1,...,l K lk 1 ) l1 1=0 lk 1 =0 KY m 1 k m 2 1 k mk k=1 l 1 k l k l 1 k l k Effect of component replacement: l k (l 1,...,l k 1, l k, l k+1,...,l K )= m k (l 1...,l k 1, l k 1, l k+1,...,l K, 1)+ m k l k m k (l 1...,l k 1, l k, l k+1,...,l K, 0) (GDRR 2013) Survival signature 16 / 31

17 Imprecise probability Uncertainty quantification mostly by precise probabilities: For event A, a single (classical, precise) probability P(A), typically satisfying Kolmogorov s axioms Classical probability requires a high level of precision and consistency of information, and is often too restrictive to carefully represent the multi-dimensional nature of uncertainty Lower and Upper Probabilities: P(A) and P(A) respectively, with 0 apple P(A) apple P(A) apple 1 and P(A c )=1 P(A) (conjugacy) (GDRR 2013) Survival signature 17 / 31

18 Walley (1991, imprecise probability ) uses a subjective interpretation, Weichselberger (2001, interval probability ) generalizes Kolmogorov s axioms without an explicit interpretation. Introduction to Imprecise Probabilities (Wiley, late 2013) P(A) reflects evidence in favour of A, and 1 against A. P(A) reflects evidence We attempt to base inference on few model assumptions, not sufficiently strong to lead to precise probabilities. The lower and upper probabilities are the maximum lower and minimum upper bounds, respectively, for all precise probabilities that are in agreement with the assumptions made and the data-based inferences following from these assumptions. (GDRR 2013) Survival signature 18 / 31

19 Nonparametric Predictive Inference (NPI): A nonparametric frequentist statistical approach Predictive: inference for one or more future observation(s) Depends on Hill s assumption A (n) (Hill 1968) (GDRR 2013) Survival signature 19 / 31

20 NPI for Bernoulli random quantities n + m exchangeable Bernoulli trials, each success or failure. Yj l : number of successes in trials j to l. R t = {r 1,...,r t }, with 1 apple t apple m + 1 and 0 apple r 1 < r 2 <...<r t apple m, and define s+r 0 s = 0. For s 2{0,...,n} (Coolen, 1998) n + m 1 P(Y n+m n+1 2 R t Y1 n = s) = n tx apple s + rj s + rj 1 n s + m rj s s n s j=1 Lower probability derived via conjugacy property (GDRR 2013) Survival signature 20 / 31

21 Combining survival signature with NPI System with one type of components: C t is the number out of m future components, so components in the system, that function at time t. In the test of n components, exchangeable with these m components, s t functioned at time t. where P(T S > t) = mx l=0 (l)d(c t = l) D(C t = l) = P(C t apple l) P(C t apple l 1) n + m 1 apple st 1 + l n st + m l = n s t 1 n s t (GDRR 2013) Survival signature 21 / 31

22 System with K types of components (independent): where m 1 X P(T S > t) = l 1 =0 m K X l K =0 KY (l 1,, l K ) D(Ct k = l k ) k=1 D(Ct k = l k ) = P(Ct k apple l k ) P(Ct k apple l k 1) " nk + m 1 s k = k t 1 + l k nk s k t + m k l # k st k 1 n k The corresponding upper probabilities are found similarly, using instead D(C k t = l k )=P(C k t apple l k ) P(C k t apple l k 1) n k s k t (GDRR 2013) Survival signature 22 / 31

23 Example Figure: System with 2 types of components (GDRR 2013) Survival signature 23 / 31

24 Assume n 1 = n 2 = 2 components of each type tested, ordering of their failure times: t1 2 < t1 1 < t2 2 < t1 2 (tk j the j-th ordered failure time of components of type k). t1 2 < t1 1 < t2 2 < t1 2 t 2 P(T S > t) P(T S > t) [0, t1 2) (t1 2, t1 1 ) (t1 1, t2 2 ) (t2 2, t1 2 ) , 1) (t 1 2 Table: S TS (t) and S TS (t) (GDRR 2013) Survival signature 24 / 31

25 While for test failure times ordering t 1 1 < t2 1 < t1 2 < t2 2 : t1 1 < t2 1 < t1 2 < t2 2 t 2 P(T S > t) P(T S > t) [0, t1 1) (t1 1, t2 1 ) (t1 2, t1 2 ) (t 1, t2 2 ) (t2 2, 1) Table: S TS (t) and S TS (t) (GDRR 2013) Survival signature 25 / 31

26 Challenges Computation of survival signatures: link to fault trees, binary decision diagrams, etc Learning survival signature from system-level data (Louis Aslett has presented this for the signature) Use for reliability of (large) networks Applications! (GDRR 2013) Survival signature 26 / 31

27 References Signatures Samaniego FJ (2007) System Signatures and their Applications in Engineering Reliability. Springer. Eryilmaz S (2010) Review of recent advances in reliability of consecutive k-out-of-n and related systems. Journal of Risk and Reliability (GDRR 2013) Survival signature 27 / 31

28 Imprecise Probability and Reliability Coolen FPA, Troffaes MC, Augustin T (2011) Imprecise probability. International Encyclopedia of Statistical Science, M Lovric (Ed.). Springer Coolen FPA, Utkin LV (2011) Imprecise reliability. International Encyclopedia of Statistical Science, M Lovric (Ed.). Springer (GDRR 2013) Survival signature 28 / 31

29 NPI Coolen FPA (2011) Nonparametric predictive inference. International Encyclopedia of Statistical Science, M Lovric (Ed.). Springer Coolen FPA (1998) Low structure imprecise predictive inference for Bayes problem. Statistics & Probability Letters Hill BM (1968) Posterior distribution of percentiles: Bayes theorem for sampling from a population. Journal of the American Statistical Association (GDRR 2013) Survival signature 29 / 31

30 NPI with Signatures Coolen FPA, Al-nefaiee AH (2012) Nonparametric predictive inference for failure times of systems with exchangeable components. Journal of Risk and Reliability Al-nefaiee AH, Coolen FPA. Nonparametric predictive inference for system failure time based on bounds for the signature. Journal of Risk and Reliability to appear. (GDRR 2013) Survival signature 30 / 31

31 Survival Signatures Coolen FPA, Coolen-Maturi T (2012) On generalizing the signature to systems with multiple types of components. Complex Systems and Dependability, W Zamojski et al. (Eds.), Springer Coolen FPA, Coolen-Maturi T, Al-nefaiee AH, Aboalkhair AM (2013) Recent advances in system reliability using the survival signature. Proceedings 20th Advances in Risk and Reliability Technology Symposium 2013, L Jackson, J Andrews (Eds.) (GDRR 2013) Survival signature 31 / 31

Nonparametric Predictive Inference (An Introduction)

Nonparametric Predictive Inference (An Introduction) Nonparametric Predictive Inference (An Introduction) SIPTA Summer School 200 Durham University 3 September 200 Hill s assumption A (n) (Hill, 968) Hill s assumption A (n) (Hill, 968) X,..., X n, X n+ are

More information

The structure function for system reliability as predictive (imprecise) probability

The structure function for system reliability as predictive (imprecise) probability The structure function for system reliability as predictive (imprecise) probability Frank P.A. Coolen 1a, Tahani Coolen-Maturi 2 1 Department of Mathematical Sciences, Durham University, UK 2 Durham University

More information

Introduction to Imprecise Probability and Imprecise Statistical Methods

Introduction to Imprecise Probability and Imprecise Statistical Methods Introduction to Imprecise Probability and Imprecise Statistical Methods Frank Coolen UTOPIAE Training School, Strathclyde University 22 November 2017 (UTOPIAE) Intro to IP 1 / 20 Probability Classical,

More information

Nonparametric predictive inference for ordinal data

Nonparametric predictive inference for ordinal data Nonparametric predictive inference for ordinal data F.P.A. Coolen a,, P. Coolen-Schrijner, T. Coolen-Maturi b,, F.F. Ali a, a Dept of Mathematical Sciences, Durham University, Durham DH1 3LE, UK b Kent

More information

Bayesian Inference for Reliability of Systems and Networks using the Survival Signature

Bayesian Inference for Reliability of Systems and Networks using the Survival Signature Bayesian Inference for Reliability of Systems and Networks using the Survival Signature Louis J. M. Aslett, Frank P. A. Coolen and Simon P. Wilson University of Oxford, Durham University, Trinity College

More information

Parametric and Topological Inference for Masked System Lifetime Data

Parametric and Topological Inference for Masked System Lifetime Data Parametric and for Masked System Lifetime Data Rang Louis J M Aslett and Simon P Wilson Trinity College Dublin 9 th July 2013 Structural Reliability Theory Interest lies in the reliability of systems composed

More information

Three-group ROC predictive analysis for ordinal outcomes

Three-group ROC predictive analysis for ordinal outcomes Three-group ROC predictive analysis for ordinal outcomes Tahani Coolen-Maturi Durham University Business School Durham University, UK tahani.maturi@durham.ac.uk June 26, 2016 Abstract Measuring the accuracy

More information

Nonparametric predictive inference with parametric copulas for combining bivariate diagnostic tests

Nonparametric predictive inference with parametric copulas for combining bivariate diagnostic tests Nonparametric predictive inference with parametric copulas for combining bivariate diagnostic tests Noryanti Muhammad, Universiti Malaysia Pahang, Malaysia, noryanti@ump.edu.my Tahani Coolen-Maturi, Durham

More information

Part 3 Robust Bayesian statistics & applications in reliability networks

Part 3 Robust Bayesian statistics & applications in reliability networks Tuesday 9:00-12:30 Part 3 Robust Bayesian statistics & applications in reliability networks by Gero Walter 69 Robust Bayesian statistics & applications in reliability networks Outline Robust Bayesian Analysis

More information

Semi-parametric predictive inference for bivariate data using copulas

Semi-parametric predictive inference for bivariate data using copulas Semi-parametric predictive inference for bivariate data using copulas Tahani Coolen-Maturi a, Frank P.A. Coolen b,, Noryanti Muhammad b a Durham University Business School, Durham University, Durham, DH1

More information

Introduction to Reliability Theory (part 2)

Introduction to Reliability Theory (part 2) Introduction to Reliability Theory (part 2) Frank Coolen UTOPIAE Training School II, Durham University 3 July 2018 (UTOPIAE) Introduction to Reliability Theory 1 / 21 Outline Statistical issues Software

More information

Nonparametric Predictive Inference for Acceptance Decisions. Mohamed A. Elsaeiti. A thesis presented for the degree of Doctor of Philosophy

Nonparametric Predictive Inference for Acceptance Decisions. Mohamed A. Elsaeiti. A thesis presented for the degree of Doctor of Philosophy Nonparametric Predictive Inference for Acceptance Decisions Mohamed A. Elsaeiti A thesis presented for the degree of Doctor of Philosophy Department of Mathematical Sciences University of Durham England

More information

Finite Element Structural Analysis using Imprecise Probabilities Based on P-Box Representation

Finite Element Structural Analysis using Imprecise Probabilities Based on P-Box Representation Finite Element Structural Analysis using Imprecise Probabilities Based on P-Box Representation Hao Zhang 1, R. L. Mullen 2, and R. L. Muhanna 3 1 University of Sydney, Sydney 2006, Australia, haozhang@usyd.edu.au

More information

Bayesian Reliability Demonstration

Bayesian Reliability Demonstration Bayesian Reliability Demonstration F.P.A. Coolen and P. Coolen-Schrijner Department of Mathematical Sciences, Durham University Durham DH1 3LE, UK Abstract This paper presents several main aspects of Bayesian

More information

ABC methods for phase-type distributions with applications in insurance risk problems

ABC methods for phase-type distributions with applications in insurance risk problems ABC methods for phase-type with applications problems Concepcion Ausin, Department of Statistics, Universidad Carlos III de Madrid Joint work with: Pedro Galeano, Universidad Carlos III de Madrid Simon

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

NONPARAMETRIC PREDICTIVE INFERENCE FOR REPRODUCIBILITY OF TWO BASIC TESTS BASED ON ORDER STATISTICS

NONPARAMETRIC PREDICTIVE INFERENCE FOR REPRODUCIBILITY OF TWO BASIC TESTS BASED ON ORDER STATISTICS REVSTAT Statistical Journal Volume 16, Number 2, April 2018, 167 185 NONPARAMETRIC PREDICTIVE INFERENCE FOR REPRODUCIBILITY OF TWO BASIC TESTS BASED ON ORDER STATISTICS Authors: Frank P.A. Coolen Department

More information

On Computing Signatures of k-out-of-n Systems Consisting of Modules

On Computing Signatures of k-out-of-n Systems Consisting of Modules On Computing Signatures of k-out-of-n Systems Consisting of Modules Gaofeng Da Lvyu Xia Taizhong Hu Department of Statistics Finance, School of Management University of Science Technology of China Hefei,

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Lecture 1: Bayesian Framework Basics

Lecture 1: Bayesian Framework Basics Lecture 1: Bayesian Framework Basics Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de April 21, 2014 What is this course about? Building Bayesian machine learning models Performing the inference of

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Machine Learning using Bayesian Approaches

Machine Learning using Bayesian Approaches Machine Learning using Bayesian Approaches Sargur N. Srihari University at Buffalo, State University of New York 1 Outline 1. Progress in ML and PR 2. Fully Bayesian Approach 1. Probability theory Bayes

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017 Department of Statistical Science Duke University FIRST YEAR EXAM - SPRING 017 Monday May 8th 017, 9:00 AM 1:00 PM NOTES: PLEASE READ CAREFULLY BEFORE BEGINNING EXAM! 1. Do not write solutions on the exam;

More information

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables.

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables. Stat 260 - Lecture 20 Recap of Last Class Last class we introduced the covariance and correlation between two jointly distributed random variables. Today: We will introduce the idea of a statistic and

More information

On the Impact of Robust Statistics on Imprecise Probability Models

On the Impact of Robust Statistics on Imprecise Probability Models ICOSSAR 2009 September 14th, 2009 On the Impact of Robust Statistics on Imprecise Probability Models A Review Thomas Augustin Department of Statistics LMU Munich Germany Robert Hable Department of Mathematics

More information

Probability signatures of multistate systems made up of two-state components

Probability signatures of multistate systems made up of two-state components Probability signatures of multistate systems made up of two-state components J.-L. Marichal, P. Mathonet, J. Navarro, C. Paroissin Mathematics Research Unit, FSTC, University of Luxembourg, Luxembourg

More information

STAT J535: Introduction

STAT J535: Introduction David B. Hitchcock E-Mail: hitchcock@stat.sc.edu Spring 2012 Chapter 1: Introduction to Bayesian Data Analysis Bayesian statistical inference uses Bayes Law (Bayes Theorem) to combine prior information

More information

Bayesian Reliability Analysis: Statistical Challenges from Science-Based Stockpile Stewardship

Bayesian Reliability Analysis: Statistical Challenges from Science-Based Stockpile Stewardship : Statistical Challenges from Science-Based Stockpile Stewardship Alyson G. Wilson, Ph.D. agw@lanl.gov Statistical Sciences Group Los Alamos National Laboratory May 22, 28 Acknowledgments Christine Anderson-Cook

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory

More information

DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS

DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS DIFFERENT APPROACHES TO STATISTICAL INFERENCE: HYPOTHESIS TESTING VERSUS BAYESIAN ANALYSIS THUY ANH NGO 1. Introduction Statistics are easily come across in our daily life. Statements such as the average

More information

Probability and Stochastic Processes

Probability and Stochastic Processes Probability and Stochastic Processes A Friendly Introduction Electrical and Computer Engineers Third Edition Roy D. Yates Rutgers, The State University of New Jersey David J. Goodman New York University

More information

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018 15-388/688 - Practical Data Science: Basic probability J. Zico Kolter Carnegie Mellon University Spring 2018 1 Announcements Logistics of next few lectures Final project released, proposals/groups due

More information

Estimating Risk of Failure of Engineering Structures using Predictive Likelihood

Estimating Risk of Failure of Engineering Structures using Predictive Likelihood Dublin Institute of Technology ARROW@DIT Conference papers School of Civil and Structural Engineering 2006-1 Estimating Risk of Failure of Engineering Structures using Predictive Likelihood Colin C. Caprani

More information

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2017 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields

More information

Naïve Bayes classification

Naïve Bayes classification Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss

More information

Multi-criteria Decision Making by Incomplete Preferences

Multi-criteria Decision Making by Incomplete Preferences Journal of Uncertain Systems Vol.2, No.4, pp.255-266, 2008 Online at: www.jus.org.uk Multi-criteria Decision Making by Incomplete Preferences Lev V. Utkin Natalia V. Simanova Department of Computer Science,

More information

Aarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell

Aarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell Machine Learning 0-70/5 70/5-78, 78, Spring 00 Probability 0 Aarti Singh Lecture, January 3, 00 f(x) µ x Reading: Bishop: Chap, Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell Announcements Homework

More information

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates Rutgers, The State University ofnew Jersey David J. Goodman Rutgers, The State University

More information

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish

More information

Gwinnett has an amazing story to share, spanning 200 years from 1818 until today.

Gwinnett has an amazing story to share, spanning 200 years from 1818 until today. I B, Lm, W - D I C Dm 15, 1818, W C, m D I. T m m m Ck Ck m Jk C. B m, 30, A 20. T mk, B, v v m C. m, m 1776, C C. O A 2, 1776, D I,. C v 437 q m xm 280,000. D 1980, C k, m v. A, 2010, j v 800,000 j m

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

Learning Objectives for Stat 225

Learning Objectives for Stat 225 Learning Objectives for Stat 225 08/20/12 Introduction to Probability: Get some general ideas about probability, and learn how to use sample space to compute the probability of a specific event. Set Theory:

More information

ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. <www.ie.ncsu.edu/jwilson> May 22, 2006

ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. <www.ie.ncsu.edu/jwilson> May 22, 2006 ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION Slide 1 Faker Zouaoui Sabre Holdings James R. Wilson NC State University May, 006 Slide From American

More information

Bounds for reliability of IFRA coherent systems using signatures

Bounds for reliability of IFRA coherent systems using signatures isid/ms/2011/17 September 12, 2011 http://www.isid.ac.in/ statmath/eprints Bounds for reliability of IFRA coherent systems using signatures Jayant V. Deshpande Isha Dewan Indian Statistical Institute,

More information

Part 2: One-parameter models

Part 2: One-parameter models Part 2: One-parameter models 1 Bernoulli/binomial models Return to iid Y 1,...,Y n Bin(1, ). The sampling model/likelihood is p(y 1,...,y n ) = P y i (1 ) n P y i When combined with a prior p( ), Bayes

More information

Machine Learning Lecture 2

Machine Learning Lecture 2 Announcements Machine Learning Lecture 2 Eceptional number of lecture participants this year Current count: 449 participants This is very nice, but it stretches our resources to their limits Probability

More information

Grundlagen der Künstlichen Intelligenz

Grundlagen der Künstlichen Intelligenz Grundlagen der Künstlichen Intelligenz Uncertainty & Probabilities & Bandits Daniel Hennes 16.11.2017 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Uncertainty Probability

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

Durham E-Theses. Predictive Inference with Copulas for Bivariate Data MUHAMMAD, NORYANTI

Durham E-Theses. Predictive Inference with Copulas for Bivariate Data MUHAMMAD, NORYANTI Durham E-Theses Predictive Inference with Copulas for Bivariate Data MUHAMMAD, NORYANTI How to cite: MUHAMMAD, NORYANTI (2016) Predictive Inference with Copulas for Bivariate Data, Durham theses, Durham

More information

PSA Quantification. Analysis of Results. Workshop Information IAEA Workshop

PSA Quantification. Analysis of Results. Workshop Information IAEA Workshop IAEA Training Course on Safety Assessment of NPPs to Assist Decision Making PSA Quantification. Analysis of Results Lecturer Lesson Lesson IV IV 3_7.3 3_7.3 Workshop Information IAEA Workshop City, XX

More information

A noninformative Bayesian approach to domain estimation

A noninformative Bayesian approach to domain estimation A noninformative Bayesian approach to domain estimation Glen Meeden School of Statistics University of Minnesota Minneapolis, MN 55455 glen@stat.umn.edu August 2002 Revised July 2003 To appear in Journal

More information

Bayesian Inference (A Rough Guide)

Bayesian Inference (A Rough Guide) Sigmedia, Electronic Engineering Dept., Trinity College, Dublin. 1 Bayesian Inference (A Rough Guide) Anil C. Kokaram anil.kokaram@tcd.ie Electrical and Electronic Engineering Dept., University of Dublin,

More information

Reliability of Coherent Systems with Dependent Component Lifetimes

Reliability of Coherent Systems with Dependent Component Lifetimes Reliability of Coherent Systems with Dependent Component Lifetimes M. Burkschat Abstract In reliability theory, coherent systems represent a classical framework for describing the structure of technical

More information

Advanced statistical methods for data analysis Lecture 1

Advanced statistical methods for data analysis Lecture 1 Advanced statistical methods for data analysis Lecture 1 RHUL Physics www.pp.rhul.ac.uk/~cowan Universität Mainz Klausurtagung des GK Eichtheorien exp. Tests... Bullay/Mosel 15 17 September, 2008 1 Outline

More information

Asymptotic distribution of the sample average value-at-risk

Asymptotic distribution of the sample average value-at-risk Asymptotic distribution of the sample average value-at-risk Stoyan V. Stoyanov Svetlozar T. Rachev September 3, 7 Abstract In this paper, we prove a result for the asymptotic distribution of the sample

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

arxiv: v1 [stat.me] 23 Oct 2016

arxiv: v1 [stat.me] 23 Oct 2016 ROBUST BAYESIAN RELIABILITY FOR COMPLEX SYSTEMS UNDER PRIOR-DATA CONFLICT Gero Walter 1, Ph.D. and Fran P.A. Coolen 2, Ph.D. arxiv:1610.07222v1 [stat.me] 23 Oct 2016 Abstract In reliability engineering,

More information

Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy

Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy This article was downloaded by: [Ferdowsi University] On: 16 April 212, At: 4:53 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 172954 Registered office: Mortimer

More information

Machine Learning Lecture 2

Machine Learning Lecture 2 Machine Perceptual Learning and Sensory Summer Augmented 6 Computing Announcements Machine Learning Lecture 2 Course webpage http://www.vision.rwth-aachen.de/teaching/ Slides will be made available on

More information

ECE 302: Probabilistic Methods in Electrical Engineering

ECE 302: Probabilistic Methods in Electrical Engineering ECE 302: Probabilistic Methods in Electrical Engineering Test I : Chapters 1 3 3/22/04, 7:30 PM Print Name: Read every question carefully and solve each problem in a legible and ordered manner. Make sure

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

Are Bayesian Inferences Weak for Wasserman s Example?

Are Bayesian Inferences Weak for Wasserman s Example? Are Bayesian Inferences Weak for Wasserman s Example? Longhai Li longhai@math.usask.ca Department of Mathematics and Statistics University of Saskatchewan Saskatoon, Saskatchewan, S7N 5E6 Canada Quebec

More information

Bayesian Methods. David S. Rosenberg. New York University. March 20, 2018

Bayesian Methods. David S. Rosenberg. New York University. March 20, 2018 Bayesian Methods David S. Rosenberg New York University March 20, 2018 David S. Rosenberg (New York University) DS-GA 1003 / CSCI-GA 2567 March 20, 2018 1 / 38 Contents 1 Classical Statistics 2 Bayesian

More information

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015 Probability Refresher Kai Arras, University of Freiburg Winter term 2014/2015 Probability Refresher Introduction to Probability Random variables Joint distribution Marginalization Conditional probability

More information

CUMULATIVE DISTRIBUTION FUNCTION OF MARKOV ORDER 2 GEOMETRIC DISTRIBUTION OF ORDER K

CUMULATIVE DISTRIBUTION FUNCTION OF MARKOV ORDER 2 GEOMETRIC DISTRIBUTION OF ORDER K FUNCTIONAL DIFFERENTIAL EQUATIONS VOLUME 20 2013, NO 1 2 PP. 129 137 CUMULATIVE DISTRIBUTION FUNCTION OF MARKOV ORDER 2 GEOMETRIC DISTRIBUTION OF ORDER K E. SHMERLING Abstract. Simple formulas for calculating

More information

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist

More information

Foundations of Statistical Inference

Foundations of Statistical Inference Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 20 Lecture 6 : Bayesian Inference

More information

Intro to Probability. Andrei Barbu

Intro to Probability. Andrei Barbu Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems

More information

INVERTED KUMARASWAMY DISTRIBUTION: PROPERTIES AND ESTIMATION

INVERTED KUMARASWAMY DISTRIBUTION: PROPERTIES AND ESTIMATION Pak. J. Statist. 2017 Vol. 33(1), 37-61 INVERTED KUMARASWAMY DISTRIBUTION: PROPERTIES AND ESTIMATION A. M. Abd AL-Fattah, A.A. EL-Helbawy G.R. AL-Dayian Statistics Department, Faculty of Commerce, AL-Azhar

More information

Analysis of Experimental Designs

Analysis of Experimental Designs Analysis of Experimental Designs p. 1/? Analysis of Experimental Designs Gilles Lamothe Mathematics and Statistics University of Ottawa Analysis of Experimental Designs p. 2/? Review of Probability A short

More information

Calculating Bounds on Expected Return and First Passage Times in Finite-State Imprecise Birth-Death Chains

Calculating Bounds on Expected Return and First Passage Times in Finite-State Imprecise Birth-Death Chains 9th International Symposium on Imprecise Probability: Theories Applications, Pescara, Italy, 2015 Calculating Bounds on Expected Return First Passage Times in Finite-State Imprecise Birth-Death Chains

More information

Basics of Uncertainty Analysis

Basics of Uncertainty Analysis Basics of Uncertainty Analysis Chapter Six Basics of Uncertainty Analysis 6.1 Introduction As shown in Fig. 6.1, analysis models are used to predict the performances or behaviors of a product under design.

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

risk assessment of systems

risk assessment of systems 12735: Urban Systems Modeling Lec. 05 risk assessment of instructor: Matteo Pozzi C 1 C 1 C 2 C 1 C 2 C 3 C 2 C 3 C 3 1 outline definition of system; classification and representation; two state ; cut

More information

Bayesian Inference: What, and Why?

Bayesian Inference: What, and Why? Winter School on Big Data Challenges to Modern Statistics Geilo Jan, 2014 (A Light Appetizer before Dinner) Bayesian Inference: What, and Why? Elja Arjas UH, THL, UiO Understanding the concepts of randomness

More information

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review STATS 200: Introduction to Statistical Inference Lecture 29: Course review Course review We started in Lecture 1 with a fundamental assumption: Data is a realization of a random process. The goal throughout

More information

Probabilistic and Bayesian Machine Learning

Probabilistic and Bayesian Machine Learning Probabilistic and Bayesian Machine Learning Lecture 1: Introduction to Probabilistic Modelling Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Why a

More information

MATHEMATICAL COMPUTING IN STATISTICS FRED TINSLEY (WITH STEVEN JANKE) JANUARY 18, :40 PM

MATHEMATICAL COMPUTING IN STATISTICS FRED TINSLEY (WITH STEVEN JANKE) JANUARY 18, :40 PM MATHEMATICAL COMPUTING IN STATISTICS FRED TINSLEY (WITH STEVEN JANKE) JANUARY 18, 2008 2:40 PM Personal Background MS in Statistics; PhD in Mathematics 30 years teaching at liberal arts college Service

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

Bayesian Analysis of Risk for Data Mining Based on Empirical Likelihood

Bayesian Analysis of Risk for Data Mining Based on Empirical Likelihood 1 / 29 Bayesian Analysis of Risk for Data Mining Based on Empirical Likelihood Yuan Liao Wenxin Jiang Northwestern University Presented at: Department of Statistics and Biostatistics Rutgers University

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)

More information

Statistical Machine Learning Lecture 1: Motivation

Statistical Machine Learning Lecture 1: Motivation 1 / 65 Statistical Machine Learning Lecture 1: Motivation Melih Kandemir Özyeğin University, İstanbul, Turkey 2 / 65 What is this course about? Using the science of statistics to build machine learning

More information

ebay/google short course: Problem set 2

ebay/google short course: Problem set 2 18 Jan 013 ebay/google short course: Problem set 1. (the Echange Parado) You are playing the following game against an opponent, with a referee also taking part. The referee has two envelopes (numbered

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

RELIABILITY: AN INTERDISCIPLINARY APPLICATION. Paul J. Laumakis, Richard West United States Military Academy

RELIABILITY: AN INTERDISCIPLINARY APPLICATION. Paul J. Laumakis, Richard West United States Military Academy -a Session 3565 RELIABILITY: AN INTERDISCIPLINARY APPLICATION Paul J Laumakis, Richard West United States Military Academy Introduction Assessing the reliability of large-scale systems is a problem common

More information

Machine Learning Lecture 2

Machine Learning Lecture 2 Machine Perceptual Learning and Sensory Summer Augmented 15 Computing Many slides adapted from B. Schiele Machine Learning Lecture 2 Probability Density Estimation 16.04.2015 Bastian Leibe RWTH Aachen

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Two-stage Adaptive Randomization for Delayed Response in Clinical Trials

Two-stage Adaptive Randomization for Delayed Response in Clinical Trials Two-stage Adaptive Randomization for Delayed Response in Clinical Trials Guosheng Yin Department of Statistics and Actuarial Science The University of Hong Kong Joint work with J. Xu PSI and RSS Journal

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTONOMY. FIRST YEAR B.Sc.(Computer Science) SEMESTER I

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTONOMY. FIRST YEAR B.Sc.(Computer Science) SEMESTER I Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTONOMY FIRST YEAR B.Sc.(Computer Science) SEMESTER I SYLLABUS FOR F.Y.B.Sc.(Computer Science) STATISTICS Academic Year 2016-2017

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Probability Theory Review

Probability Theory Review Probability Theory Review Brendan O Connor 10-601 Recitation Sept 11 & 12, 2012 1 Mathematical Tools for Machine Learning Probability Theory Linear Algebra Calculus Wikipedia is great reference 2 Probability

More information

Risk Analysis of Highly-integrated Systems

Risk Analysis of Highly-integrated Systems Risk Analysis of Highly-integrated Systems RA II: Methods (FTA, ETA) Fault Tree Analysis (FTA) Problem description It is not possible to analyse complicated, highly-reliable or novel systems as black box

More information