Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference
|
|
- Cody Pope
- 5 years ago
- Views:
Transcription
1 Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference Chuanhai Liu Department of Statistics Purdue University Joint work with A. P. Dempster Dec. 11, 2006
2 What is DS Analysis (DSA)? State Space Model (SSM) Reference Dempster (2006, IJAR) The problem X Poisson(L) with L 0 and X = 0, 1,... X L The SSM for the Poisson DS model: SSM = {(L,X) : L 0,X = 0, 1,...} December 11, 2006 Copyright c 2006 by Chuanhai Liu 1
3 What is DSA? DS Model (DSM) The Poisson DSM on the SSM (X,L) is defined by associated (a)- random sets based on a standard Poisson a-process that can be visualized as follows. X V 1 V 2 V 3 V 4 V 5 V 6 V 7 L A typical a-random subset is the union of intervals at levels X = 0, 1, 2,... December 11, 2006 Copyright c 2006 by Chuanhai Liu 2
4 What is DSA? DS Calculus (DSC) Combining with a fixed L yields a standard Poisson inference for X. Combining with an observed X yields an a-random inference for L. A-random sets for inference about L Given an observed count X, a-random sets for inference about L are intervals of the form [V X,V X+1 ] where V X (V X+1 V X ), V X Gamma(X, 1), and (V X+1 V X ) Gamma(1, 1) = Exp(1), that is, V X+1 Gamma(X + 1, 1). DSC in symbols (SSM + DSM) DSC = DSA DSM = DSM 1 DSM 2 DSM n December 11, 2006 Copyright c 2006 by Chuanhai Liu 3
5 What is DSA? DS Output Every assertion that you represent as meaningfully true or false has an associated (p, q, r) where p is your probability for the assertion, q is the probability against, and r is your probability of don t know. Example Suppose X = 0. Then the assertion {0 L 3} has (p,q,r) = (0.95, 0, 0.05) December 11, 2006 Copyright c 2006 by Chuanhai Liu 4
6 What is DSA? DS Output Two-cdfs Taking (p,q,r) to be functions of K in the assertion {0 L K} leads to two cdfs. One is the probability for {0 L K}. The other one is the plausibility of {0 L K}. The difference between the two cdfs at K is the posterior probability of don t know. Example Inference about L with the observed Poisson count X = 4. CDF plausibility s(k) = p(k) + r(k) probability p(k) = p({0<l<k}) r(k) = r({0<l<k}) is the prob. of "don t know" K Two cdfs with the observed count X = 4 December 11, 2006 Copyright c 2006 by Chuanhai Liu 5
7 Elements of DS calculus DS concept of DSM To specify a DSM over a given SSM you specify a mass distribution over the subsets of the SSM, later to be referred to in mathematically more conventional terms as an a-probability measure over a sample space of a-random subsets A S. Mass function m(a) for all A S. Example For the Poisson model, m([a,b]) = f VX (a)f VX+1 V X (b a), (0 a b) where f VX (.) and f VX+1 V X (.) denote the density functions of the a- random variables V X and V X+1 V X, respectively. December 11, 2006 Copyright c 2006 by Chuanhai Liu 6
8 Elements of DS calculus DS concept of DSM (continued) The mass m(a) associated with each assertion A S is an atom in the sense that it cannot be further broken down into pieces assigned to subsets of A. Logically, however, the probability p(a) representing the uncertainty that you unambiguously assign to A is different from m(a) because p(a) accumulates masses from all the assertions that imply A. In symbols, p(a) = B A m(b) where the empty set has m( ) = 0. Example For the Poisson model with X = 0, p({0 L 3}) = P(V X+1 3) = pg(3,x + 1) 0.95 q({0 L 3}) = P(V X > 3) = 0 r({0 L 3}) = 1 p({0 L 3}) q({0 L 3}) 0.05 December 11, 2006 Copyright c 2006 by Chuanhai Liu 7
9 Elements of DS calculus DS combination DS extension Input component DSM s are usually defined on a margin, and must be extended to logically equivalent DSM s before combination is defined. DS combination To combine DS-independent DSM s on a common SSM, intersect sets, multiply probabilities, cumulate, and renormalize away conflict. Commonality The collection of (p, q, r) probabilities associated with any DSM s are from a mathematical perspective set functions. Another set function c(a) = m(b) B A called commonality is technically important because it multiplies under DS combination. December 11, 2006 Copyright c 2006 by Chuanhai Liu 8
10 Poisson models and analyses Multiple Poisson counts Assume n DS-independent variables X 1,X 2,...,X n that represent counts, all occurring at the same rate L per unit time, but having differing known periods τ 1,τ 2,...,τ n. This situation requires a DSM that combines n independent Poisson counts over the state space (L,X 1, X 2,...,X n ). You create this DSM mathematically by starting from marginal pairs (L,X i ) with 2-dimensional Poisson DSM s having rates τ i L, then extending each of these to the full (n+1)-dimensional SSM, and finally performing DS-combination on the full state space. December 11, 2006 Copyright c 2006 by Chuanhai Liu 9
11 Poisson models and analyses The role of join trees Inference about L requires DS-combination of 2n input DSM s consisting of the n observations X i and the n Poisson DSM s associated with the pairs (L,X i ). Computation can be carried out in a brute force way involving extension, combination, and marginalization on the full SSM, or equivalently and much more economically by message passing through a mathematical structure called a join tree, as follows: L L,X 1 L,X 2... L,X n... X 1 X 2... X n The join tree for the multiple Poisson inference of Section 3.2. The arrows on the edges indicate the directions of inward propagation. December 11, 2006 Copyright c 2006 by Chuanhai Liu 10
12 Poisson models and analyses Predicting a new count The join tree for predicting a new count given one or more observed counts is as follows: L L,X 1 L,X 2 X 1 X 2 The join tree for prediction of X 2 with arrows indicating the direction of information flow. December 11, 2006 Copyright c 2006 by Chuanhai Liu 11
13 A model from physics Case (1) The problem Consider an interesting statistical problem in particle physics about detecting the Higgs particle. A particular non-negative quantity, denoted by B, is to be estimated with confidence bounds. Two specific models were proposed by a group of physicists and statisticians for study (Heinrich, 2006). Case (1) An experiment yields three counts X 1, X 2, and X 3 that are independent Poissons with means L 1, L 2, and L 3, respectively, where L 1 = BU + E, L 2 = au, and L 3 = be with a and b known constants coming from the experimental set-up, and U and E unknowns that need to be finessed to get at B. December 11, 2006 Copyright c 2006 by Chuanhai Liu 12
14 A model from physics Case (1) DS inference The three a-random intervals for L 1, L 2, and L 3 [V X1,V X1 +1], [V X2,V X2 +1], and [V X3,V X3 +1] where V X1, V X2, V X3, V X1 +1 V X1, V X2 +1 V X2, and V X3 +1 V X3 are independent Gamma random variables. This leads to the marginal a-random interval with the unknown E and U effectively integrated out for inference about B as follows } a max { 0,V X1 1 b V X 3 +1 V X2 +1 B a V X b V X 3 V X2 (1) subject to the non-conflict constraint V X1 +1 b 1 V X3. The two cdfs for inference about B can obtained numerically by making use of Beta distributions. December 11, 2006 Copyright c 2006 by Chuanhai Liu 13
15 A model from physics Case (1) Example An artificial data set X 1 X 2 X 3 a b a CDFS a likelihood CDF B The a-cdfs and a-likelihood function of the parameter of interest, B. December 11, 2006 Copyright c 2006 by Chuanhai Liu 14
16 DS inference a-likelihood function The degree of don t know r(b), computed as the vertical distance between the two a-cdfs, is of the type plausibility of singleton or equivalently commonality of singleton. So in particular this function r(b) is a kind of marginal likelihood function of B. We call r(b) the a-likelihood function of B. The basic reasons for thinking of r(b) as likelihood are (1) r(b) is the ordinary likelihood in simple parametric inference situations, (2) these r(b)s do in the rigorous DS theory multiply across independent replicates, and (3) if one brought in a prior on B and used r(b) as likelihood in the usual Bayesian way, we would have a rigorous DS inference for B. December 11, 2006 Copyright c 2006 by Chuanhai Liu 15
17 A model from physics Case (2) Case (2) There are 10 replicates of the experiment in case (1) with a common B but different U j, E j, a j and b j for j = 1,..., ,000 copies of the data sets of case (2) were simulated for study. We refer to these 70,000 data sets as replicates and to those 10 repeats within each of the 70,000 data sets as subreplicates. In what follows, we denote by n the number of replicates and by n l the number of subreplicates for l = 1,...,n. Simulated data for study It turns out that the prior distributions of the unreported U s, Es, Bs in the simulated data of 70,000 replicates can be easily ascertained by an exploratory data analysis with graphical techniques and the method of moments. A more realistic situation No prior information is available about the unknown U s, Es, and B. The parameter of interest is of course still B. December 11, 2006 Copyright c 2006 by Chuanhai Liu 16
18 A model from physics Case (2) The formal DSM for case (2) with subreplicates n l involves 3 n l intervals (for L (j) 1, L(j) 2, and L(j) 3 for j = 1,...,n l ). Given the fact that a common B is shared by all the n l subreplicates, the projected DS a-random interval for inference about the common B is formed by the intersection of the n l separate a-random intervals of the type (1). The a-random interval for inference about the common value of B can be obtained by MCMC methods. a-likelihood function The n l subreplicates are DS combined simply by multiplying the individual a-likelihoods r j (B), that is r(b) = n l j=1 r j (B), where r j (B) is the a-likelihood of B given the j-th subreplicate. December 11, 2006 Copyright c 2006 by Chuanhai Liu 17
19 A model from physics Case (2) Example An artificial data set Subreplicate X X X a b Inference about B The a-likelihood function of ln B, and its normal approximation. a likelihood normal approximation 1.5 density lnb December 11, 2006 Copyright c 2006 by Chuanhai Liu 18
20 Multiple significance testing The data The data The numbers of faults in 32 rolls of textile fabric (Bissell 1972; see also Cox and Snell, 1981 and Gelman et al, 1995) No. of Faults The green line was fitted by "eye" to the "good" data points The vertical green line segments are the (1/32) predictive intervals (PIs) The red dots are the points laying outside of the 1/32 PIs (371, 14) (735, 17) (895, 28) (905, 23) Roll Length (meters) December 11, 2006 Copyright c 2006 by Chuanhai Liu 19
21 Multiple significance testing A statistical model Data structure The plot indicates that it is reasonable to assume that for each i = 1,..., 32, the number of faults in roll i follows Poission with mean τ i L i, where τ i is the known roll length. One could imagine that (1) most of the rolls were manufactured with the underlying product line being under control and (2) a small unknown number of rolls were manufactured with the product line being out of control. A statistical model For i = 1,..., let X i be the number of faults in roll i and assume that X i s are independent with X i Poisson(τ i L i ). Most of L i s are the same and equal to a common unknown L 0, except for a small unknown number, denoted by K, of L i s that are different from (greater than) L 0. We refer to L i s that are different from L 0 as outliers. December 11, 2006 Copyright c 2006 by Chuanhai Liu 20
22 Multiple significance testing a-random intervals Non-conflict a-random intervals [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ λ o1 ] [ λ o2 ] λ L 0 [Z l = the (K + 1)-th largest V Xi /τ i,z u = min V Xi +1/τ i ]. December 11, 2006 Copyright c 2006 by Chuanhai Liu 21
23 Multiple significance testing Inference about K The basic idea Assume K K max and compute (p,q,r)s for the sequence of assertions {K = k} for k = K max,k max 1,..., 1, i.e., (p k,q k,r k ) = (P({K = k}), 0, 1 P({K = k})) assuming K = 0,...,k for each k = 1,...,K max. A frequency coverage problem k = 1 with K max = 1. p k 1 as n, e.g., K = 0 and December 11, 2006 Copyright c 2006 by Chuanhai Liu 22
24 Multiple significance testing Inference about K Logit p k Plot of logit(p k ) for the observed data and logit(p k ) for 10 simulated data sets without outliers logit(p_k) (Simulated data) logit(p_k) (Observed data) December 11, 2006 Copyright c 2006 by Chuanhai Liu 23
25 Multiple significance testing Inference about K Ratios of minimum deviations Let Z u be the upper end point of the a- random interval for L 0, i.e., Z u = minv Xi +1. Let V (K) l be the K- th largest lower end points of the n a-random intervals. Histograms of V (K) l /Z u for k = 1,..., 8: k = 1 k = 2 k = 3 k = 4 Frequency Frequency Frequency Frequency R R R R k = 5 k = 6 k = 7 k = 8 Frequency Frequency Frequency Frequency R R R R December 11, 2006 Copyright c 2006 by Chuanhai Liu 24
26 Multiple significance testing Identification of outliers Outliers (example) Taking K = 5 leads to the following posterior probabilities of {L i > L 0 } for i = 1,...,n P(X_i is an outlier) X_i/tau_i December 11, 2006 Copyright c 2006 by Chuanhai Liu 25
27 Multiple significance testing An alternative DSM DS model (with latent variables) This model is under investigation. The SSM for each i consists of six variables I, X, Y, Z, L, and M, where Y Poisson(L), Z Poisson(M) with (M > M 0 > 0), I = 0 or 1, X = Y if I = 0 and X = Y + Z if I = 1. ( a ) I = 0 ( b ) I = 1 Y Y Z 0 Z M M The small circles in the (Z,Y ) plane represent the possible points at I = 0 and I = 1 when the observed X is fixed at 5. December 11, 2006 Copyright c 2006 by Chuanhai Liu 26
Discussion on Fygenson (2007, Statistica Sinica): a DS Perspective
1 Discussion on Fygenson (2007, Statistica Sinica): a DS Perspective Chuanhai Liu Purdue University 1. Introduction In statistical analysis, it is important to discuss both uncertainty due to model choice
More informationBy Paul T. Edlefsen, Chuanhai Liu and Arthur P. Dempster Harvard University and Purdue University
Submitted to the Annals of Applied Statistics ESTIMATING THE MASS OF THE HIGGS PARTICLE USING DEMPSTER-SHAFER ANALYSIS By Paul T. Edlefsen, Chuanhai Liu and Arthur P. Dempster Harvard University and Purdue
More informationESTIMATING LIMITS FROM POISSON COUNTING DATA USING DEMPSTER-SHAFER ANALYSIS
Submitted to the Annals of Applied Statistics ESTIMATING LIMITS FROM POISSON COUNTING DATA USING DEMPSTER-SHAFER ANALYSIS By Paul T. Edlefsen, Chuanhai Liu and Arthur P. Dempster Harvard University and
More informationProbabilistic Inference for Multiple Testing
This is the title page! This is the title page! Probabilistic Inference for Multiple Testing Chuanhai Liu and Jun Xie Department of Statistics, Purdue University, West Lafayette, IN 47907. E-mail: chuanhai,
More informationContents. Preface to Second Edition Preface to First Edition Abbreviations PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1
Contents Preface to Second Edition Preface to First Edition Abbreviations xv xvii xix PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1 1 The Role of Statistical Methods in Modern Industry and Services
More informationTHE MINIMAL BELIEF PRINCIPLE: A NEW METHOD FOR PARAMETRIC INFERENCE
1 THE MINIMAL BELIEF PRINCIPLE: A NEW METHOD FOR PARAMETRIC INFERENCE Chuanhai Liu and Jianchun Zhang Purdue University Abstract: Contemporary very-high-dimensional (VHD) statistical problems call attention
More information13: Variational inference II
10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational
More information2. A Basic Statistical Toolbox
. A Basic Statistical Toolbo Statistics is a mathematical science pertaining to the collection, analysis, interpretation, and presentation of data. Wikipedia definition Mathematical statistics: concerned
More informationSubject CS1 Actuarial Statistics 1 Core Principles
Institute of Actuaries of India Subject CS1 Actuarial Statistics 1 Core Principles For 2019 Examinations Aim The aim of the Actuarial Statistics 1 subject is to provide a grounding in mathematical and
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. <www.ie.ncsu.edu/jwilson> May 22, 2006
ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION Slide 1 Faker Zouaoui Sabre Holdings James R. Wilson NC State University May, 006 Slide From American
More informationIntroduction to Applied Bayesian Modeling. ICPSR Day 4
Introduction to Applied Bayesian Modeling ICPSR Day 4 Simple Priors Remember Bayes Law: Where P(A) is the prior probability of A Simple prior Recall the test for disease example where we specified the
More informationStatistical Methods in Particle Physics. Lecture 2
Statistical Methods in Particle Physics Lecture 2 October 17, 2011 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2011 / 12 Outline Probability Definition and interpretation Kolmogorov's
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationLearning Objectives for Stat 225
Learning Objectives for Stat 225 08/20/12 Introduction to Probability: Get some general ideas about probability, and learn how to use sample space to compute the probability of a specific event. Set Theory:
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationStatistical techniques for data analysis in Cosmology
Statistical techniques for data analysis in Cosmology arxiv:0712.3028; arxiv:0911.3105 Numerical recipes (the bible ) Licia Verde ICREA & ICC UB-IEEC http://icc.ub.edu/~liciaverde outline Lecture 1: Introduction
More informationStat 5101 Lecture Notes
Stat 5101 Lecture Notes Charles J. Geyer Copyright 1998, 1999, 2000, 2001 by Charles J. Geyer May 7, 2001 ii Stat 5101 (Geyer) Course Notes Contents 1 Random Variables and Change of Variables 1 1.1 Random
More informationEcon 325: Introduction to Empirical Economics
Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationBayesian analysis in nuclear physics
Bayesian analysis in nuclear physics Ken Hanson T-16, Nuclear Physics; Theoretical Division Los Alamos National Laboratory Tutorials presented at LANSCE Los Alamos Neutron Scattering Center July 25 August
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationIntegrating Correlated Bayesian Networks Using Maximum Entropy
Applied Mathematical Sciences, Vol. 5, 2011, no. 48, 2361-2371 Integrating Correlated Bayesian Networks Using Maximum Entropy Kenneth D. Jarman Pacific Northwest National Laboratory PO Box 999, MSIN K7-90
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More informationBayesian Econometrics
Bayesian Econometrics Christopher A. Sims Princeton University sims@princeton.edu September 20, 2016 Outline I. The difference between Bayesian and non-bayesian inference. II. Confidence sets and confidence
More informationIntroduction to Probability and Statistics (Continued)
Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More information2011 Pearson Education, Inc
Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive
More informationShow Your Work! Point values are in square brackets. There are 35 points possible. Some facts about sets are on the last page.
Formal Methods Name: Key Midterm 2, Spring, 2007 Show Your Work! Point values are in square brackets. There are 35 points possible. Some facts about sets are on the last page.. Determine whether each of
More informationPrimer on statistics:
Primer on statistics: MLE, Confidence Intervals, and Hypothesis Testing ryan.reece@gmail.com http://rreece.github.io/ Insight Data Science - AI Fellows Workshop Feb 16, 018 Outline 1. Maximum likelihood
More informationLabor-Supply Shifts and Economic Fluctuations. Technical Appendix
Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January
More informationWest Windsor-Plainsboro Regional School District Algebra Grade 8
West Windsor-Plainsboro Regional School District Algebra Grade 8 Content Area: Mathematics Unit 1: Foundations of Algebra This unit involves the study of real numbers and the language of algebra. Using
More informationUncertain Inference and Artificial Intelligence
March 3, 2011 1 Prepared for a Purdue Machine Learning Seminar Acknowledgement Prof. A. P. Dempster for intensive collaborations on the Dempster-Shafer theory. Jianchun Zhang, Ryan Martin, Duncan Ermini
More informationSampling Algorithms for Probabilistic Graphical models
Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir
More informationECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts
ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts 1 States and Events In an uncertain situation, any one of several possible outcomes may
More informationProbability Theory Review
Probability Theory Review Brendan O Connor 10-601 Recitation Sept 11 & 12, 2012 1 Mathematical Tools for Machine Learning Probability Theory Linear Algebra Calculus Wikipedia is great reference 2 Probability
More informationIGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden
IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October 28-30 th 2014, Kalmar, Sweden Comparison of probabilistic and alternative evidence theoretical methods for the handling of parameter
More informationTABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1
TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1 1.1 The Probability Model...1 1.2 Finite Discrete Models with Equally Likely Outcomes...5 1.2.1 Tree Diagrams...6 1.2.2 The Multiplication Principle...8
More informationProbabilistic Graphical Models
Parameter Estimation December 14, 2015 Overview 1 Motivation 2 3 4 What did we have so far? 1 Representations: how do we model the problem? (directed/undirected). 2 Inference: given a model and partially
More informationProperties of Probability
Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.
More informationStatistics for Particle Physics. Kyle Cranmer. New York University. Kyle Cranmer (NYU) CERN Academic Training, Feb 2-5, 2009
Statistics for Particle Physics Kyle Cranmer New York University 91 Remaining Lectures Lecture 3:! Compound hypotheses, nuisance parameters, & similar tests! The Neyman-Construction (illustrated)! Inverted
More informationGrundlagen der Künstlichen Intelligenz
Grundlagen der Künstlichen Intelligenz Uncertainty & Probabilities & Bandits Daniel Hennes 16.11.2017 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Uncertainty Probability
More informationECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4
ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationMarkov Networks.
Markov Networks www.biostat.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts Markov network syntax Markov network semantics Potential functions Partition function
More informationHarrison B. Prosper. CMS Statistics Committee
Harrison B. Prosper Florida State University CMS Statistics Committee 08-08-08 Bayesian Methods: Theory & Practice. Harrison B. Prosper 1 h Lecture 3 Applications h Hypothesis Testing Recap h A Single
More information1 Probability Review. 1.1 Sample Spaces
1 Probability Review Probability is a critical tool for modern data analysis. It arises in dealing with uncertainty, in randomized algorithms, and in Bayesian analysis. To understand any of these concepts
More informationNew Statistical Methods That Improve on MLE and GLM Including for Reserve Modeling GARY G VENTER
New Statistical Methods That Improve on MLE and GLM Including for Reserve Modeling GARY G VENTER MLE Going the Way of the Buggy Whip Used to be gold standard of statistical estimation Minimum variance
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationHybrid Censoring; An Introduction 2
Hybrid Censoring; An Introduction 2 Debasis Kundu Department of Mathematics & Statistics Indian Institute of Technology Kanpur 23-rd November, 2010 2 This is a joint work with N. Balakrishnan Debasis Kundu
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationMachine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014
Machine Learning Probability Basics Basic definitions: Random variables, joint, conditional, marginal distribution, Bayes theorem & examples; Probability distributions: Binomial, Beta, Multinomial, Dirichlet,
More informationAn Introduction to Bayesian Linear Regression
An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,
More informationProbability Theory and Applications
Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson
More information12 : Variational Inference I
10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of
More informationHypothesis testing: theory and methods
Statistical Methods Warsaw School of Economics November 3, 2017 Statistical hypothesis is the name of any conjecture about unknown parameters of a population distribution. The hypothesis should be verifiable
More informationArtificial Intelligence
Artificial Intelligence Roman Barták Department of Theoretical Computer Science and Mathematical Logic Summary of last lecture We know how to do probabilistic reasoning over time transition model P(X t
More informationConjugate Analysis for the Linear Model
Conjugate Analysis for the Linear Model If we have good prior knowledge that can help us specify priors for β and σ 2, we can use conjugate priors. Following the procedure in Christensen, Johnson, Branscum,
More informationQuantifying the Price of Uncertainty in Bayesian Models
Provided by the author(s) and NUI Galway in accordance with publisher policies. Please cite the published version when available. Title Quantifying the Price of Uncertainty in Bayesian Models Author(s)
More informationPROBABILITY DISTRIBUTIONS. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception
PROBABILITY DISTRIBUTIONS Credits 2 These slides were sourced and/or modified from: Christopher Bishop, Microsoft UK Parametric Distributions 3 Basic building blocks: Need to determine given Representation:
More informationProbability Review. September 25, 2015
Probability Review September 25, 2015 We need a tool to 1) Formulate a model of some phenomenon. 2) Learn an instance of the model from data. 3) Use it to infer outputs from new inputs. Why Probability?
More information12. Vagueness, Uncertainty and Degrees of Belief
12. Vagueness, Uncertainty and Degrees of Belief KR & R Brachman & Levesque 2005 202 Noncategorical statements Ordinary commonsense knowledge quickly moves away from categorical statements like a P is
More informationLearning Bayesian network : Given structure and completely observed data
Learning Bayesian network : Given structure and completely observed data Probabilistic Graphical Models Sharif University of Technology Spring 2017 Soleymani Learning problem Target: true distribution
More informationMultilevel Statistical Models: 3 rd edition, 2003 Contents
Multilevel Statistical Models: 3 rd edition, 2003 Contents Preface Acknowledgements Notation Two and three level models. A general classification notation and diagram Glossary Chapter 1 An introduction
More informationHierarchical Bayesian Modeling
Hierarchical Bayesian Modeling Making scientific inferences about a population based on many individuals Angie Wolfgang NSF Postdoctoral Fellow, Penn State Astronomical Populations Once we discover an
More informationInstitute of Actuaries of India
Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2018 Examinations Subject CT3 Probability and Mathematical Statistics Core Technical Syllabus 1 June 2017 Aim The
More informationHierarchical Modelling for non-gaussian Spatial Data
Hierarchical Modelling for non-gaussian Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Generalized Linear Models Often data
More informationGenerative Techniques: Bayes Rule and the Axioms of Probability
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 8 3 March 2017 Generative Techniques: Bayes Rule and the Axioms of Probability Generative
More information2 Inference for Multinomial Distribution
Markov Chain Monte Carlo Methods Part III: Statistical Concepts By K.B.Athreya, Mohan Delampady and T.Krishnan 1 Introduction In parts I and II of this series it was shown how Markov chain Monte Carlo
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationPractice Problems Section Problems
Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationDempster-Shafer Theory and Statistical Inference with Weak Beliefs
Submitted to the Statistical Science Dempster-Shafer Theory and Statistical Inference with Weak Beliefs Ryan Martin, Jianchun Zhang, and Chuanhai Liu Indiana University-Purdue University Indianapolis and
More informationICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts
ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes
More informationReasoning with Uncertainty
Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take
More informationUndirected Graphical Models
Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates
More informationProbability and Stochastic Processes
Probability and Stochastic Processes A Friendly Introduction Electrical and Computer Engineers Third Edition Roy D. Yates Rutgers, The State University of New Jersey David J. Goodman New York University
More informationBayesian Approach 2. CSC412 Probabilistic Learning & Reasoning
CSC412 Probabilistic Learning & Reasoning Lecture 12: Bayesian Parameter Estimation February 27, 2006 Sam Roweis Bayesian Approach 2 The Bayesian programme (after Rev. Thomas Bayes) treats all unnown quantities
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationUncertainty and Rules
Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty
More informationBasic Sampling Methods
Basic Sampling Methods Sargur Srihari srihari@cedar.buffalo.edu 1 1. Motivation Topics Intractability in ML How sampling can help 2. Ancestral Sampling Using BNs 3. Transforming a Uniform Distribution
More informationA union of Bayesian, frequentist and fiducial inferences by confidence distribution and artificial data sampling
A union of Bayesian, frequentist and fiducial inferences by confidence distribution and artificial data sampling Min-ge Xie Department of Statistics, Rutgers University Workshop on Higher-Order Asymptotics
More informationStatistics Boot Camp. Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018
Statistics Boot Camp Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018 March 21, 2018 Outline of boot camp Summarizing and simplifying data Point and interval estimation Foundations of statistical
More informationMATHEMATICS (MATH) Calendar
MATHEMATICS (MATH) This is a list of the Mathematics (MATH) courses available at KPU. For information about transfer of credit amongst institutions in B.C. and to see how individual courses transfer, go
More informationInfer relationships among three species: Outgroup:
Infer relationships among three species: Outgroup: Three possible trees (topologies): A C B A B C Model probability 1.0 Prior distribution Data (observations) probability 1.0 Posterior distribution Bayes
More informationarxiv:cs/ v2 [cs.ai] 29 Nov 2006
Belief Conditioning Rules arxiv:cs/0607005v2 [cs.ai] 29 Nov 2006 Florentin Smarandache Department of Mathematics, University of New Mexico, Gallup, NM 87301, U.S.A. smarand@unm.edu Jean Dezert ONERA, 29
More informationPart I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a
More informationProbabilistic Graphical Models. Guest Lecture by Narges Razavian Machine Learning Class April
Probabilistic Graphical Models Guest Lecture by Narges Razavian Machine Learning Class April 14 2017 Today What is probabilistic graphical model and why it is useful? Bayesian Networks Basic Inference
More informationProf. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course
Course on Bayesian Networks, winter term 2007 0/31 Bayesian Networks Bayesian Networks I. Bayesian Networks / 1. Probabilistic Independence and Separation in Graphs Prof. Dr. Lars Schmidt-Thieme, L. B.
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More information13 : Variational Inference: Loopy Belief Propagation and Mean Field
10-708: Probabilistic Graphical Models 10-708, Spring 2012 13 : Variational Inference: Loopy Belief Propagation and Mean Field Lecturer: Eric P. Xing Scribes: Peter Schulam and William Wang 1 Introduction
More informationIntroduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models
Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Matthew S. Johnson New York ASA Chapter Workshop CUNY Graduate Center New York, NY hspace1in December 17, 2009 December
More informationStatistical Tools and Techniques for Solar Astronomers
Statistical Tools and Techniques for Solar Astronomers Alexander W Blocker Nathan Stein SolarStat 2012 Outline Outline 1 Introduction & Objectives 2 Statistical issues with astronomical data 3 Example:
More informationMidterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm
Midterm Examination STA 215: Statistical Inference Due Wednesday, 2006 Mar 8, 1:15 pm This is an open-book take-home examination. You may work on it during any consecutive 24-hour period you like; please
More informationA Bayesian Method for Guessing the Extreme Values in a Data Set
A Bayesian Method for Guessing the Extreme Values in a Data Set Mingxi Wu University of Florida May, 2008 Mingxi Wu (University of Florida) May, 2008 1 / 74 Outline Problem Definition Example Applications
More informationLectures on Statistical Data Analysis
Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk
More information