Uncertain Inference and Artificial Intelligence

Size: px
Start display at page:

Download "Uncertain Inference and Artificial Intelligence"

Transcription

1 March 3, Prepared for a Purdue Machine Learning Seminar

2 Acknowledgement Prof. A. P. Dempster for intensive collaborations on the Dempster-Shafer theory. Jianchun Zhang, Ryan Martin, Duncan Ermini Leaf, Zouyi Zhang, Huiping Xu, Jing-Shiang Hwang, Jun Xie, and Hyokun Yun for collaborations on a variety of IM research projects. NSF support for a joint project with Jun Xie on large-scale multinomial inference and its applications in genome-wide association studies.

3 References Martin, R. and Liu, C. (2011, ) and the references therein. A possible tbook (Liu and Martin, 2012+; Reasoning with Uncertainty) having the futures: A prior-free and valid probabilistic inference system, which is promising for serious applications of statistics. Fully developed valid probabilistic inferential methods for textbook problems A large collection of applications to modern, challenging, and large-scale statistical problems Deeper understanding of existing schools of thought and their strengths and weaknesses. Satisfactory solutions to well-known benchmark problems, including Stein s paradox and the Behrens-Fisher problem A direct attack on the source of uncertainty, which makes learning and teaching easier and more enjoyable

4 Abstract Artificial Intelligence It is difficult, perhaps, to believe that artificial intelligence can be made intelligent enough without a valid probabilistic inferential system as a critical module. After a brief review of existing schools of thought on uncertain inference, we introduce a valid probabilistic inferential framework termed inferential models (IMs). With several simple and benchmark examples, we discuss potential applications of IMs in artificial intelligence (in general and machine learning in particular).

5 Artificial intelligence Machine learning Learning from data What is it? An answer from the web Artificial Intelligence (AI) is the area of computer science focusing on creating machines that can engage on behaviors that humans consider intelligent. The ability to create intelligent machines has intrigued humans since ancient times, and today with the advent of the computer and 50 years of research into AI programming techniques, the dream of smart machines is becoming a reality. Researchers are creating systems which can mimic human thought, understand speech, beat the best human chess player, and countless other feats never before possible.

6 Is the answer precise? Artificial Intelligence Artificial intelligence Machine learning Learning from data If not, blame on Google s machine learning algorithms

7 Artificial intelligence Machine learning Learning from data What is it? An answer from the web Machine learning has been central to AI research from the beginning. Unsupervised learning is the ability to find patterns in a stream of input. Supervised learning includes both classification and numerical regression. Classification is used to determine what category something belongs in, after seeing a number of examples of things from several categories. Regression takes a set of numerical input/output examples and attempts to discover a continuous function that would generate the outputs from the inputs. In reinforcement learning the agent is rewarded for good responses and punished for bad ones. These can be analyzed in terms of decision theory, using concepts like utility. The mathematical analysis of machine learning algorithms and their performance is a branch of theoretical computer science known as computational learning theory.

8 Artificial intelligence Machine learning Learning from data The inference problem Input 1. Data x observed observable quantities X X. 2. Assertion A statements on θ Θ, unknown quantities. 3. Association between X and θ. For example, x is a sample of the population characterized by the cdf F θ (.). Output: 1. Probabilistic uncertainty assessments on the truth or the falsity of A given X = x. 2. Plausible regions for θ and its functions.

9 Uncertain inference Artificial Intelligence Intelligence and uncertainty Probability models Statistical models Existing schools of thought is critical to AI No?

10 Intelligence and uncertainty Probability models Statistical models Existing schools of thought One (simple) kind of uncertain inference Probability models A probability model has a meaningful/valid probability distribution assumed to be adequate for everything. In particular, θ has a valid marginal distribution that can be operated via the usual probability calculus to derive valid, e.g., marginal and conditional posterior distributions. Subjective Bayesian Philosophically, every Bayesian is subjective. Bayes was not Bayesian. What s wrong? Nothing is wrong you make the decision and (you or your clients) should take the consequence.

11 Statistical models Artificial Intelligence Intelligence and uncertainty Probability models Statistical models Existing schools of thought Statistical models In what follows, we consider the cases where you don t have valid distributions for everything, which we refer to as Statistical Models. θ is taken to be unknown.

12 Intelligence and uncertainty Probability models Statistical models Existing schools of thought Objective Bayesian a personal view The idea can be viewed as to use magic priors to approximate (ideal) frequentist results. Remarks: Assertion-specific priors: Certain priors can work for certain assertions on θ. Large-sample theory: It is really on the case when uncertainty goes away; thinking about both normality and vanishing variances in very-high-dimensional problems. Robust Bayesian: The worst case scenario thinking ultimately leads the Bayesian to a non-bayesian school.

13 Existing schools of thought Intelligence and uncertainty Probability models Statistical models Existing schools of thought Bayes: for it to work, it really requires valid priors. Fiducial: it is very interesting. It is wrong (but better than Bayes[?]). Dempster-Shafer: as an extension of both Bayes and fiducial, it requires valid independent individual components that are probabilistically meaningful. For example, individual components are specified with fiducial probabilities. Frequentist: starting with specified rules and criteria, it invites the guess and check approach to uncertain inference. If so, is it very appealing? For example, 24+ methods for 2x2 tables and penalty-based methods.

14 Intelligence and uncertainty Probability models Statistical models Existing schools of thought Remarks These existing methods are useful. All these schools of thought fail for many benchmark examples, such as, the many-normal-means, Behrens-Fisher, and constrained parameter problems. Thinking outside the box may be necessary for new generations.

15 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test The likelihood insufficiency principle Likelihood alone is not sufficient for probabilistic inference. An unobserved but predictable quantity called the auxiliary (a)-variable, must be introduced for predictive/probabilistic inference. Remark: Bayes makes θ predictable. Is it credible/valid?

16 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test The No Validity, No Probability principle? Notation: denote by P x (A) the probability for the truth of A given the observed data x. Definition (validity). An inferential framework is said to be valid if A Θ, P X (A), as a function of X, satisfies P X (A) stochastically Unif(0, 1) under the falsity of A, i.e., under the truth of A c, the negation of A.

17 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test The Inferential Model (IM) framework IM is valid and consists of three steps: Association-step: Associate X and θ with an a-variable z to obtain the mapping Θ X (z) Θ (z π z) consisting of candidate values of θ given X and z. Prediction-step: Predict z with a credible predictive random set (PRS) S θ, i.e., P (S θ z) Unif (0,1), where z π z. Combination-step: Combine x and S θ to obtain Θ x(s θ ) = z Sθ Θ x(z) and compute evidence e x (A) = P (Θ x(s θ ) A) and e x (A c ) = P (Θ x(s θ ) A c ) with e x(a) = 1 e x (A c ) called plausibility.

18 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test X N(θ, 1) Example A-step. X = θ + z, where z N(0,1). P-step. S θ = [ Z, Z ], where Z N(0,1). C-step. e x (A) and e x (A c ) with Θ x (S θ ) = [x Z,x + Z ]. e x (θ 0 ) θ 0 Figure: Plausibility of assertion A = {θ : θ = θ 0 }, indexed by θ 0, given x = Note e x (θ 0 ) = 0.

19 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test X Binomial(n, θ) This is a homework problem for Stat 598D.

20 Efficiency Artificial Intelligence A valid probabilistic inference framework Two simple examples Predictive random sets One sample test See Stat 598D lecture notes on Statistical Inference. Let b(z) be a continuous function and define Then S = {z : b(z) b(z)} (Z π z ). P(S z) Unif(0,1) (z π z ). We can use this result to construct credible PRS.

21 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test Combining information: Conditional IMs Example (A textbook example) Consider the association model X i = θ + z i (z i iid N(0, 1), i = 1,...,n). Write X = θ + z and X i X = (z i z) (i = 1,...,n). Predict z conditional on the observed a-quantities {(z i z)} n 1. This leads to simplified conditional IM: A-step. X = θ + n 1 u, where u N(0, 1). P-step. S = [ U, U ], where U N(0, 1). C-step. Θ x(s) = [ X U / n, X + U / n].

22 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test Efficient inference: Marginal IMs Example (Another textbook example) Consider the association model X i = η + σz i (z i iid N(0, 1), i = 1,...,n). Let θ = (η, σ 2 ) Θ = R R + and write X = η + σ z, s 2 x = σ2 s 2 z, and (X X1)/s x = (z z1)/s z. Predict z and s 2 z conditional on the observed a-quantities (z z1)/s z. This leads to simplified conditional IM: A-step. X = η + sx n u and s 2 x = σ2 s 2 z, where u t n 1(0, 1) s 2 z χ2 n 1. P-step. S = [ U, U ] [0, ], where U t n 1 (0, 1). C-step. Θ x(s) = [ X U s x/ n, X + U s x/ n] [0, ].

23 A valid probabilistic inference framework Two simple examples Predictive random sets One sample test Model selection via AI (or by AS Artificial Statistician)? Consider choosing a model from a collection of models, including, e.g., normal for simplicity (and efficiency) and non-parametric for robustness. See Jianchun Zhang s PhD thesis for an IM-based method.

24 2 2 tables Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem Example (Kidney stone treatment, Steven et al (1994)) Table 1. Small Stones? Table 2. Large Stones Treatment Success Failure Treatment Success Failure A 81 6 A B B For making intelligent decision, there are (at least) two things to consider. Prediction: Conditional on the Stone type. Estimation: Combining data if possible. Thus, check the homogeneity of each of the two tables Table 3. Treatment A & Table 4. Treatment B Stone type Success Failure Stone type Success Failure Small 81 6 Small Large Large 55 25

25 Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem Evidence for and against homogeneity of treatments For each of Table 3 and Table 4, compute 1. e(homogeneous), 2. e(homogeneous), and 3. 95% plausibility interval for the odd ratio. Remarks. 1. Simpson s paradox is related more to wrong statistical analysis, i.e., modeling, than to inferential method(?) How can this be done in AI? 2. Some relevant statistical thoughts Increase precision of prediction via conditioning, and Increase precision of estimation via pooling. Can some basics like these be integrated into AI?

26 Numerical results Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem Figure: Plausibilities for log odd ratios Tables 3 and 4, which shows that pooling makes no sense in this example.

27 Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem Comparing two normal means with unknown variances This is a common textbook, controversial, and practically useful example (Bayes and fiducial do not work well); See Martin, Hwang, and Liu (2010b).

28 Many-normal-means Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem The association model: X i = µ i + z i (z i iid N(0,1),i = 1,...,n). The problem of interest is to infer µ. A very important example for understanding inference. (Bayes and fiducial do not work); See Martin, Hwang, and Liu (2010b).

29 Many-normal-means Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem The usual model for the observable X 1,...,X n: µ i iid N(θ, σ 2 ) (i = 1,...,n) and X i µ ind N(µ i, s 2 i ) (i = 1,...,n) with known positive s 2 1,...,s2 n, where µ = (µ 1,...,µ n) and (θ, σ 2 ) R R + unknown. Here, we are interested in inference about σ 2. Since there are really meaningful prior knowledge in practice, it has been tremendous interest on choosing Bayesian priors.

30 Many-normal-means Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem The sampling model for the observable quantity is X i ind N(θ, σ 2 + s 2 i ) (i = 1,...,n) For simplicity to motivate ideas, consider the case with known θ = 0, that is, X i ind N(0, σ 2 + s 2 i ) (i = 1,...,n) An association model is given by and " nx i=1 nx i=1 X 2 i σ 2 + s 2 i 0 # Xi 2 1/2 B σ 2 + si X 1 q σ 2 + s 2 i = V 1 X n C,..., p σ 2 + sn 2 A = U, where V χ 2 n U Unif (On).

31 Many-normal-means Artificial Intelligence Simpson s paradox The Behrens-Fisher problem Stein s paradox A meta-analysis problem Specify the predictive random set, which predicts u alone, S = {(v, u) : F n(v).5 F n(v).5 } This is a constrained parameter inference problem. Remark. Validity is not a problem, but efficient inference is not straightforward. It requires to consider Generalized Conditional IMs a challenging topic under investigation!

Valid Prior-Free Probabilistic Inference and its Applications in Medical Statistics

Valid Prior-Free Probabilistic Inference and its Applications in Medical Statistics October 28, 2011 Valid Prior-Free Probabilistic Inference and its Applications in Medical Statistics Duncan Ermini Leaf, Hyokun Yun, and Chuanhai Liu Abstract: Valid, prior-free, and situation-specific

More information

Probabilistic Inference for Multiple Testing

Probabilistic Inference for Multiple Testing This is the title page! This is the title page! Probabilistic Inference for Multiple Testing Chuanhai Liu and Jun Xie Department of Statistics, Purdue University, West Lafayette, IN 47907. E-mail: chuanhai,

More information

Discussion on Fygenson (2007, Statistica Sinica): a DS Perspective

Discussion on Fygenson (2007, Statistica Sinica): a DS Perspective 1 Discussion on Fygenson (2007, Statistica Sinica): a DS Perspective Chuanhai Liu Purdue University 1. Introduction In statistical analysis, it is important to discuss both uncertainty due to model choice

More information

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make

More information

Inferential models: A framework for prior-free posterior probabilistic inference

Inferential models: A framework for prior-free posterior probabilistic inference Inferential models: A framework for prior-free posterior probabilistic inference Ryan Martin Department of Mathematics, Statistics, and Computer Science University of Illinois at Chicago rgmartin@uic.edu

More information

Lecture Notes 1: Decisions and Data. In these notes, I describe some basic ideas in decision theory. theory is constructed from

Lecture Notes 1: Decisions and Data. In these notes, I describe some basic ideas in decision theory. theory is constructed from Topics in Data Analysis Steven N. Durlauf University of Wisconsin Lecture Notes : Decisions and Data In these notes, I describe some basic ideas in decision theory. theory is constructed from The Data:

More information

Cromwell's principle idealized under the theory of large deviations

Cromwell's principle idealized under the theory of large deviations Cromwell's principle idealized under the theory of large deviations Seminar, Statistics and Probability Research Group, University of Ottawa Ottawa, Ontario April 27, 2018 David Bickel University of Ottawa

More information

THE MINIMAL BELIEF PRINCIPLE: A NEW METHOD FOR PARAMETRIC INFERENCE

THE MINIMAL BELIEF PRINCIPLE: A NEW METHOD FOR PARAMETRIC INFERENCE 1 THE MINIMAL BELIEF PRINCIPLE: A NEW METHOD FOR PARAMETRIC INFERENCE Chuanhai Liu and Jianchun Zhang Purdue University Abstract: Contemporary very-high-dimensional (VHD) statistical problems call attention

More information

PMR Learning as Inference

PMR Learning as Inference Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning

More information

GENERAL THEORY OF INFERENTIAL MODELS I. CONDITIONAL INFERENCE

GENERAL THEORY OF INFERENTIAL MODELS I. CONDITIONAL INFERENCE GENERAL THEORY OF INFERENTIAL MODELS I. CONDITIONAL INFERENCE By Ryan Martin, Jing-Shiang Hwang, and Chuanhai Liu Indiana University-Purdue University Indianapolis, Academia Sinica, and Purdue University

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

Theory of Maximum Likelihood Estimation. Konstantin Kashin

Theory of Maximum Likelihood Estimation. Konstantin Kashin Gov 2001 Section 5: Theory of Maximum Likelihood Estimation Konstantin Kashin February 28, 2013 Outline Introduction Likelihood Examples of MLE Variance of MLE Asymptotic Properties What is Statistical

More information

Fiducial Inference and Generalizations

Fiducial Inference and Generalizations Fiducial Inference and Generalizations Jan Hannig Department of Statistics and Operations Research The University of North Carolina at Chapel Hill Hari Iyer Department of Statistics, Colorado State University

More information

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review STATS 200: Introduction to Statistical Inference Lecture 29: Course review Course review We started in Lecture 1 with a fundamental assumption: Data is a realization of a random process. The goal throughout

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning

More information

A union of Bayesian, frequentist and fiducial inferences by confidence distribution and artificial data sampling

A union of Bayesian, frequentist and fiducial inferences by confidence distribution and artificial data sampling A union of Bayesian, frequentist and fiducial inferences by confidence distribution and artificial data sampling Min-ge Xie Department of Statistics, Rutgers University Workshop on Higher-Order Asymptotics

More information

2. I will provide two examples to contrast the two schools. Hopefully, in the end you will fall in love with the Bayesian approach.

2. I will provide two examples to contrast the two schools. Hopefully, in the end you will fall in love with the Bayesian approach. Bayesian Statistics 1. In the new era of big data, machine learning and artificial intelligence, it is important for students to know the vocabulary of Bayesian statistics, which has been competing with

More information

Marginal inferential models: prior-free probabilistic inference on interest parameters

Marginal inferential models: prior-free probabilistic inference on interest parameters Marginal inferential models: prior-free probabilistic inference on interest parameters arxiv:1306.3092v4 [math.st] 24 Oct 2014 Ryan Martin Department of Mathematics, Statistics, and Computer Science University

More information

1 Hypothesis Testing and Model Selection

1 Hypothesis Testing and Model Selection A Short Course on Bayesian Inference (based on An Introduction to Bayesian Analysis: Theory and Methods by Ghosh, Delampady and Samanta) Module 6: From Chapter 6 of GDS 1 Hypothesis Testing and Model Selection

More information

Bayesian Linear Regression [DRAFT - In Progress]

Bayesian Linear Regression [DRAFT - In Progress] Bayesian Linear Regression [DRAFT - In Progress] David S. Rosenberg Abstract Here we develop some basics of Bayesian linear regression. Most of the calculations for this document come from the basic theory

More information

Parameter Estimation. Industrial AI Lab.

Parameter Estimation. Industrial AI Lab. Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed

More information

CSC321 Lecture 18: Learning Probabilistic Models

CSC321 Lecture 18: Learning Probabilistic Models CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling

More information

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann Machine Learning! in just a few minutes Jan Peters Gerhard Neumann 1 Purpose of this Lecture Foundations of machine learning tools for robotics We focus on regression methods and general principles Often

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 11 Maximum Likelihood as Bayesian Inference Maximum A Posteriori Bayesian Gaussian Estimation Why Maximum Likelihood? So far, assumed max (log) likelihood

More information

Intelligent Systems I

Intelligent Systems I Intelligent Systems I 00 INTRODUCTION Stefan Harmeling & Philipp Hennig 24. October 2013 Max Planck Institute for Intelligent Systems Dptmt. of Empirical Inference Which Card? Opening Experiment Which

More information

Naïve Bayes classification

Naïve Bayes classification Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Dempster-Shafer Theory and Statistical Inference with Weak Beliefs

Dempster-Shafer Theory and Statistical Inference with Weak Beliefs Submitted to the Statistical Science Dempster-Shafer Theory and Statistical Inference with Weak Beliefs Ryan Martin, Jianchun Zhang, and Chuanhai Liu Indiana University-Purdue University Indianapolis and

More information

Exact and Efficient Inference for Partial Bayes. Problems

Exact and Efficient Inference for Partial Bayes. Problems Exact and Efficient Inference for Partial Bayes Problems arxiv:1802.04050v1 [stat.me] 12 Feb 2018 Yixuan Qiu Department of Statistics, Purdue University, yixuanq@purdue.edu Lingsong Zhang Department of

More information

STAT 830 Bayesian Estimation

STAT 830 Bayesian Estimation STAT 830 Bayesian Estimation Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Bayesian Estimation STAT 830 Fall 2011 1 / 23 Purposes of These

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Bayesian Econometrics

Bayesian Econometrics Bayesian Econometrics Christopher A. Sims Princeton University sims@princeton.edu September 20, 2016 Outline I. The difference between Bayesian and non-bayesian inference. II. Confidence sets and confidence

More information

Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?

Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric? Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric? Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/

More information

Central Limit Theorem ( 5.3)

Central Limit Theorem ( 5.3) Central Limit Theorem ( 5.3) Let X 1, X 2,... be a sequence of independent random variables, each having n mean µ and variance σ 2. Then the distribution of the partial sum S n = X i i=1 becomes approximately

More information

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling Due: Tuesday, May 10, 2016, at 6pm (Submit via NYU Classes) Instructions: Your answers to the questions below, including

More information

Bayesian Inference: What, and Why?

Bayesian Inference: What, and Why? Winter School on Big Data Challenges to Modern Statistics Geilo Jan, 2014 (A Light Appetizer before Dinner) Bayesian Inference: What, and Why? Elja Arjas UH, THL, UiO Understanding the concepts of randomness

More information

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish

More information

CPSC 340: Machine Learning and Data Mining. MLE and MAP Fall 2017

CPSC 340: Machine Learning and Data Mining. MLE and MAP Fall 2017 CPSC 340: Machine Learning and Data Mining MLE and MAP Fall 2017 Assignment 3: Admin 1 late day to hand in tonight, 2 late days for Wednesday. Assignment 4: Due Friday of next week. Last Time: Multi-Class

More information

Statistics Boot Camp. Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018

Statistics Boot Camp. Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018 Statistics Boot Camp Dr. Stephanie Lane Institute for Defense Analyses DATAWorks 2018 March 21, 2018 Outline of boot camp Summarizing and simplifying data Point and interval estimation Foundations of statistical

More information

Bayesian Inference. STA 121: Regression Analysis Artin Armagan

Bayesian Inference. STA 121: Regression Analysis Artin Armagan Bayesian Inference STA 121: Regression Analysis Artin Armagan Bayes Rule...s! Reverend Thomas Bayes Posterior Prior p(θ y) = p(y θ)p(θ)/p(y) Likelihood - Sampling Distribution Normalizing Constant: p(y

More information

SVAN 2016 Mini Course: Stochastic Convex Optimization Methods in Machine Learning

SVAN 2016 Mini Course: Stochastic Convex Optimization Methods in Machine Learning SVAN 2016 Mini Course: Stochastic Convex Optimization Methods in Machine Learning Mark Schmidt University of British Columbia, May 2016 www.cs.ubc.ca/~schmidtm/svan16 Some images from this lecture are

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Lecture 1: Bayesian Framework Basics

Lecture 1: Bayesian Framework Basics Lecture 1: Bayesian Framework Basics Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de April 21, 2014 What is this course about? Building Bayesian machine learning models Performing the inference of

More information

Two examples of the use of fuzzy set theory in statistics. Glen Meeden University of Minnesota.

Two examples of the use of fuzzy set theory in statistics. Glen Meeden University of Minnesota. Two examples of the use of fuzzy set theory in statistics Glen Meeden University of Minnesota http://www.stat.umn.edu/~glen/talks 1 Fuzzy set theory Fuzzy set theory was introduced by Zadeh in (1965) as

More information

A Discussion of the Bayesian Approach

A Discussion of the Bayesian Approach A Discussion of the Bayesian Approach Reference: Chapter 10 of Theoretical Statistics, Cox and Hinkley, 1974 and Sujit Ghosh s lecture notes David Madigan Statistics The subject of statistics concerns

More information

TEACHING INDEPENDENCE AND EXCHANGEABILITY

TEACHING INDEPENDENCE AND EXCHANGEABILITY TEACHING INDEPENDENCE AND EXCHANGEABILITY Lisbeth K. Cordani Instituto Mauá de Tecnologia, Brasil Sergio Wechsler Universidade de São Paulo, Brasil lisbeth@maua.br Most part of statistical literature,

More information

Lecture 20 May 18, Empirical Bayes Interpretation [Efron & Morris 1973]

Lecture 20 May 18, Empirical Bayes Interpretation [Efron & Morris 1973] Stats 300C: Theory of Statistics Spring 2018 Lecture 20 May 18, 2018 Prof. Emmanuel Candes Scribe: Will Fithian and E. Candes 1 Outline 1. Stein s Phenomenon 2. Empirical Bayes Interpretation of James-Stein

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

g-priors for Linear Regression

g-priors for Linear Regression Stat60: Bayesian Modeling and Inference Lecture Date: March 15, 010 g-priors for Linear Regression Lecturer: Michael I. Jordan Scribe: Andrew H. Chan 1 Linear regression and g-priors In the last lecture,

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf 2013-14 We know that X ~ B(n,p), but we do not know p. We get a random sample

More information

Machine Learning. Bayes Basics. Marc Toussaint U Stuttgart. Bayes, probabilities, Bayes theorem & examples

Machine Learning. Bayes Basics. Marc Toussaint U Stuttgart. Bayes, probabilities, Bayes theorem & examples Machine Learning Bayes Basics Bayes, probabilities, Bayes theorem & examples Marc Toussaint U Stuttgart So far: Basic regression & classification methods: Features + Loss + Regularization & CV All kinds

More information

Probabilistic Graphical Models for Image Analysis - Lecture 1

Probabilistic Graphical Models for Image Analysis - Lecture 1 Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.

More information

Confidence Distribution

Confidence Distribution Confidence Distribution Xie and Singh (2013): Confidence distribution, the frequentist distribution estimator of a parameter: A Review Céline Cunen, 15/09/2014 Outline of Article Introduction The concept

More information

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33 Hypothesis Testing Econ 690 Purdue University Justin L. Tobias (Purdue) Testing 1 / 33 Outline 1 Basic Testing Framework 2 Testing with HPD intervals 3 Example 4 Savage Dickey Density Ratio 5 Bartlett

More information

Applied Bayesian Statistics STAT 388/488

Applied Bayesian Statistics STAT 388/488 STAT 388/488 Dr. Earvin Balderama Department of Mathematics & Statistics Loyola University Chicago August 29, 207 Course Info STAT 388/488 http://math.luc.edu/~ebalderama/bayes 2 A motivating example (See

More information

Grundlagen der Künstlichen Intelligenz

Grundlagen der Künstlichen Intelligenz Grundlagen der Künstlichen Intelligenz Uncertainty & Probabilities & Bandits Daniel Hennes 16.11.2017 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Uncertainty Probability

More information

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006 Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)

More information

Graduate Econometrics I: What is econometrics?

Graduate Econometrics I: What is econometrics? Graduate Econometrics I: What is econometrics? Yves Dominicy Université libre de Bruxelles Solvay Brussels School of Economics and Management ECARES Yves Dominicy Graduate Econometrics I: What is econometrics?

More information

BFF Four: Are we Converging?

BFF Four: Are we Converging? BFF Four: Are we Converging? Nancy Reid May 2, 2017 Classical Approaches: A Look Way Back Nature of Probability BFF one to three: a look back Comparisons Are we getting there? BFF Four Harvard, May 2017

More information

Variability within multi-component systems. Bayesian inference in probabilistic risk assessment The current state of the art

Variability within multi-component systems. Bayesian inference in probabilistic risk assessment The current state of the art PhD seminar series Probabilistics in Engineering : g Bayesian networks and Bayesian hierarchical analysis in engeering g Conducted by Prof. Dr. Maes, Prof. Dr. Faber and Dr. Nishijima Variability within

More information

Statistical Models. David M. Blei Columbia University. October 14, 2014

Statistical Models. David M. Blei Columbia University. October 14, 2014 Statistical Models David M. Blei Columbia University October 14, 2014 We have discussed graphical models. Graphical models are a formalism for representing families of probability distributions. They are

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

On Generalized Fiducial Inference

On Generalized Fiducial Inference On Generalized Fiducial Inference Jan Hannig jan.hannig@colostate.edu University of North Carolina at Chapel Hill Parts of this talk are based on joint work with: Hari Iyer, Thomas C.M. Lee, Paul Patterson,

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

CS 188: Artificial Intelligence Spring Today

CS 188: Artificial Intelligence Spring Today CS 188: Artificial Intelligence Spring 2006 Lecture 9: Naïve Bayes 2/14/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Bayes rule Today Expectations and utilities Naïve

More information

Machine Learning Basics: Maximum Likelihood Estimation

Machine Learning Basics: Maximum Likelihood Estimation Machine Learning Basics: Maximum Likelihood Estimation Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics 1. Learning

More information

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing 1 In most statistics problems, we assume that the data have been generated from some unknown probability distribution. We desire

More information

MLE/MAP + Naïve Bayes

MLE/MAP + Naïve Bayes 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University MLE/MAP + Naïve Bayes MLE / MAP Readings: Estimating Probabilities (Mitchell, 2016)

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

Review: Statistical Model

Review: Statistical Model Review: Statistical Model { f θ :θ Ω} A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The statistical model

More information

The Fundamental Principle of Data Science

The Fundamental Principle of Data Science The Fundamental Principle of Data Science Harry Crane Department of Statistics Rutgers May 7, 2018 Web : www.harrycrane.com Project : researchers.one Contact : @HarryDCrane Harry Crane (Rutgers) Foundations

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee and Andrew O. Finley 2 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department

More information

One-parameter models

One-parameter models One-parameter models Patrick Breheny January 22 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/17 Introduction Binomial data is not the only example in which Bayesian solutions can be worked

More information

Conditional probabilities and graphical models

Conditional probabilities and graphical models Conditional probabilities and graphical models Thomas Mailund Bioinformatics Research Centre (BiRC), Aarhus University Probability theory allows us to describe uncertainty in the processes we model within

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective

More information

Parametric Techniques Lecture 3

Parametric Techniques Lecture 3 Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to

More information

COMP90051 Statistical Machine Learning

COMP90051 Statistical Machine Learning COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Trevor Cohn 2. Statistical Schools Adapted from slides by Ben Rubinstein Statistical Schools of Thought Remainder of lecture is to provide

More information

5601 Notes: The Sandwich Estimator

5601 Notes: The Sandwich Estimator 560 Notes: The Sandwich Estimator Charles J. Geyer December 6, 2003 Contents Maximum Likelihood Estimation 2. Likelihood for One Observation................... 2.2 Likelihood for Many IID Observations...............

More information

COMP3702/7702 Artificial Intelligence Lecture 11: Introduction to Machine Learning and Reinforcement Learning. Hanna Kurniawati

COMP3702/7702 Artificial Intelligence Lecture 11: Introduction to Machine Learning and Reinforcement Learning. Hanna Kurniawati COMP3702/7702 Artificial Intelligence Lecture 11: Introduction to Machine Learning and Reinforcement Learning Hanna Kurniawati Today } What is machine learning? } Where is it used? } Types of machine learning

More information

Probabilistic Reasoning

Probabilistic Reasoning Course 16 :198 :520 : Introduction To Artificial Intelligence Lecture 7 Probabilistic Reasoning Abdeslam Boularias Monday, September 28, 2015 1 / 17 Outline We show how to reason and act under uncertainty.

More information

BTRY 4830/6830: Quantitative Genomics and Genetics

BTRY 4830/6830: Quantitative Genomics and Genetics BTRY 4830/6830: Quantitative Genomics and Genetics Lecture 23: Alternative tests in GWAS / (Brief) Introduction to Bayesian Inference Jason Mezey jgm45@cornell.edu Nov. 13, 2014 (Th) 8:40-9:55 Announcements

More information

Introduction to Machine Learning. Lecture 2

Introduction to Machine Learning. Lecture 2 Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for

More information

DS-GA 1002 Lecture notes 11 Fall Bayesian statistics

DS-GA 1002 Lecture notes 11 Fall Bayesian statistics DS-GA 100 Lecture notes 11 Fall 016 Bayesian statistics In the frequentist paradigm we model the data as realizations from a distribution that depends on deterministic parameters. In contrast, in Bayesian

More information

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models

Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Introduction to Bayesian Statistics with WinBUGS Part 4 Priors and Hierarchical Models Matthew S. Johnson New York ASA Chapter Workshop CUNY Graduate Center New York, NY hspace1in December 17, 2009 December

More information

Notes 11: OLS Theorems ECO 231W - Undergraduate Econometrics

Notes 11: OLS Theorems ECO 231W - Undergraduate Econometrics Notes 11: OLS Theorems ECO 231W - Undergraduate Econometrics Prof. Carolina Caetano For a while we talked about the regression method. Then we talked about the linear model. There were many details, but

More information

Uncertain Knowledge and Bayes Rule. George Konidaris

Uncertain Knowledge and Bayes Rule. George Konidaris Uncertain Knowledge and Bayes Rule George Konidaris gdk@cs.brown.edu Fall 2018 Knowledge Logic Logical representations are based on: Facts about the world. Either true or false. We may not know which.

More information

CPSC 340: Machine Learning and Data Mining

CPSC 340: Machine Learning and Data Mining CPSC 340: Machine Learning and Data Mining MLE and MAP Original version of these slides by Mark Schmidt, with modifications by Mike Gelbart. 1 Admin Assignment 4: Due tonight. Assignment 5: Will be released

More information

COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 9: LINEAR REGRESSION

COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 9: LINEAR REGRESSION COS513: FOUNDATIONS OF PROBABILISTIC MODELS LECTURE 9: LINEAR REGRESSION SEAN GERRISH AND CHONG WANG 1. WAYS OF ORGANIZING MODELS In probabilistic modeling, there are several ways of organizing models:

More information

Introduction to Probabilistic Machine Learning

Introduction to Probabilistic Machine Learning Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning

More information

Learning in Bayesian Networks

Learning in Bayesian Networks Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks

More information

BAYESIAN DECISION THEORY

BAYESIAN DECISION THEORY Last updated: September 17, 2012 BAYESIAN DECISION THEORY Problems 2 The following problems from the textbook are relevant: 2.1 2.9, 2.11, 2.17 For this week, please at least solve Problem 2.3. We will

More information

Natural Language Processing. Classification. Features. Some Definitions. Classification. Feature Vectors. Classification I. Dan Klein UC Berkeley

Natural Language Processing. Classification. Features. Some Definitions. Classification. Feature Vectors. Classification I. Dan Klein UC Berkeley Natural Language Processing Classification Classification I Dan Klein UC Berkeley Classification Automatically make a decision about inputs Example: document category Example: image of digit digit Example:

More information

Generative Techniques: Bayes Rule and the Axioms of Probability

Generative Techniques: Bayes Rule and the Axioms of Probability Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 8 3 March 2017 Generative Techniques: Bayes Rule and the Axioms of Probability Generative

More information

Probability models for machine learning. Advanced topics ML4bio 2016 Alan Moses

Probability models for machine learning. Advanced topics ML4bio 2016 Alan Moses Probability models for machine learning Advanced topics ML4bio 2016 Alan Moses What did we cover in this course so far? 4 major areas of machine learning: Clustering Dimensionality reduction Classification

More information

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1 Chapter 4 HOMEWORK ASSIGNMENTS These homeworks may be modified as the semester progresses. It is your responsibility to keep up to date with the correctly assigned homeworks. There may be some errors in

More information

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) =

σ(a) = a N (x; 0, 1 2 ) dx. σ(a) = Φ(a) = Until now we have always worked with likelihoods and prior distributions that were conjugate to each other, allowing the computation of the posterior distribution to be done in closed form. Unfortunately,

More information

Probabilistic and Bayesian Machine Learning

Probabilistic and Bayesian Machine Learning Probabilistic and Bayesian Machine Learning Lecture 1: Introduction to Probabilistic Modelling Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Why a

More information

Uni- and Bivariate Power

Uni- and Bivariate Power Uni- and Bivariate Power Copyright 2002, 2014, J. Toby Mordkoff Note that the relationship between risk and power is unidirectional. Power depends on risk, but risk is completely independent of power.

More information

Computer Vision Group Prof. Daniel Cremers. 3. Regression

Computer Vision Group Prof. Daniel Cremers. 3. Regression Prof. Daniel Cremers 3. Regression Categories of Learning (Rep.) Learnin g Unsupervise d Learning Clustering, density estimation Supervised Learning learning from a training data set, inference on the

More information