Bayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke

Size: px
Start display at page:

Download "Bayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke"

Transcription

1 State University of New York at Buffalo From the SelectedWorks of Joseph Lucke 2009 Bayesian Statistics Joseph F. Lucke Available at:

2 Bayesian Statistics Joseph F. Lucke Research Institute for Addictions State University of New York at Buffalo Original Presentation: July 29, 2009 Latest Revision: October 19, 2011

3 What is Bayesian Analysis? Bayesian Statistical Theory (BST) is radically different paradigm for statistics BST is not just another class of statistical models like structural equation models or multilevel models. BST can analyze any statistical model. Even though the numbers may be the same as classical theory, the interpretation is different. Issues that trouble classical statistics (e.g., multiple comparisons, sample size adequacy, interpretation of p-values) disappear in BST.

4 Why Consider Bayesian Analyses? Pragmatic Reasons Solve more statistical problems Implement more realistic models Less concern with sample size issues Philosophical Reasons Conceptual difficulties with classical statistics Closer integration of probability theory with statistical methods Unified approach to statistics

5 Interpretations of Probability Probability and hence Statistics are not monolithic disciplines. Probability theory began around Probability has always had ambiguous interpretations: Mathematical Not interpreted, branch of Measure Theory Objective Relative frequencies Subjective Logic of opinion

6 Historical Chronology Ambiguous: Pascal (1654), Bayes (1763) Subjective: Laplace (1814) Dormancy: ( ) Objective: Venn (1888), von Mises (1964) Mathematical: Kolmogorov (1933) Subjective: Ramsey (1926), de Finetti (1937)

7 Bayes? ( )

8 Laplace ( )

9 Ramsey ( )

10 de Finetti ( )

11 Current Schools of Statistics Classical: Neyman-Pearson, Wald, Lehmann significance, power, decision between hypotheses Likelihood: Fisher, Royall accept/reject hypothesis, p-value Bayesian: Jeffrey, DeGroot prior & posterior distributions, Bayes factor.

12 Comparison of Bayes and Neyman-Pearson Theories Feature Bayes Neyman-Pearson Content Beliefs Acts Unifying Principle Coherence Inductive behavior Probability Subjective Objective Repeated Events Exchangeability Independence Data Fixed Random Parameters Random Fixed, unknown Inference Bayes s Theorem Estimation Confidence interval Fixed Random Hypothesis testing Bayes factor Significance, power

13 Comments on Probability Probability is only the expression of our ignorance of the true causes. (Laplace, 1814) There are no such things as objective chances... Chances must be defined by degrees of belief. (Ramsey, 1931 ) PROBABILITY DOES NOT EXIST! (de Finetti, 1972)

14 Comments on Probability Probability is only the expression of our ignorance of the true causes. (Laplace, 1814) There are no such things as objective chances... Chances must be defined by degrees of belief. (Ramsey, 1931 ) PROBABILITY DOES NOT EXIST! (de Finetti, 1972)

15 Comments on Probability Probability is only the expression of our ignorance of the true causes. (Laplace, 1814) There are no such things as objective chances... Chances must be defined by degrees of belief. (Ramsey, 1931 ) PROBABILITY DOES NOT EXIST! (de Finetti, 1972)

16 Subjective Probability Probability is the logic of judgment or opinion. Your opinion can be represented as a set of subjectively fair bets on an event. Events may be unique. No repetition is required. Coherence principle: Avoid sets of bets that entrain a guaranteed loss. Ramsey-de Finetti Theorem (1931): Coherent judgments must satisfy probability axioms.

17 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

18 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

19 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

20 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

21 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

22 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

23 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

24 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

25 Coherence Suppose bet Republican choices of presidential candidate in Events Romney Perry Cain Odds Against 1 W 1 2 W 1 8 W 1 Bets ($) Romney 9 C6 C2 Perry C9 12 C2 Cain C9 C6 16 No matter what the outcome, I lose $1. My judgments are incoherent. Corresponding probabilities: 1 2 C 1 3 C 1 9 D < 1.

26 Exchangeability Finite case The concept of exchangeability is the subjectivist s equivalent to random sampling. Given an urn of 10 (N ) variously colored balls (events), the collection is exchangeable if any sample of size n < 100 (n < N ) is has the same distribution as any other sample of size n. The drawing of any single ball has the same information as the drawing of any other single ball. The drawing of any pair of balls has the same information as the drawing of any other pair. The drawing of any n-tuple of balls has the same information as the drawing of any other n-tuple.

27 Exchangeability Infinite case An infinite collection of events is infinitely exchangeable if any finite sample, no matter how large, is exchangeable. DeFinetti s Representation Theorem (1937): If a (potentially) infinite collection is infinitely exchangeable, then the collection can be modeled as if the collection consisted of independent events dependent upon some parameter. Infinite exchangeability can be approximated by (finite) exchangeability.

28 Exchangeability Examples If I flip a coin, and I consider each set of flips as being equally informative as any other set of flips, and I am willing to consider any number of flips, no matter how large, then the flips of the coin can be represented as if they were independent flips given the tendency of the coin to come up heads. Arbuthnott (1710) in 82 years (1629 to 1710) of annual birth records observed 484,382 male births of 938,223 total births from (82 years). If this sample is considered infinitely exchangeable, then the births can be considered independent with a male birth rate of 51.63%. Infinite exchangeability can be approximated by (finite) exchangeability.

29 Implications for Statistics 1 Use all of probability theory 2 Parameters can be considered uncertain with a (subjective) probability distribution 3 Parameter uncertainty is reduced by observations via Bayes s Theorem

30 Bayes s Theorem 1 Let be a parameter controlling a model. 2 Let Pr./ be the prior probability of. 3 Let x be an observation. 4 Let Pr.xj/ be the data-generating mechanism describing how the observations probabilistically arise from the model controlled by the parameter. 5 Let Pr.x/ be the prior predictive distribution. 6 Let Pr. jx/ be the posterior distribution of the parameter given the data x. 7 Then Bayes s Theorem is. Pr.jx/ D Pr.xj/ Pr./ Pr.x/

31 Bayes s Theorem is Trivial Bayes s Theorem is a trivial theorem in modern probability theory. Proof: Pr.jx/ Pr.x/ D Pr.; x/ D Pr.xj/ Pr./. Interpretation: The probability of two events happening is the probability of the first event times the probability of the second event given that the first has happened. Whether or not you can use Bayes s Theorem for statistical inference is determined by your interpretation of probability as a logic of opinion or relative frequency.

32 Bayes s Theorem, Example 1 Assume there are two urns, each with 100 balls. 2 Urn 1 has 90 black balls and 10 white 3 Urn 2 has 30 black balls and 70 white 4 So Pr.black ball/ D D :90 or D :30 5 Let prior probability of be Pr. D :90/ D :1 and Pr. D :30/ D :9. 6 Assume we draw a black ball. 7 What the probability that the ball came from Urn 1 ( D :9) versus Urn 2 ( D :3)?

33 Bayes s Theorem, Example continued 1 Let x D 1 mean the ball is black and x D 0 mean the ball is white. 2 The data generating mechanism is Bernoulli: x.1 / 1 x. 3 The prior predictive probability is Pr.x D 1/ D : Pr. D :9jx D 1/ D Pr.x D 1j D :9/ Pr. D :9/= Pr.x D 1/ and D :9 :1=:36 D :25 Pr. D :3jx D 1/ D Pr.x D 1j D :3/ Pr. D :3/= Pr.x D 1/ D :3 :9=:36 D :75:

34 Predictive Distributions Bayesian analysis uses predictive probabilities for observations. A predictive probability of an outcome is its probability weighted by the prior probabilities of the models generating the outcome. The prior predictive probability for x D 1 is Pr.x D 1/ D Pr.x D 1j D :9/ Pr. D :9/ C Pr.x D 1j D :3/ Pr. D :3/ D :9 :1 C :3 :9 D :36:

35 Predictive Distributions, continued More important than the prior predictive probability is the posterior predictive probability of a new, not-yet-observed outcome. The posterior predictive probability of a new outcome is its probability weighted by the posterior probabilities of the models generating the outcome. The posterior predictive probability for a new Ox D 1 is Pr. Ox D 1/ D Pr. Ox D 1j D :9/ Pr. D :9jx D 1/ C Pr. Ox D 1j D :3/ Pr. D :3jx D 1/ D :9 :25 C :3 :75 D :45:

36 Sources of Priors: 1 of 4. One of the biggest problems in using Bayesian statistics is the requirement of a prior probability for the parameter(s). Personal What you believe in your heart of hearts; Often requires a tedious process of elicitation; Usually difficult to incorporate in statistical model; Expert Determined by knowledgeable expert; My priors are the expert s priors;

37 Sources of Priors: 2 of 4. Scientific Consensus Reflects beliefs of scientific community; Tempered allow some credibility for alternative hypotheses; Adversarial Priors Choose prior favoring adversary s position Multiple priors; Used in sensitivity analyses;

38 Sources of Priors: 3 of 4. Previous data Previous results can be summarized as a prior for the current study. Prior studies are often heterogeneous and only partially relevant. Prior studies do not provide summaries as posterior distributions. Use discounted priors based on posteriors of previous results. Theory Determined by scientific theory; Possibly only partially determined;

39 Sources of Priors: 4 of 4. Conjugate Technically simple but flexible; Do not require substantial computation; Posterior distribution is in same class as prior. Reference Minimal influence on data; Try to express ignorance regarding parameters; Usually are conjugate priors;

40 Swamping of Priors Dogmatic prior Little or no uncertainty: ll prior probability concentrated on very small interval or single point. Open-minded prior moderate or large amount of uncertainty: Probability no concentrated. Sufficient data will swamp an open-minded prior. Bayesian Central Limit Theorem : In most cases, with sufficient data, the posterior distribution of a parameter will have an approximately normal distribution.

41 Posterior Analyses The posterior distribution contains all the information regarding the impact of the observed data on the model parameters. Analyses and summaries of the posterior are used to convey the impact of the data on the model parameters.

42 Posterior Analyses Location First summary usually refers to location. Posterior mean. Posterior median or mode.

43 Posterior Analyses Spread Extremely important is a measure of uncertainty regarding the parameter value. Usually supplied by the 95% (or other %) credible interval. The credible interval is fixed and the parameter is random In contrast, a confidence interval is random and the parameter is fixed. A (random) parameter has probability.95 of falling within a (fixed) 95% credible interval. A (random) 95% confidence interval has probability of covering the (fixed) unknown parameter.

44 Bayesian Analysis: Binary Outcome Arbuthnot (1710) investigated the birth rates of males and females in London. He was particularly concerned whether the male birth rate exceeded :5. Here we will conduct a Bayesian analysis. Let denote the male birth rate. Note that 0 1. Let the prior for be beta.1; 1/. This means all birth rates are equally likely. Not realistic, but replicates classical analysis.

45 Binary Outcomes, cont d Stage Males Females E./ 95%BCI Pr. > :50/ prior (1) (1).500 (.025,.975).50 1 day (.339,.694).57 2 days (.386,.649).61 1 week (.455,.596).77 1 month (.491,.562).93 1 year (.517,.537) 1.00-

46 Prior 1 Day Proportion Male Births Proportion Male Births 1 Week 1 Month Proportion Male Births Proportion Male Births

47 Bayesian Analysis: Comments Sequential sampling: Because the observation were considered exchangeable, Arbuthnot could update his priors from day to day using the previous day s posterior as the subsequent day s prior. Stopping rule: He could stop sampling whenever he wanted. Multiple inferences: The inferences are based solely on the observations obtained. There is no correction needed for having made multiple inferences.

48 Binary Outcomes, Informative Prior Now, consider an informative prior beta.50; 50/. Stage Males Females E./ 95%BCI Pr. > :50/ prior (50) (50).500 (.403,.600).50 1 day (.417,.590).54 2 days (.427,.586).56 1 week (.460,.575).72 1 month (.490,.557).92 1 year (.517,.537) 1.00-

49 Prior 1 Day Proportion Male Births Proportion Male Births 1 Week 1 Month Proportion Male Births Proportion Male Births

50 Classical hypothesis testing :PTCA vs Stent RCT for percutaneous transluminal coronary angioplasty (PTCA) versus provisional stenting (Stent) for increasing survival (Savage, 1997). Expected survival propensity for PTCA is :70. Want Stent condition to increase survival to at least :75. ı D :05, D :05, 1 ˇ D :80. Required sample size: 986 per group

51 Data Group Sample Survival Proportion PTCA Stent Classical test of proportions: 2.1/ D 0:80, p D :19, one-sided. Actual sample size is only 11% of required size What can we conclude? No difference? Unable to detect any difference? Maybe study should not have been conducted in the first place.

52 Bayesian hypothesis testing :PTCA vs Stent Same RCT for percutaneous transluminal coronary angioplasty (PTCA) versus provisional stenting (Stent) for increasing survival (Savage, 1997). Encode my information prior to observing data No difference between groups Expected survival propensity for either group is :70 95%BCI = (.40,.93) for each group

53 Prior Prior, both groups mean =.70 95% BCI = (.40,.93) Survival Propensity

54 Data Group Sample Survival Proportion PTCA Stent Pr.Surv j Stent/ D :82, 95%BCI = (.75.89) Pr.Surv j PCTA/ D :77, 95%BCI = (.69.84)

55 Posterior Stent =.82; BCI = (.75,.89) PCTA =.77; BCI = (.69,.84) Survival Propensity

56 Bayesian Hypothesis testing : Difference between PTCA vs Stent Interested in difference between PCTA & Stent effects. Mathematical fact: Difference between two beta distributions is approximately normal. If more precision is require, one can alway simulate distribution.

57 Prior for Difference between PCTA & Stent Prior mean =.00 Prior 95% BCI = (.38,.38) Survival Difference

58 Posterior for Difference between PCTA & Stent prior posterior Post mean =.05 Post 95% BCI = (.05,.15) Prior mean =.00 Prior 95% BCI = (.38,.38) Survival Difference

59 Bayesian hypothesis testing 1 Let ı denote the difference between the Stent and PTCA propensities of survival. Consider Stent Superiority: ı > 0 Stent Inferiority: ı 0

60 Bayesian Hypothesis Testing 1 x Inferior Equivocal Superior x

61 Posterior for Difference between PCTA & Stent Prob Inf =.15 Prob Sup = Survival Difference

62 Bayesian hypothesis testing 2 Let ı denote the difference between the Stent and PTCA propensities of survival. Consider : Stent Superiority: :05 < ı Stent-PCTA Equivalence: :05 ı :05 Stent Inferiority: ı < :05 Stent Non-inferiority: ı :05 Stent Non-Superiority ı :05 Equivocal Results: :10 ı :10

63 Bayesian Hypothesis Testing 2 x Not Superior Inferior Equivocal Equal Superior Not Inferior x

64 Bayesian Hypothesis Testing Prob Equiv =.45 Prob Inf =.02 Prob Sup = Survival Difference

65 Modern Bayesian Analysis Before 1980, Bayesian analyses were severely limited to a few classes of models. The computational complexity of posterior distributions proscribed any serious analyses. Bayesian textbooks would start off with theory of Bayesian analyses but end up with practice of classical analyses. After 1980, Bayesian analyses are no longer so limited. A procedure called Markov chain Monte Carlo (MCMC) methods has liberated Baysian analyses from its complexity. MCMC is implemented in (free) software WinBUGS OpenBUGS.

66 How does MCMC work? MCMC is a simulation technique It takes each parameter of a model by fixing the data (which is always the same) and remaining parameters and generating a sampling distribution for that parameter. It then moves to the next parameter and generates a sampling distribution for that parameter. And so on through all parameters. Then it starts over with the first updated parameter and repeats the process. After a few thousand trial, the distributions of all the parameters converge. From these distributions, summaries of parameters can be obtained.

67 Thanks to Michael West (2004). ~mw/abs04/lecture_slides/4.stats_regression.pdf

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007 Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006 Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)

More information

PIER HLM Course July 30, 2011 Howard Seltman. Discussion Guide for Bayes and BUGS

PIER HLM Course July 30, 2011 Howard Seltman. Discussion Guide for Bayes and BUGS PIER HLM Course July 30, 2011 Howard Seltman Discussion Guide for Bayes and BUGS 1. Classical Statistics is based on parameters as fixed unknown values. a. The standard approach is to try to discover,

More information

TEACHING INDEPENDENCE AND EXCHANGEABILITY

TEACHING INDEPENDENCE AND EXCHANGEABILITY TEACHING INDEPENDENCE AND EXCHANGEABILITY Lisbeth K. Cordani Instituto Mauá de Tecnologia, Brasil Sergio Wechsler Universidade de São Paulo, Brasil lisbeth@maua.br Most part of statistical literature,

More information

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist

More information

PHASES OF STATISTICAL ANALYSIS 1. Initial Data Manipulation Assembling data Checks of data quality - graphical and numeric

PHASES OF STATISTICAL ANALYSIS 1. Initial Data Manipulation Assembling data Checks of data quality - graphical and numeric PHASES OF STATISTICAL ANALYSIS 1. Initial Data Manipulation Assembling data Checks of data quality - graphical and numeric 2. Preliminary Analysis: Clarify Directions for Analysis Identifying Data Structure:

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Probability is related to uncertainty and not (only) to the results of repeated experiments

Probability is related to uncertainty and not (only) to the results of repeated experiments Uncertainty probability Probability is related to uncertainty and not (only) to the results of repeated experiments G. D Agostini, Probabilità e incertezze di misura - Parte 1 p. 40 Uncertainty probability

More information

Probability theory basics

Probability theory basics Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:

More information

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation

More information

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology http://www.dm.hut.fi Meanings of uncertainty Interpretations of probability Biases in

More information

A Discussion of the Bayesian Approach

A Discussion of the Bayesian Approach A Discussion of the Bayesian Approach Reference: Chapter 10 of Theoretical Statistics, Cox and Hinkley, 1974 and Sujit Ghosh s lecture notes David Madigan Statistics The subject of statistics concerns

More information

Bayesian Inference: What, and Why?

Bayesian Inference: What, and Why? Winter School on Big Data Challenges to Modern Statistics Geilo Jan, 2014 (A Light Appetizer before Dinner) Bayesian Inference: What, and Why? Elja Arjas UH, THL, UiO Understanding the concepts of randomness

More information

Chapter 5. Bayesian Statistics

Chapter 5. Bayesian Statistics Chapter 5. Bayesian Statistics Principles of Bayesian Statistics Anything unknown is given a probability distribution, representing degrees of belief [subjective probability]. Degrees of belief [subjective

More information

Unobservable Parameter. Observed Random Sample. Calculate Posterior. Choosing Prior. Conjugate prior. population proportion, p prior:

Unobservable Parameter. Observed Random Sample. Calculate Posterior. Choosing Prior. Conjugate prior. population proportion, p prior: Pi Priors Unobservable Parameter population proportion, p prior: π ( p) Conjugate prior π ( p) ~ Beta( a, b) same PDF family exponential family only Posterior π ( p y) ~ Beta( a + y, b + n y) Observed

More information

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

Probability Theory Review

Probability Theory Review Probability Theory Review Brendan O Connor 10-601 Recitation Sept 11 & 12, 2012 1 Mathematical Tools for Machine Learning Probability Theory Linear Algebra Calculus Wikipedia is great reference 2 Probability

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 9: A Bayesian model of concept learning Chris Lucas School of Informatics University of Edinburgh October 16, 218 Reading Rules and Similarity in Concept Learning

More information

EnM Probability and Random Processes

EnM Probability and Random Processes Historical Note: EnM 503 - Probability and Random Processes Probability has its roots in games of chance, which have been played since prehistoric time. Games and equipment have been found in Egyptian

More information

Bayesian Modeling, Inference, Prediction and Decision-Making

Bayesian Modeling, Inference, Prediction and Decision-Making Bayesian Modeling, Inference, Prediction and Decision-Making 1: Background and Basics David Draper Department of Applied Mathematics and Statistics University of California, Santa Cruz Short Course (Days

More information

Fundamental Probability and Statistics

Fundamental Probability and Statistics Fundamental Probability and Statistics "There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are

More information

Subjective and Objective Bayesian Statistics

Subjective and Objective Bayesian Statistics Subjective and Objective Bayesian Statistics Principles, Models, and Applications Second Edition S. JAMES PRESS with contributions by SIDDHARTHA CHIB MERLISE CLYDE GEORGE WOODWORTH ALAN ZASLAVSKY \WILEY-

More information

Bayesian analysis in nuclear physics

Bayesian analysis in nuclear physics Bayesian analysis in nuclear physics Ken Hanson T-16, Nuclear Physics; Theoretical Division Los Alamos National Laboratory Tutorials presented at LANSCE Los Alamos Neutron Scattering Center July 25 August

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Statistical Inference

Statistical Inference Statistical Inference Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Spring, 2006 1. DeGroot 1973 In (DeGroot 1973), Morrie DeGroot considers testing the

More information

Schrödinger in Nature and the Greeks

Schrödinger in Nature and the Greeks Schrödinger in Nature and the Greeks [... ] the scientist subconsciously, almost inadvertently, simplifies his problem of understanding Nature by disregarding or cutting out of the picture to be constructed

More information

Bayesian Statistics. Debdeep Pati Florida State University. February 11, 2016

Bayesian Statistics. Debdeep Pati Florida State University. February 11, 2016 Bayesian Statistics Debdeep Pati Florida State University February 11, 2016 Historical Background Historical Background Historical Background Brief History of Bayesian Statistics 1764-1838: called probability

More information

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective

More information

Bayesian Inference. Chapter 1. Introduction and basic concepts

Bayesian Inference. Chapter 1. Introduction and basic concepts Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master

More information

Principles of Statistical Inference

Principles of Statistical Inference Principles of Statistical Inference Nancy Reid and David Cox August 30, 2013 Introduction Statistics needs a healthy interplay between theory and applications theory meaning Foundations, rather than theoretical

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics October 17, 2017 CS 361: Probability & Statistics Inference Maximum likelihood: drawbacks A couple of things might trip up max likelihood estimation: 1) Finding the maximum of some functions can be quite

More information

Time Series and Dynamic Models

Time Series and Dynamic Models Time Series and Dynamic Models Section 1 Intro to Bayesian Inference Carlos M. Carvalho The University of Texas at Austin 1 Outline 1 1. Foundations of Bayesian Statistics 2. Bayesian Estimation 3. The

More information

Principles of Statistical Inference

Principles of Statistical Inference Principles of Statistical Inference Nancy Reid and David Cox August 30, 2013 Introduction Statistics needs a healthy interplay between theory and applications theory meaning Foundations, rather than theoretical

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 CS students: don t forget to re-register in CS-535D. Even if you just audit this course, please do register.

More information

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Statistical Methods in Particle Physics Lecture 1: Bayesian methods Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan

More information

(1) Introduction to Bayesian statistics

(1) Introduction to Bayesian statistics Spring, 2018 A motivating example Student 1 will write down a number and then flip a coin If the flip is heads, they will honestly tell student 2 if the number is even or odd If the flip is tails, they

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Computational Perception. Bayesian Inference

Computational Perception. Bayesian Inference Computational Perception 15-485/785 January 24, 2008 Bayesian Inference The process of probabilistic inference 1. define model of problem 2. derive posterior distributions and estimators 3. estimate parameters

More information

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem? Who was Bayes? Bayesian Phylogenetics Bret Larget Departments of Botany and of Statistics University of Wisconsin Madison October 6, 2011 The Reverand Thomas Bayes was born in London in 1702. He was the

More information

Bayesian Phylogenetics

Bayesian Phylogenetics Bayesian Phylogenetics Bret Larget Departments of Botany and of Statistics University of Wisconsin Madison October 6, 2011 Bayesian Phylogenetics 1 / 27 Who was Bayes? The Reverand Thomas Bayes was born

More information

Machine Learning using Bayesian Approaches

Machine Learning using Bayesian Approaches Machine Learning using Bayesian Approaches Sargur N. Srihari University at Buffalo, State University of New York 1 Outline 1. Progress in ML and PR 2. Fully Bayesian Approach 1. Probability theory Bayes

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,

More information

A Bayesian Approach to Phylogenetics

A Bayesian Approach to Phylogenetics A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte

More information

Discrete Binary Distributions

Discrete Binary Distributions Discrete Binary Distributions Carl Edward Rasmussen November th, 26 Carl Edward Rasmussen Discrete Binary Distributions November th, 26 / 5 Key concepts Bernoulli: probabilities over binary variables Binomial:

More information

Bayesian Estimation An Informal Introduction

Bayesian Estimation An Informal Introduction Mary Parker, Bayesian Estimation An Informal Introduction page 1 of 8 Bayesian Estimation An Informal Introduction Example: I take a coin out of my pocket and I want to estimate the probability of heads

More information

Origins of Probability Theory

Origins of Probability Theory 1 16.584: INTRODUCTION Theory and Tools of Probability required to analyze and design systems subject to uncertain outcomes/unpredictability/randomness. Such systems more generally referred to as Experiments.

More information

Introduction to Bayesian Inference

Introduction to Bayesian Inference Introduction to Bayesian Inference p. 1/2 Introduction to Bayesian Inference September 15th, 2010 Reading: Hoff Chapter 1-2 Introduction to Bayesian Inference p. 2/2 Probability: Measurement of Uncertainty

More information

the unification of statistics its uses in practice and its role in Objective Bayesian Analysis:

the unification of statistics its uses in practice and its role in Objective Bayesian Analysis: Objective Bayesian Analysis: its uses in practice and its role in the unification of statistics James O. Berger Duke University and the Statistical and Applied Mathematical Sciences Institute Allen T.

More information

Why Try Bayesian Methods? (Lecture 5)

Why Try Bayesian Methods? (Lecture 5) Why Try Bayesian Methods? (Lecture 5) Tom Loredo Dept. of Astronomy, Cornell University http://www.astro.cornell.edu/staff/loredo/bayes/ p.1/28 Today s Lecture Problems you avoid Ambiguity in what is random

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics March 14, 2018 CS 361: Probability & Statistics Inference The prior From Bayes rule, we know that we can express our function of interest as Likelihood Prior Posterior The right hand side contains the

More information

Randomized Algorithms

Randomized Algorithms Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

The Laplace Rule of Succession Under A General Prior

The Laplace Rule of Succession Under A General Prior 1 The Laplace Rule of Succession Under A General Prior Kalyan Raman University of Michigan in Flint School of Management Flint, MI 48502 May 2000 ------------------------------------------------------------------------------------------------

More information

Statistics of Small Signals

Statistics of Small Signals Statistics of Small Signals Gary Feldman Harvard University NEPPSR August 17, 2005 Statistics of Small Signals In 1998, Bob Cousins and I were working on the NOMAD neutrino oscillation experiment and we

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace

A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I kevin small & byron wallace today a review of probability random variables, maximum likelihood, etc. crucial for clinical

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive

More information

The Bayesian Paradigm

The Bayesian Paradigm Stat 200 The Bayesian Paradigm Friday March 2nd The Bayesian Paradigm can be seen in some ways as an extra step in the modelling world just as parametric modelling is. We have seen how we could use probabilistic

More information

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model

More information

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205)

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205) 3.2 Intoduction to probability Sections 3.2 and 3.3 Elementary Statistics for the Biological and Life Sciences (Stat 205) 1 / 47 Probability 3.2 Intoduction to probability The probability of an event E

More information

Inference for a Population Proportion

Inference for a Population Proportion Al Nosedal. University of Toronto. November 11, 2015 Statistical inference is drawing conclusions about an entire population based on data in a sample drawn from that population. From both frequentist

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability

More information

A Problem Involving Games. Paccioli s Solution. Problems for Paccioli: Small Samples. n / (n + m) m / (n + m)

A Problem Involving Games. Paccioli s Solution. Problems for Paccioli: Small Samples. n / (n + m) m / (n + m) Class #10: Introduction to Probability Theory Artificial Intelligence (CS 452/552): M. Allen, 27 Sept. 17 A Problem Involving Games } Two players put money in on a game of chance } First one to certain

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Bayesian Inference. STA 121: Regression Analysis Artin Armagan

Bayesian Inference. STA 121: Regression Analysis Artin Armagan Bayesian Inference STA 121: Regression Analysis Artin Armagan Bayes Rule...s! Reverend Thomas Bayes Posterior Prior p(θ y) = p(y θ)p(θ)/p(y) Likelihood - Sampling Distribution Normalizing Constant: p(y

More information

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides slide 1 Inference with Bayes rule: Example In a bag there are two envelopes one has a red ball (worth $100) and a black ball one

More information

1. Introduction and non-bayesian inference

1. Introduction and non-bayesian inference 1. Introduction and non-bayesian inference Objective Introduce the different objective and subjective interpretations of probability. Examine the various non-bayesian treatments of statistical inference

More information

1 A simple example. A short introduction to Bayesian statistics, part I Math 217 Probability and Statistics Prof. D.

1 A simple example. A short introduction to Bayesian statistics, part I Math 217 Probability and Statistics Prof. D. probabilities, we ll use Bayes formula. We can easily compute the reverse probabilities A short introduction to Bayesian statistics, part I Math 17 Probability and Statistics Prof. D. Joyce, Fall 014 I

More information

Computational Cognitive Science

Computational Cognitive Science Computational Cognitive Science Lecture 9: Bayesian Estimation Chris Lucas (Slides adapted from Frank Keller s) School of Informatics University of Edinburgh clucas2@inf.ed.ac.uk 17 October, 2017 1 / 28

More information

My talk concerns estimating a fixed but unknown, continuously valued parameter, linked to data by a statistical model. I focus on contrasting

My talk concerns estimating a fixed but unknown, continuously valued parameter, linked to data by a statistical model. I focus on contrasting 1 My talk concerns estimating a fixed but unknown, continuously valued parameter, linked to data by a statistical model. I focus on contrasting Subjective and Objective Bayesian parameter estimation methods

More information

De Finetti s ultimate failure. Krzysztof Burdzy University of Washington

De Finetti s ultimate failure. Krzysztof Burdzy University of Washington De Finetti s ultimate failure Krzysztof Burdzy University of Washington Does philosophy matter? Global temperatures will rise by 1 degree in 20 years with probability 80%. Reading suggestions Probability

More information

ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing

ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing Robert Vanderbei Fall 2014 Slides last edited on November 24, 2014 http://www.princeton.edu/ rvdb Coin Tossing Example Consider two coins.

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box 90251 Durham, NC 27708, USA Summary: Pre-experimental Frequentist error probabilities do not summarize

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Basic notions of probability theory

Basic notions of probability theory Basic notions of probability theory Contents o Boolean Logic o Definitions of probability o Probability laws Objectives of This Lecture What do we intend for probability in the context of RAM and risk

More information

Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2

Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2 Logistics CSE 446: Point Estimation Winter 2012 PS2 out shortly Dan Weld Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2 Last Time Random variables, distributions Marginal, joint & conditional

More information

Probability Review - Bayes Introduction

Probability Review - Bayes Introduction Probability Review - Bayes Introduction Statistics 220 Spring 2005 Copyright c 2005 by Mark E. Irwin Advantages of Bayesian Analysis Answers the questions that researchers are usually interested in, What

More information

Lecture 5: Bayes pt. 1

Lecture 5: Bayes pt. 1 Lecture 5: Bayes pt. 1 D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Bayes Probabilities

More information

Some Basic Concepts of Probability and Information Theory: Pt. 2

Some Basic Concepts of Probability and Information Theory: Pt. 2 Some Basic Concepts of Probability and Information Theory: Pt. 2 PHYS 476Q - Southern Illinois University January 22, 2018 PHYS 476Q - Southern Illinois University Some Basic Concepts of Probability and

More information

Case Study in the Use of Bayesian Hierarchical Modeling and Simulation for Design and Analysis of a Clinical Trial

Case Study in the Use of Bayesian Hierarchical Modeling and Simulation for Design and Analysis of a Clinical Trial Case Study in the Use of Bayesian Hierarchical Modeling and Simulation for Design and Analysis of a Clinical Trial William R. Gillespie Pharsight Corporation Cary, North Carolina, USA PAGE 2003 Verona,

More information

Bayesian Computation

Bayesian Computation Bayesian Computation CAS Centennial Celebration and Annual Meeting New York, NY November 10, 2014 Brian M. Hartman, PhD ASA Assistant Professor of Actuarial Science University of Connecticut CAS Antitrust

More information

Models of Reputation with Bayesian Updating

Models of Reputation with Bayesian Updating Models of Reputation with Bayesian Updating Jia Chen 1 The Tariff Game (Downs and Rocke 1996) 1.1 Basic Setting Two states, A and B, are setting the tariffs for trade. The basic setting of the game resembles

More information

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition. Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface

More information

Basic notions of probability theory

Basic notions of probability theory Basic notions of probability theory Contents o Boolean Logic o Definitions of probability o Probability laws Why a Lecture on Probability? Lecture 1, Slide 22: Basic Definitions Definitions: experiment,

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics Probability Rules MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2018 Introduction Probability is a measure of the likelihood of the occurrence of a certain behavior

More information

The Fundamental Principle of Data Science

The Fundamental Principle of Data Science The Fundamental Principle of Data Science Harry Crane Department of Statistics Rutgers May 7, 2018 Web : www.harrycrane.com Project : researchers.one Contact : @HarryDCrane Harry Crane (Rutgers) Foundations

More information

Deciding, Estimating, Computing, Checking

Deciding, Estimating, Computing, Checking Deciding, Estimating, Computing, Checking How are Bayesian posteriors used, computed and validated? Fundamentalist Bayes: The posterior is ALL knowledge you have about the state Use in decision making:

More information

Deciding, Estimating, Computing, Checking. How are Bayesian posteriors used, computed and validated?

Deciding, Estimating, Computing, Checking. How are Bayesian posteriors used, computed and validated? Deciding, Estimating, Computing, Checking How are Bayesian posteriors used, computed and validated? Fundamentalist Bayes: The posterior is ALL knowledge you have about the state Use in decision making:

More information

Mathematical foundations of Econometrics

Mathematical foundations of Econometrics Mathematical foundations of Econometrics G.Gioldasis, UniFe & prof. A.Musolesi, UniFe March 13, 2016.Gioldasis, UniFe & prof. A.Musolesi, UniFe Mathematical foundations of Econometrics March 13, 2016 1

More information

Journeys of an Accidental Statistician

Journeys of an Accidental Statistician Journeys of an Accidental Statistician A partially anecdotal account of A Unified Approach to the Classical Statistical Analysis of Small Signals, GJF and Robert D. Cousins, Phys. Rev. D 57, 3873 (1998)

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped

More information

Imprecise Probability

Imprecise Probability Imprecise Probability Alexander Karlsson University of Skövde School of Humanities and Informatics alexander.karlsson@his.se 6th October 2006 0 D W 0 L 0 Introduction The term imprecise probability refers

More information

MATH MW Elementary Probability Course Notes Part I: Models and Counting

MATH MW Elementary Probability Course Notes Part I: Models and Counting MATH 2030 3.00MW Elementary Probability Course Notes Part I: Models and Counting Tom Salisbury salt@yorku.ca York University Winter 2010 Introduction [Jan 5] Probability: the mathematics used for Statistics

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Quantitative Understanding in Biology 1.7 Bayesian Methods

Quantitative Understanding in Biology 1.7 Bayesian Methods Quantitative Understanding in Biology 1.7 Bayesian Methods Jason Banfelder October 25th, 2018 1 Introduction So far, most of the methods we ve looked at fall under the heading of classical, or frequentist

More information