Bayesian Inference. Chapter 1. Introduction and basic concepts

Similar documents
Bayesian Inference for the Multivariate Normal

Bayesian Inference. Chapter 9. Linear models and regression

PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.

Bayesian Inference: Concept and Practice

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Bayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida

eqr094: Hierarchical MCMC for Bayesian System Reliability

Markov Chain Monte Carlo methods

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

On the Bayesianity of Pereira-Stern tests

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

Bayesian Regression Linear and Logistic Regression

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Reconstruction of individual patient data for meta analysis via Bayesian approach

Bayesian Estimation An Informal Introduction

A note on Reversible Jump Markov Chain Monte Carlo

Bayesian Inference. Anders Gorm Pedersen. Molecular Evolution Group Center for Biological Sequence Analysis Technical University of Denmark (DTU)

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Kobe University Repository : Kernel

Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions

Quantile POD for Hit-Miss Data

Statistical Inference

Statistics 220 Bayesian Data Analysis

Using prior knowledge in frequentist tests

Bayesian inference & Markov chain Monte Carlo. Note 1: Many slides for this lecture were kindly provided by Paul Lewis and Mark Holder

Bayesian modelling. Hans-Peter Helfrich. University of Bonn. Theodor-Brinkmann-Graduate School

Lecture 1 Bayesian inference

Introduction to Bayesian Methods

Sampling Methods (11/30/04)

Principles of Bayesian Inference

A Bayesian Approach to Phylogenetics

Assessment of the South African anchovy resource using data from : posterior distributions for the two base case hypotheses

Bayesian Phylogenetics:

Bayesian Inference. p(y)

BAYESIAN ESTIMATION OF LINEAR STATISTICAL MODEL BIAS

Bayes: All uncertainty is described using probability.

ABC methods for phase-type distributions with applications in insurance risk problems

Bayesian Inference and MCMC

A Very Brief Summary of Bayesian Inference, and Examples

Monte Carlo Methods for Computation and Optimization (048715)

Comparison of Three Calculation Methods for a Bayesian Inference of Two Poisson Parameters

Bayesian Statistics. Debdeep Pati Florida State University. February 11, 2016

Bayesian phylogenetics. the one true tree? Bayesian phylogenetics

Markov chain Monte Carlo

Answers and expectations

Tools for Parameter Estimation and Propagation of Uncertainty

36-463/663: Hierarchical Linear Models

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

MONTE CARLO METHODS. Hedibert Freitas Lopes

Data Mining Chapter 4: Data Analysis and Uncertainty Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Markov Chain Monte Carlo methods

Simulation of truncated normal variables. Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris

Forward Problems and their Inverse Solutions

Principles of Bayesian Inference

Theory and Methods of Statistical Inference. PART I Frequentist theory and methods

Bayesian Reliability Demonstration

Monte Carlo in Bayesian Statistics

Bayesian Inference: Probit and Linear Probability Models

Web Appendix for The Dynamics of Reciprocity, Accountability, and Credibility

Principles of Bayesian Inference

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method

Computer Practical: Metropolis-Hastings-based MCMC

Description Remarks and examples References Also see

Bayesian philosophy Bayesian computation Bayesian software. Bayesian Statistics. Petter Mostad. Chalmers. April 6, 2017

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Density Estimation. Seungjin Choi

A BAYESIAN MATHEMATICAL STATISTICS PRIMER. José M. Bernardo Universitat de València, Spain

Theory and Methods of Statistical Inference

McGill University. Department of Epidemiology and Biostatistics. Bayesian Analysis for the Health Sciences. Course EPIB-682.

Data Analysis and Uncertainty Part 2: Estimation


Bayesian Analysis for Step-Stress Accelerated Life Testing using Weibull Proportional Hazard Model

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements

Nested Sampling. Brendon J. Brewer. brewer/ Department of Statistics The University of Auckland

Penalized Loss functions for Bayesian Model Choice

Theory and Methods of Statistical Inference. PART I Frequentist likelihood methods

Confidence Distribution

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

Part III. A Decision-Theoretic Approach and Bayesian testing

Bayesian Inference for Normal Mean

Markov Chain Monte Carlo

Comparison study between MCMC-based and weight-based Bayesian methods for identification of joint distribution

Markov Chain Monte Carlo, Numerical Integration

Bayesian Methods for Machine Learning

Lecture 2: Statistical Decision Theory (Part I)

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software

Bayesian Inference. Chapter 2: Conjugate models

Bayesian search for other Earths

Foundations of Statistical Inference

Approximate Bayesian Computation: a simulation based approach to inference

Introduction to Bayesian Statistics 1

NONLINEAR APPLICATIONS OF MARKOV CHAIN MONTE CARLO

Molecular Epidemiology Workshop: Bayesian Data Analysis

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Bayesian Phylogenetics

Multiple QTL mapping

A CONDITION TO OBTAIN THE SAME DECISION IN THE HOMOGENEITY TEST- ING PROBLEM FROM THE FREQUENTIST AND BAYESIAN POINT OF VIEW

BAYESIAN ANALYSIS OF CORRELATED PROPORTIONS

Transcription:

Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in Mathematical Engineering

Objective The aim of this course is to introduce the modern approach to Bayesian statistics, emphasizing the computational aspects and the differences between the classical and Bayesian approaches. The course includes Bayesian solutions to real problems. Thomas Bayes

Recommended reading

More references Box, G.E., Tiao, G.C., (1992) Bayesian Inference in Statistical Analysis. Wiley. Robert, C.P., Casella, G. (2004). Monte Carlo Statistical Methods. Springer. Bernardo, J.M., Smith., A.F.M. (1994) Bayesian Theory. Wiley. Gamerman, D., Lopes, H.F., (2006) MCMC - Stochastic Simulation for Bayesian Inference. Chapman & Hall. http://www.dme.ufrj.br/mcmc. Gelman, A., Carlin, J.B., Stern, H.S., Rubin, D.B., (2004) Bayesian Data Analysis. Chapman & Hall. Lee, P.M., (2004) Bayesian Statistics: An Introduction. Wiley. Robert, C.P., Casella, R., (2010) Introducing Monte Carlo Methods with R. Springer.

Introduction Bayesian statistics is based on probability laws to quantify the degree of belief in an hypothesis. Two different views of probability: Frequentist or classical approach: The limit of the relative frequencies of an event in an infinite series of repetitions of the same experiment. Bayesian approach: Statements about our state of knowledge only based on the available information. Bayes theorem is the basic formula to update our beliefs about a hypothesis once we have additional observed information. In the Bayesian approach, the parameters of a distribution are considered as random variables, while the data are fixed.

Introduction Given a prior probability about a hypothesis and the observed information, the Bayes rule is used to obtain the posterior probability which is the conditional probability on the observed evidence. In Bayesian statistics, unlike the classical approach, inference is always based on Probability Theory. The main criticism to Bayesian inference is the usual subjective choice of the prior probabilities. In practice, posterior probabilities can be obtained analytically only in simple problems. Realistic complex problems usually require computational intensive methods based on numerical approximations or simulation methods. In particular, Markov Chain Monte Carlo (MCMC) methods have been largely developed in the last decades.

Basic concepts Given a hypothesis, H, the Bayes theorem is used to update our beliefs about it once the data have been observed: where: Pr (H Data) = Pr (Data H) Pr (H) Pr (Data) Pr(H) is our prior belief or prior probability on the hypothesis. Pr(H Data) is the posterior probability for H once the data have been observed. Pr(Data H) is the likelihood for the observed data given that H is true. Pr(Data) is the marginal likelihood for the observed data independently of H being true or not.

Main steps in Bayesian inference Suppose now that we are interested in making inference about a parameter, θ, given a data sample, x = {x 1,..., x n }. Using the Bayesian approach, θ is not a fixed value, but a random variable. Then, the main steps are: 1. Select a prior distribution for θ, denoted by π(θ), which should reflect our prior beliefs about θ before observing the data. 2. Collect some data, x, assume a statistical model for them given θ and obtain the likelihood, f (x θ). 3. Using Bayes rule, update our prior beliefs about θ by calculating the posterior probability: π (θ x) = f (x θ) π (θ) f (x)

Main steps in Bayesian inference The marginal likelihood, f (x) = f (x θ)π(θ)dθ is an integration constant to ensure that the posterior distribution of θ integrates up to one and does not depend on θ. Therefore, this constant does not provide any additional information about the posterior distribution and then, this is usually expressed by, π (θ x) f (x θ) π (θ) which means that the posterior distribution is proportional to the likelihood times te prior distribution. Therefore, a 95% credible interval for θ is simply an interval (a, b) such that the posterior probability that θ is in (a, b) is equal to 0.95.

To sum up... Frequen:st Probability can model frequency: how o^en an event occurs We assume that experiments are repeatable. The parameters of a distribu:on are fixed. The data is random: it is drawn from a fixed distribu:on. I have a 95% confidence that the popula:on mean is between 2.2 and +0.3 If H 0 is true, we would only see such extreme results 3.2% of the :me, so we can reject H 0 at the 5% level. Bayesian Probability can model our subjec1ve beliefs. Experiments cannot be repeatable. The parameters of a distribu:on are random variables. The data is the only thing that is fixed & immutable. There is a 95% probability that the popula:on mean is between 12 and 13 Probability of H 0 is 0.26%. 6 Summary from Brian Potetz web page

In the next chapter, we will introduce simple models where the posterior distributions can be obtained analytically assuming certain prior distribution models, known as conjugate distributions.