Week 4 Spring Lecture 7. Empirical Bayes, hierarchical Bayes and random effects

Size: px
Start display at page:

Download "Week 4 Spring Lecture 7. Empirical Bayes, hierarchical Bayes and random effects"

Transcription

1 Week 4 Spring 9 Lecture 7 Empirical Bayes, hierarchical Bayes and random effects Empirical Bayes and Hierarchical Bayes eview: Let X N (; ) Consider a normal prior N (; ) Then the posterior distribution of is jx N( + ( + ) (X ), ( + ) ) Empirical Bayes Let = I, = and () = = I Then + = + : Since E p kk = +, we may estimate + by p to yield the James-Stein estimator kk Hierarchical Bayes Let = I, = I and H It is easy to see that g () = = kk p Z _ p e kk d _ Z ' I () d, then the harmonic prior corresponds h () Theorem Let h () _ ( + ) a The the generalized Bayes estimators (i) eist if a < + p=; (ii) are admissible if p 3 and 3 p= a ; (iii) are minima if they eist and p 3 and 3 p= a Proof of the theorem: (i) It is easy to see (a ie, a < + p= ) (p ) d is nite if and only if (a ) < p, (ii) We have Z g () _ ( + ) a ep kk p d then kk p+ a g () c as kk

2 It can be shown that the growth condition holds Z g () kk (log kk) d < kk> and the atness condition holds when 3 p= a (leave it to you) (iii) It can be shown that r kk (p ) () A kk where r kk = p 4 p + a vp= e kk = a e vkk From Baranchik (97) the estimator is minima if p + a e kk = p 4 vp= a e vkk Let s just look at the case a = The general case follows similarly Write then g () _ g () = = kk p Z _ ' I () d Z Z ' (+ )I () d _ v (p 4)= ep kk v, thus rg g = vp= ep kk v vp= ep kk v = p vp= e kk = = e vkk vp= d ep vp= ep kk v kk kk v kk emark When p 5, there eist proper Bayes minima estimators, since h is the density of a nite measure for a < For p = 3; 4 there are no proper Bayes minima estimators from Brown (97) Hierarchical Bayes vs Empirical Bayes These two forms of analysis are closely related The hierarchical formulation X N (; I) ; N ; I

3 is common to both of them The empirical Bayes method uses the data to produce some heuristic estimator of Hierarchical Bayes methods treat the hierarchical parameter,, in a Bayesian fashion There is an additional heuristic connection between the two methodologies Note that the hierarchical Bayes estimator can be written as E (j) = E E j; j The inner epectation on the right hand side of the equation can be considered to be an estimator such as the one that appears in the empirical Bayes derivation Hence the hierarchical Bayes estimator, hier, say is the mean of these with respect to the Bayesian conditional distribution of given Write hier = b In this way the hierarchical Bayes estimator can also be viewed as an empirical Bayes estimator 3

4 Lecture 8 Empirical Bayes, hierarchical Bayes and random effects (Cont) obbins (956): An empirical Bayes approach to statistics, Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, Jerzy Neyman, ed, vol, Berkeley, California: University of California Press, 956, pp Selected writings of obbins: What is Mathematics?: An elementary approach to ideas and methods, with ichard Courant, London: Oford University Press, 94 A stochastic approimation method, with Sutton Monro, Annals of Mathematical Statistics, 3, 95, pp obbins (956) above 4 Asymptotically subminima solutions of compound decision problems, Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, 95, pp 3-48 Eample Observe X i P oisson ( i ) P (X i = i j i ) = ep ( i) i i i We want to estimate the unknown parameter i Assume that ; ; : : : ; n are iid with distribution G The generalized Bayes wrt squared error loss is ep( ) i i ( i ) = i ep( ) i i ep( ) i + ( = ( i + ) i+) ep( ) i i = ( i + ) G ( i + ) G ( i ) where G is the marginal distribution of X i We know for every ed i ; number of terms X ; X,, X n which are equal to i + G ( i + ) number of terms X ; X,, X n which are equal to i G ( i ) Eample (read Efron, 3, Ann Stat) Applications to missing species problem Estimate Shakespeare s vocabulary Etension to eponential family Let X i f (j i ) = ep ( i ( i )) h () and i G, then ep ( ()) h () () = ep ( ()) h () = d d (G () =h ()) G () =h () 4

5 Then the question now is how to estimate G () Connection to compound decision theory obbins (95): Asymptotically subminima solutions of compound decision problems, Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, 95, pp 3-48 Let X i f (j i ) and write (; ) = n nx EL ( i ; i (X)) i= For separable decision rules of the form i (X) = t (X i ), the compound risk is equal to the average risk Z Z (; ) = L (; t (X)) f (j) d where G (A) = n P n i= f Ag obbin s proposal is to seek asymptotically minima procedure satisfying (; ) = (; G ) + o () as n 5

Week 2 Spring Lecture 3. The Canonical normal means estimation problem (cont.).! (X) = X+ 1 X X, + (X) = X+ 1

Week 2 Spring Lecture 3. The Canonical normal means estimation problem (cont.).! (X) = X+ 1 X X, + (X) = X+ 1 Week 2 Spring 2009 Lecture 3. The Canonical normal means estimation problem (cont.). Shrink toward a common mean. Theorem. Let X N ; 2 I n. Let 0 < C 2 (n 3) (hence n 4). De ne!! (X) = X+ 1 Then C 2 X

More information

Lecture 20 May 18, Empirical Bayes Interpretation [Efron & Morris 1973]

Lecture 20 May 18, Empirical Bayes Interpretation [Efron & Morris 1973] Stats 300C: Theory of Statistics Spring 2018 Lecture 20 May 18, 2018 Prof. Emmanuel Candes Scribe: Will Fithian and E. Candes 1 Outline 1. Stein s Phenomenon 2. Empirical Bayes Interpretation of James-Stein

More information

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b)

LECTURE 5 NOTES. n t. t Γ(a)Γ(b) pt+a 1 (1 p) n t+b 1. The marginal density of t is. Γ(t + a)γ(n t + b) Γ(n + a + b) LECTURE 5 NOTES 1. Bayesian point estimators. In the conventional (frequentist) approach to statistical inference, the parameter θ Θ is considered a fixed quantity. In the Bayesian approach, it is considered

More information

g-priors for Linear Regression

g-priors for Linear Regression Stat60: Bayesian Modeling and Inference Lecture Date: March 15, 010 g-priors for Linear Regression Lecturer: Michael I. Jordan Scribe: Andrew H. Chan 1 Linear regression and g-priors In the last lecture,

More information

8-1 Exploring Exponential Models

8-1 Exploring Exponential Models 8- Eploring Eponential Models Eponential Function A function with the general form, where is a real number, a 0, b > 0 and b. Eample: y = 4() Growth Factor When b >, b is the growth factor Eample: y =

More information

Carl N. Morris. University of Texas

Carl N. Morris. University of Texas EMPIRICAL BAYES: A FREQUENCY-BAYES COMPROMISE Carl N. Morris University of Texas Empirical Bayes research has expanded significantly since the ground-breaking paper (1956) of Herbert Robbins, and its province

More information

Machine Learning Lecture 2

Machine Learning Lecture 2 Machine Perceptual Learning and Sensory Summer Augmented 6 Computing Announcements Machine Learning Lecture 2 Course webpage http://www.vision.rwth-aachen.de/teaching/ Slides will be made available on

More information

L20: MLPs, RBFs and SPR Bayes discriminants and MLPs The role of MLP hidden units Bayes discriminants and RBFs Comparison between MLPs and RBFs

L20: MLPs, RBFs and SPR Bayes discriminants and MLPs The role of MLP hidden units Bayes discriminants and RBFs Comparison between MLPs and RBFs L0: MLPs, RBFs and SPR Bayes discriminants and MLPs The role of MLP hidden units Bayes discriminants and RBFs Comparison between MLPs and RBFs CSCE 666 Pattern Analysis Ricardo Gutierrez-Osuna CSE@TAMU

More information

Statistical Decision Theory and Bayesian Analysis Chapter1 Basic Concepts. Outline. Introduction. Introduction(cont.) Basic Elements (cont.

Statistical Decision Theory and Bayesian Analysis Chapter1 Basic Concepts. Outline. Introduction. Introduction(cont.) Basic Elements (cont. Statistical Decision Theory and Bayesian Analysis Chapter Basic Concepts 939 Outline Introduction Basic Elements Bayesian Epected Loss Frequentist Risk Randomized Decision Rules Decision Principles Misuse

More information

Variational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M

Variational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M PD M = PD θ, MPθ Mdθ Lecture 14 : Variational Bayes where θ are the parameters of the model and Pθ M is

More information

Statistical Inference

Statistical Inference Statistical Inference Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Spring, 2006 1. DeGroot 1973 In (DeGroot 1973), Morrie DeGroot considers testing the

More information

Notes on Discriminant Functions and Optimal Classification

Notes on Discriminant Functions and Optimal Classification Notes on Discriminant Functions and Optimal Classification Padhraic Smyth, Department of Computer Science University of California, Irvine c 2017 1 Discriminant Functions Consider a classification problem

More information

L. Brown. Statistics Department, Wharton School University of Pennsylvania

L. Brown. Statistics Department, Wharton School University of Pennsylvania Non-parametric Empirical Bayes and Compound Bayes Estimation of Independent Normal Means Joint work with E. Greenshtein L. Brown Statistics Department, Wharton School University of Pennsylvania lbrown@wharton.upenn.edu

More information

Bayes Decision Theory - I

Bayes Decision Theory - I Bayes Decision Theory - I Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statistical Learning from Data Goal: Given a relationship between a feature vector and a vector y, and iid data samples ( i,y i ), find

More information

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources

STA 732: Inference. Notes 10. Parameter Estimation from a Decision Theoretic Angle. Other resources STA 732: Inference Notes 10. Parameter Estimation from a Decision Theoretic Angle Other resources 1 Statistical rules, loss and risk We saw that a major focus of classical statistics is comparing various

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

Variations on a Theme by Liu, Cuff, and Verdú: The Power of Posterior Sampling

Variations on a Theme by Liu, Cuff, and Verdú: The Power of Posterior Sampling Variations on a Theme by Liu, Cuff, and Verdú: The Power of Posterior Sampling Alankrita Bhatt, Jiun-Ting Huang, Young-Han Kim, J. Jon Ryu, and Pinar Sen Department of Electrical and Computer Engineering,

More information

Business Statistics: A Decision-Making Approach 6 th Edition. Chapter Goals

Business Statistics: A Decision-Making Approach 6 th Edition. Chapter Goals Chapter 6 Student Lecture Notes 6-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 6 Introduction to Sampling Distributions Chap 6-1 Chapter Goals To use information from the sample

More information

On Simulations form the Two-Parameter. Poisson-Dirichlet Process and the Normalized. Inverse-Gaussian Process

On Simulations form the Two-Parameter. Poisson-Dirichlet Process and the Normalized. Inverse-Gaussian Process On Simulations form the Two-Parameter arxiv:1209.5359v1 [stat.co] 24 Sep 2012 Poisson-Dirichlet Process and the Normalized Inverse-Gaussian Process Luai Al Labadi and Mahmoud Zarepour May 8, 2018 ABSTRACT

More information

Multinomial Data. f(y θ) θ y i. where θ i is the probability that a given trial results in category i, i = 1,..., k. The parameter space is

Multinomial Data. f(y θ) θ y i. where θ i is the probability that a given trial results in category i, i = 1,..., k. The parameter space is Multinomial Data The multinomial distribution is a generalization of the binomial for the situation in which each trial results in one and only one of several categories, as opposed to just two, as in

More information

Rejection sampling - Acceptance probability. Review: How to sample from a multivariate normal in R. Review: Rejection sampling. Weighted resampling

Rejection sampling - Acceptance probability. Review: How to sample from a multivariate normal in R. Review: Rejection sampling. Weighted resampling Rejection sampling - Acceptance probability Review: How to sample from a multivariate normal in R Goal: Simulate from N d (µ,σ)? Note: For c to be small, g() must be similar to f(). The art of rejection

More information

Bayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides)

Bayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides) Bayesian inference Justin Chumbley ETH and UZH (Thanks to Jean Denizeau for slides) Overview of the talk Introduction: Bayesian inference Bayesian model comparison Group-level Bayesian model selection

More information

Chapter 6: Large Random Samples Sections

Chapter 6: Large Random Samples Sections Chapter 6: Large Random Samples Sections 6.1: Introduction 6.2: The Law of Large Numbers Skip p. 356-358 Skip p. 366-368 Skip 6.4: The correction for continuity Remember: The Midterm is October 25th in

More information

Fixed Effects, Invariance, and Spatial Variation in Intergenerational Mobility

Fixed Effects, Invariance, and Spatial Variation in Intergenerational Mobility American Economic Review: Papers & Proceedings 2016, 106(5): 400 404 http://dx.doi.org/10.1257/aer.p20161082 Fixed Effects, Invariance, and Spatial Variation in Intergenerational Mobility By Gary Chamberlain*

More information

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003 Statistical Approaches to Learning and Discovery Week 4: Decision Theory and Risk Minimization February 3, 2003 Recall From Last Time Bayesian expected loss is ρ(π, a) = E π [L(θ, a)] = L(θ, a) df π (θ)

More information

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali Stochastic Processes Review o Elementary Probability bili Lecture I Hamid R. Rabiee Ali Jalali Outline History/Philosophy Random Variables Density/Distribution Functions Joint/Conditional Distributions

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

Module 22: Bayesian Methods Lecture 9 A: Default prior selection

Module 22: Bayesian Methods Lecture 9 A: Default prior selection Module 22: Bayesian Methods Lecture 9 A: Default prior selection Peter Hoff Departments of Statistics and Biostatistics University of Washington Outline Jeffreys prior Unit information priors Empirical

More information

Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap

Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap Dale J. Poirier University of California, Irvine September 1, 2008 Abstract This paper

More information

A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS

A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS Statistica Sinica 25 2015), 1527-1546 doi:http://d.doi.org/10.5705/ss.2013.155 A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS B. Buonaguidi and P. Muliere Bocconi University Abstract:

More information

every hour 8760 A every minute 525,000 A continuously n A

every hour 8760 A every minute 525,000 A continuously n A In the previous lesson we introduced Eponential Functions and their graphs, and covered an application of Eponential Functions (Compound Interest). We saw that when interest is compounded n times per year

More information

Minimum Message Length Shrinkage Estimation

Minimum Message Length Shrinkage Estimation Minimum Message Length Shrinage Estimation Enes Maalic, Daniel F. Schmidt Faculty of Information Technology, Monash University, Clayton, Australia, 3800 Abstract This note considers estimation of the mean

More information

Bayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge

Bayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge Bayesian Quadrature: Model-based Approimate Integration David Duvenaud University of Cambridge The Quadrature Problem ˆ We want to estimate an integral Z = f ()p()d ˆ Most computational problems in inference

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015

More information

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory

Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory Statistical Inference Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory IP, José Bioucas Dias, IST, 2007

More information

Robustness to Parametric Assumptions in Missing Data Models

Robustness to Parametric Assumptions in Missing Data Models Robustness to Parametric Assumptions in Missing Data Models Bryan Graham NYU Keisuke Hirano University of Arizona April 2011 Motivation Motivation We consider the classic missing data problem. In practice

More information

On Extended Admissible Procedures and their Nonstandard Bayes Risk

On Extended Admissible Procedures and their Nonstandard Bayes Risk On Extended Admissible Procedures and their Nonstandard Bayes Risk Haosui "Kevin" Duanmu and Daniel M. Roy University of Toronto + Fourth Bayesian, Fiducial, and Frequentist Conference Harvard University

More information

Density Estimation: ML, MAP, Bayesian estimation

Density Estimation: ML, MAP, Bayesian estimation Density Estimation: ML, MAP, Bayesian estimation CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Maximum-Likelihood Estimation Maximum

More information

PROCEEDINGS of the THIRD. BERKELEY SYMPOSIUM ON MATHEMATICAL STATISTICS AND PROBABILITY

PROCEEDINGS of the THIRD. BERKELEY SYMPOSIUM ON MATHEMATICAL STATISTICS AND PROBABILITY PROCEEDINGS of the THIRD. BERKELEY SYMPOSIUM ON MATHEMATICAL STATISTICS AND PROBABILITY Held at the Statistical IAboratory Universi~ of California De_mnbiJ954 '!dy and August, 1955 VOLUME I CONTRIBUTIONS

More information

Bayesian Interpretations of Heteroskedastic Consistent Covariance. Estimators Using the Informed Bayesian Bootstrap

Bayesian Interpretations of Heteroskedastic Consistent Covariance. Estimators Using the Informed Bayesian Bootstrap Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap Dale J. Poirier University of California, Irvine May 22, 2009 Abstract This paper provides

More information

Empirical Bayes Quantile-Prediction aka E-B Prediction under Check-loss;

Empirical Bayes Quantile-Prediction aka E-B Prediction under Check-loss; BFF4, May 2, 2017 Empirical Bayes Quantile-Prediction aka E-B Prediction under Check-loss; Lawrence D. Brown Wharton School, Univ. of Pennsylvania Joint work with Gourab Mukherjee and Paat Rusmevichientong

More information

Tweedie s Formula and Selection Bias. Bradley Efron Stanford University

Tweedie s Formula and Selection Bias. Bradley Efron Stanford University Tweedie s Formula and Selection Bias Bradley Efron Stanford University Selection Bias Observe z i N(µ i, 1) for i = 1, 2,..., N Select the m biggest ones: z (1) > z (2) > z (3) > > z (m) Question: µ values?

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 8 1 / 11 The Prior Distribution Definition Suppose that one has a statistical model with parameter

More information

Math M111: Lecture Notes For Chapter 10

Math M111: Lecture Notes For Chapter 10 Math M: Lecture Notes For Chapter 0 Sections 0.: Inverse Function Inverse function (interchange and y): Find the equation of the inverses for: y = + 5 ; y = + 4 3 Function (from section 3.5): (Vertical

More information

ON EMPIRICAL BAYES WITH SEQUENTIAL COMPONENT

ON EMPIRICAL BAYES WITH SEQUENTIAL COMPONENT Ann. Inst. Statist. Math. Vol. 40, No. 1, 187-193 (1988) ON EMPIRICAL BAYES WITH SEQUENTIAL COMPONENT DENNIS C. GILLILAND 1 AND ROHANA KARUNAMUNI 2 1Department of Statistics and Probability, Michigan State

More information

Lecture 2: CDF and EDF

Lecture 2: CDF and EDF STAT 425: Introduction to Nonparametric Statistics Winter 2018 Instructor: Yen-Chi Chen Lecture 2: CDF and EDF 2.1 CDF: Cumulative Distribution Function For a random variable X, its CDF F () contains all

More information

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Jon Wakefield Departments of Statistics and Biostatistics University of Washington 1 / 37 Lecture Content Motivation

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory Department of Statistics & Applied Probability Wednesday, October 5, 2011 Lecture 13: Basic elements and notions in decision theory Basic elements X : a sample from a population P P Decision: an action

More information

Bayesian inference J. Daunizeau

Bayesian inference J. Daunizeau Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty

More information

Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions

Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions Hacettepe Journal of Mathematics Statistics Volume 44 (5) (2015), 1283 1291 Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions SH. Moradi

More information

McGill University. Department of Epidemiology and Biostatistics. Bayesian Analysis for the Health Sciences. Course EPIB-675.

McGill University. Department of Epidemiology and Biostatistics. Bayesian Analysis for the Health Sciences. Course EPIB-675. McGill University Department of Epidemiology and Biostatistics Bayesian Analysis for the Health Sciences Course EPIB-675 Lawrence Joseph Bayesian Analysis for the Health Sciences EPIB-675 3 credits Instructor:

More information

Are You a Bayesian or a Frequentist?

Are You a Bayesian or a Frequentist? Are You a Bayesian or a Frequentist? Michael I. Jordan Department of EECS Department of Statistics University of California, Berkeley http://www.cs.berkeley.edu/ jordan 1 Statistical Inference Bayesian

More information

MODULE 6 LECTURE NOTES 1 REVIEW OF PROBABILITY THEORY. Most water resources decision problems face the risk of uncertainty mainly because of the

MODULE 6 LECTURE NOTES 1 REVIEW OF PROBABILITY THEORY. Most water resources decision problems face the risk of uncertainty mainly because of the MODULE 6 LECTURE NOTES REVIEW OF PROBABILITY THEORY INTRODUCTION Most water resources decision problems ace the risk o uncertainty mainly because o the randomness o the variables that inluence the perormance

More information

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 1998 Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ Lawrence D. Brown University

More information

This theorem guarantees solutions to many problems you will encounter. exists, then f ( c)

This theorem guarantees solutions to many problems you will encounter. exists, then f ( c) Maimum and Minimum Values Etreme Value Theorem If f () is continuous on the closed interval [a, b], then f () achieves both a global (absolute) maimum and global minimum at some numbers c and d in [a,

More information

MACHINE LEARNING ADVANCED MACHINE LEARNING

MACHINE LEARNING ADVANCED MACHINE LEARNING MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 2 2 MACHINE LEARNING Overview Definition pdf Definition joint, condition, marginal,

More information

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006 Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)

More information

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu Lecture 16-17: Bayesian Nonparametrics I STAT 6474 Instructor: Hongxiao Zhu Plan for today Why Bayesian Nonparametrics? Dirichlet Distribution and Dirichlet Processes. 2 Parameter and Patterns Reference:

More information

Bayesian inference J. Daunizeau

Bayesian inference J. Daunizeau Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty

More information

Lecture 13 and 14: Bayesian estimation theory

Lecture 13 and 14: Bayesian estimation theory 1 Lecture 13 and 14: Bayesian estimation theory Spring 2012 - EE 194 Networked estimation and control (Prof. Khan) March 26 2012 I. BAYESIAN ESTIMATORS Mother Nature conducts a random experiment that generates

More information

ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals

ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals D. Richard Brown III Worcester Polytechnic Institute 30-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 30-Apr-2009 1 / 32

More information

ESTIMATORS FOR GAUSSIAN MODELS HAVING A BLOCK-WISE STRUCTURE

ESTIMATORS FOR GAUSSIAN MODELS HAVING A BLOCK-WISE STRUCTURE Statistica Sinica 9 2009, 885-903 ESTIMATORS FOR GAUSSIAN MODELS HAVING A BLOCK-WISE STRUCTURE Lawrence D. Brown and Linda H. Zhao University of Pennsylvania Abstract: Many multivariate Gaussian models

More information

A Comparison of Particle Filters for Personal Positioning

A Comparison of Particle Filters for Personal Positioning VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University

More information

Robust Fault Diagnosis of Uncertain One-dimensional Wave Equations

Robust Fault Diagnosis of Uncertain One-dimensional Wave Equations Robust Fault Diagnosis of Uncertain One-dimensional Wave Equations Satadru Dey 1 and Scott J. Moura Abstract Unlike its Ordinary Differential Equation ODE counterpart, fault diagnosis of Partial Differential

More information

BURGESS JAMES DAVIS. B.S. Ohio State 1965 M.S. University of Illinois 1966 Ph.D. University of Illinois 1968

BURGESS JAMES DAVIS. B.S. Ohio State 1965 M.S. University of Illinois 1966 Ph.D. University of Illinois 1968 BURGESS JAMES DAVIS April 2016 EDUCATION B.S. Ohio State 1965 M.S. University of Illinois 1966 Ph.D. University of Illinois 1968 PROFESSIONAL EXPERIENCE Assistant Professor Rutgers University 1968-71 Associate

More information

Alternative Bayesian Estimators for Vector-Autoregressive Models

Alternative Bayesian Estimators for Vector-Autoregressive Models Alternative Bayesian Estimators for Vector-Autoregressive Models Shawn Ni, Department of Economics, University of Missouri, Columbia, MO 65211, USA Dongchu Sun, Department of Statistics, University of

More information

Lecture 2: Basic Concepts of Statistical Decision Theory

Lecture 2: Basic Concepts of Statistical Decision Theory EE378A Statistical Signal Processing Lecture 2-03/31/2016 Lecture 2: Basic Concepts of Statistical Decision Theory Lecturer: Jiantao Jiao, Tsachy Weissman Scribe: John Miller and Aran Nayebi In this lecture

More information

CREDIBILITY OF CONFIDENCE INTERVALS

CREDIBILITY OF CONFIDENCE INTERVALS CREDIBILITY OF CONFIDENCE INTERVALS D. Karlen Λ Ottawa-Carleton Institute for Physics, Carleton University, Ottawa, Canada Abstract Classical confidence intervals are often misunderstood by particle physicists

More information

Lesson Goals. Unit 2 Functions Analyzing Graphs of Functions (Unit 2.2) Graph of a Function. Lesson Goals

Lesson Goals. Unit 2 Functions Analyzing Graphs of Functions (Unit 2.2) Graph of a Function. Lesson Goals Unit Functions Analzing Graphs of Functions (Unit.) William (Bill) Finch Mathematics Department Denton High School Lesson Goals When ou have completed this lesson ou will: Find the domain and range of

More information

Part III. A Decision-Theoretic Approach and Bayesian testing

Part III. A Decision-Theoretic Approach and Bayesian testing Part III A Decision-Theoretic Approach and Bayesian testing 1 Chapter 10 Bayesian Inference as a Decision Problem The decision-theoretic framework starts with the following situation. We would like to

More information

Where now? Machine Learning and Bayesian Inference

Where now? Machine Learning and Bayesian Inference Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from

More information

Statistical Measures of Uncertainty in Inverse Problems

Statistical Measures of Uncertainty in Inverse Problems Statistical Measures of Uncertainty in Inverse Problems Workshop on Uncertainty in Inverse Problems Institute for Mathematics and Its Applications Minneapolis, MN 19-26 April 2002 P.B. Stark Department

More information

Exact Calculation of Normalized Maximum Likelihood Code Length Using Fourier Analysis

Exact Calculation of Normalized Maximum Likelihood Code Length Using Fourier Analysis Eact Calculation of ormalized Maimum Likelihood Code Length Using Fourier Analysis Atsushi Suzuki and Kenji Yamanishi The University of Tokyo Graduate School of Information Science and Technology Bunkyo,

More information

Chapter 8. Exponential and Logarithmic Functions

Chapter 8. Exponential and Logarithmic Functions Chapter 8 Eponential and Logarithmic Functions Lesson 8-1 Eploring Eponential Models Eponential Function The general form of an eponential function is y = ab. Growth Factor When the value of b is greater

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Asymptotic efficiency of simple decisions for the compound decision problem

Asymptotic efficiency of simple decisions for the compound decision problem Asymptotic efficiency of simple decisions for the compound decision problem Eitan Greenshtein and Ya acov Ritov Department of Statistical Sciences Duke University Durham, NC 27708-0251, USA e-mail: eitan.greenshtein@gmail.com

More information

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA

Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box Durham, NC 27708, USA Testing Simple Hypotheses R.L. Wolpert Institute of Statistics and Decision Sciences Duke University, Box 90251 Durham, NC 27708, USA Summary: Pre-experimental Frequentist error probabilities do not summarize

More information

Cromwell's principle idealized under the theory of large deviations

Cromwell's principle idealized under the theory of large deviations Cromwell's principle idealized under the theory of large deviations Seminar, Statistics and Probability Research Group, University of Ottawa Ottawa, Ontario April 27, 2018 David Bickel University of Ottawa

More information

10-704: Information Processing and Learning Fall Lecture 24: Dec 7

10-704: Information Processing and Learning Fall Lecture 24: Dec 7 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 24: Dec 7 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

Overall Objective Priors

Overall Objective Priors Overall Objective Priors Jim Berger, Jose Bernardo and Dongchu Sun Duke University, University of Valencia and University of Missouri Recent advances in statistical inference: theory and case studies University

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Lecture 7 October 13

Lecture 7 October 13 STATS 300A: Theory of Statistics Fall 2015 Lecture 7 October 13 Lecturer: Lester Mackey Scribe: Jing Miao and Xiuyuan Lu 7.1 Recap So far, we have investigated various criteria for optimal inference. We

More information

Nonparametric Bayesian Methods

Nonparametric Bayesian Methods Nonparametric Bayesian Methods Debdeep Pati Florida State University October 2, 2014 Large spatial datasets (Problem of big n) Large observational and computer-generated datasets: Often have spatial and

More information

Type II variational methods in Bayesian estimation

Type II variational methods in Bayesian estimation Type II variational methods in Bayesian estimation J. A. Palmer, D. P. Wipf, and K. Kreutz-Delgado Department of Electrical and Computer Engineering University of California San Diego, La Jolla, CA 9093

More information

Bayesian Decision Theory Tutorial Visual Recognition Tutorial 1

Bayesian Decision Theory Tutorial Visual Recognition Tutorial 1 Bayesian Decision Theory Tutorial C4 36607 Visual Recognition Tutorial Tutorial the outline Bayesian decision making with discrete probabilities an eample Looking at continuous densities Bayesian decision

More information

ebay/google short course: Problem set 2

ebay/google short course: Problem set 2 18 Jan 013 ebay/google short course: Problem set 1. (the Echange Parado) You are playing the following game against an opponent, with a referee also taking part. The referee has two envelopes (numbered

More information

Hybrid Dirichlet processes for functional data

Hybrid Dirichlet processes for functional data Hybrid Dirichlet processes for functional data Sonia Petrone Università Bocconi, Milano Joint work with Michele Guindani - U.T. MD Anderson Cancer Center, Houston and Alan Gelfand - Duke University, USA

More information

Least Squares Regression

Least Squares Regression CIS 50: Machine Learning Spring 08: Lecture 4 Least Squares Regression Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may not cover all the

More information

An introduction to Bayesian inference and model comparison J. Daunizeau

An introduction to Bayesian inference and model comparison J. Daunizeau An introduction to Bayesian inference and model comparison J. Daunizeau ICM, Paris, France TNU, Zurich, Switzerland Overview of the talk An introduction to probabilistic modelling Bayesian model comparison

More information

Mathematical Institute, University of Utrecht. The problem of estimating the mean of an observed Gaussian innite-dimensional vector

Mathematical Institute, University of Utrecht. The problem of estimating the mean of an observed Gaussian innite-dimensional vector On Minimax Filtering over Ellipsoids Eduard N. Belitser and Boris Y. Levit Mathematical Institute, University of Utrecht Budapestlaan 6, 3584 CD Utrecht, The Netherlands The problem of estimating the mean

More information

Controlling the False Discovery Rate: Understanding and Extending the Benjamini-Hochberg Method

Controlling the False Discovery Rate: Understanding and Extending the Benjamini-Hochberg Method Controlling the False Discovery Rate: Understanding and Extending the Benjamini-Hochberg Method Christopher R. Genovese Department of Statistics Carnegie Mellon University joint work with Larry Wasserman

More information

The unbearable transparency of Stein estimation

The unbearable transparency of Stein estimation IMS Collections Nonparametrics and Robustness in Modern Statistical Inference and Time Series Analysis: A Festschrift in honor of Professor Jana Jurečková Vol. 7 (2010) 25 34 c Institute of Mathematical

More information

Lecture 3: Pattern Classification. Pattern classification

Lecture 3: Pattern Classification. Pattern classification EE E68: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mitures and

More information

Bayesian Inference of Noise Levels in Regression

Bayesian Inference of Noise Levels in Regression Bayesian Inference of Noise Levels in Regression Christopher M. Bishop Microsoft Research, 7 J. J. Thomson Avenue, Cambridge, CB FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop

More information

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery

Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns

More information

13: Variational inference II

13: Variational inference II 10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational

More information

A Bayesian Analysis of Some Nonparametric Problems

A Bayesian Analysis of Some Nonparametric Problems A Bayesian Analysis of Some Nonparametric Problems Thomas S Ferguson April 8, 2003 Introduction Bayesian approach remained rather unsuccessful in treating nonparametric problems. This is primarily due

More information

Optimal Few-Stage Designs for Clinical Trials. Janis Hardwick Quentin F. Stout

Optimal Few-Stage Designs for Clinical Trials. Janis Hardwick Quentin F. Stout Presentation at GlaxoSmithKline, 10 May 2002. 1 ΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛΛ Optimal Few-Stage Designs for Clinical Trials Janis Hardwick Quentin F. Stout University of Michigan http://www.eecs.umich.edu/

More information

Lecture 1c: Gaussian Processes for Regression

Lecture 1c: Gaussian Processes for Regression Lecture c: Gaussian Processes for Regression Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk

More information