A Frequentist Assessment of Bayesian Inclusion Probabilities
|
|
- Dulcie Patrick
- 6 years ago
- Views:
Transcription
1 A Frequentist Assessment of Bayesian Inclusion Probabilities Department of Statistical Sciences and Operations Research October 13, 2008
2 Outline 1 Quantitative Traits Genetic Map The model 2 Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space 3 Theorem Performance 4
3 Quantitiative Traits Quantitative Traits Genetic Map The model Biologists are often interested in which genes control a quantitative trait. Examples: Height Weight Yeild Cotyledon Opening Angle
4 Quantitiative Traits Quantitative Traits Genetic Map The model Biologists are often interested in which genes control a quantitative trait. Examples: Height Weight Yeild Cotyledon Opening Angle
5 Quantitiative Traits Quantitative Traits Genetic Map The model Biologists are often interested in which genes control a quantitative trait. Examples: Height Weight Yeild Cotyledon Opening Angle
6 Quantitiative Traits Quantitative Traits Genetic Map The model Biologists are often interested in which genes control a quantitative trait. Examples: Height Weight Yeild Cotyledon Opening Angle
7 Quantitive Trait Loci Quantitative Traits Genetic Map The model Genetic Map of the Arabidopsis thaliana. 0 I II III IV V T1G11 MSAT2-5 NGA172 MSAT4-39 F21M12 MSAT4-8 MSAT2-38 ATHCHIB MSAT11-0 NGA248 T27K12 NGA128 F5I14 MSAT1-13 MSAT-36 MSAT2-41 MSAT2-7 MSAT2-10 MSAT2-22 MSAT3-19 MSAT3-32 MSAT3-21 MSAT3-18 NGA8 MSAT4-35 MSAT4-15 MSAT4-18 MSAT4-9 MSAT4-37 MSAT5-14 NGA139 MSAT5-22 MSAT5-9 MSAT5-12 MSAT1-5 MSAT cm Genetic Map BAY-0 x SHADARA
8 Linear Model Quantitative Traits Genetic Map The model To determine the benchmark dose for a predefined endpoint we can use: p Y i = β 0 + β j X ij I Xij M k + ɛ ij j=1 where I Xij M k = 1 if X ij is in model M k and zero otherwise and { 1 if Locus j comes from Parent A X ij = 0 if Locus j comes from Parent B There are 2 p = 2 38 possible first order linear models.
9 Quantitative Traits Genetic Map The model Typical approaches to determining the locations may be: ANOVA Step-wise Selection Forward Selection Mallows C p
10 Quantitative Traits Genetic Map The model Typical approaches to determining the locations may be: ANOVA Step-wise Selection Forward Selection Mallows C p
11 Quantitative Traits Genetic Map The model Typical approaches to determining the locations may be: ANOVA Step-wise Selection Forward Selection Mallows C p
12 Quantitative Traits Genetic Map The model Typical approaches to determining the locations may be: ANOVA Step-wise Selection Forward Selection Mallows C p
13 Inclusion Probabilities Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space An alternative to standard likelihood or Bayes factor inferences can be achieved via Inclusion Probabilities: P(β j 0 D) = P(Locus j D) These give us the probability that Locus j is important regardless of the model. This only depends on the data.
14 Bayesian Model Averaging Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space For a model space, M with M models then for each model M c M the posterior model probability given a set of data D is given by: P(D M c )P(M c ) P(M c D) = M k=1 P(D M k)p(m k ) where P(D M c ) = P(D θ c, M c )P(θ c M c )dθ c. where θ c is the parameter vector for model M c.
15 Determining the Marginals Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space Determining P(D M c ) can be done various ways: AIC, BIC or DIC based approximations Laplace approximation Numerical Integration Exact Solution
16 Determining the Marginals Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space Determining P(D M c ) can be done various ways: AIC, BIC or DIC based approximations Laplace approximation Numerical Integration Exact Solution
17 Determining the Marginals Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space Determining P(D M c ) can be done various ways: AIC, BIC or DIC based approximations Laplace approximation Numerical Integration Exact Solution
18 Determining the Marginals Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space Determining P(D M c ) can be done various ways: AIC, BIC or DIC based approximations Laplace approximation Numerical Integration Exact Solution
19 Determining the Marginals Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space For the linear model with Normal-Inv-χ 2 prior distribution the exact marginal is given by: Γ ( ) ν ν+n (νλ) 2 P(D µ c, V c, ν, X c, M c ) = π n 2 Γ ( ν 2 [λν + (Y X c µ c ) (I + X c V c X c) 1 (Y X c µ c )] ν+n 2, (1) 2 ) I + Xc V c X c 1/2
20 Stochastic Search Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space To search through the model space a stochastic search can be employed. To the probability of moving from model M c to model M t is given by the Metropolis-Hastings algorithm. α = min { 1, P(M t)p(d M t ) P(M c )P(D M c ) where q is a proposal distribution q(m t M c ) q(m c M t ) }, (2)
21 Inclusion Probabilities: Part Duex Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space In this framework the Inclusion Probabilities can be calculated via: P(β j 0 D) = = M P(β j 0 D, M k )P(M k D) k=1 M I Xj M k (M k )P(M k D) k=1
22 Restricted Model Space Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space In the case where p > n a restricted model space can be employed be restricting the models to have only r loci at a time. This allows: All loci to be considered. Enough degrees of freedom for model fitting. Smaller model space.
23 Restricted Model Space Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space In the case where p > n a restricted model space can be employed be restricting the models to have only r loci at a time. This allows: All loci to be considered. Enough degrees of freedom for model fitting. Smaller model space.
24 Restricted Model Space Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space In the case where p > n a restricted model space can be employed be restricting the models to have only r loci at a time. This allows: All loci to be considered. Enough degrees of freedom for model fitting. Smaller model space.
25 Effect of Restriction Bayesian Model Averaging Determining the Marginals Inclusion Probabilities Restricted Model Space For the Arabidopsis thaliana. Table: Inclusion probabilities P(β j 0 D) for highest probability loci with restrictions r = 5, 10, 15 and r = p = 38. Locus r = 5 r = 10 r = 15 r = p = 38 ATHCHIB F21M MSAT MSAT MSAT The restriction suppresses the larger inclusion probabilities.
26 Theorem Theorem Performance Theorem Under H o :β j = 0 for all j = 1,..., p with restriction r < p and P(M k ) is uniform for all M k M then P(β j 0 D) r 1 i=1 ( ) r 1 i ). r i=1 ( p i (3)
27 Corollary Theorem Performance Corrollary Under H o :β j = 0 for all j = 1,..., p, r = p and P(M k ) is uniform for all M k M then P(β j 0 D) 1 2. This is consistent with expectations.
28 Upper Cut-off Values Theorem Performance Let q = P(β j 0 D) under H 0 : β j = 0 then a simple upper cut-off value for a single locus is: q(1 q) P(β j 0 D) q + z 1 α, n s or for p loci the upper cut off can be found as an order statistic q (max) g(q (max)) = Φ(q (max)) p 1 φ(q (max)), where g is the sampling distribution for q under H 0. For simplicity the normal distribution can be used as an approximation. Note: a Bayesian approach using a Beta distribution could be employed.
29 Simulation Study Theorem Performance To evaluate the performance of the algorithm a simulation study was conducted. Theoretical vs Empirical Cut off Values Power for effect sizes 0, 1, 2,4 and 8 Restriction sizes 5, 10 and 15
30 Simulation Study Theorem Performance To generate the simulation data the loci data from the Arabidopsis thaliana was used for X i and the response was generated using: Y i = β 0 + δx ij + ɛ i ɛ i N(0, 2) δ = 0, 1, 2,4 and 8 X j was randomly selected from the 38 loci P(β j 0 D) based on 100,000 stochastic steps
31 Theoretical vs. Empirical Theorem Performance Table: Empirical and theoretical α = 0.05 upper cut-off values for a single locus. Empirical values are based on 300 simulations. Restriction Empirical Theoretical r = r = r = Note: at r = 15 the normal distribution assumption is invalid.
32 Power Theorem Performance Power study for various effect sizes. Table: Power of theoretical upper cut-off values for a single locus. Effect Size Restriction r = r = r = Note: In simulated data σ = 2.
33 Arabidopsis thaliana data Table: Inclusion probabilities for highest probability loci with significance using proposed individual cut-off value of for r = 5 and for r = 10. r = 5 r = 10 Locus P(β j 0 D) Significant P(β j 0 D) Significant ATHCHIB Y Y F21M Y Y MSAT Y Y MSAT N N MSAT Y N
34 Conclusions This gives frequentists a method to evaluate Inclusion Probabilities Shows how to deal with the effect of restricted model spaces Consistent with expectations This shows the power of the method Shows how to apply the method
35 Conclusions This gives frequentists a method to evaluate Inclusion Probabilities Shows how to deal with the effect of restricted model spaces Consistent with expectations This shows the power of the method Shows how to apply the method
36 Conclusions This gives frequentists a method to evaluate Inclusion Probabilities Shows how to deal with the effect of restricted model spaces Consistent with expectations This shows the power of the method Shows how to apply the method
37 Conclusions This gives frequentists a method to evaluate Inclusion Probabilities Shows how to deal with the effect of restricted model spaces Consistent with expectations This shows the power of the method Shows how to apply the method
38 Conclusions This gives frequentists a method to evaluate Inclusion Probabilities Shows how to deal with the effect of restricted model spaces Consistent with expectations This shows the power of the method Shows how to apply the method
39 Future Work How does restricted model spaces affect epistasis models? Can we determine a better sampling distribution for P(β j 0 D)? A blind stochastic search is expensive. Can we form a more intelligent search method? Can we form a more intelligent restriction method?
40 Future Work How does restricted model spaces affect epistasis models? Can we determine a better sampling distribution for P(β j 0 D)? A blind stochastic search is expensive. Can we form a more intelligent search method? Can we form a more intelligent restriction method?
41 Future Work How does restricted model spaces affect epistasis models? Can we determine a better sampling distribution for P(β j 0 D)? A blind stochastic search is expensive. Can we form a more intelligent search method? Can we form a more intelligent restriction method?
42 Future Work How does restricted model spaces affect epistasis models? Can we determine a better sampling distribution for P(β j 0 D)? A blind stochastic search is expensive. Can we form a more intelligent search method? Can we form a more intelligent restriction method?
43 Contact Info Thank You. Edward L. Boone Department of Statistical Sciences and Operations Research Richmond, Virginia
Introduction into Bayesian statistics
Introduction into Bayesian statistics Maxim Kochurov EF MSU November 15, 2016 Maxim Kochurov Introduction into Bayesian statistics EF MSU 1 / 7 Content 1 Framework Notations 2 Difference Bayesians vs Frequentists
More informationMultiple QTL mapping
Multiple QTL mapping Karl W Broman Department of Biostatistics Johns Hopkins University www.biostat.jhsph.edu/~kbroman [ Teaching Miscellaneous lectures] 1 Why? Reduce residual variation = increased power
More informationLinear Models A linear model is defined by the expression
Linear Models A linear model is defined by the expression x = F β + ɛ. where x = (x 1, x 2,..., x n ) is vector of size n usually known as the response vector. β = (β 1, β 2,..., β p ) is the transpose
More informationGraphical Models for Collaborative Filtering
Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,
More informationBayesian Linear Regression
Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective
More informationOverview. Background
Overview Implementation of robust methods for locating quantitative trait loci in R Introduction to QTL mapping Andreas Baierl and Andreas Futschik Institute of Statistics and Decision Support Systems
More informationDavid Giles Bayesian Econometrics
9. Model Selection - Theory David Giles Bayesian Econometrics One nice feature of the Bayesian analysis is that we can apply it to drawing inferences about entire models, not just parameters. Can't do
More informationBayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017
Bayesian inference Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark April 10, 2017 1 / 22 Outline for today A genetic example Bayes theorem Examples Priors Posterior summaries
More informationAn Empirical-Bayes Score for Discrete Bayesian Networks
An Empirical-Bayes Score for Discrete Bayesian Networks scutari@stats.ox.ac.uk Department of Statistics September 8, 2016 Bayesian Network Structure Learning Learning a BN B = (G, Θ) from a data set D
More informationBayesian model selection for computer model validation via mixture model estimation
Bayesian model selection for computer model validation via mixture model estimation Kaniav Kamary ATER, CNAM Joint work with É. Parent, P. Barbillon, M. Keller and N. Bousquet Outline Computer model validation
More informationStandard Errors & Confidence Intervals. N(0, I( β) 1 ), I( β) = [ 2 l(β, φ; y) β i β β= β j
Standard Errors & Confidence Intervals β β asy N(0, I( β) 1 ), where I( β) = [ 2 l(β, φ; y) ] β i β β= β j We can obtain asymptotic 100(1 α)% confidence intervals for β j using: β j ± Z 1 α/2 se( β j )
More informationNotes on pseudo-marginal methods, variational Bayes and ABC
Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution
More informationBayesian methods in economics and finance
1/26 Bayesian methods in economics and finance Linear regression: Bayesian model selection and sparsity priors Linear Regression 2/26 Linear regression Model for relationship between (several) independent
More informationDIC, AIC, BIC, PPL, MSPE Residuals Predictive residuals
DIC, AIC, BIC, PPL, MSPE Residuals Predictive residuals Overall Measures of GOF Deviance: this measures the overall likelihood of the model given a parameter vector D( θ) = 2 log L( θ) This can be averaged
More informationReadings: K&F: 16.3, 16.4, Graphical Models Carlos Guestrin Carnegie Mellon University October 6 th, 2008
Readings: K&F: 16.3, 16.4, 17.3 Bayesian Param. Learning Bayesian Structure Learning Graphical Models 10708 Carlos Guestrin Carnegie Mellon University October 6 th, 2008 10-708 Carlos Guestrin 2006-2008
More informationg-priors for Linear Regression
Stat60: Bayesian Modeling and Inference Lecture Date: March 15, 010 g-priors for Linear Regression Lecturer: Michael I. Jordan Scribe: Andrew H. Chan 1 Linear regression and g-priors In the last lecture,
More informationHypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33
Hypothesis Testing Econ 690 Purdue University Justin L. Tobias (Purdue) Testing 1 / 33 Outline 1 Basic Testing Framework 2 Testing with HPD intervals 3 Example 4 Savage Dickey Density Ratio 5 Bartlett
More informationECE521 W17 Tutorial 6. Min Bai and Yuhuai (Tony) Wu
ECE521 W17 Tutorial 6 Min Bai and Yuhuai (Tony) Wu Agenda knn and PCA Bayesian Inference k-means Technique for clustering Unsupervised pattern and grouping discovery Class prediction Outlier detection
More informationDown by the Bayes, where the Watermelons Grow
Down by the Bayes, where the Watermelons Grow A Bayesian example using SAS SUAVe: Victoria SAS User Group Meeting November 21, 2017 Peter K. Ott, M.Sc., P.Stat. Strategic Analysis 1 Outline 1. Motivating
More informationModel comparison and selection
BS2 Statistical Inference, Lectures 9 and 10, Hilary Term 2008 March 2, 2008 Hypothesis testing Consider two alternative models M 1 = {f (x; θ), θ Θ 1 } and M 2 = {f (x; θ), θ Θ 2 } for a sample (X = x)
More informationQTL Mapping I: Overview and using Inbred Lines
QTL Mapping I: Overview and using Inbred Lines Key idea: Looking for marker-trait associations in collections of relatives If (say) the mean trait value for marker genotype MM is statisically different
More informationBayesian model selection: methodology, computation and applications
Bayesian model selection: methodology, computation and applications David Nott Department of Statistics and Applied Probability National University of Singapore Statistical Genomics Summer School Program
More informationFoundations of Statistical Inference
Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2015 Julien Berestycki (University of Oxford) SB2a MT 2015 1 / 16 Lecture 16 : Bayesian analysis
More informationQTL model selection: key players
Bayesian Interval Mapping. Bayesian strategy -9. Markov chain sampling 0-7. sampling genetic architectures 8-5 4. criteria for model selection 6-44 QTL : Bayes Seattle SISG: Yandell 008 QTL model selection:
More informationLecture 1 Bayesian inference
Lecture 1 Bayesian inference olivier.francois@imag.fr April 2011 Outline of Lecture 1 Principles of Bayesian inference Classical inference problems (frequency, mean, variance) Basic simulation algorithms
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationSparse Linear Models (10/7/13)
STA56: Probabilistic machine learning Sparse Linear Models (0/7/) Lecturer: Barbara Engelhardt Scribes: Jiaji Huang, Xin Jiang, Albert Oh Sparsity Sparsity has been a hot topic in statistics and machine
More informationModel Choice. Hoff Chapter 9. Dec 8, 2010
Model Choice Hoff Chapter 9 Dec 8, 2010 Topics Variable Selection / Model Choice Stepwise Methods Model Selection Criteria Model Averaging Variable Selection Reasons for reducing the number of variables
More informationBayesian Inference in Astronomy & Astrophysics A Short Course
Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning
More informationModel Choice. Hoff Chapter 9, Clyde & George Model Uncertainty StatSci, Hoeting et al BMA StatSci. October 27, 2015
Model Choice Hoff Chapter 9, Clyde & George Model Uncertainty StatSci, Hoeting et al BMA StatSci October 27, 2015 Topics Variable Selection / Model Choice Stepwise Methods Model Selection Criteria Model
More information7. Estimation and hypothesis testing. Objective. Recommended reading
7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing
More informationLecture 13 Fundamentals of Bayesian Inference
Lecture 13 Fundamentals of Bayesian Inference Dennis Sun Stats 253 August 11, 2014 Outline of Lecture 1 Bayesian Models 2 Modeling Correlations Using Bayes 3 The Universal Algorithm 4 BUGS 5 Wrapping Up
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationModule 22: Bayesian Methods Lecture 9 A: Default prior selection
Module 22: Bayesian Methods Lecture 9 A: Default prior selection Peter Hoff Departments of Statistics and Biostatistics University of Washington Outline Jeffreys prior Unit information priors Empirical
More informationLecture 8. QTL Mapping 1: Overview and Using Inbred Lines
Lecture 8 QTL Mapping 1: Overview and Using Inbred Lines Bruce Walsh. jbwalsh@u.arizona.edu. University of Arizona. Notes from a short course taught Jan-Feb 2012 at University of Uppsala While the machinery
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationBeyond Uniform Priors in Bayesian Network Structure Learning
Beyond Uniform Priors in Bayesian Network Structure Learning (for Discrete Bayesian Networks) scutari@stats.ox.ac.uk Department of Statistics April 5, 2017 Bayesian Network Structure Learning Learning
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationVariational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M
A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M PD M = PD θ, MPθ Mdθ Lecture 14 : Variational Bayes where θ are the parameters of the model and Pθ M is
More informationBayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides)
Bayesian inference Justin Chumbley ETH and UZH (Thanks to Jean Denizeau for slides) Overview of the talk Introduction: Bayesian inference Bayesian model comparison Group-level Bayesian model selection
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics, School of Public
More informationSelecting explanatory variables with the modified version of Bayesian Information Criterion
Selecting explanatory variables with the modified version of Bayesian Information Criterion Institute of Mathematics and Computer Science, Wrocław University of Technology, Poland in cooperation with J.K.Ghosh,
More informationLog Gaussian Cox Processes. Chi Group Meeting February 23, 2016
Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started
More informationBayesian shrinkage approach in variable selection for mixed
Bayesian shrinkage approach in variable selection for mixed effects s GGI Statistics Conference, Florence, 2015 Bayesian Variable Selection June 22-26, 2015 Outline 1 Introduction 2 3 4 Outline Introduction
More informationBayesian Model Comparison:
Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500
More informationLecture 6: Model Checking and Selection
Lecture 6: Model Checking and Selection Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de May 27, 2014 Model selection We often have multiple modeling choices that are equally sensible: M 1,, M T. Which
More informationEmpirical Likelihood Based Deviance Information Criterion
Empirical Likelihood Based Deviance Information Criterion Yin Teng Smart and Safe City Center of Excellence NCS Pte Ltd June 22, 2016 Outline Bayesian empirical likelihood Definition Problems Empirical
More informationApproximate Bayesian Computation: a simulation based approach to inference
Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics
More informationGWAS IV: Bayesian linear (variance component) models
GWAS IV: Bayesian linear (variance component) models Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS IV: Bayesian
More informationCausal Graphical Models in Systems Genetics
1 Causal Graphical Models in Systems Genetics 2013 Network Analysis Short Course - UCLA Human Genetics Elias Chaibub Neto and Brian S Yandell July 17, 2013 Motivation and basic concepts 2 3 Motivation
More informationTransdimensional Markov Chain Monte Carlo Methods. Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen
Transdimensional Markov Chain Monte Carlo Methods Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen Motivation for Different Inversion Technique Inversion techniques typically provide a single
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Empirical Bayes, Hierarchical Bayes Mark Schmidt University of British Columbia Winter 2017 Admin Assignment 5: Due April 10. Project description on Piazza. Final details coming
More informationDepartamento de Economía Universidad de Chile
Departamento de Economía Universidad de Chile GRADUATE COURSE SPATIAL ECONOMETRICS November 14, 16, 17, 20 and 21, 2017 Prof. Henk Folmer University of Groningen Objectives The main objective of the course
More informationThe Normal Linear Regression Model with Natural Conjugate Prior. March 7, 2016
The Normal Linear Regression Model with Natural Conjugate Prior March 7, 2016 The Normal Linear Regression Model with Natural Conjugate Prior The plan Estimate simple regression model using Bayesian methods
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationLecture 4: Dynamic models
linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More informationForward Problems and their Inverse Solutions
Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather
More information7. Estimation and hypothesis testing. Objective. Recommended reading
7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing
More informationA Bayesian view of model complexity
A Bayesian view of model complexity Angelika van der Linde University of Bremen, Germany 1. Intuitions 2. Measures of dependence between observables and parameters 3. Occurrence in predictive model comparison
More informationSTAT Advanced Bayesian Inference
1 / 32 STAT 625 - Advanced Bayesian Inference Meng Li Department of Statistics Jan 23, 218 The Dirichlet distribution 2 / 32 θ Dirichlet(a 1,...,a k ) with density p(θ 1,θ 2,...,θ k ) = k j=1 Γ(a j) Γ(
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationA Bayesian Treatment of Linear Gaussian Regression
A Bayesian Treatment of Linear Gaussian Regression Frank Wood December 3, 2009 Bayesian Approach to Classical Linear Regression In classical linear regression we have the following model y β, σ 2, X N(Xβ,
More informationThe Monte Carlo Method: Bayesian Networks
The Method: Bayesian Networks Dieter W. Heermann Methods 2009 Dieter W. Heermann ( Methods)The Method: Bayesian Networks 2009 1 / 18 Outline 1 Bayesian Networks 2 Gene Expression Data 3 Bayesian Networks
More informationRelated Concepts: Lecture 9 SEM, Statistical Modeling, AI, and Data Mining. I. Terminology of SEM
Lecture 9 SEM, Statistical Modeling, AI, and Data Mining I. Terminology of SEM Related Concepts: Causal Modeling Path Analysis Structural Equation Modeling Latent variables (Factors measurable, but thru
More informationObnoxious lateness humor
Obnoxious lateness humor 1 Using Bayesian Model Averaging For Addressing Model Uncertainty in Environmental Risk Assessment Louise Ryan and Melissa Whitney Department of Biostatistics Harvard School of
More informationBayesian analysis of the Hardy-Weinberg equilibrium model
Bayesian analysis of the Hardy-Weinberg equilibrium model Eduardo Gutiérrez Peña Department of Probability and Statistics IIMAS, UNAM 6 April, 2010 Outline Statistical Inference 1 Statistical Inference
More informationOverall Objective Priors
Overall Objective Priors Jim Berger, Jose Bernardo and Dongchu Sun Duke University, University of Valencia and University of Missouri Recent advances in statistical inference: theory and case studies University
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationData-Driven Bayesian Model Selection: Parameter Space Dimension Reduction using Automatic Relevance Determination Priors
Data-Driven : Parameter Space Dimension Reduction using Priors Mohammad Khalil mkhalil@sandia.gov, Livermore, CA Workshop on Uncertainty Quantification and Data-Driven Modeling Austin, Texas March 23-24,
More informationBayesian Inference. p(y)
Bayesian Inference There are different ways to interpret a probability statement in a real world setting. Frequentist interpretations of probability apply to situations that can be repeated many times,
More informationwhat is Bayes theorem? posterior = likelihood * prior / C
who was Bayes? Reverend Thomas Bayes (70-76) part-time mathematician buried in Bunhill Cemetary, Moongate, London famous paper in 763 Phil Trans Roy Soc London was Bayes the first with this idea? (Laplace?)
More informationHeriot-Watt University
Heriot-Watt University Heriot-Watt University Research Gateway Prediction of settlement delay in critical illness insurance claims by using the generalized beta of the second kind distribution Dodd, Erengul;
More informationBayesian Model Diagnostics and Checking
Earvin Balderama Quantitative Ecology Lab Department of Forestry and Environmental Resources North Carolina State University April 12, 2013 1 / 34 Introduction MCMCMC 2 / 34 Introduction MCMCMC Steps in
More informationBayesian linear regression
Bayesian linear regression Linear regression is the basis of most statistical modeling. The model is Y i = X T i β + ε i, where Y i is the continuous response X i = (X i1,..., X ip ) T is the corresponding
More informationBayesian Inference of Interactions and Associations
Bayesian Inference of Interactions and Associations Jun Liu Department of Statistics Harvard University http://www.fas.harvard.edu/~junliu Based on collaborations with Yu Zhang, Jing Zhang, Yuan Yuan,
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationIncluding historical data in the analysis of clinical trials using the modified power priors: theoretical overview and sampling algorithms
Including historical data in the analysis of clinical trials using the modified power priors: theoretical overview and sampling algorithms Joost van Rosmalen 1, David Dejardin 2,3, and Emmanuel Lesaffre
More informationStructure Learning: the good, the bad, the ugly
Readings: K&F: 15.1, 15.2, 15.3, 15.4, 15.5 Structure Learning: the good, the bad, the ugly Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 29 th, 2006 1 Understanding the uniform
More informationST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks
(9) Model selection and goodness-of-fit checks Objectives In this module we will study methods for model comparisons and checking for model adequacy For model comparisons there are a finite number of candidate
More informationClosed-form sampling formulas for the coalescent with recombination
0 / 21 Closed-form sampling formulas for the coalescent with recombination Yun S. Song CS Division and Department of Statistics University of California, Berkeley September 7, 2009 Joint work with Paul
More informationThe linear model is the most fundamental of all serious statistical models encompassing:
Linear Regression Models: A Bayesian perspective Ingredients of a linear model include an n 1 response vector y = (y 1,..., y n ) T and an n p design matrix (e.g. including regressors) X = [x 1,..., x
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationThe Metropolis-Hastings Algorithm. June 8, 2012
The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings
More informationBayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2
How Random Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2 1 Center for Complex Systems Research and Department of Physics University
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationEfficient Forecasting of Volcanic Ash Clouds. Roger P Denlinger Hans F Schwaiger US Geological Survey
Efficient Forecasting of Volcanic Ash Clouds Roger P Denlinger Hans F Schwaiger US Geological Survey Two basic questions addressed in this talk: 1. How does uncertainty affect forecasts of volcanic ash
More informationVariational Scoring of Graphical Model Structures
Variational Scoring of Graphical Model Structures Matthew J. Beal Work with Zoubin Ghahramani & Carl Rasmussen, Toronto. 15th September 2003 Overview Bayesian model selection Approximations using Variational
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring
Lecture 9 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Applications: Comparison of Frequentist and Bayesian inference
More informationscrna-seq Differential expression analysis methods Olga Dethlefsen NBIS, National Bioinformatics Infrastructure Sweden October 2017
scrna-seq Differential expression analysis methods Olga Dethlefsen NBIS, National Bioinformatics Infrastructure Sweden October 2017 Olga (NBIS) scrna-seq de October 2017 1 / 34 Outline Introduction: what
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationNested Sampling. Brendon J. Brewer. brewer/ Department of Statistics The University of Auckland
Department of Statistics The University of Auckland https://www.stat.auckland.ac.nz/ brewer/ is a Monte Carlo method (not necessarily MCMC) that was introduced by John Skilling in 2004. It is very popular
More informationRegression, Ridge Regression, Lasso
Regression, Ridge Regression, Lasso Fabio G. Cozman - fgcozman@usp.br October 2, 2018 A general definition Regression studies the relationship between a response variable Y and covariates X 1,..., X n.
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationSome Curiosities Arising in Objective Bayesian Analysis
. Some Curiosities Arising in Objective Bayesian Analysis Jim Berger Duke University Statistical and Applied Mathematical Institute Yale University May 15, 2009 1 Three vignettes related to John s work
More informationBayesian Regression (1/31/13)
STA613/CBB540: Statistical methods in computational biology Bayesian Regression (1/31/13) Lecturer: Barbara Engelhardt Scribe: Amanda Lea 1 Bayesian Paradigm Bayesian methods ask: given that I have observed
More informationVariational Bayesian Logistic Regression
Variational Bayesian Logistic Regression Sargur N. University at Buffalo, State University of New York USA Topics in Linear Models for Classification Overview 1. Discriminant Functions 2. Probabilistic
More information