Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements
|
|
- Abigayle Bradford
- 5 years ago
- Views:
Transcription
1 Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Jeffrey N. Rouder Francis Tuerlinckx Paul L. Speckman Jun Lu & Pablo Gomez May The Weibull regression model The Weibull regression model for RT is given as T ij ψ i λ ij β i iid Weibull(ψ i λ ij β i (1 log λ ij α i θ i γ j = α i + θ i c j + γ j ( α i α 0 σ α θ i θ 0 σ θ γ j σ γ The probability density function of the Weibull is iid Normal(α 0 σα (3 iid Normal(θ 0 σθ (4 iid Normal(0 σγ. (5 f(t ψ λ β = λβ(t ψ β 1 exp ( λ(t ψ β t ψ. Priors are needed for (ψ i η 1 η α 0 θ 0 σ α σ θ σ γ. We have previously provided the following priors and have shown they provide for good estimation across a wide range of experimental outcomes (Lu 004 Rouder et al : ψ i Uniform(0 min j T ij η 1 Gamma(c 1 d 1 η Gamma(c d with values of c 1 = d 1 =.0 c = and d =.04. The density of the gamma distribution is ηη1 η1 1 f(x; η 1 η = x exp( η x Γ(η 1 where Γ is the gamma function(abramowitz & Stegun Flat priors are convenient and appropriate for location parameters (α 0 θ 0 ; inverse gamma 1
2 priors are convenient and appropriate for variance parameters (σα σ γ σ θ. The inverse gamma has a probability density function f(x; a b = ba Γ(a x a 1 exp ( b x Parameters a and b are set to values of.1. With this choice the resulting prior coarsely approximates the Jeffrey s noninformative prior on variance (Jeffreys The target of analysis in Bayesian statistics is the derivation of the marginal posterior distribution for each parameter. Often these marginal posteriors cannot be derived as closed-form expressions. We follow the common approach of deriving closed-form expressions for full conditional posterior distributions and sampling these with the Markov chain Monte Carlo (MCMC technique (Gelfand & Smith It is convenient to refer to vectors of parameters and these will be denoted with boldface typeset; e.g. α = (α 1 α... α I. The following full conditional distributions are well-known random variables: ( α 0 α σα; i T Normal α i σ α I I ( θ 0 θ σθ ; T Normal i θ i σ θ I I ( i σα α α 0; T Inverse Gamma a + I/ b + (α i α 0 ( σ γ γ; T Inverse Gamma a + J/ b + ( σθ θ θ 0 ; T Inverse Gamma a + I/ b + η β η 1 ; T Gamma(Iη 1 + c d + i β i. j γ j i (θ i θ 0 The derivation of the first five full conditionals is standard (see Rouder & Lu 005. The derivation of the last one is provided in Rouder et al. (003. The remaining full conditional posteriors have closed form probability density functions but are not well-known random variables. These are expressed conveniently as the logarithms of density functions and expressed up to an additive constant: log f(ψ i α i γ θ i β i ; T = (β i 1 j J i log(t ij ψ i j J i exp(α i + γ j + θ i c j (T ij ψ i βi ψ i < min j T ij log f(α i ψ i γ θ i β i α 0 σα; T = J i α i exp(α i + γ j + θ i c j (T ij ψ i βi (α i α 0 j J i σ α
3 log f(γ j ψ α θ β σ γ; T log f(θ i ψ i α i γ β i θ 0 σ θ; T = I j γ j exp(α i + γ j + θ i c j (T ij ψ i βi γ j σ i I γ j = θ i c j exp(α i + γ j + θ i c j (T ij ψ i βi (θ i θ 0 j J i j log f(β i ψ i α i γ θ i η 1 η ; T = ( J i + η 1 1logβ i + (β i 1 j J i log(t ij ψ i σ θ exp(α i j J i exp(γ j + θc j (T ij ψ i βi β i η log f(η 1 β η ; T = η 1 (I log η d 1 + i log β i I log Γ(η 1 + (c 1 1log(η 1. where I j is the set of participants that was presented item j and J i is the set of items presented to participant i. The derivations of these log posterior densities are straightforward. Full conditional posteriors that are distributed as normals inverse gammas and gammas are conveniently sampled in many packages (we used the Gnu Scientific Library for C. The remaining full conditionals are sampled with the Metropolis-Hastings algorithm (Gelman et al. 004 with the exception of parameter ψ. Efficient sampling of ψ i is possible with rejection sampling (Tanner For these shift parameters a shifted rightward-facing exponential serves as a suitable candidate distribution. While sampling is convenient in this framework excessive autocorrelation of MCMC chains results. To mitigate this autocorrelation we implemented a series of decorrelating Metropolis steps (Graves Speckman & Sun submitted; Liu & Sabatti 000. A sample of the resulting chains are shown in Figure 1. There is some autocorrelation in the chains though it is not excessive. To mitigate this remaining autocorrelation we used a large number of iterations in application (50000 iterations with a burn-in of iterations. Derivation of Expected Values We derive the expected value of T ij for any participant observing an item with covariate value c j that is E(T ij ψ i α i θ i β i : E(T ij ψ i α i θ i β i = E γj E(T ij ψ i α i θ i β i γ j = E γj [ ψ i + Γ(1 + 1/β j e (αi+γj+θicj/βi ] = ψ i + Γ(1 + 1/β j e (αi+γj+θicj/βi E γj [ e γj/βi ]. The factor E γj [ e γ j/β i ] is the moment generating function of γj. Random variable γ j is distributed as a normal and its moment generating function is 3
4 Parameter Value Shape(β Interation Parameter Value WF Effect (θ Interation Parameter Value Variance(σ γ Interation Figure 1: MCMC chain values for selected parameters. Top to bottom panels show samples for β 0 θ 0 and σγ respectively. These chains are typical for other parameters. As can be seen mixing is not ideal but posterior quantities may nonetheless be estimated with long chains. well known (e.g. Hogg & Craig 1978: ( ] ( γj σ γ E γj [exp = exp. β i Substituting in this equality and rearranging terms yields ( E(T ij ψ i α i θ i β i = ψ i + K i exp θ ic j (6 β i β i where K i is the constant ( ( βi + 1 K i = Γ exp α i + σ γ β i β i βi. 3 Robustness to Misspecification The hierarchical Weibull regression model assumes that RT is distributed as a Weibull. The Weibull is a sufficiently simple distribution that this assumption may be assessed. To do so Rouder et al. (005 recommend transforming the data: Q ij = (T ij ˆψ i ˆβ iˆλij. (7 With this transformation the distribution of each Q ij is independent and follows a standard exponential (rate of 1.0. Consequently the Weibull assumption may be assessed by plotting the distribution of Q ij and comparing it to a standard 4
5 Density Exponential Residuals Density Value Standard Exponential Time (ms Figure : Misspecification of the Weibull distribution. Left: Histogram of transformed data Q ij. The line is the standard exponential distribution. Middle: Quantile-quantile plot of the transformed data against the standard exponential quantiles. Right: A comparison of the inverse Gaussian (dashed and Weibull (solid distributions. exponential. The left panel of Figure shows the histogram of Q ij for the data from Gomez Perea & Ratcliff s (007 Experiment 1; the line is the density of the standard exponential. The misfit is obvious. The data have more mass in the left tail than the Weibull. The center plot shows a quantile-quantile plot of Q ij vs. the standard exponential. This plot reveals that the data have a heavier right tail than the Weibull model. The right panel shows the comparison of an inverse Gaussian distribution 1 (dashed line and a Weibull distribution (solid line. The inverse Gaussian has heavier left and right tails than the Weibull and is therefore a more realistic distribution. Although the inverse-gaussian is more realistic it is more difficult to fit. Unfortunately small deviations in data have big effects on parameter estimates (Rouder & Speckman 004. Given the evidence for misspecification it is desirable to test whether the discrimination of functional forms is robust in the hierarchical Weibull model. We explored how well the Weibull regression model recovered the functional form when (simulated data were distributed as an inverse Gaussian. To make the simulations realistic we used the design matrix from the Gomez et al. experiment; that is each simulated participant corresponded to a real participant and each simulated item corresponded to a real item. We simulated two data sets: one following a power function (logarithm of the inverse Gaussian scale 1 The probability density function of the inverse Gaussian distribution is ( ξ ((x ψφ ξ f(x; ψ ξ φ = π (x ψ 3/ exp. ξ(x ψ 5
6 was linear in log-wf and one following an exponential function (logarithm of inverse Gaussian scale was linear in WF. The true inverse Gaussian parameters were chosen such that the resulting means and variances of simulated data matched those estimated from Gomez et al. s data. Each of the two inverse Gaussian simulated data sets was analyzed with two hierarchical Weibull models: one in which WF served as the covariate and a second in which log-wf served as the covariate. Plots of item residuals (γ j as a function of the covariate are shown in Figure 3. The top row shows analyses when inverse Gaussian log-scale follows WF (an exponential function; the bottom row shows the same when inverse Gaussian log-scale follows log-wf (a power function. The left column is for Weibull analysis with a WF covariate; the right column is the same with a log-wf covariate. Item residuals γ j are flat when the covariate matches the generating function and curved otherwise. The Weibull model therefore appears to recover the appropriate functional form even when the data are from an inverse Gaussian. 6
7 Linear Covariate Logarithmic Covariate Item Residuals Item Residuals Word Frequency Log Word Frequency Exponential Law Serves As Truth Power Law Serves as Truth Word Frequency Log Word Frequency Figure 3: Item residuals (γ j as a function of covariate for data simulated from an inverse Gaussian. Top and Bottom Rows: True relationship follows an exponential and power function respectively. Left and Right Columns: Weibull analysis with covariates linear and logarithmic in word frequency respectively. Lines are nonparametric smooths. The Weibull successfully recovers the parametric relationship even when the distribution is misspecified. 7
8 References [1] M. Abramowitz and I. A. Stegun. Handbook of Mathematical Functions: with Formulas Graphs and Mathematical Tables. Dover New York [] A. Gelfand and A. F. M. Smith. Sampling based approaches to calculating marginal densities. Journal of the American Statistical Association 85: [3] A. Gelman J. B. Carlin H. S. Stern and D. B. Rubin. Bayesian data analysis (nd edition. Chapman and Hall London 004. [4] P. Gomez and M. Ratcliff R.and Perea. Diffusion model of the go/no-go task. Journal of Experimental Psychology: General 136: [5] T. L. Graves P. L. Speckman and D. Sun. Improved mixing in MCMC algorithms for linear models. submitted. [6] R. V. Hogg and A. T. Craig. Introduction to mathematical statistics. MacMillan New York [7] H. Jeffreys. Theory of Probability (3rd Edition. Oxford University Press New York [8] S. J. Liu and C. Sabatti. Generalised Gibbs sampler and multigrid Monte Carlo for Bayesian computation. Biometrika 87: [9] J. Lu. Bayesian hierarchical models and applications in psychology research. Unpublished dissertation. University of Missouri-Columbia [10] J. N. Rouder and J. Lu. An introduction to Bayesian hierarchical models with an application in the theory of signal detection. Psychonomic Bulletin and Review 1: [11] J. N. Rouder J. Lu P. L. Speckman D. Sun and Y. Jiang. A hierarchical model for estimating response time distributions. Psychonomic Bulletin and Review 1: [1] J. N. Rouder and P. L. Speckman. An evlatuation of the Vincentizing method of forming group-level response time distributions. Psychonomic Bulletin and Review 11: [13] J. N. Rouder D. Sun P. L. Speckman J. Lu and D. Zhou. A hierarchical Bayesian statistical framework for response time distributions. Psychometrika 68: [14] J. N. Rouder F. Tuerlinckx P. L. Speckman J. Lu and P. Gomez. A hierarchical approach for fitting curves to response time measurements. submitted. 8
9 [15] Martin A. Tanner. Tools for statistical inference: Methods for the exploration of posterior distributions and likelihood functions. Springer Berlin
eqr094: Hierarchical MCMC for Bayesian System Reliability
eqr094: Hierarchical MCMC for Bayesian System Reliability Alyson G. Wilson Statistical Sciences Group, Los Alamos National Laboratory P.O. Box 1663, MS F600 Los Alamos, NM 87545 USA Phone: 505-667-9167
More informationBayesian data analysis in practice: Three simple examples
Bayesian data analysis in practice: Three simple examples Martin P. Tingley Introduction These notes cover three examples I presented at Climatea on 5 October 0. Matlab code is available by request to
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As
More informationST 740: Markov Chain Monte Carlo
ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:
More informationBagging During Markov Chain Monte Carlo for Smoother Predictions
Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods
More informationBayesian spatial hierarchical modeling for temperature extremes
Bayesian spatial hierarchical modeling for temperature extremes Indriati Bisono Dr. Andrew Robinson Dr. Aloke Phatak Mathematics and Statistics Department The University of Melbourne Maths, Informatics
More informationHastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model
UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced
More informationCTDL-Positive Stable Frailty Model
CTDL-Positive Stable Frailty Model M. Blagojevic 1, G. MacKenzie 2 1 Department of Mathematics, Keele University, Staffordshire ST5 5BG,UK and 2 Centre of Biostatistics, University of Limerick, Ireland
More informationSTAT 518 Intro Student Presentation
STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible
More informationParameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1
Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data
More informationStatistical Inference for Stochastic Epidemic Models
Statistical Inference for Stochastic Epidemic Models George Streftaris 1 and Gavin J. Gibson 1 1 Department of Actuarial Mathematics & Statistics, Heriot-Watt University, Riccarton, Edinburgh EH14 4AS,
More informationDefault Priors and Effcient Posterior Computation in Bayesian
Default Priors and Effcient Posterior Computation in Bayesian Factor Analysis January 16, 2010 Presented by Eric Wang, Duke University Background and Motivation A Brief Review of Parameter Expansion Literature
More informationBayesian model selection in graphs by using BDgraph package
Bayesian model selection in graphs by using BDgraph package A. Mohammadi and E. Wit March 26, 2013 MOTIVATION Flow cytometry data with 11 proteins from Sachs et al. (2005) RESULT FOR CELL SIGNALING DATA
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationCHAPTER 9 BAYESIAN METHODS
CHAPTER 9 BAYESIAN METHODS 9. Overview Over the last two decades there has been an MCMC revolution in which Bayesian methods have become a highly popular and effective tool for the applied statistician.
More informationKazuhiko Kakamu Department of Economics Finance, Institute for Advanced Studies. Abstract
Bayesian Estimation of A Distance Functional Weight Matrix Model Kazuhiko Kakamu Department of Economics Finance, Institute for Advanced Studies Abstract This paper considers the distance functional weight
More informationSimulation of truncated normal variables. Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris
Simulation of truncated normal variables Christian P. Robert LSTA, Université Pierre et Marie Curie, Paris Abstract arxiv:0907.4010v1 [stat.co] 23 Jul 2009 We provide in this paper simulation algorithms
More informationStat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC
Stat 451 Lecture Notes 07 12 Markov Chain Monte Carlo Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapters 8 9 in Givens & Hoeting, Chapters 25 27 in Lange 2 Updated: April 4, 2016 1 / 42 Outline
More informationStat 542: Item Response Theory Modeling Using The Extended Rank Likelihood
Stat 542: Item Response Theory Modeling Using The Extended Rank Likelihood Jonathan Gruhl March 18, 2010 1 Introduction Researchers commonly apply item response theory (IRT) models to binary and ordinal
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationSlice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method
Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Madeleine B. Thompson Radford M. Neal Abstract The shrinking rank method is a variation of slice sampling that is efficient at
More informationRonald Christensen. University of New Mexico. Albuquerque, New Mexico. Wesley Johnson. University of California, Irvine. Irvine, California
Texts in Statistical Science Bayesian Ideas and Data Analysis An Introduction for Scientists and Statisticians Ronald Christensen University of New Mexico Albuquerque, New Mexico Wesley Johnson University
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationBayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence
Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns
More informationBayesian Inference. Chapter 1. Introduction and basic concepts
Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master
More informationMarkov Chain Monte Carlo
Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).
More informationBayesian linear regression
Bayesian linear regression Linear regression is the basis of most statistical modeling. The model is Y i = X T i β + ε i, where Y i is the continuous response X i = (X i1,..., X ip ) T is the corresponding
More informationModeling Individual Differences with Dirichlet Processes
Modeling Individual Differences with Dirichlet Processes Daniel J. Navarro (daniel.navarro@adelaide.edu.au) Department of Psychology, University of Adelaide, SA 55, Australia Thomas L. Griffiths (thomas
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters
More informationAdvanced Statistical Modelling
Markov chain Monte Carlo (MCMC) Methods and Their Applications in Bayesian Statistics School of Technology and Business Studies/Statistics Dalarna University Borlänge, Sweden. Feb. 05, 2014. Outlines 1
More informationMultilevel Statistical Models: 3 rd edition, 2003 Contents
Multilevel Statistical Models: 3 rd edition, 2003 Contents Preface Acknowledgements Notation Two and three level models. A general classification notation and diagram Glossary Chapter 1 An introduction
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More informationBayesian modelling. Hans-Peter Helfrich. University of Bonn. Theodor-Brinkmann-Graduate School
Bayesian modelling Hans-Peter Helfrich University of Bonn Theodor-Brinkmann-Graduate School H.-P. Helfrich (University of Bonn) Bayesian modelling Brinkmann School 1 / 22 Overview 1 Bayesian modelling
More informationReconstruction of individual patient data for meta analysis via Bayesian approach
Reconstruction of individual patient data for meta analysis via Bayesian approach Yusuke Yamaguchi, Wataru Sakamoto and Shingo Shirahata Graduate School of Engineering Science, Osaka University Masashi
More informationNOTES AND COMMENT. A hierarchical approach for fitting curves to response time measurements
Psychonomic ulletin & Review 28, 15 (6), 121-128 doi:1.3758/pr.15.6.121 NOTES ND COMMENT hierarchical approach for fitting curves to response time measurements JEFFREY N. ROUDER University of Missouri,
More informationMetropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9
Metropolis Hastings Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601 Module 9 1 The Metropolis-Hastings algorithm is a general term for a family of Markov chain simulation methods
More informationBayesian Linear Regression
Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationRank Regression with Normal Residuals using the Gibbs Sampler
Rank Regression with Normal Residuals using the Gibbs Sampler Stephen P Smith email: hucklebird@aol.com, 2018 Abstract Yu (2000) described the use of the Gibbs sampler to estimate regression parameters
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationBayesian Inference for the Multivariate Normal
Bayesian Inference for the Multivariate Normal Will Penny Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK. November 28, 2014 Abstract Bayesian inference for the multivariate
More informationBAYESIAN MODELING OF DYNAMIC SOFTWARE GROWTH CURVE MODELS
BAYESIAN MODELING OF DYNAMIC SOFTWARE GROWTH CURVE MODELS Zhaohui Liu, Nalini Ravishanker, University of Connecticut Bonnie K. Ray, IBM Watson Research Center Department of Mathematical Sciences, IBM Watson
More informationZita Oravecz, Francis Tuerlinckx, & Joachim Vandekerckhove Department of Psychology. Department of Psychology University of Leuven, Belgium
Bayesian statistical inference for the hierarchical Ornstein-Uhlenbeck model: An online supplement to A hierarchical latent stochastic differential equation model for affective dynamics Zita Oravecz, Francis
More informationMarkov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017
Markov Chain Monte Carlo (MCMC) and Model Evaluation August 15, 2017 Frequentist Linking Frequentist and Bayesian Statistics How can we estimate model parameters and what does it imply? Want to find the
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics, School of Public
More informationA Semi-parametric Bayesian Framework for Performance Analysis of Call Centers
Proceedings 59th ISI World Statistics Congress, 25-30 August 2013, Hong Kong (Session STS065) p.2345 A Semi-parametric Bayesian Framework for Performance Analysis of Call Centers Bangxian Wu and Xiaowei
More informationComputer Practical: Metropolis-Hastings-based MCMC
Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov
More informationWeb Appendix for The Dynamics of Reciprocity, Accountability, and Credibility
Web Appendix for The Dynamics of Reciprocity, Accountability, and Credibility Patrick T. Brandt School of Economic, Political and Policy Sciences University of Texas at Dallas E-mail: pbrandt@utdallas.edu
More informationThe Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.
Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface
More information7. Estimation and hypothesis testing. Objective. Recommended reading
7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing
More informationHierarchical models. Dr. Jarad Niemi. August 31, Iowa State University. Jarad Niemi (Iowa State) Hierarchical models August 31, / 31
Hierarchical models Dr. Jarad Niemi Iowa State University August 31, 2017 Jarad Niemi (Iowa State) Hierarchical models August 31, 2017 1 / 31 Normal hierarchical model Let Y ig N(θ g, σ 2 ) for i = 1,...,
More informationMarkov Chain Monte Carlo A Contribution to the Encyclopedia of Environmetrics
Markov Chain Monte Carlo A Contribution to the Encyclopedia of Environmetrics Galin L. Jones and James P. Hobert Department of Statistics University of Florida May 2000 1 Introduction Realistic statistical
More informationSTAT 425: Introduction to Bayesian Analysis
STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 2) Fall 2017 1 / 19 Part 2: Markov chain Monte
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationStatistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling
1 / 27 Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling Melih Kandemir Özyeğin University, İstanbul, Turkey 2 / 27 Monte Carlo Integration The big question : Evaluate E p(z) [f(z)]
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationMEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES
XX IMEKO World Congress Metrology for Green Growth September 9 14, 212, Busan, Republic of Korea MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES A B Forbes National Physical Laboratory, Teddington,
More informationMarkov Chain Monte Carlo in Practice
Markov Chain Monte Carlo in Practice Edited by W.R. Gilks Medical Research Council Biostatistics Unit Cambridge UK S. Richardson French National Institute for Health and Medical Research Vilejuif France
More informationBayesian non-parametric model to longitudinally predict churn
Bayesian non-parametric model to longitudinally predict churn Bruno Scarpa Università di Padova Conference of European Statistics Stakeholders Methodologists, Producers and Users of European Statistics
More informationPARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.
PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.. Beta Distribution We ll start by learning about the Beta distribution, since we end up using
More informationItem Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions
R U T C O R R E S E A R C H R E P O R T Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions Douglas H. Jones a Mikhail Nediak b RRR 7-2, February, 2! " ##$%#&
More informationCSC 2541: Bayesian Methods for Machine Learning
CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll
More informationModelling Operational Risk Using Bayesian Inference
Pavel V. Shevchenko Modelling Operational Risk Using Bayesian Inference 4y Springer 1 Operational Risk and Basel II 1 1.1 Introduction to Operational Risk 1 1.2 Defining Operational Risk 4 1.3 Basel II
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationBayesian Statistical Methods. Jeff Gill. Department of Political Science, University of Florida
Bayesian Statistical Methods Jeff Gill Department of Political Science, University of Florida 234 Anderson Hall, PO Box 117325, Gainesville, FL 32611-7325 Voice: 352-392-0262x272, Fax: 352-392-8127, Email:
More informationSpatially Adaptive Smoothing Splines
Spatially Adaptive Smoothing Splines Paul Speckman University of Missouri-Columbia speckman@statmissouriedu September 11, 23 Banff 9/7/3 Ordinary Simple Spline Smoothing Observe y i = f(t i ) + ε i, =
More informationOnline appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US
Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus
More informationBayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014
Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation
More informationTheory and Methods of Statistical Inference
PhD School in Statistics cycle XXIX, 2014 Theory and Methods of Statistical Inference Instructors: B. Liseo, L. Pace, A. Salvan (course coordinator), N. Sartori, A. Tancredi, L. Ventura Syllabus Some prerequisites:
More informationINTRODUCTION TO BAYESIAN STATISTICS
INTRODUCTION TO BAYESIAN STATISTICS Sarat C. Dass Department of Statistics & Probability Department of Computer Science & Engineering Michigan State University TOPICS The Bayesian Framework Different Types
More informationThe Bayesian Approach to Multi-equation Econometric Model Estimation
Journal of Statistical and Econometric Methods, vol.3, no.1, 2014, 85-96 ISSN: 2241-0384 (print), 2241-0376 (online) Scienpress Ltd, 2014 The Bayesian Approach to Multi-equation Econometric Model Estimation
More informationUsing Estimating Equations for Spatially Correlated A
Using Estimating Equations for Spatially Correlated Areal Data December 8, 2009 Introduction GEEs Spatial Estimating Equations Implementation Simulation Conclusion Typical Problem Assess the relationship
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate
More informationStatistical Methods in Particle Physics Lecture 1: Bayesian methods
Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationAccounting for Complex Sample Designs via Mixture Models
Accounting for Complex Sample Designs via Finite Normal Mixture Models 1 1 University of Michigan School of Public Health August 2009 Talk Outline 1 2 Accommodating Sampling Weights in Mixture Models 3
More informationPart 8: GLMs and Hierarchical LMs and GLMs
Part 8: GLMs and Hierarchical LMs and GLMs 1 Example: Song sparrow reproductive success Arcese et al., (1992) provide data on a sample from a population of 52 female song sparrows studied over the course
More informationPractical Bayesian Quantile Regression. Keming Yu University of Plymouth, UK
Practical Bayesian Quantile Regression Keming Yu University of Plymouth, UK (kyu@plymouth.ac.uk) A brief summary of some recent work of us (Keming Yu, Rana Moyeed and Julian Stander). Summary We develops
More informationDynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models
6 Dependent data The AR(p) model The MA(q) model Hidden Markov models Dependent data Dependent data Huge portion of real-life data involving dependent datapoints Example (Capture-recapture) capture histories
More informationPrior Distributions for Variance Parameters in Hierarchical Models. Andrew Gelman. EERI Research Paper Series No 6/2004
EERI Economics and Econometrics Research Institute Prior Distributions for Variance Parameters in Hierarchical Models Andrew Gelman EERI Research Paper Series No 6/2004 Copyright 2004 by Andrew Gelman
More informationReducing The Computational Cost of Bayesian Indoor Positioning Systems
Reducing The Computational Cost of Bayesian Indoor Positioning Systems Konstantinos Kleisouris, Richard P. Martin Computer Science Department Rutgers University WINLAB Research Review May 15 th, 2006 Motivation
More informationChoosing the Summary Statistics and the Acceptance Rate in Approximate Bayesian Computation
Choosing the Summary Statistics and the Acceptance Rate in Approximate Bayesian Computation COMPSTAT 2010 Revised version; August 13, 2010 Michael G.B. Blum 1 Laboratoire TIMC-IMAG, CNRS, UJF Grenoble
More informationA Bayesian Approach to Phylogenetics
A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte
More informationMarkov Chain Monte Carlo
1 Motivation 1.1 Bayesian Learning Markov Chain Monte Carlo Yale Chang In Bayesian learning, given data X, we make assumptions on the generative process of X by introducing hidden variables Z: p(z): prior
More informationBayesian inference for factor scores
Bayesian inference for factor scores Murray Aitkin and Irit Aitkin School of Mathematics and Statistics University of Newcastle UK October, 3 Abstract Bayesian inference for the parameters of the factor
More informationDeblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.
Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More informationBayesian time series classification
Bayesian time series classification Peter Sykacek Department of Engineering Science University of Oxford Oxford, OX 3PJ, UK psyk@robots.ox.ac.uk Stephen Roberts Department of Engineering Science University
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationStable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence
Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham NC 778-5 - Revised April,
More informationMetropolis-Hastings Algorithm
Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to
More informationLabor-Supply Shifts and Economic Fluctuations. Technical Appendix
Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January
More informationSpatial Statistics Chapter 4 Basics of Bayesian Inference and Computation
Spatial Statistics Chapter 4 Basics of Bayesian Inference and Computation So far we have discussed types of spatial data, some basic modeling frameworks and exploratory techniques. We have not discussed
More informationBayesian Modeling of Accelerated Life Tests with Random Effects
Bayesian Modeling of Accelerated Life Tests with Random Effects Ramón V. León Avery J. Ashby Jayanth Thyagarajan Joint Statistical Meeting August, 00 Toronto, Canada Abstract We show how to use Bayesian
More informationNonparametric Drift Estimation for Stochastic Differential Equations
Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,
More informationSession 3A: Markov chain Monte Carlo (MCMC)
Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte
More information