XXV ENCONTRO BRASILEIRO DE ECONOMETRIA Porto Seguro - BA, 2003 REVISITING DISTRIBUTED LAG MODELS THROUGH A BAYESIAN PERSPECTIVE
|
|
- Clarissa King
- 6 years ago
- Views:
Transcription
1 XXV ENCONTRO BRASILEIRO DE ECONOMETRIA Porto Seguro - BA, 2003 REVISITING DISTRIBUTED LAG MODELS THROUGH A BAYESIAN PERSPECTIVE Romy R. Ravines, Alexandra M. Schmidt and Helio S. Migon 1 Instituto de Matemática Universidade Federal do Rio de Janeiro. 1 This work is part of the first author s PhD studies, who is being supervised by H.S. Migon and A.M. Schmidt. R.R. Ravines is grateful to CAPES for the the financial support during her PhD studies.
2 Outline Review some distributed lag models Write them into a particular class of Bayesian Dynamic Models, the transfer functions models. Perform inference following the Bayesian paradigm. Make use of Markov chain Monte Carlo (MCMC) methods. Computation is made by the use of the software WinBugs. An example: the Koyck s consumption function is analyzed using two different approaches, the Koyck s and the Solow s distributed lag models.
3 Distributed-lag Models First Prev Next Last Go Back Full Screen Close Quit
4 Distributed-lag Models The general form of a linear distributed-lag model is Y t = β i X t i + ɛ t (1) i=0 where any change in X t will affect E[Y t ] in all the later periods. The term β i in (1) is the ith reaction coefficient, and it is usually assumed that lim i β i = 0 and i=0 β i = β <.
5 Distributed-lag Models The general form of a linear distributed-lag model is Y t = β i X t i + ɛ t (1) i=0 where any change in X t will affect E[Y t ] in all the later periods. The term β i in (1) is the ith reaction coefficient, and it is usually assumed that lim i β i = 0 and i=0 β i = β <. One important aspect to be considered is the number of parameters involved in these distributed lag models. In order to be parsimonious, it is assumed that the coefficients of lagged variables are functionally related.
6 The Koyck Distributed Lag β i = αλ i, i, with 0 < λ < 1. (2) Then, Y t = αx t + αλ 1 X t 1 + αλ 2 X t ɛ t. (3)
7 The Koyck Distributed Lag Then, β i = αλ i, i, with 0 < λ < 1. (2) Y t = αx t + αλ 1 X t 1 + αλ 2 X t ɛ t. (3) The Solow Distributed Lags ( ) r + i 1 β i = α (1 λ) r λ, 0 < λ < 1, i, r > 0. (4) i
8 The Koyck Distributed Lag Then, β i = αλ i, i, with 0 < λ < 1. (2) Y t = αx t + αλ 1 X t 1 + αλ 2 X t ɛ t. (3) The Solow Distributed Lags ( ) r + i 1 β i = α (1 λ) r λ, 0 < λ < 1, i, r > 0. (4) i The Almon Distributed Lags β i = α 0 + α 1 i + α 2 i α p i p = p α k i k. (5) k=0
9 (a) Koyck (b) Solow (c) Almon Figure 1: Examples of different Distributed Lag Models
10 Bayesian Dynamic Models First Prev Next Last Go Back Full Screen Close Quit
11 Bayesian Dynamic Models The general dynamic model is defined by {F (.), G(.), V, W } t. F (.) and G(.) are general smooth functions defining the mean of the response variable and the state parameters evolution. V and W represent the variances. For each t, the univariate Dynamic Linear Model (DLM) is Observation equation: Y t =F tθ t + ɛ t, ɛ t N(0, V t ) System equation: θ t =G t θ t 1 + ω t, ω t N(0, W t ) (6) Initial information: (θ 0 D 0 ) N[m 0, C 0 ] Extensions for non-linear (unknown G) and non-normal Bayesian models are easily introduced. First Prev Next Last Go Back Full Screen Close Quit
12 Bayesian Dynamic Models The general dynamic model is defined by {F (.), G(.), V, W } t. F (.) and G(.) are general smooth functions defining the mean of the response variable and the state parameters evolution. V and W represent the variances. For each t, the univariate Dynamic Linear Model (DLM) is Observation equation: Y t =F tθ t + ɛ t, ɛ t N(0, V t ) System equation: θ t =G t θ t 1 + ω t, ω t N(0, W t ) (6) Initial information: (θ 0 D 0 ) N[m 0, C 0 ] Extensions for non-linear (unknown G) and non-normal Bayesian models are easily introduced. The distributed lagged models can be seen as particular cases of DLM. Hence, they can be expressed in the form of (6). First Prev Next Last Go Back Full Screen Close Quit
13 Form-free transfer functions m E(Y t θ t ) = β i X t i = β 0 X t + β 1 X t β m X t m, (7) i=0 The transfer response function of X is { β i X i = 0, 1,..., m; 0 i > m.
14 Form-free transfer functions m E(Y t θ t ) = β i X t i = β 0 X t + β 1 X t β m X t m, (7) i=0 The transfer response function of X is { β i X i = 0, 1,..., m; 0 i > m. Functional form transfer functions Y t = F θ t + ɛ t (8a) θ t = Gθ t 1 + ψ t X t + θ t (8b) ψ t = ψ t 1 + ψ t (8c) The transfer function model (8) can be written as the standard DLM form: Y t = F θt + ɛ t, ɛ t N(0, σɛ 2 ) θ t = G t θt 1 + ω t, ω t N(0, σωi). 2 (9)
15 Bayesian Dynamic Models and Distributed Lag Models First Prev Next Last Go Back Full Screen Close Quit
16 Bayesian Dynamic Models and Distributed Lag Models The Koyck distributed lag model, can be rewritten as Y t = E t + ɛ t (10a) E t = λe t 1 + αx t (10b) where E t = αx t + αλ 1 X t 1 + αλ 2 X t and 0 < λ < 1. In this model, the transfer response function of X is simply αλ i X. Here, in the representation (8), we have n = 1, θ t = E t, the effect variable, ψ t = α, the current effect for all t, F = 1, G = λ and the noise term θ t is assumed to be zero. First Prev Next Last Go Back Full Screen Close Quit
17 Bayesian Dynamic Models and Distributed Lag Models The Solow s distributed lag model can be expressed in the form of (9). In this case we have (1 λl) r E t = α(1 λ) r X t. An evolution equation can be assigned for λ and r.
18 Bayesian Dynamic Models and Distributed Lag Models The Solow s distributed lag model can be expressed in the form of (9). In this case we have (1 λl) r E t = α(1 λ) r X t. An evolution equation can be assigned for λ and r. Also, the Almon s model can be rewritten in a free form transfer function (7) since it is a regression on a fixed and finite number of lagged variables. By using (5) to define the coefficients of the lagged variables, the transfer response function of X t i in the mean of Y t will be { p k=0 α ki k X t i i = 0, 1,..., m; 0 i > m.
19 Inference Procedure First Prev Next Last Go Back Full Screen Close Quit
20 Inference Procedure From Bayes theorem: π(θ x) l(x θ)π(θ) Bayesian inference has experienced a great development since the early 90 s due to the introduction of Markov Chain Monte Carlo (MCMC) methods. One of the most popular methods and easy to implement is the Gibbs Sampling. For highly structured (complex) models, we usually have to write our own program.
21 Inference Procedure From Bayes theorem: π(θ x) l(x θ)π(θ) Bayesian inference has experienced a great development since the early 90 s due to the introduction of Markov Chain Monte Carlo (MCMC) methods. One of the most popular methods and easy to implement is the Gibbs Sampling. For highly structured (complex) models, we usually have to write our own program. This task has been simplified after the introduction of the software BUGS (Bayesian Analysis using Gibbs Sampling). BUGS was developed in the MRC Biostatistics Unit and is available free of charge from
22 Application: Consumption Function Estimation First Prev Next Last Go Back Full Screen Close Quit
23 Application: Consumption Function Estimation Zellner and Geisel (1970) proposed this consumption function: Y t = kx t + ɛ t (11) where, for the tth period, t = 1, 2,..., T, Y t is measured real consumption, Xt is normal real income, k is a parameter whose value is unknown, and ɛ t is an error term or transitory consumption. Assuming that the normal income satisfies X t X t 1 = (1 λ)(x t X t 1) where the parameter λ is such that 0 < λ < 1, the model becomes: Y t = k(1 λ)(x t + λx t 1 + λ 2 X t ) + ɛ t (12)
24 The transfer function model is: Y t = E t + ɛ t E t = λe t 1 + ψx t (13) where ψ = k(1 λ) and E t = ψx t + λψx t 1 + λ 2 ψx t
25 The transfer function model is: Y t = E t + ɛ t E t = λe t 1 + ψx t (13) where ψ = k(1 λ) and E t = ψx t + λψx t 1 + λ 2 ψx t Several assumptions can be made about the serial correlation properties of the residual ɛ t in (13): Model I: ɛ t ind N(0, σ2 ); Model II: ɛ t = ρɛ t 1 + ν t. with ν t ind N(0, σ2 )
26 The transfer function model is: Y t = E t + ɛ t E t = λe t 1 + ψx t (13) where ψ = k(1 λ) and E t = ψx t + λψx t 1 + λ 2 ψx t Several assumptions can be made about the serial correlation properties of the residual ɛ t in (13): Model I: ɛ t ind N(0, σ2 ); Model II: ɛ t = ρɛ t 1 + ν t. with ν t ind N(0, σ2 ) A priori we set λ Be(1, 1) and k Be(1, 1), σ 2 IG(0.0001, ) and, for Model II, ρ N(0, 100). First Prev Next Last Go Back Full Screen Close Quit
27 The transfer function model is: Y t = E t + ɛ t E t = λe t 1 + ψx t (13) where ψ = k(1 λ) and E t = ψx t + λψx t 1 + λ 2 ψx t Several assumptions can be made about the serial correlation properties of the residual ɛ t in (13): Model I: ɛ t ind N(0, σ2 ); Model II: ɛ t = ρɛ t 1 + ν t. with ν t ind N(0, σ2 ) A priori we set λ Be(1, 1) and k Be(1, 1), σ 2 IG(0.0001, ) and, for Model II, ρ N(0, 100). Similarly to Zellner and Geisel (1970), we use the U.S. quarterly price-deflated, seasonally adjusted data on personal disposable income and personal consumption expenditure,1947.i-1960.iv First Prev Next Last Go Back Full Screen Close Quit
28 Example of a code used in WinBUGS model #(Y=consumption, X=income) { for(t in 2:T) { mean.y[t] <- lambda*y[t-1] + k*(1-lambda)*x[t] Y[t] dnorm(mean.ct[t], tau.v) } # Prior lambda dbeta(1,1) k dbeta(1,1) tau.v dgamma(0.0001,0.0001) var.v <- 1/tau.v }
29 Figure 2: Interface do WinBUGS First Prev Next Last Go Back Full Screen Close Quit
30 Table 1: Posterior summaries associated with parameters in Model I k λ mean modes sd mean modes sd ZG KT T F ZG = Zellner & Geisel, 2 KT = Koyck s transformation, 3 TF = Transfer function (a) k (b) λ Figure 3: Posterior densities of k and λ in Model I, using TF
31 Table 2: Posterior summaries associated with parameters in Model II k λ ρ mean median sd mean median sd mean median sd ZG KT TF ZG = Zellner & Geisel, 2 KT = Koyck s transformation, 3 TF = Transfer function (a) k (b) λ Figure 4: Posterior densities of k and λ in Model II, using TF First Prev Next Last Go Back Full Screen Close Quit
32 Comparing Koyck s models First Prev Next Last Go Back Full Screen Close Quit
33 Comparing Koyck s models The Deviance Information Criterion (DIC) can be used to assess model complexity and compare different models. DIC is given by DIC = D + p D = D( θ) + 2p D (14) The WinBUGS version 1.4 computes the DIC automatically. The model with the smallest DIC is considered to be the best one.
34 Comparing Koyck s models The Deviance Information Criterion (DIC) can be used to assess model complexity and compare different models. DIC is given by DIC = D + p D = D( θ) + 2p D (14) The WinBUGS version 1.4 computes the DIC automatically. The model with the smallest DIC is considered to be the best one. Table 3: Model comparison through DIC Deviance DIC Model I Koyck s transformation Transfer function Model II Koyck s transformation Transfer function
35 We use the same dataset to estimate some particular Solow s models First Prev Next Last Go Back Full Screen Close Quit
36 We use the same dataset to estimate some particular Solow s models First, we adjusted three Solow s models for fixed r = 1, 2, 3 in (4), assuming that ɛ t ind N(0, σ2 ) in (11) Then, we allowed r to vary in the discrete set {1, 2, 3}. We used a Categorical priori distribution for r, that is, r Categorial(p) with p = (0.2, 0.3, 0.5), and started the chain with r 0 = 3. First Prev Next Last Go Back Full Screen Close Quit
37 We use the same dataset to estimate some particular Solow s models First, we adjusted three Solow s models for fixed r = 1, 2, 3 in (4), assuming that ɛ t ind N(0, σ2 ) in (11) Then, we allowed r to vary in the discrete set {1, 2, 3}. We used a Categorical priori distribution for r, that is, r Categorial(p) with p = (0.2, 0.3, 0.5), and started the chain with r 0 = 3. Table 4: Posterior summaries and DIC associated with the Solow s models k λ DIC mean modes sd mean modes sd r= r= r= r=1,2, First Prev Next Last Go Back Full Screen Close Quit
38 (a) k (b) λ (c) Prob(r = 1 y) (d) Prob(r = 2 y) (e) Prob(r = 3 y) Figure 5: Posterior distribution of the parameters in Solow s Model with a categorical prior for r
39 Discussion and Future work First Prev Next Last Go Back Full Screen Close Quit
40 Discussion and Future work We discussed here the inference of distributed lag models under a Bayesian framework.
41 Discussion and Future work We discussed here the inference of distributed lag models under a Bayesian framework. Inference of distributed lag models can be performed using MCMC algorithms. Use of the software WinBUGS
42 Discussion and Future work We discussed here the inference of distributed lag models under a Bayesian framework. Inference of distributed lag models can be performed using MCMC algorithms. Use of the software WinBUGS More complex models can be analyzed under the Bayesian framework: Migon (1985), Alves, Gamerman and Ferreira (2003).
43 Discussion and Future work We discussed here the inference of distributed lag models under a Bayesian framework. Inference of distributed lag models can be performed using MCMC algorithms. Use of the software WinBUGS More complex models can be analyzed under the Bayesian framework: Migon (1985), Alves, Gamerman and Ferreira (2003). An important theoretical issue, which is currently under research, is how to choose the form of the transfer function.
Fully Bayesian Spatial Analysis of Homicide Rates.
Fully Bayesian Spatial Analysis of Homicide Rates. Silvio A. da Silva, Luiz L.M. Melo and Ricardo S. Ehlers Universidade Federal do Paraná, Brazil Abstract Spatial models have been used in many fields
More informationSpatio-Temporal Models for Areal Data
Spatio-Temporal Models for Areal Data Juan C. Vivar (jcvivar@dme.ufrj.br) and Marco A. R. Ferreira (marco@im.ufrj.br) Departamento de Métodos Estatísticos - IM Universidade Federal do Rio de Janeiro (UFRJ)
More informationVCMC: Variational Consensus Monte Carlo
VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object
More informationProbabilistic Machine Learning
Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent
More informationA Hierarchical Mixture Dynamic Model of School Performance in the Brazilian Mathematical Olympiads for Public Schools (OBMEP)
A of School Performance in the ian Mathematical Olympiads for Public Schools (OBMEP) Alexandra M. Schmidt IM - UFRJ, Homepage: www.dme.ufrj.br/ alex joint work with Caroline P. de Moraes and Helio S. Migon
More informationSpatial Analysis of Incidence Rates: A Bayesian Approach
Spatial Analysis of Incidence Rates: A Bayesian Approach Silvio A. da Silva, Luiz L.M. Melo and Ricardo Ehlers July 2004 Abstract Spatial models have been used in many fields of science where the data
More informationBayesian Inference. Chapter 4: Regression and Hierarchical Models
Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School
More informationComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors. RicardoS.Ehlers
ComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors RicardoS.Ehlers Laboratório de Estatística e Geoinformação- UFPR http://leg.ufpr.br/ ehlers ehlers@leg.ufpr.br II Workshop on Statistical
More informationTEORIA BAYESIANA Ralph S. Silva
TEORIA BAYESIANA Ralph S. Silva Departamento de Métodos Estatísticos Instituto de Matemática Universidade Federal do Rio de Janeiro Sumário Numerical Integration Polynomial quadrature is intended to approximate
More informationMetropolis-Hastings Algorithm
Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to
More informationBayesian Networks in Educational Assessment
Bayesian Networks in Educational Assessment Estimating Parameters with MCMC Bayesian Inference: Expanding Our Context Roy Levy Arizona State University Roy.Levy@asu.edu 2017 Roy Levy MCMC 1 MCMC 2 Posterior
More information36-463/663Multilevel and Hierarchical Models
36-463/663Multilevel and Hierarchical Models From Bayes to MCMC to MLMs Brian Junker 132E Baker Hall brian@stat.cmu.edu 1 Outline Bayesian Statistics and MCMC Distribution of Skill Mastery in a Population
More informationBayesian Inference. Chapter 4: Regression and Hierarchical Models
Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationST 740: Linear Models and Multivariate Normal Inference
ST 740: Linear Models and Multivariate Normal Inference Alyson Wilson Department of Statistics North Carolina State University November 4, 2013 A. Wilson (NCSU STAT) Linear Models November 4, 2013 1 /
More informationDynamic linear models (aka state-space models) 1
Dynamic linear models (aka state-space models) 1 Advanced Econometris: Time Series Hedibert Freitas Lopes INSPER 1 Part of this lecture is based on Gamerman and Lopes (2006) Markov Chain Monte Carlo: Stochastic
More informationBayesian philosophy Bayesian computation Bayesian software. Bayesian Statistics. Petter Mostad. Chalmers. April 6, 2017
Chalmers April 6, 2017 Bayesian philosophy Bayesian philosophy Bayesian statistics versus classical statistics: War or co-existence? Classical statistics: Models have variables and parameters; these are
More information7. Estimation and hypothesis testing. Objective. Recommended reading
7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing
More informationWeakness of Beta priors (or conjugate priors in general) They can only represent a limited range of prior beliefs. For example... There are no bimodal beta distributions (except when the modes are at 0
More informationBayesian Linear Regression
Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationSpatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016
Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström
More informationPhysician Performance Assessment / Spatial Inference of Pollutant Concentrations
Physician Performance Assessment / Spatial Inference of Pollutant Concentrations Dawn Woodard Operations Research & Information Engineering Cornell University Johns Hopkins Dept. of Biostatistics, April
More informationLecture 4: Dynamic models
linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More informationModelling and forecasting of offshore wind power fluctuations with Markov-Switching models
Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models 02433 - Hidden Markov Models Pierre-Julien Trombe, Martin Wæver Pedersen, Henrik Madsen Course week 10 MWP, compiled
More informationBUGS Bayesian inference Using Gibbs Sampling
BUGS Bayesian inference Using Gibbs Sampling Glen DePalma Department of Statistics May 30, 2013 www.stat.purdue.edu/~gdepalma 1 / 20 Bayesian Philosophy I [Pearl] turned Bayesian in 1971, as soon as I
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationBAYESIAN MODEL CRITICISM
Monte via Chib s BAYESIAN MODEL CRITICM Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes
More informationAdvanced Statistical Modelling
Markov chain Monte Carlo (MCMC) Methods and Their Applications in Bayesian Statistics School of Technology and Business Studies/Statistics Dalarna University Borlänge, Sweden. Feb. 05, 2014. Outlines 1
More informationDeblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.
Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical
More informationA Parameter Expansion Approach to Bayesian SEM Estimation
A Parameter Expansion Approach to Bayesian SEM Estimation Ed Merkle and Yves Rosseel Utrecht University 24 June 2016 Yves Rosseel A Parameter Expansion Approach to Bayesian SEM Estimation 1 / 51 overview
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More information7. Estimation and hypothesis testing. Objective. Recommended reading
7. Estimation and hypothesis testing Objective In this chapter, we show how the election of estimators can be represented as a decision problem. Secondly, we consider the problem of hypothesis testing
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationHierarchical Linear Models
Hierarchical Linear Models Statistics 220 Spring 2005 Copyright c 2005 by Mark E. Irwin The linear regression model Hierarchical Linear Models y N(Xβ, Σ y ) β σ 2 p(β σ 2 ) σ 2 p(σ 2 ) can be extended
More informationAssessing the Effect of Prior Distribution Assumption on the Variance Parameters in Evaluating Bioequivalence Trials
Georgia State University ScholarWorks @ Georgia State University Mathematics Theses Department of Mathematics and Statistics 8--006 Assessing the Effect of Prior Distribution Assumption on the Variance
More informationBayesian Inference. Chapter 1. Introduction and basic concepts
Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master
More informationDIC, AIC, BIC, PPL, MSPE Residuals Predictive residuals
DIC, AIC, BIC, PPL, MSPE Residuals Predictive residuals Overall Measures of GOF Deviance: this measures the overall likelihood of the model given a parameter vector D( θ) = 2 log L( θ) This can be averaged
More informationBayesian Model Diagnostics and Checking
Earvin Balderama Quantitative Ecology Lab Department of Forestry and Environmental Resources North Carolina State University April 12, 2013 1 / 34 Introduction MCMCMC 2 / 34 Introduction MCMCMC Steps in
More informationWeb Appendix for Hierarchical Adaptive Regression Kernels for Regression with Functional Predictors by D. B. Woodard, C. Crainiceanu, and D.
Web Appendix for Hierarchical Adaptive Regression Kernels for Regression with Functional Predictors by D. B. Woodard, C. Crainiceanu, and D. Ruppert A. EMPIRICAL ESTIMATE OF THE KERNEL MIXTURE Here we
More informationAnalyzing Spatial Panel Data of Cigarette Demand: A Bayesian Hierarchical Modeling Approach
Journal of Data Science 6(2008), 467-489 Analyzing Spatial Panel Data of Cigarette Demand: A Bayesian Hierarchical Modeling Approach Yanbing Zheng 1, Jun Zhu 2 and Dong Li 3 1 University of Kentucky, 2
More informationThe Bayesian Approach to Multi-equation Econometric Model Estimation
Journal of Statistical and Econometric Methods, vol.3, no.1, 2014, 85-96 ISSN: 2241-0384 (print), 2241-0376 (online) Scienpress Ltd, 2014 The Bayesian Approach to Multi-equation Econometric Model Estimation
More informationStatistics in Environmental Research (BUC Workshop Series) II Problem sheet - WinBUGS - SOLUTIONS
Statistics in Environmental Research (BUC Workshop Series) II Problem sheet - WinBUGS - SOLUTIONS 1. (a) The posterior mean estimate of α is 14.27, and the posterior mean for the standard deviation of
More informationMonitoring and data filtering II. Dan Jensen IPH, KU
Monitoring and data filtering II Dan Jensen IPH, KU Outline Introduction to Dynamic Linear Models (DLM) - Conceptual introduction - Difference between the Classical methods and DLM - A very simple DLM
More informationVariability within multi-component systems. Bayesian inference in probabilistic risk assessment The current state of the art
PhD seminar series Probabilistics in Engineering : g Bayesian networks and Bayesian hierarchical analysis in engeering g Conducted by Prof. Dr. Maes, Prof. Dr. Faber and Dr. Nishijima Variability within
More informationBETA METEOROLOGICAL TIME SERIES: APPLICATION TO AIR HUMIDITY DATA
BETA METEOROLOGICAL TIME SERIES: APPLICATION TO AIR HUMIDITY DATA Edilberto Cepeda-Cuervo. 1 Statistics department Universidad Nacional de Colombia, Bogotá, Colombia Marinho G. Andrade 2 Applied mathematics
More informationMixture modelling of recurrent event times with long-term survivors: Analysis of Hutterite birth intervals. John W. Mac McDonald & Alessandro Rosina
Mixture modelling of recurrent event times with long-term survivors: Analysis of Hutterite birth intervals John W. Mac McDonald & Alessandro Rosina Quantitative Methods in the Social Sciences Seminar -
More informationBayesian Graphical Models
Graphical Models and Inference, Lecture 16, Michaelmas Term 2009 December 4, 2009 Parameter θ, data X = x, likelihood L(θ x) p(x θ). Express knowledge about θ through prior distribution π on θ. Inference
More informationBayes: All uncertainty is described using probability.
Bayes: All uncertainty is described using probability. Let w be the data and θ be any unknown quantities. Likelihood. The probability model π(w θ) has θ fixed and w varying. The likelihood L(θ; w) is π(w
More informationExample using R: Heart Valves Study
Example using R: Heart Valves Study Goal: Show that the thrombogenicity rate (TR) is less than two times the objective performance criterion R and WinBUGS Examples p. 1/27 Example using R: Heart Valves
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationIntroduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016
Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016 EPSY 905: Intro to Bayesian and MCMC Today s Class An
More informationEconometrics I, Estimation
Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics, School of Public
More informationWorking Papers in Econometrics and Applied Statistics
T h e U n i v e r s i t y o f NEW ENGLAND Working Papers in Econometrics and Applied Statistics Finite Sample Inference in the SUR Model Duangkamon Chotikapanich and William E. Griffiths No. 03 - April
More informationOn Bayesian Computation
On Bayesian Computation Michael I. Jordan with Elaine Angelino, Maxim Rabinovich, Martin Wainwright and Yun Yang Previous Work: Information Constraints on Inference Minimize the minimax risk under constraints
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationMONTE CARLO METHODS. Hedibert Freitas Lopes
MONTE CARLO METHODS Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More information1 Bayesian Linear Regression (BLR)
Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,
More informationBayesian Inference and Decision Theory
Bayesian Inference and Decision Theory Instructor: Kathryn Blackmond Laskey Room 2214 ENGR (703) 993-1644 Office Hours: Tuesday and Thursday 4:30-5:30 PM, or by appointment Spring 2018 Unit 6: Gibbs Sampling
More informationeqr094: Hierarchical MCMC for Bayesian System Reliability
eqr094: Hierarchical MCMC for Bayesian System Reliability Alyson G. Wilson Statistical Sciences Group, Los Alamos National Laboratory P.O. Box 1663, MS F600 Los Alamos, NM 87545 USA Phone: 505-667-9167
More informationBayesian hypothesis testing for the distribution of insurance claim counts using the Gibbs sampler
Journal of Computational Methods in Sciences and Engineering 5 (2005) 201 214 201 IOS Press Bayesian hypothesis testing for the distribution of insurance claim counts using the Gibbs sampler Athanassios
More informationLecture 6: Dynamic Models
Lecture 6: Dynamic Models R.G. Pierse 1 Introduction Up until now we have maintained the assumption that X values are fixed in repeated sampling (A4) In this lecture we look at dynamic models, where the
More informationWeakness of Beta priors (or conjugate priors in general) They can only represent a limited range of prior beliefs. For example... There are no bimodal beta distributions (except when the modes are at 0
More informationSTAT 518 Intro Student Presentation
STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible
More informationBayesian search for other Earths
Bayesian search for other Earths Low-mass planets orbiting nearby M dwarfs Mikko Tuomi University of Hertfordshire, Centre for Astrophysics Research Email: mikko.tuomi@utu.fi Presentation, 19.4.2013 1
More informationSpatial Dynamic Factor Analysis
Spatial Dynamic Factor Analysis Esther Salazar Federal University of Rio de Janeiro Department of Statistical Methods Sixth Workshop on BAYESIAN INFERENCE IN STOCHASTIC PROCESSES Bressanone/Brixen, Italy
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationAlexandra M. Schmidt and Hedibert F. Lopes. Dynamic models
Alexandra M. Schmidt and Hedibert F. Lopes Dynamic models ii Contents 1 Dynamic models 1 Alexandra M. Schmidt and Hedibert F. Lopes 1.1 Introduction........................... 1 1.2 Univariate Normal Dynamic
More informationspbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models
spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models Andrew O. Finley 1, Sudipto Banerjee 2, and Bradley P. Carlin 2 1 Michigan State University, Departments
More informationStatistical Inference for MIC Data
Statistical Inference for MIC Data Prepared for: Nancy de With The British Columbia Ministry of Agriculture and Lands Submitted by: Huanhuan Wu Carl Schwarz Tom Loughin hwu1@sfu.ca cschwarz@stat.sfu.ca
More informationProbabilities for climate projections
Probabilities for climate projections Claudia Tebaldi, Reinhard Furrer Linda Mearns, Doug Nychka National Center for Atmospheric Research Richard Smith - UNC-Chapel Hill Steve Sain - CU-Denver Statistical
More informationTime-series small area estimation for unemployment based on a rotating panel survey
Discussion Paper Time-series small area estimation for unemployment based on a rotating panel survey The views expressed in this paper are those of the author and do not necessarily relect the policies
More informationLinear Regression. Data Model. β, σ 2. Process Model. ,V β. ,s 2. s 1. Parameter Model
Regression: Part II Linear Regression y~n X, 2 X Y Data Model β, σ 2 Process Model Β 0,V β s 1,s 2 Parameter Model Assumptions of Linear Model Homoskedasticity No error in X variables Error in Y variables
More informationMultivariate Normal & Wishart
Multivariate Normal & Wishart Hoff Chapter 7 October 21, 2010 Reading Comprehesion Example Twenty-two children are given a reading comprehsion test before and after receiving a particular instruction method.
More informationStat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC
Stat 451 Lecture Notes 07 12 Markov Chain Monte Carlo Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapters 8 9 in Givens & Hoeting, Chapters 25 27 in Lange 2 Updated: April 4, 2016 1 / 42 Outline
More informationLecture 13 Fundamentals of Bayesian Inference
Lecture 13 Fundamentals of Bayesian Inference Dennis Sun Stats 253 August 11, 2014 Outline of Lecture 1 Bayesian Models 2 Modeling Correlations Using Bayes 3 The Universal Algorithm 4 BUGS 5 Wrapping Up
More informationMarkov chain Monte Carlo
1 / 26 Markov chain Monte Carlo Timothy Hanson 1 and Alejandro Jara 2 1 Division of Biostatistics, University of Minnesota, USA 2 Department of Statistics, Universidad de Concepción, Chile IAP-Workshop
More informationConstruction of an Informative Hierarchical Prior Distribution: Application to Electricity Load Forecasting
Construction of an Informative Hierarchical Prior Distribution: Application to Electricity Load Forecasting Anne Philippe Laboratoire de Mathématiques Jean Leray Université de Nantes Workshop EDF-INRIA,
More informationBeyond MCMC in fitting complex Bayesian models: The INLA method
Beyond MCMC in fitting complex Bayesian models: The INLA method Valeska Andreozzi Centre of Statistics and Applications of Lisbon University (valeska.andreozzi at fc.ul.pt) European Congress of Epidemiology
More informationST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks
(9) Model selection and goodness-of-fit checks Objectives In this module we will study methods for model comparisons and checking for model adequacy For model comparisons there are a finite number of candidate
More informationComputer intensive statistical methods
Lecture 11 Markov Chain Monte Carlo cont. October 6, 2015 Jonas Wallin jonwal@chalmers.se Chalmers, Gothenburg university The two stage Gibbs sampler If the conditional distributions are easy to sample
More informationItem Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions
R U T C O R R E S E A R C H R E P O R T Item Parameter Calibration of LSAT Items Using MCMC Approximation of Bayes Posterior Distributions Douglas H. Jones a Mikhail Nediak b RRR 7-2, February, 2! " ##$%#&
More informationApproximate Bayesian Computation: a simulation based approach to inference
Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationEstimating and Accounting for the Output Gap with Large Bayesian Vector Autoregressions
Estimating and Accounting for the Output Gap with Large Bayesian Vector Autoregressions James Morley 1 Benjamin Wong 2 1 University of Sydney 2 Reserve Bank of New Zealand The view do not necessarily represent
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationDAG models and Markov Chain Monte Carlo methods a short overview
DAG models and Markov Chain Monte Carlo methods a short overview Søren Højsgaard Institute of Genetics and Biotechnology University of Aarhus August 18, 2008 Printed: August 18, 2008 File: DAGMC-Lecture.tex
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More information1.1. VARs, Wold representations and their limits
1. Shocks Nr. 1 1.1. VARs, Wold representations and their limits A brief review of VARs. Assume a true model, in MA form: X = A 0 e + A 1 e( 1) + A 2 e( 2) +...; E(ee ) = I = (A 0 + A 1 L + A 2 L 2 +...)
More informationSequential Bayesian Updating
BS2 Statistical Inference, Lectures 14 and 15, Hilary Term 2009 May 28, 2009 We consider data arriving sequentially X 1,..., X n,... and wish to update inference on an unknown parameter θ online. In a
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationThe Metropolis-Hastings Algorithm. June 8, 2012
The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings
More informationBayesian model selection for computer model validation via mixture model estimation
Bayesian model selection for computer model validation via mixture model estimation Kaniav Kamary ATER, CNAM Joint work with É. Parent, P. Barbillon, M. Keller and N. Bousquet Outline Computer model validation
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee and Andrew O. Finley 2 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More information