ESTIMATION of a DSGE MODEL

Size: px
Start display at page:

Download "ESTIMATION of a DSGE MODEL"

Transcription

1 ESTIMATION of a DSGE MODEL Paris, October STÉPHANE ADJEMIAN stephane.adjemian@ens.fr UNIVERSITÉ DU MAINE & CEPREMAP Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 1/3

2 Bayesian approach A bridge between calibration and maximum likelihood. Uncertainty and a priori knowledge about the model and its parameters are described by prior probabilities. Confrontation to the data, through the likelihood function, leads to a revision of these probabilities ( posterior probabilities). Testing and model comparison is done by comparing posterior probabilities (over models). Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 2/3

3 Prior density The prior density describes a priori beliefs, before observing the data: p(θ A A) A stands for a specific model. θ A represents the parameters of model A. p( ) normal, gamma, shifted gamma, inverse gamma, beta, generalized beta, uniform... Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 3/3

4 Likelihood function The likelihood describes the density of the observed data, given the model and its parameters: L(θ A Y T, A) p(y T θ A, A) Y T are the observations until period T. In our case, the likelihood is recursive: p(y T θ A, A) = p(y 0 θ A, A) T t=1 p(y t Y t 1,θ A, A) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 4/3

5 Bayes theorem We have a prior density p(θ) and the likelihood p(y T θ). We are interested in p(θ Y T ), the posterior density. Using the Bayes theorem twice we obtain the density of the parameters knowing the data: We have: and also p(θ Y T ) = p(θ;y T) p(y T ) p(y T θ) = p(θ;y T) p(θ) p(θ;y T ) = p(y T θ) p(θ) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 5/3

6 Posterior kernel and density By substitution, we get the posterior density: p(θ A Y T, A) = p(y T θ A, A)p(θ A A) p(y T A) Where p(y T A) is the marginal density of the data conditional on the model: p(y T A) = p(θ A ;Y T A)dθ A Θ A The posterior kernel (un-normalized posterior density): p(θ A Y T, A) p(y T θ A, A)p(θ A A) K(θ A Y T, A) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 6/3

7 A simple example (I) Data Generating Process y t = µ + ε t, with t = 1,...,T where ε t N(0, 1) is a gaussian white noise. The likelihood is given by p(y T µ) = (2π) T 2 e 1 2 P T t=1 (y t µ) 2 µ ML,T = 1 T T t=1 y t y Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 7/3

8 A simple example (II) Note that the variance of this estimator is a simple function of the sample size V[ µ ML,T ] = 1 T Let our prior be a gaussian distribution with expectation µ 0 and variance σ 2 µ. The posterior density is defined, up to a constant, by: p (µ Y T ) (2πσ 2 µ) 1 2 e 1 2 (µ µ 0 ) 2 σ 2 µ (2π) T 2 e 1 2 P T t=1 (y t µ) 2 Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 8/3

9 A simple example (III)... Or equivalently 2 p (µ Y T ) e (µ E[µ]) V[µ] the posterior distribution is gaussian, with: V[µ] = 1 ( 1T ) 1 + σ 2 µ and E[µ] = ( 1T ) 1 µml,t + σ 2 µ µ 0 ( 1T ) 1 + σ 2 µ Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 9/3

10 A simple example (The bridge, IV) E[µ] = ( 1T ) 1 µml,t + σ 2 µ µ 0 ( 1T ) 1 + σ 2 µ The posterior mean is a convex combination of the prior mean and the ML estimate. If σ 2 µ (no prior information) then E[µ] µ ML,T. If σ 2 µ 0 (calibration) then E[µ] µ 0. Not so simple if the model is non linear in the estimated parameters... In general we have to follow a simulation based approach to get the posterior distributions and moments. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 10/3

11 Model comparison (I) Suppose we have two models A and B (with two associated vectors of deep parameters θ A and θ B ) estimated using the same sample Y T. For each model I = A, B we can evaluate, at least theoretically, the marginal density of the data conditional on the model: p(y T I) = p(θ I I) p(y T θ I, I)dθ I Θ I by integrating out the deep parameters θ I from the posterior kernel. p(y T I) measures the fit of model I. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 11/3

12 model A model B p(y T A) p(y T B) Y T Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 12/3

13 Model comparison (II) Suppose we have a prior distribution over models: p(a) and p(b). Again, using the Bayes theorem we can compute the posterior distribution over models: p(i Y T ) = p(i)p(y T I) I=A,B p(i)p(y T I) This formula may easily be generalized to a collection of N models. Posterior odds ratio: p(a Y T ) p(b Y T ) = p(a) p(b) p(y T A) p(y T B) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 13/3

14 In practice... We may be interested in: Estimating some posterior moments. Estimating the posterior marginal density to compare models. We can do it by hand for linear models (for instance VAR models) by carefully choosing the priors But it s impossible for DSGE models. We use a simulation based approach. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 14/3

15 Metropolis algorithm (I) 1. Choose a starting point θ & run a loop over Draw a proposal θ from a jumping distribution J(θ θ t 1 ) = N(θ t 1,cΣ m ) 3. Compute the acceptance ratio r = p(θ Y T ) p(θ t 1 Y T ) = K(θ Y T ) K(θ t 1 Y T ) 4. Finally θ t = { θ with probability min(r, 1) θ t 1 otherwise. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 15/3

16 posterior kernel K(θ o ) θ o Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 16/3

17 K ( θ 1) = K(θ ) posterior kernel K(θ o ) θ o θ 1 = θ Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 17/3

18 K ( θ 1) posterior kernel K(θ o ) K(θ )?? θ o θ 1 θ Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 18/3

19 Metropolis algorithm (II) How should we choose the scale factor c (variance of the jumping distribution)? The acceptance ratio should be strictly positive and not too important. How many draws? Convergence has to be assessed... Parallel Markov chains Pooled moments have to be close to Within moments. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 19/3

20 Approximation of the marginal density By definition we have: p(y T A) = p(θ A Y T, A)p(θ A A)dθ A Θ A By assuming that the posterior density is gaussian (Laplace approximation) we have the following estimator: p(y T A) = (2π) k 2 Σθ m A 1 2 p(θ m A Y T, A)p(θ m A A) where θ m A is the posterior mode. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 20/3

21 Harmonic mean (I) Note that " # f(θ A ) E A p(θ A A)p(Y T θ A, A) θa, = Z f(θ A )p(θ A Y T, A) Θ A p(θ A A)p(Y T θ A, A) dθ A where f is a density function. The right member of the equality, using the definition of the posterior density, may be rewritten as Z f(θ A ) Θ A p(θ A A)p(Y T θ A, A) p(θ A A)p(Y T θ A, A) RΘ A p(θ A A)p(Y T θ A, A)dθ A dθ A Finally, we have " # f(θ A ) E A p(θ A A)p(Y T θ A, A) θa, = R Θ f(θ)dθ R A Θ p(θ A A A)p(Y T θ A, A)dθ A Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 21/3

22 Harmonic mean (II) So that p(y T A) = E [ f(θ A ) ] 1 θ A, A p(θ A A)p(Y T θ A, A) This suggests the following estimator of the marginal density p(y T A) = [ 1 B B b=1 f(θ (b) A ) p(θ (b) A A)p(Y T θ (b) A, A) Each drawn vector θ (b) A comes from the Metropolis-Hastings monte-carlo. ] 1 Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 22/3

23 Harmonic mean (III) The preceding proof holds if we replace f(θ) by 1 Simple Harmonic Mean estimator. But this estimator may also have a huge variance. The density f(θ) may be interpreted as a weighting function, we want to give less importance to extremal values of θ. Geweke (1999) suggests to use a truncated gaussian function (modified harmonic mean estimator). Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 23/3

24 DSGE model (I) Any DSGE model may be represented as follows with: E t [ǫ t+1 ] = 0 E t [ ǫt+1 ǫ t+1] = Σǫ E t [f θ (y t+1,y t,y t 1,ǫ t )] = 0 where: y is the vector of endogenous variables, ǫ is the vector of exogenous stochastic shocks, θ is the vector of deep parameters. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 24/3

25 DSGE model (II, solution) We linearize the model around a deterministic steady state ȳ(θ) (such that f θ (ȳ,ȳ,ȳ, 0) = 0). Solving the linearized version of this model (with dynare for instance), we get a state space reduced form representation: y t ŷ t = Mȳ(θ) + Mŷ t +η t = g y (θ)ŷ t 1 + g u (θ)ǫ t E [ ǫ t ǫ t] = Σǫ E [ η t η t] = V The reduced form is non linear in the deep parameters... Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 25/3

26 DSGE model (III, Kalman filter)... But linear in the endogenous and exogenous variables. So that the likelihood may be evaluated with a linear prediction error algorithm. Kalman Filter recursion, for t = 1,...,T : v t F t K t ŷ t+1 = y t ȳ (θ) Mŷ t = MP t M + V = g y (θ)p t M Ft 1 = g y (θ)ŷ t + K t v t P t+1 = g y (θ)p t (g y (θ) K t M) + g u (θ)σ ǫ g u (θ) with initial conditions y 1 and P 1. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 26/3

27 DSGE model (IV, likelihood & posterior) We have to estimate θ, Σ ǫ and V The log-likelihood is defined by : log L(θ Y T ) = Tk 2 log 2π 1 2 T t=1 log F t 1 2 T t=1 v tf t 1 v t where the vector θ contains the k parameters to be estimated (θ, vech(σ ǫ ) and vech(v )). The log posterior kernel is then defined by : log K(θ YT ) = log L(θ Y T ) + log p(θ) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 27/3

28 Rabanal & Rubio-Ramirez 2001 (I) New keynesian models. Common equations : y t = E t y t+1 σ(r t E t p t+1 + E t g t+1 g t ) y t = a t + (1 δ)n t mc t = w t p t + n t y t mrs t = 1 σ y t + γn t g t r t = ρ r r t 1 + (1 ρ r ) [γ π p t + γ y y t ] + z t w t p t = w t 1 p t 1 + w t p t a t,g t AR(1), z t,λ t are gaussian white noises. Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 28/3

29 Rabanal & Rubio-Ramirez 2001 (II) Baseline sticky prices model (BSP) : p t = βe [ p t+1 + κ p (mc t + λ t )] w t p t = mrs t Sticky prices & Price indexation (INDP) : p t = γ b p t 1 + γ f E [ p t+1 + κ p(mc t + λ t ) ] w t p t = mrs t Sticky prices & wages (EHL) : p t = βe t [ p t+1 + κ p (mc t + λ t )] w t = βe t [ w t+1 ] + κ w [mrs t (w t p t )] Sticky prices & wages + Wage indexation (INDW) : w t α p t 1 = βe t [ w t+1 ] αβ p t + κ w [mrs t (w t p t )] Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 29/3

30 DYNARE (I) var a g mc mrs n pie r rw winf y; varexo e_a e_g e_lam e_ms; parameters invsig delta... ; model(linear); y=y(+1)-(1/invsig)*(r-pie(+1)+g(+1)-g); y=a+(1-delta)*n; mc=rw+n-y;... end; Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 30/3

31 DYNARE (II) estimated_params; stderr e_a, uniform_pdf,,,0,1; stderr e_lam, uniform_pdf,,,0,1;... rho,uniform_pdf,,,0,1; gampie, normal_pdf, 1.5, 0.25;... end; varobs pie r y rw; estimation(datafile=dataraba,first_obs=10,...,mh_jscale=0.5); Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 31/3

32 Posterior mode Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 32/3

33 Posterior mode 0.3 omega (Interval) 12 x 10 3 omega (m2) 2.5 x 10 3 omega (m3) rhoa (Interval) x x 10 4 rhoa (m2) x x 10 5 rhoa (m3) x rhog (Interval) x x 10 3 rhog (m2) x x 10 4 rhog (m3) x x 10 4 x 10 4 x 10 4 Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 33/3

34 Posterior distributions (I) Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 34/3

35 Posterior distributions (II) SE_e_a SE_e_g SE_e_ms SE_e_lam invsig gam rho gampie gamy Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 35/3

36 Posterior distributions (III) omega rhoa rhog 0.4 thetabig Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/ :37 p. 36/3

Bayesian Estimation of DSGE Models

Bayesian Estimation of DSGE Models Bayesian Estimation of DSGE Models Stéphane Adjemian Université du Maine, GAINS & CEPREMAP stephane.adjemian@univ-lemans.fr http://www.dynare.org/stepan June 28, 2011 June 28, 2011 Université du Maine,

More information

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc.

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc. DSGE Methods Estimation of DSGE models: Maximum Likelihood & Bayesian Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@uni-muenster.de Summer

More information

The Metropolis-Hastings Algorithm. June 8, 2012

The Metropolis-Hastings Algorithm. June 8, 2012 The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Building and simulating DSGE models part 2

Building and simulating DSGE models part 2 Building and simulating DSGE models part II DSGE Models and Real Life I. Replicating business cycles A lot of papers consider a smart calibration to replicate business cycle statistics (e.g. Kydland Prescott).

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

Inference. Jesús Fernández-Villaverde University of Pennsylvania

Inference. Jesús Fernández-Villaverde University of Pennsylvania Inference Jesús Fernández-Villaverde University of Pennsylvania 1 A Model with Sticky Price and Sticky Wage Household j [0, 1] maximizes utility function: X E 0 β t t=0 G t ³ C j t 1 1 σ 1 1 σ ³ N j t

More information

Bayesian Estimation of DSGE Models: Lessons from Second-order Approximations

Bayesian Estimation of DSGE Models: Lessons from Second-order Approximations Bayesian Estimation of DSGE Models: Lessons from Second-order Approximations Sungbae An Singapore Management University Bank Indonesia/BIS Workshop: STRUCTURAL DYNAMIC MACROECONOMIC MODELS IN ASIA-PACIFIC

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Computing first and second order approximations of DSGE models with DYNARE

Computing first and second order approximations of DSGE models with DYNARE Computing first and second order approximations of DSGE models with DYNARE Michel Juillard CEPREMAP Computing first and second order approximations of DSGE models with DYNARE p. 1/33 DSGE models E t {f(y

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

An Introduction to Graphs in Dynare

An Introduction to Graphs in Dynare An Introduction to Graphs in Dynare Johannes Pfeifer. University of Cologne June 9, 7 First version: March 3 Abstract This paper provides an introduction to the graphs displayed by Dynare in its stoch_simul,

More information

... Econometric Methods for the Analysis of Dynamic General Equilibrium Models

... Econometric Methods for the Analysis of Dynamic General Equilibrium Models ... Econometric Methods for the Analysis of Dynamic General Equilibrium Models 1 Overview Multiple Equation Methods State space-observer form Three Examples of Versatility of state space-observer form:

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Dynamic Macro 1 / 114. Bayesian Estimation. Summer Bonn University

Dynamic Macro 1 / 114. Bayesian Estimation. Summer Bonn University Dynamic Macro Bayesian Estimation Petr Sedláček Bonn University Summer 2015 1 / 114 Overall plan Motivation Week 1: Use of computational tools, simple DSGE model Tools necessary to solve models and a solution

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal Department of Econometrics, Vrije Universitiet Amsterdam, NL-8 HV Amsterdam dcreal@feweb.vu.nl August 7 Abstract Bayesian estimation

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References Solution and Estimation of DSGE Models, with J. Fernandez-Villaverde

More information

Point, Interval, and Density Forecast Evaluation of Linear versus Nonlinear DSGE Models

Point, Interval, and Density Forecast Evaluation of Linear versus Nonlinear DSGE Models Point, Interval, and Density Forecast Evaluation of Linear versus Nonlinear DSGE Models Francis X. Diebold Frank Schorfheide Minchul Shin University of Pennsylvania May 4, 2014 1 / 33 Motivation The use

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal First version: February 8, 27 Current version: March 27, 27 Abstract Dynamic stochastic general equilibrium models have become a popular

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Advanced Statistical Modelling

Advanced Statistical Modelling Markov chain Monte Carlo (MCMC) Methods and Their Applications in Bayesian Statistics School of Technology and Business Studies/Statistics Dalarna University Borlänge, Sweden. Feb. 05, 2014. Outlines 1

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Model comparison. Christopher A. Sims Princeton University October 18, 2016

Model comparison. Christopher A. Sims Princeton University October 18, 2016 ECO 513 Fall 2008 Model comparison Christopher A. Sims Princeton University sims@princeton.edu October 18, 2016 c 2016 by Christopher A. Sims. This document may be reproduced for educational and research

More information

High-dimensional Problems in Finance and Economics. Thomas M. Mertens

High-dimensional Problems in Finance and Economics. Thomas M. Mertens High-dimensional Problems in Finance and Economics Thomas M. Mertens NYU Stern Risk Economics Lab April 17, 2012 1 / 78 Motivation Many problems in finance and economics are high dimensional. Dynamic Optimization:

More information

Introduction to Smoothing spline ANOVA models (metamodelling)

Introduction to Smoothing spline ANOVA models (metamodelling) Introduction to Smoothing spline ANOVA models (metamodelling) M. Ratto DYNARE Summer School, Paris, June 215. Joint Research Centre www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting

More information

(I AL BL 2 )z t = (I CL)ζ t, where

(I AL BL 2 )z t = (I CL)ζ t, where ECO 513 Fall 2011 MIDTERM EXAM The exam lasts 90 minutes. Answer all three questions. (1 Consider this model: x t = 1.2x t 1.62x t 2 +.2y t 1.23y t 2 + ε t.7ε t 1.9ν t 1 (1 [ εt y t = 1.4y t 1.62y t 2

More information

MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY

MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY ECO 513 Fall 2008 MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY SIMS@PRINCETON.EDU 1. MODEL COMPARISON AS ESTIMATING A DISCRETE PARAMETER Data Y, models 1 and 2, parameter vectors θ 1, θ 2.

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Combining Macroeconomic Models for Prediction

Combining Macroeconomic Models for Prediction Combining Macroeconomic Models for Prediction John Geweke University of Technology Sydney 15th Australasian Macro Workshop April 8, 2010 Outline 1 Optimal prediction pools 2 Models and data 3 Optimal pools

More information

Dynamic Identification of DSGE Models

Dynamic Identification of DSGE Models Dynamic Identification of DSGE Models Ivana Komunjer and Serena Ng UCSD and Columbia University All UC Conference 2009 Riverside 1 Why is the Problem Non-standard? 2 Setup Model Observables 3 Identification

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Introduction to Bayesian Methods

Introduction to Bayesian Methods Introduction to Bayesian Methods Jessi Cisewski Department of Statistics Yale University Sagan Summer Workshop 2016 Our goal: introduction to Bayesian methods Likelihoods Priors: conjugate priors, non-informative

More information

Bayesian Phylogenetics:

Bayesian Phylogenetics: Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes

More information

DSGE Model Forecasting

DSGE Model Forecasting University of Pennsylvania EABCN Training School May 1, 216 Introduction The use of DSGE models at central banks has triggered a strong interest in their forecast performance. The subsequent material draws

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Bayesian RL Seminar. Chris Mansley September 9, 2008

Bayesian RL Seminar. Chris Mansley September 9, 2008 Bayesian RL Seminar Chris Mansley September 9, 2008 Bayes Basic Probability One of the basic principles of probability theory, the chain rule, will allow us to derive most of the background material in

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Computer Practical: Metropolis-Hastings-based MCMC

Computer Practical: Metropolis-Hastings-based MCMC Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Estimation of moment-based models with latent variables

Estimation of moment-based models with latent variables Estimation of moment-based models with latent variables work in progress Ra aella Giacomini and Giuseppe Ragusa UCL/Cemmap and UCI/Luiss UPenn, 5/4/2010 Giacomini and Ragusa (UCL/Cemmap and UCI/Luiss)Moments

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Estimating Macroeconomic Models: A Likelihood Approach

Estimating Macroeconomic Models: A Likelihood Approach Estimating Macroeconomic Models: A Likelihood Approach Jesús Fernández-Villaverde University of Pennsylvania, NBER, and CEPR Juan Rubio-Ramírez Federal Reserve Bank of Atlanta Estimating Dynamic Macroeconomic

More information

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for

More information

Markov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017

Markov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017 Markov Chain Monte Carlo (MCMC) and Model Evaluation August 15, 2017 Frequentist Linking Frequentist and Bayesian Statistics How can we estimate model parameters and what does it imply? Want to find the

More information

Stochastic simulations with DYNARE. A practical guide.

Stochastic simulations with DYNARE. A practical guide. Stochastic simulations with DYNARE. A practical guide. Fabrice Collard (GREMAQ, University of Toulouse) Adapted for Dynare 4.1 by Michel Juillard and Sébastien Villemot (CEPREMAP) First draft: February

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

Dynamic Factor Models and Factor Augmented Vector Autoregressions. Lawrence J. Christiano

Dynamic Factor Models and Factor Augmented Vector Autoregressions. Lawrence J. Christiano Dynamic Factor Models and Factor Augmented Vector Autoregressions Lawrence J Christiano Dynamic Factor Models and Factor Augmented Vector Autoregressions Problem: the time series dimension of data is relatively

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference 1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE

More information

MONFISPOL FP7 project SSH Report on computing optimal policy

MONFISPOL FP7 project SSH Report on computing optimal policy MONFISPOL FP7 project SSH-225149 Deliverable 1.1.1 Report on computing optimal policy November 2010 Michel Juillard 1 1 CEPREMAP Introduction This report describes the algorithm used by Dynare to compute

More information

Monetary and Exchange Rate Policy Under Remittance Fluctuations. Technical Appendix and Additional Results

Monetary and Exchange Rate Policy Under Remittance Fluctuations. Technical Appendix and Additional Results Monetary and Exchange Rate Policy Under Remittance Fluctuations Technical Appendix and Additional Results Federico Mandelman February In this appendix, I provide technical details on the Bayesian estimation.

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 5. Bayesian Computation Historically, the computational "cost" of Bayesian methods greatly limited their application. For instance, by Bayes' Theorem: p(θ y) = p(θ)p(y

More information

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013 Eco517 Fall 2013 C. Sims MCMC October 8, 2013 c 2013 by Christopher A. Sims. This document may be reproduced for educational and research purposes, so long as the copies contain this notice and are retained

More information

Bayesian Calibration of Simulators with Structured Discretization Uncertainty

Bayesian Calibration of Simulators with Structured Discretization Uncertainty Bayesian Calibration of Simulators with Structured Discretization Uncertainty Oksana A. Chkrebtii Department of Statistics, The Ohio State University Joint work with Matthew T. Pratola (Statistics, The

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Bayesian Methods for DSGE models Lecture 1 Macro models as data generating processes

Bayesian Methods for DSGE models Lecture 1 Macro models as data generating processes Bayesian Methods for DSGE models Lecture 1 Macro models as data generating processes Kristoffer Nimark CREI, Universitat Pompeu Fabra and Barcelona GSE June 30, 2014 Bayesian Methods for DSGE models Course

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

1 Bayesian Linear Regression (BLR)

1 Bayesian Linear Regression (BLR) Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,

More information

Bayesian Inference in Astronomy & Astrophysics A Short Course

Bayesian Inference in Astronomy & Astrophysics A Short Course Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning

More information

Estimating a Nonlinear New Keynesian Model with the Zero Lower Bound for Japan

Estimating a Nonlinear New Keynesian Model with the Zero Lower Bound for Japan Estimating a Nonlinear New Keynesian Model with the Zero Lower Bound for Japan Hirokuni Iiboshi 1, Mototsugu Shintani 2, Kozo Ueda 3 August 218 @ EEA-ESEM 1 Tokyo Metropolitan University 2 University of

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Answers and expectations

Answers and expectations Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

CS281A/Stat241A Lecture 22

CS281A/Stat241A Lecture 22 CS281A/Stat241A Lecture 22 p. 1/4 CS281A/Stat241A Lecture 22 Monte Carlo Methods Peter Bartlett CS281A/Stat241A Lecture 22 p. 2/4 Key ideas of this lecture Sampling in Bayesian methods: Predictive distribution

More information

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

Bayesian Model Comparison:

Bayesian Model Comparison: Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500

More information

Session 5B: A worked example EGARCH model

Session 5B: A worked example EGARCH model Session 5B: A worked example EGARCH model John Geweke Bayesian Econometrics and its Applications August 7, worked example EGARCH model August 7, / 6 EGARCH Exponential generalized autoregressive conditional

More information

The Basic RBC model with Dynare

The Basic RBC model with Dynare The Basic RBC model with Dynare Petros Varthalitis December 200 Abstract This paper brie y describes the basic RBC model without steady-state growth.it uses dynare software to simulate the model under

More information

Eco517 Fall 2014 C. Sims MIDTERM EXAM

Eco517 Fall 2014 C. Sims MIDTERM EXAM Eco57 Fall 204 C. Sims MIDTERM EXAM You have 90 minutes for this exam and there are a total of 90 points. The points for each question are listed at the beginning of the question. Answer all questions.

More information

Session 2B: Some basic simulation methods

Session 2B: Some basic simulation methods Session 2B: Some basic simulation methods John Geweke Bayesian Econometrics and its Applications August 14, 2012 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

The 2001 recession displayed unique characteristics in comparison to other

The 2001 recession displayed unique characteristics in comparison to other Smoothing the Shocks of a Dynamic Stochastic General Equilibrium Model ANDREW BAUER NICHOLAS HALTOM AND JUAN F RUBIO-RAMÍREZ Bauer and Haltom are senior economic analysts and Rubio-Ramírez is an economist

More information

Solving Deterministic Models

Solving Deterministic Models Solving Deterministic Models Shanghai Dynare Workshop Sébastien Villemot CEPREMAP October 27, 2013 Sébastien Villemot (CEPREMAP) Solving Deterministic Models October 27, 2013 1 / 42 Introduction Deterministic

More information

Introduction into Bayesian statistics

Introduction into Bayesian statistics Introduction into Bayesian statistics Maxim Kochurov EF MSU November 15, 2016 Maxim Kochurov Introduction into Bayesian statistics EF MSU 1 / 7 Content 1 Framework Notations 2 Difference Bayesians vs Frequentists

More information

SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, )

SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, ) Econometrica Supplementary Material SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, 653 710) BY SANGHAMITRA DAS, MARK ROBERTS, AND

More information

Graphical Models for Collaborative Filtering

Graphical Models for Collaborative Filtering Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,

More information

Hierarchical models. Dr. Jarad Niemi. August 31, Iowa State University. Jarad Niemi (Iowa State) Hierarchical models August 31, / 31

Hierarchical models. Dr. Jarad Niemi. August 31, Iowa State University. Jarad Niemi (Iowa State) Hierarchical models August 31, / 31 Hierarchical models Dr. Jarad Niemi Iowa State University August 31, 2017 Jarad Niemi (Iowa State) Hierarchical models August 31, 2017 1 / 31 Normal hierarchical model Let Y ig N(θ g, σ 2 ) for i = 1,...,

More information

First order approximation of stochastic models

First order approximation of stochastic models First order approximation of stochastic models Shanghai Dynare Workshop Sébastien Villemot CEPREMAP October 27, 2013 Sébastien Villemot (CEPREMAP) First order approximation of stochastic models October

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Decision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over

Decision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over Point estimation Suppose we are interested in the value of a parameter θ, for example the unknown bias of a coin. We have already seen how one may use the Bayesian method to reason about θ; namely, we

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning

More information

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation

More information

Bayesian Analysis (Optional)

Bayesian Analysis (Optional) Bayesian Analysis (Optional) 1 2 Big Picture There are two ways to conduct statistical inference 1. Classical method (frequentist), which postulates (a) Probability refers to limiting relative frequencies

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

Bayesian search for other Earths

Bayesian search for other Earths Bayesian search for other Earths Low-mass planets orbiting nearby M dwarfs Mikko Tuomi University of Hertfordshire, Centre for Astrophysics Research Email: mikko.tuomi@utu.fi Presentation, 19.4.2013 1

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 13-28 February 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Limitations of Gibbs sampling. Metropolis-Hastings algorithm. Proof

More information

The full RBC model. Empirical evaluation

The full RBC model. Empirical evaluation The full RBC model. Empirical evaluation Lecture 13 (updated version), ECON 4310 Tord Krogh October 24, 2012 Tord Krogh () ECON 4310 October 24, 2012 1 / 49 Today s lecture Add labor to the stochastic

More information

ComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors. RicardoS.Ehlers

ComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors. RicardoS.Ehlers ComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors RicardoS.Ehlers Laboratório de Estatística e Geoinformação- UFPR http://leg.ufpr.br/ ehlers ehlers@leg.ufpr.br II Workshop on Statistical

More information

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio

Estimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist

More information