Paul Karapanagiotidis ECO4060
The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH) algorithm 3) Discuss recent improvements to the MH algorithm - Hamiltonian Monte Carlo (HMC) 4) Talk about possible future research: - Apply MCMC to Stochastic Volatility modeling
Econometric Modeling What is the connection between MCMC and econometric models? (i.e. why do we even care?) Modern econometric models often involve: - A large number of parameters - Complicated non-linear relationships - Often impossible to find analytic closed form solutions - Unobserved or latent variables may require simulation MCMC is useful since: - We can now estimate the parameters of these otherwise intractable models And. - Easy to implement Bayesian approach which often simplifies problems - Turns larger estimation problems can be broken up into smaller parts
Econometric Modeling What are some examples of useful applications? 1) High-dimensional multivariate GARCH models - BEKK Specification (Engle & Kroner, 1995) 2) Stochastic Volatility (SV) models (Taylor, 1994) - Volatility is unobserved 3) Dynamic Stochastic General Equilibrium models (DSGE) - Bayesian approach facilitates estimation (Fernandez-V., 2009)
An Example What exactly is MCMC (Markov Chain Monte Carlo)? Example: ( classic MC integration) Classic MC integration of the Expectation requires being able to sample from f(x)! MCMC resolves this issue: it provides a general framework for sampling from any distribution!
Metropolis-Hastings algorithm Paper by Chib & Greenberg (1995) One of the most general MCMC methods. - Transition kernel (continuous states) is: - Invariant distribution π(x) satisfies: - Detailed balance condition: We accept proposed value to the Markov Chain if: Question: How to propose values from that are usually accepted?
Traditional proposal mechanism We typically use a Random walk! Problems: - Random walk typically suffers from correlation of the Markov Chain! - High correlation makes inference less precise - When f(x) is a very complicated distribution: - Random walk will prove even worse as it will get stuck in regions A lack of good proposal mechanisms represents the main challenge facing the M-H algorithm. Can we find a more intelligent method of proposing Markov Chain draws?
Hamiltonian Monte Carlo Paper by Radford Neal (2010) Example: Bi-Variate Normal R.V. x with parameters θ={μ,σ} The distribution f(x θ) looks like:
Hamiltonian Monte Carlo - p = x denotes current position of the object ; m, momentum; M, a mass matrix. is potential energy as a function of position is kinetic energy as a function of momentum Laws of Motion: - Boltzmann distribution of energy in a system : - Η(p,m) is the Hamiltonian energy of the system (from statistical mechanics). Target distribution we wish to sample from
Hamiltonian Monte Carlo Total Energy, should stay constant as we move through the (p,m) space! In theory HMC could intelligently traverse the entire distribution support in one step, so draws would be nearly independent! Metropolis-Hastings draws are now accepted to the Markov Chain if: Ratio of Boltzmann dist s
Hamiltonian Monte Carlo Problems: - Computers can t deal with continuous variables - need to discretize laws of motion - Often H(p,m) will grow without bound if stability conditions not met - Proposed momentum doesn t consider current position - algorithm could still be made smarter - Only really works for smooth distributions
Hamiltonian Monte Carlo Maheu & Burda (2010) Augmented HMC and high-dimension BEKK GARCH : Improvements to HMC: - Fundamental idea is to adapt M(x) according to the local curvature of f(x θ) - Mass matrix M(x) now a function of the current state of the R.V. - However, this makes H(p,m) non-separable - we now require a local symmetry condition - Therefore, authors derive a new discretization that: - ensures local symmetry condition - avoids approximation error growing without bound So we ve got some nice improvements here application to BEKK GARCH shows great potential!
Looking forward towards research proposal Possible future research area: - High-frequency data - Stochastic Volatility (SV) model is probably preferred to ARCH - Model volatility with Realized Volatility estimator - Volatility Spillover effects Benefits of MCMC in this case: - MCMC on SV automatically provides optimal non-linear smoothed state inference - No need for separate particle filter or Kalman filter! - Higher precision samples from HMC imply better inference - Bayesian approach! - lets us turn the problem into an integral - breaks up problem into constituent conditional distributions