Assessing Regime Uncertainty Through Reversible Jump McMC

Similar documents
Using Model Selection and Prior Specification to Improve Regime-switching Asset Simulations

Bayes Factors, posterior predictives, short intro to RJMCMC. Thermodynamic Integration

Markov Chain Monte Carlo

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Bayesian model selection: methodology, computation and applications

A note on Reversible Jump Markov Chain Monte Carlo

Bayesian Nonparametric Regression for Diabetes Deaths

Session 5B: A worked example EGARCH model

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

MCMC algorithms for fitting Bayesian models

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

Reminder of some Markov Chain properties:

Bayesian Methods for Machine Learning

An Introduction to Reversible Jump MCMC for Bayesian Networks, with Application

Bayesian Regression Linear and Logistic Regression

Monte Carlo Dynamically Weighted Importance Sampling for Spatial Models with Intractable Normalizing Constants

STA 4273H: Sta-s-cal Machine Learning

Transdimensional Markov Chain Monte Carlo Methods. Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen

Approximate Bayesian computation for spatial extremes via open-faced sandwich adjustment

New Insights into History Matching via Sequential Monte Carlo

Non-Parametric Bayes

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Principles of Bayesian Inference

Bayesian Networks in Educational Assessment

Markov Chain Monte Carlo methods

Web Appendix for The Dynamics of Reciprocity, Accountability, and Credibility

Bayesian Estimation with Sparse Grids

Bayesian Phylogenetics:

Bayesian Inference and MCMC

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Model comparison. Christopher A. Sims Princeton University October 18, 2016

Lecture 6: Markov Chain Monte Carlo

eqr094: Hierarchical MCMC for Bayesian System Reliability

Lecture 2: From Linear Regression to Kalman Filter and Beyond

A Review of Pseudo-Marginal Markov Chain Monte Carlo

ADVANCED FINANCIAL ECONOMETRICS PROF. MASSIMO GUIDOLIN

Markov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017

Bayesian analysis in nuclear physics

Doing Bayesian Integrals

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 18-16th March Arnaud Doucet

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

7. Estimation and hypothesis testing. Objective. Recommended reading

Principles of Bayesian Inference

Monte Carlo in Bayesian Statistics

Markov chain Monte Carlo

Monetary and Exchange Rate Policy Under Remittance Fluctuations. Technical Appendix and Additional Results

Likelihood-free MCMC

MARKOV CHAIN MONTE CARLO

Lecture : Probabilistic Machine Learning

BAYESIAN MODEL CRITICISM

Metropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9

Recall that the AR(p) model is defined by the equation

Tutorial on ABC Algorithms

STA414/2104 Statistical Methods for Machine Learning II

CSC 2541: Bayesian Methods for Machine Learning

Markov Chain Monte Carlo

Nested Sampling. Brendon J. Brewer. brewer/ Department of Statistics The University of Auckland

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Computer Practical: Metropolis-Hastings-based MCMC

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture)

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution

MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

Infer relationships among three species: Outgroup:

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Bridge estimation of the probability density at a point. July 2000, revised September 2003

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008

Bayesian Methods in Multilevel Regression

ComputationalToolsforComparing AsymmetricGARCHModelsviaBayes Factors. RicardoS.Ehlers

Bayesian Inference for Pair-copula Constructions of Multiple Dependence

Bayesian inference for multivariate skew-normal and skew-t distributions

Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models

Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract

Will Penny. SPM for MEG/EEG, 15th May 2012

AEROSOL MODEL SELECTION AND UNCERTAINTY MODELLING BY RJMCMC TECHNIQUE

Learning the hyper-parameters. Luca Martino

Bayesian Estimation of Input Output Tables for Russia

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian parameter estimation in predictive engineering

Bayesian inference for multivariate extreme value distributions

Principles of Bayesian Inference

An introduction to Bayesian statistics and model calibration and a host of related topics

Penalized Loss functions for Bayesian Model Choice

Recursive Deviance Information Criterion for the Hidden Markov Model

Lecture Notes based on Koop (2003) Bayesian Econometrics

Tools for Parameter Estimation and Propagation of Uncertainty

VCMC: Variational Consensus Monte Carlo

Bayes: All uncertainty is described using probability.

BAYESIAN ANALYSIS OF ORDER UNCERTAINTY IN ARIMA MODELS

DAG models and Markov Chain Monte Carlo methods a short overview

CSC 2541: Bayesian Methods for Machine Learning

Will Penny. DCM short course, Paris 2012

Intro to Probability. Andrei Barbu

Graphical Models and Kernel Methods

17 : Markov Chain Monte Carlo

Introduc)on to Bayesian Methods

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Model Comparison. Course on Bayesian Inference, WTCN, UCL, February Model Comparison. Bayes rule for models. Linear Models. AIC and BIC.

Transcription:

Assessing Regime Uncertainty Through Reversible Jump McMC August 14, 2008

1 Introduction Background Research Question 2 The RJMcMC Method McMC RJMcMC Algorithm Dependent Proposals Independent Proposals 3 Results Simulation Studies S&P 500 Data 4 Conclusion Future Work Works Cited

Background Thank you to the Committee on Knowledge Extension Research of the Society of Actuaries and the Department of Statistics at Texas A&M University for the funding to enable me to come and present here.

Background In regime-switching models, a Markov process switches between K states at random. The distribution of the current state (ρ t ) is only dependent upon the previous state. (ρ t 1 ) That is known as the Markov Property. The state (or regime) at time t determines the distribution of the random variable X t.

Background Often, the different regimes will be from the same family of distributions, but have different parameter values. In the RSLN model, each of the regimes follows a lognormal distribution. The RSLN model allows us to model various situations. (i.e. two regimes, one when the economy is good and the other when it is poor) [Hardy 2003]

Research Question How do we determine the optimal number of regimes?

Research Question Currently there are many options to determine which model is best (AIC, BIC, radj 2 ) or if one model is significantly better than another (Full vs. Reduced F Tests).

Research Question Currently there are many options to determine which model is best (AIC, BIC, radj 2 ) or if one model is significantly better than another (Full vs. Reduced F Tests). Wouldn t it be great if we could obtain the probability that a certain model is the best?

Research Question We can get those probabilities using a method first developed in [Green 1995] called reversible jump Markov chain Monte Carlo (RJMcMC). Advantages of having probabilities: Ease of explanation Ability to be incorporated into simulations

McMC When trying to find the distribution of the parameters in a model, we can use Bayes rule. P(θ y) = f(y θ)π(θ) P(y) It seems rather harmless, but let s expand it a bit.

McMC P(y) = f(y θ)π(θ) f (θ y) = f(y θ)π(θ) f(y θ)π(θ) Often, the integral in the denominator does not have a closed form.

McMC Luckily, Markov chain Monte Carlo (McMC) methods were developed to allow you to draw samples from the posterior distributions of the parameters When you get enough draws from the parameter s distribution, you will have a pretty good idea of what that distribution looks like. McMC is only works with likelihoods with a fixed number of parameters. Reversible Jump McMC expands the methodology to work with likelihoods of varying demension.

RJMcMC Algorithm Here are the basic steps of the RJMcMC algorithm. [Waagepetersen and Sorenson, 2001] 1 Select a starting value X 1 = (M 1, Z 1 ) where M i is the model index at iteration i and Z i is the parameter values with length n mi. 2 Generate a proposal value X p. 3 Satisfy certain conditions 4 Calculate the acceptance probability 5 If accepted X 2 = X p, otherwise X 2 = X 1 6 Repeat steps 2-5

RJMcMC Algorithm Select a starting value While this may seem like a difficult task, you can use other estimation methods (MLE, MOM) to find suitable starting values. You can also use information from other experts. Luckily, you don t even need to be that close because the algorithm will eventually bring you in to acceptable values.

RJMcMC Algorithm Proposal value You then generate X p = (m p, z p ). z p is generated by applying a deterministic mapping to the previous z and to a random component U. We can express it as z p = g mmp (z, U), where U is a random vector on R nmmp, n mmp 1, which has density q mmp (z, ) on R nmmp, and g mmp : R nm+nmmp R nmp is a deterministic mapping.

RJMcMC Algorithm Condition 1: Reversibility The condition of reversibility is: P(M n = m, Z n A m, M n+1 = m p, Z n+1 B mp ) = P(M n = m p, Z n B mp, M n+1 = m, Z n+1 A m ) for all m, m p 1,..., I, and all subsets A m and B mp respectively. C mp of C m and

RJMcMC Algorithm Condition 2: Dimension Matching The other crucial condition follows from the previous condition. n m + n mmp = n mp + n mpm This ensures that f m (z)q mmp (z, u) and f mp (z p )q mpm(z p, u p ) are joint densities on spaces on equal dimension.

RJMcMC Algorithm Acceptance Probability ( α mmp = min 1, p m p f mp (z p )p mpmq mpm(z p, u p ) p m f m (z)p mmp q mmp (z, u) g mmp (z, u) z u )

Dependent Proposals To propose new values in other models, we could let the new parameters depend on the current parameters of the current model. We could use moment matching or a similar method [Brooks et al 2003] to find an appropriate proposal. Unfortunately, the effectiveness of the sampler is highly dependent on the proposal function, especially in regime switching cases. Also, the inverse functions and Jacobian can get rather complicated.

Dependent Proposals Advantages of dependent proposals: Computationally quick Computationally sound Disadvantages: Results are highly dependent upon the choice of function (in RS situations, the function is not obvious) Jacobian and inverses can be very difficult to compute

Independent Proposals Instead, we can use the fact that MLE s are asymptotically multivariate normal [Gelman et al 2004] and generate draws independent of the current value of the parameters as follows: 1 Optimize the likelihood with respect to the parameters 2 Use the parameter estimates as the mean vector for the proposals 3 Use the inverse of the Hessian matrix as the covariance matrix for the proposals

Independent Proposals Advantages of independent proposals: Easy to set up (fewer errors) Jacobian equals 1 No inverses to compute Don t need extra information about the relationship between the parameters of different models Disadvantages: Numerical optimization can be unstable, especially with a large number of parameters Computationally intensive

Simulation Studies Does it actually work? To answer that question we ran two simulation studies and then applied it to real data.

Simulation Studies Simulations for µ with sample sizes of 75 and 750 Probability of One Regime 0.0 0.2 0.4 0.6 0.8 1.0 1 2 3 4 5 6 7 µ µ 1 varies σ 1 = 1 p 12 =.3 µ 2 = 4 σ 2 = 1 p 21 =.7

Simulation Studies Simulations for σ with sample sizes of 75 and 750 Probability of One Regime 0.0 0.2 0.4 0.6 0.8 1.0 1 2 3 4 5 6 7 µ 1 = 4 σ 1 varies p 12 =.3 µ 2 = 4 σ 2 = 1 p 21 =.7 σ

S&P 500 Data Now let s look at total return data from the S&P 500 index from January 1991 to March 2008. When we used the RJMcMC method, it returned a probability of 1 that the best model is the two regime RSLN model. (Remember that this is only compared to the one regime RSLN model)

Future Work To improve this project, I plan to: Explore other numerical optimization methods for the three regime case Include other models (ARCH, GARCH, SV, etc.) Try some other methods besides RJMcMC ([Chib et al 2001] or [Phillips and Smith 1996])

Works Cited Some interesting papers: Green, P.J. Reversible Jump Markov Chain Monte Carlo Computation and Bayesian Model Determination, Biometrika(1995), 82, 4, 711-732. Waagepetersen, Rasmus and Sorensen, Daniel, A Tutorial on Reversible Jump MCMC with a View toward Applications in QTL Mapping, International Statistical Review(2001), 69, 1, 49-61.