Bayesian Dynamic Linear Modelling for. Complex Computer Models

Size: px
Start display at page:

Download "Bayesian Dynamic Linear Modelling for. Complex Computer Models"

Transcription

1 Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer run is generating a function of time. For complex computer models, Bayarri et al considers the time as a computer model associated input parameter, and uses the Gaussian Response Surface Approximation method GaSP with the Kronecker product correlation matrix in the augumented space. However, this approach is only applicable when there are only a few time points. In this paper, we consider the Bayesian Dynamic Linear Model West and Harrison 1997 as an alternative approach when there are many time points. Our method also allows the forecasting for the future. Keywords: Computer model; Bayesian Dynamic Linear Model; Gaussian stochastic process; Bayesian analysis; Forwarding filtering and backward sampling; MCMC. 1 Introduction The computer models can be represented as deterministic functions of the associated parameters. There are generally two types of parameters: a calibration parameters u are 1

2 only associated with the computer codes. They may be uncertain physical properties. b unknown parameters x are associated with both the computer models and the field experiments. They are characteristics associated with the real experiments. For simplicity, we use x to represent x, u. As a result, we can represent the computer model as a function of x, y x. On the other hand, exsercising the code is very time consuming for complex computer models. Consequently, the function y x is only evaluated at selected locations x i, i = 1,..., n. In this paper, we focus on the computer models with the functional outputs. We assume that the computer model outputs are functions of time t, t = 1,..., T. We represent such computer model output as y x, t. This type of computer models has been studied both in Bayarri et al and Bayarri et al The SAVE model in Bayarri et al uses the Gaussian Response Surface Approximation method GaSP on the augmented space of x, t by assuming seperable correlation in the space of x and t. They assume that the computer model outputs are realizations from a Gaussian Stochastic Process defined on the x, t space, i.e., 1 y, GP µ, Corr,,, λm where Corr y x, t, y x, t = exp βi x i x i α i exp βt t t α t. We use y x to represent the functional output of a single computer run whose inputs is x, y x t = y x, t j, j = 1,..., T. The likelihood in SAVE is represented as, y x 1. N y x n µ 1, 1 λ Σ M 1 Σ 2 1 where Σ 1 k,l = exp β i x ki x li α i and Σ2 k,l = exp β t t k t l α t. 2

3 To implement the SAVE model, one needs to invert the matrices Σ 1 and Σ 2, where Σ 1 is a n by n matrix, and Σ 2 is T by T. In the context of complex computer models, inverting Σ 1 is feasible because n is generally small. But however, the dimension of Σ 2 may be too huge to invert. Bayarri et al uses basis expansion method SAVE2, i.e., y x, = I w i x φ i i=1 where {φ i } is a basis library, and they use a wavelet for their application. Then, they 1 model the coefficients as independent spatial processes, w i GP µ i, Corr λ M i,. i SAVE2 can give predictions with confidence bounds for the computer model output at any values of x by spatial interpolation. However, it can only handle the computer models with fixed time grids t = 1,..., T. Some applications of the computer model may require forecasting for the future, weather forecasting models for instance. In this paper, we will discuss modelling the computer model code by Dynamic Linear Models DLM, as to capture the temporal structures in the data. The paper is organized as follows. We will first introduce our DLM model and make connections with the SAVE model in section 2. In section 3, we will give the likelihood and specify the prior distributions for the unknown parameters associated with the DLM model. Section 4 discusses the MCMC method to get draws from the posterior distributions of the unknown quantities, and also gives spatial interpolation for the computer model at arbitray locations in the x space. The method will be applied on an example data set in section 5. 2 the DLM for the computer model outputs For a single computer model run at x, we use the time varying autoregressive model TVAR West and Harrison, 1997 to model its temporal structure. 3

4 y x, t = p φ t,j yx, t j + ɛ t x 2 j The computer model runs are correlated by assuming a Gaussian stochastic processes for the evolutions ɛ t x in equation 2, i.e., ɛ t GP0, v t Corr t, 3 where, we are assuming that Corr t, = Corr, is the same for all t. And we use seperable power exponential function for the evolution correlation, i.e., Corrx, x = exp i β i x i x i α i The model in equation 2 can be connected with the SAVE model given in equation 1, in an approximation sense. Consider the likelihood for the SAVE model in equation 1. Let y t = y x 1, t,..., y x n, t. We represent the likelihood in equation 1 by the product of conditional likelihoods, L y T, y T 1,..., y 1 Θ = p+1 L y i y i 1,..., y 1, Θ L y p, y p 1,..., y 1 Θ 4 i=t Next, at any time t, we approximate the conditional likelihood as, L y t y t 1,..., y 1, Θ L y t y t 1,..., y t p, Θ 5 Let ρk, l = exp β t k l α t, ρt,t 1:t p = ρt, t 1,..., ρt, t p, Σ2 ρk, l, k, l = t 1,..., t p. The conditional likelihoods in equation 5 are multivariate normals with mean vectors, k,l = 4

5 E y t y t 1,,..., y t p, Θ = = ρt,t 1:t p Σ2 1 Σ 1 Σ 1 Σ2 ρt,t 1:t p 1 I n n y t 1. y t p y t 1. y t p This implies the auto-regressive term in equation 2, y M x, t = Σ2 y M x, t 1 1 ρ t,t 1:t p. y M x, t p We assume that Corr t, = Corr, in equation 3 because the covariance matrices of the conditional likelihoods L y t y t 1,..., y t p, Θ is time-independent. To see this, we representat Cov y t y t 1,..., y t p, Θ as, Cov y t y t 1,..., y t p, Θ = 1 λ M = 1 λ M Σ 1 ρ t,t 1:t p 1 Σ 1 Σ2 Σ 1 ρ t,t 1:t p Σ 1 Σ2 1 ρ t,t 1:t p 1 ρt,t 1:t p Σ 1 Finally, realizing that the functinal outputs of the computer models are usually temporally inhomogenous, we adapt our model to such inhomogenienty by allowing time-varying autoregressive coefficients and time-varying variances of the innovations in equation 2. 5

6 3 Likelihood and the Prior Distributions 3.1 The Multivariate DLM representation We can represent the likelihood in the matrix form, i.e., yx 1, t yx 1, t 1 yx 1, t 2... yx 1, t p yx 2, t yx 2, t 1 yx 2, t 2... yx 2, t p = yx n, t yx n, t 1 yx n, t 2... yx n, t p φ t,1 φ t,2. φ t,p ɛ t x 1 ɛ t x 2 +. ɛ t x n 6 And we model the TVAR coefficients Φ t = φ t,1 φ t,2. φ t,p as, Φ t = Φ t 1 + w t where w t N0, W t. Let G t be the identity matrix of size p, V t = v t Σ 1, and F t = yx 1, t 1 yx 1, t 2... yx 1, t p yx 2, t 1 yx 2, t 2... yx 2, t p yx n, t 1 yx n, t 2... yx n, t p We can represent the likelihood in the way of Multivariate DLM West and Harrison, 1997, {F t, G t, V t, W t } T t=1 6

7 3.2 The Prior distributions Let D t be the data up to time t. We sequentially specify the prior distribtions for W t and V t by two discounting factors δ 1, δ 2. For W t, we assume, v 1 t D t 1 Gδ 1 n t 1 /2, δ 1 d t 1 /2 W t D t 1 = 1 δ 2 C t 1 /δ 2, C t 1 = CovΦ t 1 D t 1 where C t 1 = Cov Φ t 1 D t 1 and will be specified recursively in section A. The values for n 0, d 0, C 0 will be prespecified. Finally, for the spatial parameters α = {α i } and β = {β i }, we use the Jeffereys rule prior π α, β discussed in Berger et al and Paulo π α, β I α, β 1/2 trσ 1 1 Σ 1 2 where I α, β is the Fisher information matrix, and Σ 1 = Σ 1 α,β. 4 MCMC method for the Multivariate DLM We use the Monte Carlo Markov Chain method MCMC to draw samples from the posterior distributions, π {v 1,... v T }; {Φ 1,..., Φ T }; {α, β} D T. We first give the algorithm as follows. At the i th iteration, 1. Sample algorithm. {α i, β i } D T, {v i 1 1,..., v i 1 T }, {Φ i 1 1,..., Φ i 1 T } by the Metroplis-Hastings 7

8 2. Sample {v i 1,..., v i 2.1 Sample T }, {Φi 1,..., Φ i T } D T, {α i, β i } as, {v i 1,... v i T } D T, {α i, β i }. This will be discussed in section Sample {Φ i 1,..., Φ i T } D T, {v i 1,... v i T }, {αi, β i } as in section Sampling the variances We give the algorithm to update the variances {v i 1,... v i T } D T, {α i, β i }. 1. Do the Forward filtering assuming {v 1,..., v T } unknown, as discussed in the appendix B. 2. Sample v 1 i T DT, {α i, β } i G n T /2, d T /2. 3. Sample v t, t = T 1,..., 1 recursively as, v 1 t = δ 1 v 1 t+1 + G 1 δ 1 n t /2, d t /2 4.2 Sampling the TVAR coefficients Below is the algorithm to make draws from π {Φ 1,..., Φ T } D T, {v 1,... v T }, {α, β}. 1. Do the Forward filtering conditional on {v 1,..., v T }. This will be discussed in the appendix A. 2. Sample Φ T D T, {v 1,..., v T } MVN m T, C T. 8

9 3. Sample Φ t, t = T 1,..., 1 recursively from, Φ t D T, Φ t+1, {v 1,..., v T } MVN 1 δ 2 m t + δ 2 Φ t+1, 1 δ 2 C t 4.3 Spatial interpolation We predict the output of a computer model at a new input value by spatial interpolation. Suppose x is the new unexsercised input value. Let e t x i = y t x i j y t jx i φ t,j and ρ x x, x 1:n = Corrx, x 1,..., Corrx, x n, we have, y t x {y t 1 x,... y t p x}, Data, {v 1,..., v T }, {α, β} N µ t x, σ 2 t x where, µ t x = j y t j xφ t,j + v 1 t e t x 1 ρ x x, x 1:n Σ 1 e t x 2 1. e t x n and, σ 2 t x = v t 1 ρ x x, x 1:n Σ 1 1 ρ x x, x 1:n As all the computer model emulators do, the DLM modelling approach gives back the computer model output, when we are trying to make predictions for the exsercised computer input values. In other words, if x {x 1,..., x n }, we have µ t x = y t x and σ 2 t x = 0. 9

10 5 An example 5.1 The data Figure 1 gives an example of the functional outputs of computer models. Each time series is associated with an x value located to the left of the series. The x values are considered as the computer model inputs. The data with x = 0.5 in red is obtained from some real physical experiment. This data is observed at T = 3000 time points. We use y t 0.5 = y t 0.5, t = 1,..., T to represent it. Given y t 0.5 and its TVAR 20 fit {φ t,j, v t }, we simulate the data for x = 0.25,..., 0.75 by fixing α = 2, β = 1.6. The details are discussed in Appendix C. Figure 1: The simulated computer model data at various input values 10

11 5.2 MCMC Results In section 4, we can perfectly sample {v i 1,..., v i T }, {Φi This implies that, we do not need to update {v i 1,..., v i In particular, we update {v i 1,..., v i T }, {Φi 1,..., Φ i T } conditional on {αi, β i }. T }, {Φi 1,..., Φ i } in every iteration. 1,..., Φ i } after every 200 iterations of sampling {α i, β i } by the Metroplis-Hastings algorithm. We fix {α i } at 2 for the example data set. For the other unknowns, starting the MCMC from true parameter values, we obtained N = 2000 samples, among which the first 1000 are treated as burnin samples and will be discarded in all the posterior inferences. Figure 2 gives the trace plot, prior distribution up to a normalizing constant, posterior distribution, autocorrelation function for β. For the purpose of making comparison between the prior and the posterior distribution for β, we highlight with red line the prior distribution in the interval 1, 2, within which the posterior draws are concentrated. { } Suppose is the i th MCMC draw for the TVAR coefficients {φ t,j }, where i = 1,..., N, φ i t,j t = 1,..., T, and j = 1,..., 20. We calculate the posterior mean for φ t,j, ˆφ t,j by, T T ˆφ t,j = 1 N And the point-wise posterior means of the TVAR coefficients are shown in the left panel of figure 3. The right panel shows {ˆv t, t = 1,..., T }, the point-wise posterior means of {v t }. i φ i t,j 5.3 Spatial interpolation One direct application of the multivariate DLM, as we discussed in section 4.3, is to get the prediction for the computer model at input other than the design points. In figure 4, we give our prediction for the dynamic computer model outputs at input value x = 0.5. We also make comparison between the true outputs and our prediction at the time intervals 11

12 Figure 2: Upper-left: trace plot of the MCMC samples for β; Upper-Right: autocorrelation functions of the MCMC samples for β; Lower-Left: posterior distribution of β; Lower-Right: prior density of β. 1100, 1300 and 2700, 2900, where the data is exhibiting interesting features. 5.4 Wave and modular decomposition We can decompose the process {yt} as 12

13 Figure 3: Left: posterior means for the TVAR coefficients {φ t,j }; Right: posterior means for the time varying variances {v t } Figure 4: Posterior predictive curve green, true computer model output red, and 90% piece-wise predictive intervals for spatial interpolation with input value x = 0.5. y t = c z t,l + l=1 r l=1 x t,l 13

14 where the latent processes {z t,l } are TVAR s with lag 1 and x t,l are stochastically timevarying damped harmonic components, each of which is associated with the modulars damping parameters {a t,l } and the wavelengths periods {λ t,l } West and Harrison, Such decomposition can help to understand the physics meanings of the computer model outputs. In Figure 5, we show the decompositions for the posterior mean of the process {y t 0.5}. In Figure 6, we show the modulars and the wavelengths of the first 5 components, as a function of t. Figure 5: The true computer model output data {y t 0.5}bottom, posterior mean for {y t 0.5}second to the bottom, and decomposition of the posterior mean the rest curves are the first to the third components from bottom to the top. A Forward filtering with known variances We briefly review the forward filtering algorithm with known variances for multivariate DLM. For more details, refer to the Chapter 16 in West and Harrison With m 0, C 0, 14

15 Figure 6: Left: wave decompositions; Right: modular decompositions a. Posterior at t 1: Φ t 1 D t 1 Nm t 1, C t 1 b. Prior at t: Φ t D t 1 Na t, R t, with, a t = m t 1, R t = C t 1 /δ 2 c. One-step forecast: y t D t 1 Nf t, Q t, with, f t = F t a t = F t m t 1 ; Q t = F t C t 1 F t /δ 2 + v t d. Poserior at t: Φ t D t Nm t, C t with, m t = a t + A t e t and C t = R t A t Q t A t where, 15

16 A t = R t F t Q 1 t and e t = Y t f t B Forward filtering with unknown variances We first describe the forward filtering algorithm with unknown variances for multivariate DLM. With m 0, C 0, s 0, n 0, a. Posterior at t 1: Φ t 1 D t 1 Nm t 1, C t 1 b. Prior at t: Φ t D t 1 Na t, R t, with, a t = m t 1, R t = C t 1 /δ 2 c. One-step forecast: y t D t 1 Nf t, Q t, with, f t = F t a t = F t m t 1 ; Q t = F t C t 1 F t /δ 2 + s t 1 Σ 1 d. Posterior at t: Φ t D t T nt m t, C t, and, V 1 t D t Gn t /2, d t /2, with, A t = R t F t Q 1 t = C t 1 F t Q 1 t /δ 2 where m t = m t 1 + A t e t, e t = y t F t m t 1, and C t = st Ct 1 s t 1 δ 2 A t Q t A t, and, n t = δ 1 n t 1 + n; d t = δ 1 d t 1 + s t 1 e t tq 1 t e t 7 16

17 Now, we derive the relationship in equation 7. At time t, the prior for vt 1 is, v 1 t D t 1 Gδ1 n t 1 /2, δ 1 d t 1 /2 The likelihood, et D t 1, vt 1 N0, Qt Therefore, the posterior distribution for v 1 t is, πv 1 t D t 1 v t Q t exp s t 1 1/2 v t ɛ t tq 1 t ɛ t vt 1 δ 1n t 1 /2 1 exp δ 1 d t 1 vt 1 /2 This implies that, v 1 t D t G n + δ 1 n t 1 /2, δ 1 d t 1 + s t 1 ɛ t tq 1 t ɛ t /2 In other words, n t = δ 1 n t 1 + n; d t = δ 1 d t 1 + s t 1 e t tq 1 t e t ; s t = d t /n t 17

18 C Data simulation Suppose we have a functional data y t x, t = 1,..., T. Given α, β, {Φ t }, {v t }, we simulate the data as, y t x 1 µ t x 1 y t x 2 µ t x 2 {y t 1 x i }, {Φ t }, y t x MVN µ... t =, v t Σ... y t x n µ t x n with, µ t x i = j φ t,j Y t j x i + v t 1 ρ x x, x 1:n y t x j φ t,j y t j x and, Σ = Σ 1 ρ x x; x 1:n ρ x x; x 1:n Σ 1 is n n matrix with the i, j element Σ 1 i,j = Corrx i, x j. And ρ x x; x 1:n is an n by 1 vector with the i th element ρ x x; x 1:n i = Corrx, x i. 18

19 References Bayarri, M., Berger, J., Garcia-Donato, G., Liu, F., Palomo, J., Paulo, R., Sacks, J., Walsh, D., Cafeo, J., and Parthasarathy, R Computer model validation with functional outputs. Niss tech. report. Bayarri, M., Berger, J., Higdon, D., Kennedy, M., Kottas, A., Paulo, R., Sacks, J., Cafeo, J., Cavendish, J., Lin, C., and Tu, J A framework for validation of computer models. In D. Pace and S. S. Eds., eds., Proceedings of the Workshop on Foundations for V&V in the 21st Century. Society for Modeling and Simulation International. Berger, J., Oliveira, V. D., and Sanso, B Objective bayesian analysis of spatially correlated data. JASA Paulo, R Default priors for gaussian processes. Annals West, M. and Harrison, P Bayesian Forecasting and Dynamic Models. Springer, New York, USA. 19

A Dynamic Modelling Strategy for Bayesian Computer Model Emulation

A Dynamic Modelling Strategy for Bayesian Computer Model Emulation Bayesian Analysis (2004) 1, Number 1 A Dynamic Modelling Strategy for Bayesian Computer Model Emulation Fei Liu Univeristy of Missouri Mike West Duke University Abstract. Computer model evaluation studies

More information

An introduction to Bayesian statistics and model calibration and a host of related topics

An introduction to Bayesian statistics and model calibration and a host of related topics An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the

More information

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016 Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started

More information

Bayesian inference & process convolution models Dave Higdon, Statistical Sciences Group, LANL

Bayesian inference & process convolution models Dave Higdon, Statistical Sciences Group, LANL 1 Bayesian inference & process convolution models Dave Higdon, Statistical Sciences Group, LANL 2 MOVING AVERAGE SPATIAL MODELS Kernel basis representation for spatial processes z(s) Define m basis functions

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Dynamic Matrix-Variate Graphical Models A Synopsis 1

Dynamic Matrix-Variate Graphical Models A Synopsis 1 Proc. Valencia / ISBA 8th World Meeting on Bayesian Statistics Benidorm (Alicante, Spain), June 1st 6th, 2006 Dynamic Matrix-Variate Graphical Models A Synopsis 1 Carlos M. Carvalho & Mike West ISDS, Duke

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

Bayesian Gaussian Process Regression

Bayesian Gaussian Process Regression Bayesian Gaussian Process Regression STAT8810, Fall 2017 M.T. Pratola October 7, 2017 Today Bayesian Gaussian Process Regression Bayesian GP Regression Recall we had observations from our expensive simulator,

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Spatio-temporal precipitation modeling based on time-varying regressions

Spatio-temporal precipitation modeling based on time-varying regressions Spatio-temporal precipitation modeling based on time-varying regressions Oleg Makhnin Department of Mathematics New Mexico Tech Socorro, NM 87801 January 19, 2007 1 Abstract: A time-varying regression

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, )

SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, ) Econometrica Supplementary Material SUPPLEMENT TO MARKET ENTRY COSTS, PRODUCER HETEROGENEITY, AND EXPORT DYNAMICS (Econometrica, Vol. 75, No. 3, May 2007, 653 710) BY SANGHAMITRA DAS, MARK ROBERTS, AND

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference 1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Kazuhiko Kakamu Department of Economics Finance, Institute for Advanced Studies. Abstract

Kazuhiko Kakamu Department of Economics Finance, Institute for Advanced Studies. Abstract Bayesian Estimation of A Distance Functional Weight Matrix Model Kazuhiko Kakamu Department of Economics Finance, Institute for Advanced Studies Abstract This paper considers the distance functional weight

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Application of the Ensemble Kalman Filter to History Matching

Application of the Ensemble Kalman Filter to History Matching Application of the Ensemble Kalman Filter to History Matching Presented at Texas A&M, November 16,2010 Outline Philosophy EnKF for Data Assimilation Field History Match Using EnKF with Covariance Localization

More information

The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence

The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence The Effects of Monetary Policy on Stock Market Bubbles: Some Evidence Jordi Gali Luca Gambetti ONLINE APPENDIX The appendix describes the estimation of the time-varying coefficients VAR model. The model

More information

Research Division Federal Reserve Bank of St. Louis Working Paper Series

Research Division Federal Reserve Bank of St. Louis Working Paper Series Research Division Federal Reserve Bank of St Louis Working Paper Series Kalman Filtering with Truncated Normal State Variables for Bayesian Estimation of Macroeconomic Models Michael Dueker Working Paper

More information

Statistical Equivalent Models for Computer Simulators with an Application to the Random Waypoint Mobility Model

Statistical Equivalent Models for Computer Simulators with an Application to the Random Waypoint Mobility Model Statistical Equivalent Models for Computer Simulators with an Application to the Random Waypoint Mobility Model Kumar Viswanath Katia Obraczka Athanasios Kottas Bruno Sansó School of Engineering University

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

Software for non-stationary time series analysis and decompositions via TVAR models. Raquel Prado July 26, Overview This software implements ti

Software for non-stationary time series analysis and decompositions via TVAR models. Raquel Prado July 26, Overview This software implements ti Software for non-stationary time series analysis and decompositions via TVAR models. Raquel Prado July 26, 2000 1 Overview This software implements time-varying autoregressions or TVAR models and allows

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

MCMC and Gibbs Sampling. Kayhan Batmanghelich

MCMC and Gibbs Sampling. Kayhan Batmanghelich MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

NISS. Technical Report Number 144 July 2004

NISS. Technical Report Number 144 July 2004 NISS Calibrating and Validating Deterministic Traffic Models: Application to the HCM Control Delay at Signalized Intersections Rui Miguel Batista Paulo, Jiong Lin, Nagui M. Rouphail, and Jerome Sacks Technical

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Spatio-Temporal Models for Areal Data

Spatio-Temporal Models for Areal Data Spatio-Temporal Models for Areal Data Juan C. Vivar (jcvivar@dme.ufrj.br) and Marco A. R. Ferreira (marco@im.ufrj.br) Departamento de Métodos Estatísticos - IM Universidade Federal do Rio de Janeiro (UFRJ)

More information

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz

More information

Gaussian Processes for Computer Experiments

Gaussian Processes for Computer Experiments Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented

More information

Computer Practical: Metropolis-Hastings-based MCMC

Computer Practical: Metropolis-Hastings-based MCMC Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov

More information

Markov Chain Monte Carlo, Numerical Integration

Markov Chain Monte Carlo, Numerical Integration Markov Chain Monte Carlo, Numerical Integration (See Statistics) Trevor Gallen Fall 2015 1 / 1 Agenda Numerical Integration: MCMC methods Estimating Markov Chains Estimating latent variables 2 / 1 Numerical

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

FastGP: an R package for Gaussian processes

FastGP: an R package for Gaussian processes FastGP: an R package for Gaussian processes Giri Gopalan Harvard University Luke Bornn Harvard University Many methodologies involving a Gaussian process rely heavily on computationally expensive functions

More information

Variational inference

Variational inference Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM

More information

New Insights into History Matching via Sequential Monte Carlo

New Insights into History Matching via Sequential Monte Carlo New Insights into History Matching via Sequential Monte Carlo Associate Professor Chris Drovandi School of Mathematical Sciences ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

More information

A Note on the comparison of Nearest Neighbor Gaussian Process (NNGP) based models

A Note on the comparison of Nearest Neighbor Gaussian Process (NNGP) based models A Note on the comparison of Nearest Neighbor Gaussian Process (NNGP) based models arxiv:1811.03735v1 [math.st] 9 Nov 2018 Lu Zhang UCLA Department of Biostatistics Lu.Zhang@ucla.edu Sudipto Banerjee UCLA

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 2) Fall 2017 1 / 19 Part 2: Markov chain Monte

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Regression with correlation for the Sales Data

Regression with correlation for the Sales Data Regression with correlation for the Sales Data Scatter with Loess Curve Time Series Plot Sales 30 35 40 45 Sales 30 35 40 45 0 10 20 30 40 50 Week 0 10 20 30 40 50 Week Sales Data What is our goal with

More information

Fast Likelihood-Free Inference via Bayesian Optimization

Fast Likelihood-Free Inference via Bayesian Optimization Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach. Radford M. Neal, 28 February 2005

Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach. Radford M. Neal, 28 February 2005 Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach Radford M. Neal, 28 February 2005 A Very Brief Review of Gaussian Processes A Gaussian process is a distribution over

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

Riemann Manifold Methods in Bayesian Statistics

Riemann Manifold Methods in Bayesian Statistics Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes

More information

Bayesian Calibration of Simulators with Structured Discretization Uncertainty

Bayesian Calibration of Simulators with Structured Discretization Uncertainty Bayesian Calibration of Simulators with Structured Discretization Uncertainty Oksana A. Chkrebtii Department of Statistics, The Ohio State University Joint work with Matthew T. Pratola (Statistics, The

More information

Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models

Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models Efficiency and Reliability of Bayesian Calibration of Energy Supply System Models Kathrin Menberg 1,2, Yeonsook Heo 2, Ruchi Choudhary 1 1 University of Cambridge, Department of Engineering, Cambridge,

More information

Hierarchical Dynamic Models

Hierarchical Dynamic Models Hierarchical Dynamic Models Marina Silva Paez and Dani Gamerman Universidade Federal do Rio de Janeiro, Brazil marina,dani@im.ufrj.br 1 Introduction Consider a population of students divided into groups

More information

Variational Methods in Bayesian Deconvolution

Variational Methods in Bayesian Deconvolution PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the

More information

Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling

Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Monte Carlo Methods Appl, Vol 6, No 3 (2000), pp 205 210 c VSP 2000 Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Daniel B Rowe H & SS, 228-77 California Institute of

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

Computer Emulation With Density Estimation

Computer Emulation With Density Estimation Computer Emulation With Density Estimation Jake Coleman, Robert Wolpert May 8, 2017 Jake Coleman, Robert Wolpert Emulation and Density Estimation May 8, 2017 1 / 17 Computer Emulation Motivation Expensive

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Introduction to Bayesian Inference

Introduction to Bayesian Inference University of Pennsylvania EABCN Training School May 10, 2016 Bayesian Inference Ingredients of Bayesian Analysis: Likelihood function p(y φ) Prior density p(φ) Marginal data density p(y ) = p(y φ)p(φ)dφ

More information

STA 414/2104: Machine Learning

STA 414/2104: Machine Learning STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative

More information

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc.

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc. DSGE Methods Estimation of DSGE models: Maximum Likelihood & Bayesian Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@uni-muenster.de Summer

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

arxiv: v1 [stat.me] 26 Jul 2011

arxiv: v1 [stat.me] 26 Jul 2011 AUTOREGRESSIVE MODELS FOR VARIANCE MATRICES: STATIONARY INVERSE WISHART PROCESSES BY EMILY B. FOX AND MIKE WEST Duke University, Durham NC, USA arxiv:1107.5239v1 [stat.me] 26 Jul 2011 We introduce and

More information

Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract

Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract Bayesian analysis of a vector autoregressive model with multiple structural breaks Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus Abstract This paper develops a Bayesian approach

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

Advanced Statistical Modelling

Advanced Statistical Modelling Markov chain Monte Carlo (MCMC) Methods and Their Applications in Bayesian Statistics School of Technology and Business Studies/Statistics Dalarna University Borlänge, Sweden. Feb. 05, 2014. Outlines 1

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Bayesian Model Comparison:

Bayesian Model Comparison: Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500

More information

Spatial Dynamic Factor Analysis

Spatial Dynamic Factor Analysis Spatial Dynamic Factor Analysis Esther Salazar Federal University of Rio de Janeiro Department of Statistical Methods Sixth Workshop on BAYESIAN INFERENCE IN STOCHASTIC PROCESSES Bressanone/Brixen, Italy

More information

NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET. Directional Metropolis Hastings updates for posteriors with nonlinear likelihoods

NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET. Directional Metropolis Hastings updates for posteriors with nonlinear likelihoods NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Directional Metropolis Hastings updates for posteriors with nonlinear likelihoods by Håkon Tjelmeland and Jo Eidsvik PREPRINT STATISTICS NO. 5/2004 NORWEGIAN

More information

Stochastic vs Deterministic Pre-stack Inversion Methods. Brian Russell

Stochastic vs Deterministic Pre-stack Inversion Methods. Brian Russell Stochastic vs Deterministic Pre-stack Inversion Methods Brian Russell Introduction Seismic reservoir analysis techniques utilize the fact that seismic amplitudes contain information about the geological

More information

Note Set 5: Hidden Markov Models

Note Set 5: Hidden Markov Models Note Set 5: Hidden Markov Models Probabilistic Learning: Theory and Algorithms, CS 274A, Winter 2016 1 Hidden Markov Models (HMMs) 1.1 Introduction Consider observed data vectors x t that are d-dimensional

More information

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models 6 Dependent data The AR(p) model The MA(q) model Hidden Markov models Dependent data Dependent data Huge portion of real-life data involving dependent datapoints Example (Capture-recapture) capture histories

More information

Wrapped Gaussian processes: a short review and some new results

Wrapped Gaussian processes: a short review and some new results Wrapped Gaussian processes: a short review and some new results Giovanna Jona Lasinio 1, Gianluca Mastrantonio 2 and Alan Gelfand 3 1-Università Sapienza di Roma 2- Università RomaTRE 3- Duke University

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

Bayesian Estimation of Input Output Tables for Russia

Bayesian Estimation of Input Output Tables for Russia Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information