State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter
|
|
- Josephine Elliott
- 6 years ago
- Views:
Transcription
1 Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer
2 Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when a new observaion becomes available, ha is, how o move from ime (-1) o ime. Applicaion of he Kalman filer requires herefore an iniial predicor 0 ˆβ and he corresponding covariance marix, P = E[( ˆ β β )( ˆ β β ) ] Several efficien procedures have been proposed in he lieraure, bu since under cerain regulariy condiions and for sufficienly long series he curren predicors don depend on he iniializaion process, simple procedures can usually be applied.
3 Iniializaion procedure A simple iniializaion procedure is as follows: 1- Iniialize nonsaionary componens β 0, j of he sae vecor by any value (say, β 0, j=0) and a = ˆ =M very large variance, P E( β β ) 0, j 0, j 0, j (Large relaive o he magniude of he series.) - Iniialise saionary componens by heir (uncondiional) mean and variance. The use of 1 corresponds o he use of a noninformaive (diffuse) prior disribuion for he corresponding parameers under a Bayesian paradigm. 3
4 Example of Iniializaion Consider he BSM bu suppose ha he observed series { y, = 1,,...} consiss of survey esimaors and hence is subjec o sampling errors; y = Y + e, where Y defines he corresponding populaion value and e is he sampling error assumed o follow an AR() process. The saespace model is herefore, Observaion Equaion: y = L + S + e + ε Transiion Equaion: L = L 1+ R 1+ η1 ; R = R 1+ η, S = S 1 S S 3+ η3, (or rigonomeric model) e = φ e + φ e + η
5 Sae-space represenaion of he exended BSM (quarerly series) Le = L R S S 1 S e e 1 β (,,,,,, ) Observaion Equaion y = (1,0,1,0,0,1,0) β + ε Transiion Equaion L 1 η R 1 η S 1 η 3 β = S S φ φ e 1 η e 0 Q η1 q η 0 q η q = Cov 0 = η q
6 Possible iniializaion for he exended BSM β = ( L, R, S, S, S, e, e ). 1 1 Excep for e and e 1, he oher 5 componen of β are nonsaionary. Also, as we know by now, q E( e ) 0, Var( e ) V 4 = = e (1 ρφ 11 ρφ) = Hence: ˆ β 0 = (0,0,0,0,0,0,0), and P 0 M M M = M M V e Ve M is a large number.. 6
7 Esimaion of hyper-parameers So far we assumed ha he marices X, T, Σ and Q are known. In pracice, he elemens of Σ and Q are usually unknown, and possibly also some of he elemens of he ransiion marix T. (In he previous example, T conains φ 1 and φ.) Denoe by λ he vecor of all he unknown parameers, ofen referred o as hyperparameers Under he classical (Frequenis) approach, hese parameers are esimaed from he observed series using Maximum Likelihood (ML) or oher sandard echniques. Unlike in classical saisics, he observaions { Y, = 1,,... N} are no independen, and hence he likelihood is no a simple produc of he densiies of he individual observaions. 7
8 Predicion Error Decomposiion Denoe Y 0 = β 0 and le Y () = ( Y, Y 1,..., Y 0 ) denoe he vecor of observaions unil and including ime. The following decomposiion follows from repeaed applicaion of Bayes Theorem, where f () denoes he corresponding probabiliy densiy funcions (pdf). f [ Y ] = f ( Y, Y,..., Y, Y ) ( N) N N = = f [ Y Y ] f [ Y ] N ( N 1) ( N 1) f [ Y Y ] f [ Y Y ] f [ Y ] N ( N 1) N 1 ( N ) ( N ) N =... = [ fy [ Y ] fy [ β ] f( β ) ( 0 0 = ( 1) β = Y ). Assuming ha ε and η are normal and likewise for β 0, each condiional disribuion fy [ Y( 1) ] is also normal; Y Y N Yˆ F ( 1) ~ ( 1, ), F = ( X P X +Σ ) = Var( Y Yˆ )
9 Esimaion of hyper-parameers, (con.) N N = fy Y( 1) fy1 β0 f β0 = f ( Y, Y,..., Y, Y ) N Y Y ~ N( Yˆ, F ) ( 1) 1 [ [ ] [ ] ( ) F = ( XP X' +Σ ) = VarY ( Yˆ ) = Vare ( ). 1 1 The log-likelihood has herefore he form, nn 1 N log[ L; λ Y( N) ] = log( π) log F = 1 1 N 1 ef e + log f( β0) [n=dim( Y )] = 1 We need o specify f ( β 0) which amouns o he Iniializaion of he Kalman filer. Possibiliies 1- Fix β 0 and add he unknown elemens of β 0 o he vecor λ. This means ha we drop he erm log f ( β 0) from he likelihood. If β 0 is of high dimension [p=dim( β 0 ) is large], we end up wih many more unknown parameers. 9
10 Iniializaion of he likelihood (con.) - Regress Y 1... Y m as a funcion of β m o obain ˆm β and P m, base he likelihood on fy [ Y( 1) ]. N = m+ 1 The value of m is se o he smalles value ha yields an esimaor wih a proper covariance marix P m. Example: If n>p, we can wrie, ˆ β ( ), ( ) = X1 Σ X1 X1 Σ Y1 P1= X1 Σ X1 (Σ is unknown), and base he likelihood on N fy [ Y ]. = ( 1) If n p we need o sacrifice more observaions. 3- Use he Iniializaion procedure discussed before (and assume a disribuion for β 0). Here again we need o sacrifice he firs m observaions unil we ge an esimaor ˆm β wih a proper covariance marix P m. 10
11 Maximum likelihood esimaion (con.) Having esablished he log-likelihood, i is hen maximized wih respec o he elemens conained in λ (he hyper-parameers). The maximizaion process yields also he asympoic covariance marix of he esimaors via he Inverse Informaion marix. 11
12 Example; MLE for AR(1) models Consider he AR(1) model, Z = φz 1 + ε where Z = ( Y μ). 1 = 1 1 = = EZ ( Z ) φz, VarZ ( Z ) σ Var( ε ) We assume ha ε ~ N(0, σ ). N 1 N = = 1 1 N 1 ( Z φz 1) = πσ σ f ( Z,..., Z ) f( Z Z ) f( Z ) = exp[ ] f ( Z1). Hence, N 1 N 1 log( L) = log( π ) log( σ ) N ( Z 1) [ φz + ] + log[ f ( Z 1)]. = σ We need o specify f ( Z 1). 1
13 Specificaion of f ( Z 1) for he AR(1) model Possibiliies 1- Fix Z 1 and maximize he likelihood ignoring he las erm ( f ( Z1) = cons.) - Assume Z σ ~ N[0, ] and add (1 φ ) 1 log[ f( Z )] 1 log( ) Z 1 1 = πσ Z + σ Z Z where σ σ /(1 φ ) =. o he likelihood If we subsiue ( Y μ) for Z everywhere in he likelihood we can maximize also wih respec o μ. 13
14 Smoohing We considered so far he predicion of he sae vecors (i.e., he compuaion of ˆ β + k 1, k=0,1, ), and he updaing of he curren sae vecor as new daa become available, i.e., he compuaion of ˆ β ˆ = β. Having new daa Y, we may wan o modify he previous predicors, ˆ β ˆ 1, β..., which is known as smoohing. For example, modify he esimaes of he rend and seasonal effecs produced in a previous year, which in X-11 erminology is called revision. Noe: for he las ime poin N, ˆ β NN = ˆ β. N 14
15 Smoohing of he previous sae vecor We like o modify (smooh) he esimaor ˆ β 1 compued from he observaions Y 1,..., Y 1 when a new observaion Y become available. We again have wo independen predicors for β 1: * ˆ 1 β 1 u 1 β = + ; * * 1 1 = 1 E( u u ) P * β ε β 1 η ε β 1 Y = X + = X T + X + = X T + v ; * u and 1 Evv = +Σ * * ( ) XQX * v are independen, given 1 β. 15
16 Smoohing of previous sae vecor (con.) Se he regression model, * ˆ β 1 Ι u 1 = Y 1 * XT β + v, where * 1 P 1 * 0 u 0 Var V v = XQX =. +Σ The generalised leas square predicor (GLS) is, ˆ 1 1 TX V XT TX V β = Ι Ι Y ˆ Ι β [, ] [, ] = ˆ β + ( ˆ β ˆ β ) ; * 1 P 1 T 1 * P = P TP. The predicor ˆ β 1 is updaed based on he predicion error of ( ˆ β T ˆ β 1 ) = ( ˆ β ˆ β 1). 16
17 Predicion Error Variance P = E[( ˆ β β )( ˆ β β )'] * * = P P ( P P) P ' P 1 P 1. P P means ha, 1 1 Var( a ˆ β ) Var( a ˆ β ) for all vecors 1 1 a0. As expeced, smoohing reduces he variance of he predicors. Recursive smoohing algorihm ˆ β = ˆ β + ( ˆ β ˆ β ); * 1 N 1 P 1 N T 1 1 N 1 N 1 1 N 1 * 1 1 = 1 1 P P TP P = E[( ˆ β β )( ˆ β β )'] * * 1 1 ( 1 N) 1 = P P P P P. ˆ = ˆ, = P. The algorihm sars wih, β β P NN N NN N 17
18 Random walk + noise y = 1 θ + ε, 1 Example Var( ε ) = σ ( 1 X = ) θ = 1 θ + η, Var( η ) = q ( T = 1) For his model, p = p /( p + q), and * ˆ θ = (1 p ) ˆ θ + p ˆ θ +. * * N 1 N As, p cons p /( p + q) c ˆ θ = (1 ) ˆ θ + ˆ θ = ˆ θ + ( ˆ θ ˆ θ ). and N c c + 1 N c + 1 N (See exercise). 18
Linear Gaussian State Space Models
Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying
More informationTesting the Random Walk Model. i.i.d. ( ) r
he random walk heory saes: esing he Random Walk Model µ ε () np = + np + Momen Condiions where where ε ~ i.i.d he idea here is o es direcly he resricions imposed by momen condiions. lnp lnp µ ( lnp lnp
More informationOBJECTIVES OF TIME SERIES ANALYSIS
OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging
More informationSTAD57 Time Series Analysis. Lecture 14
STAD57 Time Series Analysis Lecure 14 1 Maximum Likelihood AR(p) Esimaion Insead of Yule-Walker (MM) for AR(p) model, can use Maximum Likelihood (ML) esimaion Likelihood is join densiy of daa {x 1,,x n
More informationExponential Smoothing
Exponenial moohing Inroducion A simple mehod for forecasing. Does no require long series. Enables o decompose he series ino a rend and seasonal effecs. Paricularly useful mehod when here is a need o forecas
More informationAn recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes
WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,
More informationKriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationGeorey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract
Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical
More informationhen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif
Chaper Kalman Filers. Inroducion We describe Bayesian Learning for sequenial esimaion of parameers (eg. means, AR coeciens). The updae procedures are known as Kalman Filers. We show how Dynamic Linear
More informationAugmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004
Augmened Realiy II Kalman Filers Gudrun Klinker May 25, 2004 Ouline Moivaion Discree Kalman Filer Modeled Process Compuing Model Parameers Algorihm Exended Kalman Filer Kalman Filer for Sensor Fusion Lieraure
More informationTesting for a Single Factor Model in the Multivariate State Space Framework
esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics
More informationLecture 10 Estimating Nonlinear Regression Models
Lecure 0 Esimaing Nonlinear Regression Models References: Greene, Economeric Analysis, Chaper 0 Consider he following regression model: y = f(x, β) + ε =,, x is kx for each, β is an rxconsan vecor, ε is
More informationReferences are appeared in the last slide. Last update: (1393/08/19)
SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be
More informationInstitute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler
MULTIVARIATE TIME SERIES ANALYSIS AND FORECASTING Manfred Deisler E O S Economerics and Sysems Theory Insiue for Mahemaical Mehods in Economics Universiy of Technology Vienna Singapore, May 2004 Inroducion
More informationTwo Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017
Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More informationModeling Economic Time Series with Stochastic Linear Difference Equations
A. Thiemer, SLDG.mcd, 6..6 FH-Kiel Universiy of Applied Sciences Prof. Dr. Andreas Thiemer e-mail: andreas.hiemer@fh-kiel.de Modeling Economic Time Series wih Sochasic Linear Difference Equaions Summary:
More informationL07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms
L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)
More informationAnno accademico 2006/2007. Davide Migliore
Roboica Anno accademico 2006/2007 Davide Migliore migliore@ele.polimi.i Today Eercise session: An Off-side roblem Robo Vision Task Measuring NBA layers erformance robabilisic Roboics Inroducion The Bayesian
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationFinancial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2
Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary
More informationOutline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests
Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange
More informationTime series model fitting via Kalman smoothing and EM estimation in TimeModels.jl
Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................
More informationEcon107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)
I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression
More informationMaximum Likelihood Parameter Estimation in State-Space Models
Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc. 212 4 h Ocober 212 1 / 32
More informationModal identification of structures from roving input data by means of maximum likelihood estimation of the state space model
Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix
More informationR t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t
Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,
More informationProbabilistic Robotics
Probabilisic Roboics Bayes Filer Implemenaions Gaussian filers Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel Gaussians : ~ π e p N p - Univariae / / : ~ μ μ μ e p Ν p d π Mulivariae
More informationVectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1
Vecorauoregressive Model and Coinegraion Analysis Par V Time Series Analysis Dr. Sevap Kesel 1 Vecorauoregression Vecor auoregression (VAR) is an economeric model used o capure he evoluion and he inerdependencies
More informationA Specification Test for Linear Dynamic Stochastic General Equilibrium Models
Journal of Saisical and Economeric Mehods, vol.1, no.2, 2012, 65-70 ISSN: 2241-0384 (prin), 2241-0376 (online) Scienpress Ld, 2012 A Specificaion Tes for Linear Dynamic Sochasic General Equilibrium Models
More information0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED
0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable
More informationACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin
ACE 56 Fall 005 Lecure 4: Simple Linear Regression Model: Specificaion and Esimaion by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Simple Regression: Economic and Saisical Model
More information4.1 Other Interpretations of Ridge Regression
CHAPTER 4 FURTHER RIDGE THEORY 4. Oher Inerpreaions of Ridge Regression In his secion we will presen hree inerpreaions for he use of ridge regression. The firs one is analogous o Hoerl and Kennard reasoning
More informationLecture 33: November 29
36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure
More informationACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.
ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen
More informationGeneralized Least Squares
Generalized Leas Squares Augus 006 1 Modified Model Original assumpions: 1 Specificaion: y = Xβ + ε (1) Eε =0 3 EX 0 ε =0 4 Eεε 0 = σ I In his secion, we consider relaxing assumpion (4) Insead, assume
More informationIntroduction to Mobile Robotics
Inroducion o Mobile Roboics Bayes Filer Kalman Filer Wolfram Burgard Cyrill Sachniss Giorgio Grisei Maren Bennewiz Chrisian Plagemann Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel
More information20. Applications of the Genetic-Drift Model
0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0
More informationData Fusion using Kalman Filter. Ioannis Rekleitis
Daa Fusion using Kalman Filer Ioannis Rekleiis Eample of a arameerized Baesian Filer: Kalman Filer Kalman filers (KF represen poserior belief b a Gaussian (normal disribuion A -d Gaussian disribuion is
More informationApplications in Industry (Extended) Kalman Filter. Week Date Lecture Title
hp://elec34.com Applicaions in Indusry (Eended) Kalman Filer 26 School of Informaion echnology and Elecrical Engineering a he Universiy of Queensland Lecure Schedule: Week Dae Lecure ile 29-Feb Inroducion
More informationACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.
ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple
More informationZürich. ETH Master Course: L Autonomous Mobile Robots Localization II
Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),
More informationSequential Importance Resampling (SIR) Particle Filter
Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle
More informationREVIEW OF MAXIMUM LIKELIHOOD ESTIMATION
REVIEW OF MAXIMUM LIKELIHOOD ESIMAION [] Maximum Likelihood Esimaor () Cases in which θ (unknown parameer) is scalar Noaional Clarificaion: From now on, we denoe he rue alue of θ as θ o hen, iew θ as a
More informationSEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS
SEIF, EnKF, EKF SLAM Pieer Abbeel UC Berkeley EECS Informaion Filer From an analyical poin of view == Kalman filer Difference: keep rack of he inverse covariance raher han he covariance marix [maer of
More informationHow to Deal with Structural Breaks in Practical Cointegration Analysis
How o Deal wih Srucural Breaks in Pracical Coinegraion Analysis Roselyne Joyeux * School of Economic and Financial Sudies Macquarie Universiy December 00 ABSTRACT In his noe we consider he reamen of srucural
More informationTime series Decomposition method
Time series Decomposiion mehod A ime series is described using a mulifacor model such as = f (rend, cyclical, seasonal, error) = f (T, C, S, e) Long- Iner-mediaed Seasonal Irregular erm erm effec, effec,
More informationRobust estimation based on the first- and third-moment restrictions of the power transformation model
h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,
More information12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =
1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of
More informationStationary Time Series
3-Jul-3 Time Series Analysis Assoc. Prof. Dr. Sevap Kesel July 03 Saionary Time Series Sricly saionary process: If he oin dis. of is he same as he oin dis. of ( X,... X n) ( X h,... X nh) Weakly Saionary
More informationObject tracking: Using HMMs to estimate the geographical location of fish
Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging
More informationChapter 5. Heterocedastic Models. Introduction to time series (2008) 1
Chaper 5 Heerocedasic Models Inroducion o ime series (2008) 1 Chaper 5. Conens. 5.1. The ARCH model. 5.2. The GARCH model. 5.3. The exponenial GARCH model. 5.4. The CHARMA model. 5.5. Random coefficien
More informationSmoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T
Smoohing Consan process Separae signal & noise Smooh he daa: Backward smooher: A an give, replace he observaion b a combinaion of observaions a & before Simple smooher : replace he curren observaion wih
More informationRobert Kollmann. 6 September 2017
Appendix: Supplemenary maerial for Tracable Likelihood-Based Esimaion of Non- Linear DSGE Models Economics Leers (available online 6 Sepember 207) hp://dx.doi.org/0.06/j.econle.207.08.027 Rober Kollmann
More informationRecursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems
8 Froniers in Signal Processing, Vol. 1, No. 1, July 217 hps://dx.doi.org/1.2266/fsp.217.112 Recursive Leas-Squares Fixed-Inerval Smooher Using Covariance Informaion based on Innovaion Approach in Linear
More informationDiebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles
Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance
More informationProperties of Autocorrelated Processes Economics 30331
Properies of Auocorrelaed Processes Economics 3033 Bill Evans Fall 05 Suppose we have ime series daa series labeled as where =,,3, T (he final period) Some examples are he dail closing price of he S&500,
More informationMachine Learning 4771
ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony
More informationEstimation Uncertainty
Esimaion Uncerainy The sample mean is an esimae of β = E(y +h ) The esimaion error is = + = T h y T b ( ) = = + = + = = = T T h T h e T y T y T b β β β Esimaion Variance Under classical condiions, where
More informationCS 4495 Computer Vision Tracking 1- Kalman,Gaussian
CS 4495 Compuer Vision A. Bobick CS 4495 Compuer Vision - KalmanGaussian Aaron Bobick School of Ineracive Compuing CS 4495 Compuer Vision A. Bobick Adminisrivia S5 will be ou his Thurs Due Sun Nov h :55pm
More informationEstimation of Poses with Particle Filters
Esimaion of Poses wih Paricle Filers Dr.-Ing. Bernd Ludwig Chair for Arificial Inelligence Deparmen of Compuer Science Friedrich-Alexander-Universiä Erlangen-Nürnberg 12/05/2008 Dr.-Ing. Bernd Ludwig (FAU
More information2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006
2.160 Sysem Idenificaion, Esimaion, and Learning Lecure Noes No. 8 March 6, 2006 4.9 Eended Kalman Filer In many pracical problems, he process dynamics are nonlinear. w Process Dynamics v y u Model (Linearized)
More informationT L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB
Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal
More informationRobot Motion Model EKF based Localization EKF SLAM Graph SLAM
Robo Moion Model EKF based Localizaion EKF SLAM Graph SLAM General Robo Moion Model Robo sae v r Conrol a ime Sae updae model Noise model of robo conrol Noise model of conrol Robo moion model
More information14 Autoregressive Moving Average Models
14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class
More information(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.
January 01 Final Exam Quesions: Mark W. Wason (Poins/Minues are given in Parenheses) (15) 1. Suppose ha y follows he saionary AR(1) process y = y 1 +, where = 0.5 and ~ iid(0,1). Le x = (y + y 1 )/. (11)
More informationSOMETHING ELSE ABOUT GAUSSIAN HIDDEN MARKOV MODELS AND AIR POLLUTION DATA
UNIVERSIÀ CAOLICA DEL SACRO CUORE ISIUO DI SAISICA Robera AROLI e Luigi SEZIA SOMEHING ELSE ABOU GAUSSIAN HIDDEN MARKOV MODELS AND AIR OLLUION DAA Serie E N 96 - Marzo 2000 SOMEHING ELSE ABOU GAUSSIAN
More informationUnit Root Time Series. Univariate random walk
Uni Roo ime Series Univariae random walk Consider he regression y y where ~ iid N 0, he leas squares esimae of is: ˆ yy y y yy Now wha if = If y y hen le y 0 =0 so ha y j j If ~ iid N 0, hen y ~ N 0, he
More informationCH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu)
CH Sean Han QF, NTHU, Taiwan BFS2010 (Join work wih T.-Y. Chen and W.-H. Liu) Risk Managemen in Pracice: Value a Risk (VaR) / Condiional Value a Risk (CVaR) Volailiy Esimaion: Correced Fourier Transform
More informationStochastic Signals and Systems
Sochasic Signals and Sysems Conens 1. Probabiliy Theory. Sochasic Processes 3. Parameer Esimaion 4. Signal Deecion 5. Specrum Analysis 6. Opimal Filering Chaper 6 / Sochasic Signals and Sysems / Prof.
More informationThe Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form
Chaper 6 The Simple Linear Regression Model: Reporing he Resuls and Choosing he Funcional Form To complee he analysis of he simple linear regression model, in his chaper we will consider how o measure
More informationForecasting optimally
I) ile: Forecas Evaluaion II) Conens: Evaluaing forecass, properies of opimal forecass, esing properies of opimal forecass, saisical comparison of forecas accuracy III) Documenaion: - Diebold, Francis
More informationComparing Means: t-tests for One Sample & Two Related Samples
Comparing Means: -Tess for One Sample & Two Relaed Samples Using he z-tes: Assumpions -Tess for One Sample & Two Relaed Samples The z-es (of a sample mean agains a populaion mean) is based on he assumpion
More informationDistribution of Estimates
Disribuion of Esimaes From Economerics (40) Linear Regression Model Assume (y,x ) is iid and E(x e )0 Esimaion Consisency y α + βx + he esimaes approach he rue values as he sample size increases Esimaion
More informationLecture Notes 2. The Hilbert Space Approach to Time Series
Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship
More informationTypes of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing
M Business Forecasing Mehods Exponenial moohing Mehods ecurer : Dr Iris Yeung Room No : P79 Tel No : 788 8 Types of Exponenial moohing Mehods imple Exponenial moohing Double Exponenial moohing Brown s
More informationTracking. Announcements
Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion
More informationUnderstanding the asymptotic behaviour of empirical Bayes methods
Undersanding he asympoic behaviour of empirical Bayes mehods Boond Szabo, Aad van der Vaar and Harry van Zanen EURANDOM, 11.10.2011. Conens 2/20 Moivaion Nonparameric Bayesian saisics Signal in Whie noise
More informationFinancial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS
Name SOLUTIONS Financial Economerics Jeffrey R. Russell Miderm Winer 009 SOLUTIONS You have 80 minues o complee he exam. Use can use a calculaor and noes. Try o fi all your work in he space provided. If
More informationMath 10B: Mock Mid II. April 13, 2016
Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.
More informationRecursive Identification
Chaper 8 Recursive Idenificaion Given a curren esimaed model and a new observaion, how should we updae his model in order o ake his new piece of informaion ino accoun? In many cases i is beneficial o have
More informationA Bayesian Approach to Spectral Analysis
Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2
More informationEcological Archives E A1. Meghan A. Duffy, Spencer R. Hall, Carla E. Cáceres, and Anthony R. Ives.
Ecological Archives E9-95-A1 Meghan A. Duffy, pencer R. Hall, Carla E. Cáceres, and Anhony R. ves. 29. Rapid evoluion, seasonaliy, and he erminaion of parasie epidemics. Ecology 9:1441 1448. Appendix A.
More informationBook Corrections for Optimal Estimation of Dynamic Systems, 2 nd Edition
Boo Correcions for Opimal Esimaion of Dynamic Sysems, nd Ediion John L. Crassidis and John L. Junins November 17, 017 Chaper 1 This documen provides correcions for he boo: Crassidis, J.L., and Junins,
More informationIndependent component analysis for nonminimum phase systems using H filters
Independen componen analysis for nonminimum phase sysems using H filers Shuichi Fukunaga, Kenji Fujimoo Deparmen of Mechanical Science and Engineering, Graduae Shool of Engineering, Nagoya Universiy, Furo-cho,
More informationLECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS
LECTURE : GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS We will work wih a coninuous ime reversible Markov chain X on a finie conneced sae space, wih generaor Lf(x = y q x,yf(y. (Recall ha q
More informationState-Space Modeling with Correlated Measurements with Application to Small Area Estimation Under Benchmark Constraints
ae-pace Modeling wih Correlaed Measuremens wih Applicaion o mall Area Esimaion Under Benchmark Consrains Danny Pfeffermann Hebrew Universiy, Jerusalem, Israel and Universiy of ouhampon, U.K and Richard
More informationUsing the Kalman filter Extended Kalman filter
Using he Kalman filer Eended Kalman filer Doz. G. Bleser Prof. Sricker Compuer Vision: Objec and People Tracking SA- Ouline Recap: Kalman filer algorihm Using Kalman filers Eended Kalman filer algorihm
More informationm = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19
Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible
More informationMinimally Conditioned Likelihood for a Nonstationary State Space Model. José Casals. Sonia Sotoca. Miguel Jerez. Universidad Complutense de Madrid
Minimally Condiioned Likelihood for a Nonsaionary Sae Space Model José Casals Sonia Sooca Miguel Jerez Universidad Compluense de Madrid Absrac: Compuing he gaussian likelihood for a nonsaionary sae-space
More informationRecursive Estimation and Identification of Time-Varying Long- Term Fading Channels
Recursive Esimaion and Idenificaion of ime-varying Long- erm Fading Channels Mohammed M. Olama, Kiran K. Jaladhi, Seddi M. Djouadi, and Charalambos D. Charalambous 2 Universiy of ennessee Deparmen of Elecrical
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationZápadočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France
ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,
More informationSolutions to Odd Number Exercises in Chapter 6
1 Soluions o Odd Number Exercises in 6.1 R y eˆ 1.7151 y 6.3 From eˆ ( T K) ˆ R 1 1 SST SST SST (1 R ) 55.36(1.7911) we have, ˆ 6.414 T K ( ) 6.5 y ye ye y e 1 1 Consider he erms e and xe b b x e y e b
More informationTemporal probability models. Chapter 15, Sections 1 5 1
Temporal probabiliy models Chaper 15, Secions 1 5 Chaper 15, Secions 1 5 1 Ouline Time and uncerainy Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic Bayesian
More informationThe General Linear Test in the Ridge Regression
ommunicaions for Saisical Applicaions Mehods 2014, Vol. 21, No. 4, 297 307 DOI: hp://dx.doi.org/10.5351/sam.2014.21.4.297 Prin ISSN 2287-7843 / Online ISSN 2383-4757 The General Linear Tes in he Ridge
More informationBias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé
Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070
More informationHidden Markov Models
Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe
More informationMANY FACET, COMMON LATENT TRAIT POLYTOMOUS IRT MODEL AND EM ALGORITHM. Dimitar Atanasov
Pliska Sud. Mah. Bulgar. 20 (2011), 5 12 STUDIA MATHEMATICA BULGARICA MANY FACET, COMMON LATENT TRAIT POLYTOMOUS IRT MODEL AND EM ALGORITHM Dimiar Aanasov There are many areas of assessmen where he level
More information