y k = ( ) x k + v k. w q wk i 0 0 wk

Similar documents
6.435, System Identification

Factor Analysis and Kalman Filtering (11/2/04)

EL1820 Modeling of Dynamical Systems

On Input Design for System Identification

Identification of ARX, OE, FIR models with the least squares method

SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS

Elements of Multivariate Time Series Analysis

The Kalman Filter ImPr Talk

EL1820 Modeling of Dynamical Systems

12. Prediction Error Methods (PEM)

DATA FUSION III: Estimation Theory

Further Results on Model Structure Validation for Closed Loop System Identification

Kalman-Filter-Based Time-Varying Parameter Estimation via Retrospective Optimization of the Process Noise Covariance

Advanced Process Control Tutorial Problem Set 2 Development of Control Relevant Models through System Identification

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

LECTURE 10: MORE ON RANDOM PROCESSES

Lecture 2: ARMA(p,q) models (part 2)

EECE Adaptive Control

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

ADAPTIVE FILTER THEORY

Expressions for the covariance matrix of covariance data

The outline for Unit 3

Statistics 910, #15 1. Kalman Filter

Time Series I Time Domain Methods

An Algorithm for Finding Process Identification Intervals from Normal Operating Data

Optimal control and estimation

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Examples Sheet

ADAPTIVE FILTER THEORY

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET

Econometrics II - EXAM Answer each question in separate sheets in three hours

Testing Error Correction in Panel data

3 JAA Special Publication JAA-SP-6-8E efficiency of damping estimation. It is pointed out, however, that damping is not always an appropriate index to

Predictive Control of Gyroscopic-Force Actuators for Mechanical Vibration Damping

Linear Optimum Filtering: Statement

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Adaptive MV ARMA identification under the presence of noise

X t = a t + r t, (7.1)

6.4 Kalman Filter Equations

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

Parametric Output Error Based Identification and Fault Detection in Structures Under Earthquake Excitation

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

A time series is called strictly stationary if the joint distribution of every collection (Y t

Parameter Estimation in a Moving Horizon Perspective

Linear Dynamical Systems

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström

Bayesian Methods in Positioning Applications

Advanced Econometrics

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

State Observers and the Kalman filter

Modeling and Analysis of Dynamic Systems

MAT3379 (Winter 2016)

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Statistics 910, #5 1. Regression Methods

Problem Set 2: Box-Jenkins methodology

Exercises - Time series analysis

ECO Class 6 Nonparametric Econometrics

Machine Learning 4771

Fair Exponential Smoothing with Small Alpha

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

22s:152 Applied Linear Regression

Multivariate Time Series

Image Alignment and Mosaicing Feature Tracking and the Kalman Filter

Expectation Formation and Rationally Inattentive Forecasters

Time Series Analysis

Linear Dynamical Systems (Kalman filter)

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

Dual Estimation and the Unscented Transformation

Introduction to Machine Learning

Volume 30, Issue 3. A note on Kalman filter approach to solution of rational expectations models

ARIMA Modelling and Forecasting

Threshold models: Basic concepts and new results

Trend-Cycle Decompositions

:Effects of Data Scaling We ve already looked at the effects of data scaling on the OLS statistics, 2, and R 2. What about test statistics?

Closed and Open Loop Subspace System Identification of the Kalman Filter

CS Homework 3. October 15, 2009

1 Teaching notes on structural VARs.

Statistics 135 Fall 2008 Final Exam

Decision 411: Class 9. HW#3 issues

11. Further Issues in Using OLS with TS Data

CS 532: 3D Computer Vision 6 th Set of Notes

1 Estimation of Persistent Dynamic Panel Data. Motivation

Testing Restrictions and Comparing Models

System Identification

Managing Uncertainty

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems

EE226a - Summary of Lecture 13 and 14 Kalman Filter: Convergence

Least Squares and Kalman Filtering Questions: me,

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

! # % & () +,.&/ 01),. &, / &

Department of Economics, UCSD UC San Diego

Feedback Capacity of the First-Order Moving Average Gaussian Channel

Forecasting. Simon Shaw 2005/06 Semester II

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Lecture 7a: Vector Autoregression (VAR)

Advanced Econometrics

Single Equation Linear GMM with Serially Correlated Moment Conditions

A Stochastic Online Sensor Scheduler for Remote State Estimation with Time-out Condition

Data mining for system identi cation

Transcription:

Four telling examples of Kalman Filters Example : Signal plus noise Measurement of a bandpass signal, center frequency.2 rad/sec buried in highpass noise. Dig out the quadrature part of the signal while rejecting the noise as well as possible. x q k+ x i k+ = x d k+.882.876.876.882.8 y k = ( ) x k + v k. x q k x i k x d k + w q k wk i, wk d Q = diag ( 5 ), R =.,.8337.246.7548 Σ =.246.2336.225.7548.225 5.753, M = ˆx k+ k = (A + MC)ˆx k k + My k..2388.949.5732

6 5 Bode magnitude plots of noise to state transfer functions In phase Quadrature Noise 4 Magnitude 3 2.5.5 2 2.5 3 3.5 Discrete frequency (rad/sec) 2

8 7 Magnitude plots of signal, noise and KF spectra State spectrum Kalman filter magnitude Noise spectrum 6 5 Magnitude 4 3 2.5.5 2 2.5 3 3.5 Discrete frequency (rad/sec) Q = diag ( 5 ). 3

6 5 Magnitude plots of signal, noise and KF spectra State spectrum Noise spectrum Kalman filter magnitude Noise spectrum 4 Magnitude 3 2.5.5 2 2.5 3 3.5 Discrete frequency (rad/sec) Q = diag ( ). 4

Kalman Filter Operation Produces a standard filter in the Digital Signal Processing sense. Picking out state elements is possible to access estimates of specific components. Specification of the following is key to the operation of the Kalman Filter. A signal state model things we want, A noise state model things we want to reject, Relative power of signals, Q is like the signal power, R is like the noise power. 5

Telling example 2 estimation of the mean Consider the following signal y k = γ + v k, where v k is a white noise, zero mean, variance. and we want to estimate γ the mean of y k using these measurements. Obvious answer: ˆγ k = k+ ki= y k. The optimal mean estimator the sample mean. Kalman filter version: Set up state model so that x k = γ, x k+ = x k + w k y k = x k + v k. A =, C =, Q =, R =. 6

A =, C =, Q =, R =. Σ k+ k = A k Σ k k A T k A kσ k k C T k (C kσ k k C T k +R k) C k Σ k k A T k +Q k. Σ k+ k = Σ k k Σ 2 k k (Σ k k + ) = Σ k k Σ k k +. Σ k k = k + Σ, K k = ˆγ k+ k = + k + Σ ˆγ k k + = k + Σ + k + Σ ˆγ k k + = Σ + k + Σ ˆγ + + k + Σ. + k + Σ + k + Σ + k + Σ y k, k i= y i. y k, 7

Sample Mean Estimator Points to note: Kalman knows better than you about the optimal estimator it is the same as the sample mean if Σ =. The propagation of the covariance is critical to the understanding of the optimal estimator. The time-varying filter is optimal BUT the stationary filter has K k =. The stationary filter is not even asymptotically stable. AHA!: The filtering problem has [A, Q o ] with uncontrollable modes on the unit circle no stabilizing solution exists. The Kalman Filter is nothing to be afraid of. Although it can misbehave if we do not know what we are doing. 8

Telling Example 3 Least Squares Regression Example: AutoRegressive exogenous input modeling y k = a y k a 2 y k 2... a n y k n + b u k +... + b n u k n + e k, or A(z)y k = z B(z)u k + e k. ARX model. We measure {y k, u k } and we want to estimate the parameter vector θ T = ( a a 2... a n b... b n ). State-space model for the parameters and measurements θ k+ = θ k y k = φ k θ k + e k, where e k is white noise and the regressor is φ T k = ( y k y k... y k n u k... u k n ). This is in the time-varying Kalman Filtering formulation. 9

Kalman filter to estimate state θ k. θ k+ = θ k y k = φ T k θ k + e k, ˆθ k+ k = ˆθ k k + K k [y k φ T k ˆθ k k ], K k = P k φ k (φ T k P kφ k + R), P k = P k P k φ k (φ T k P kφ k + R) φ T k P k. Rewriting the recursions a little (matrix inversion lemma) Pk+ = P k = P + + φ k R φ T k k i= K k = P k+ φ k R. This recursion minimizes the criterion min θ lim N φ i R φ T i, N N E [y i φ T i θ]2 i=

Telling example 3 Recursive Least Squares The previous KF is the Recursive Least Squares parameter estimation algorithm of systems identification for fitting ARX models. The covariance P k satisfies P k = P + k i= φ i R φ T i It suffers from the same problems as the mean estimator. If the regressor sequence φ i is stationary with full-rank covariance then P k and K k. This, of course, is the correct answer for minimizing the one-step-ahead prediction error. The property that φ i have full-rank covariance is known as persistence of excitation. Without it, there is not enough information in y k to let us converge to the correct θ..

Telling example 4 Differentiation of a Signal In longwall coal-mining, roof collapse or gas-out of the floor is a safety and productivity problem people and machines can be buried behind the collapse. Suppose we are given a signal s t of the roof strain, as measured by instrumented breaker line supports, and we wish to generate an estimate of the strain rate which is not too noisy. 2 57

Telling example 4 Differentiation of a Signal Set up a fictitious state-space model containing what we want. x k+ = x k + w k, x 2 k+ = x2 k + x k, s k = x 2 k + v k. Here we have posited a model which writes the measured signal, s k, (which is the only one that really exists) as the noisy measurement of x 2 k, which in turn is the integral of x k. AHA! Estimating x k is the same as differentiating s k. A = ( ), C = ( ), Q = ( ) q, R = r. 3

.9.8 Frequency response magnitude plots of KF differentiator Q=, R=.5 Q=, R= Q=, R=.7.6 Magnitude.5.4.3.2..5.5 2 2.5 3 3.5 Discrete Frequency (rad/sec) Differentiator performance tuned using Q/R knob. 4