Extended Kalman Filter Derivation Example application: frequency tracking

Similar documents
x n C l 1 u n C m 1 y n C p 1 v n C p 1

Core Concepts Review. Orthogonality of Complex Sinusoids Consider two (possibly non-harmonic) complex sinusoids

Overview of Discrete-Time Fourier Transform Topics Handy Equations Handy Limits Orthogonality Defined orthogonal

Spectrograms Overview Introduction and motivation Windowed estimates as filter banks Uncertainty principle Examples Other estimates

J. McNames Portland State University ECE 223 DT Fourier Series Ver

x(n) = a k x(n k)+w(n) k=1

5 Kalman filters. 5.1 Scalar Kalman filter. Unit delay Signal model. System model

Fast Fourier Transform Discrete-time windowing Discrete Fourier Transform Relationship to DTFT Relationship to DTFS Zero padding

Fast Fourier Transform Discrete-time windowing Discrete Fourier Transform Relationship to DTFT Relationship to DTFS Zero padding

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Change in Notation. e (i) (n) Note that this is inconsistent with the notation used earlier, c k+1x(n k) e (i) (n) x(n i) ˆx(n i) M

Overview of Sampling Topics

Nonlinear State Estimation! Extended Kalman Filters!

J. McNames Portland State University ECE 223 Sampling Ver

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter

From Fourier Series to Analysis of Non-stationary Signals - X

1 Kalman Filter Introduction

6.4 Kalman Filter Equations

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Least Squares and Kalman Filtering Questions: me,

Introduction to Unscented Kalman Filter

Time Series Analysis

CALIFORNIA INSTITUTE OF TECHNOLOGY Control and Dynamical Systems. CDS 110b

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Optimization-Based Control

Lesson 1. Optimal signalbehandling LTH. September Statistical Digital Signal Processing and Modeling, Hayes, M:

TSRT14: Sensor Fusion Lecture 8

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

Extended Kalman Filter Tutorial

Control Systems Lab - SC4070 System Identification and Linearization

X t = a t + r t, (7.1)

Grades will be determined by the correctness of your answers (explanations are not required).

Gaussian, Markov and stationary processes

Correlation, discrete Fourier transforms and the power spectral density

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Solutions for examination in TSRT78 Digital Signal Processing,

A new unscented Kalman filter with higher order moment-matching

Derivation of the Kalman Filter

Grades will be determined by the correctness of your answers (explanations are not required).

4 Derivations of the Discrete-Time Kalman Filter

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

CCNY. BME I5100: Biomedical Signal Processing. Stochastic Processes. Lucas C. Parra Biomedical Engineering Department City College of New York

Nonlinear Observers. Jaime A. Moreno. Eléctrica y Computación Instituto de Ingeniería Universidad Nacional Autónoma de México

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

Factor Analysis and Kalman Filtering (11/2/04)

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Topic 3: Fourier Series (FS)

3.2 Complex Sinusoids and Frequency Response of LTI Systems

VIII. Coherence and Transfer Function Applications A. Coherence Function Estimates

Digital Filters Ying Sun

Estimation and Prediction Scenarios

EE538 Final Exam Fall 2007 Mon, Dec 10, 8-10 am RHPH 127 Dec. 10, Cover Sheet

State Estimation and Motion Tracking for Spatially Diverse VLC Networks

Each problem is worth 25 points, and you may solve the problems in any order.

CALIFORNIA INSTITUTE OF TECHNOLOGY Control and Dynamical Systems. CDS 110b

ECE531 Lecture 10b: Dynamic Parameter Estimation: System Model

EE Experiment 11 The Laplace Transform and Control System Characteristics

Overview of Bode Plots Transfer function review Piece-wise linear approximations First-order terms Second-order terms (complex poles & zeros)

GATE EE Topic wise Questions SIGNALS & SYSTEMS

Solutions to Problems in Chapter 4

Optimization-Based Control

EE482: Digital Signal Processing Applications

New Fast Kalman filter method

From Bayes to Extended Kalman Filter

ECE 3084 OCTOBER 17, 2017

Introduction to Biomedical Engineering

Comparision of Probabilistic Navigation methods for a Swimming Robot

EE482: Digital Signal Processing Applications

Statistics 910, #15 1. Kalman Filter

NOISE ROBUST RELATIVE TRANSFER FUNCTION ESTIMATION. M. Schwab, P. Noll, and T. Sikora. Technical University Berlin, Germany Communication System Group

A NOVEL OPTIMAL PROBABILITY DENSITY FUNCTION TRACKING FILTER DESIGN 1

Non-parametric identification

I. Signals & Sinusoids

2.6 The optimum filtering solution is defined by the Wiener-Hopf equation

CS 532: 3D Computer Vision 6 th Set of Notes

ECE 8440 Unit 13 Sec0on Effects of Round- Off Noise in Digital Filters

Chapter 5 Frequency Domain Analysis of Systems

Dual Estimation and the Unscented Transformation

Image Alignment and Mosaicing Feature Tracking and the Kalman Filter

DESIGN AND IMPLEMENTATION OF SENSORLESS SPEED CONTROL FOR INDUCTION MOTOR DRIVE USING AN OPTIMIZED EXTENDED KALMAN FILTER

A Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis Of The Non-Stationary Signals

Velocity (Kalman Filter) Velocity (knots) Time (sec) Kalman Filter Accelerations 2.5. Acceleration (m/s2)

Massachusetts Institute of Technology

Fourier Series Representation of

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

2D Image Processing. Bayes filter implementation: Kalman filter

NAME: 11 December 2013 Digital Signal Processing I Final Exam Fall Cover Sheet

Timbral, Scale, Pitch modifications

Data assimilation with and without a model

Adaptive Systems Homework Assignment 1

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

2D Image Processing. Bayes filter implementation: Kalman filter

Optimization-Based Control

EE301 Signals and Systems In-Class Exam Exam 3 Thursday, Apr. 19, Cover Sheet

Course Background. Loosely speaking, control is the process of getting something to do what you want it to do (or not do, as the case may be).

State Estimation of Linear and Nonlinear Dynamic Systems

Assignment 4 Solutions Continuous-Time Fourier Transform

Transcription:

Extended Kalman Filter Derivation Example application: frequency tracking Nonlinear State Space Model Consider the nonlinear state-space model of a random process x n+ = f n (x n )+g n (x n )u n y n = h n (x n )+v n where u u m n v n, v m Q n δ n,m x x x x = R n δ n,m Π Assumes no interaction between the process and measurement noise, u n, v m = J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 2 Nonlinear Functions f n (x n ), g n (x n ) and h n (x n ) are multivariate, time-varying functions. f n, (x n ) f n,2 (x n ) f n (x n )=.. f n,k (x n ) g n, (x n ) g n,m (x n ) g n,2 (x n ) g n,2m (x n ) g n (x n )=..... g n,lm (x n ) g n,lm (x n ) h n, (x n ) h n,2 (x n ) h n (x n )=.. h n,m (x n ) Taylor Series Approximations Now consider a first-order Taylor expansion of f n (x) and h n (x) about ˆx n. f n (x n ) f n (ˆx n )+F n (x n ˆx n ) h n (x n ) h n (ˆx n )+H n (x n ˆx n ) g n (x n ) g n (ˆx n ) where the matrices F n and H n are the Jacobians of f n (x) and h n (x) evaluated at ˆx n = x f n (x) = f n(x) x F n l l G n = g n (ˆx n ) m m H n p p = x h n (x) = h n(x) x x=ˆxn x=ˆxn J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 3 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 4

Taylor Series Approximations: Linearization Point Taylor series expansion could be about different estimates of x n for each of the three nonlinear functions ˆx n n is usually used for f n ( ) and g n ( ) ˆx n n is usually used for h n ( ) The best possible estimate should be used in all cases to minimize the error in the Taylor series approximations Note that, even if the estimates ˆx n and ŷ n are unbiased, the estimates ˆx n+ n and ŷ n n will be biased Jacobians In expanded form, the Jacobians are given as follows f n, (x) f n, (x) x() x(2) x f n (x) = f f n,2 (x) f n,2 (x) n(x) x = x() x(2)...... x h n (x) = h n(x) x = f n,l (x) x() h n, (x) x() h n,2 (x) x(). h n,p (x) x() f n,l (x) x(2) h n, (x) x(2) h n,2 (x) x(2). h n,p (x) x(2) f n, (x) f n,2 (x) f n,l (x) h n, (x) h n,2 (x).... h n,p (x) J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 5 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 6 Affine State Space Approximation x n+ f n (ˆx n )+F n (x n ˆx n )+G n u n = F n x n + f n (ˆx n ) F n ˆx n +G n u n }{{} Requires estimation y n h n (ˆx n )+H n (x n ˆx n )+v n = H n x n +(h n (ˆx n ) H n ˆx n ) +v n }{{} Requires estimation The performance of the extended Kalman filter depends most critically on the accuracy of the Taylor series approximations Most accurate when the state residual (x n ˆx n ) is minimized The EKF simply uses the best estimates that are available at the time that the residuals must be calculated Accounting for the Nonzero Mean x n+ f n (ˆx n )+F n (x n ˆx n )+G n u n y n h n (ˆx n )+H n (x n ˆx n )+v n The Taylor series approximation results in an affine model There are two key differences between this model and the linear model x n and y n have nonzero means The coefficients are now random variables not constants The typical approach to converting this to our linear model is to subtract the means from both sides J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 7 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 8

Redefining the Normal Equations The optimal estimate of the state is now defined as the optimal affine transformation of y n ˆx n+ n K o y n + k = K o (y n E[y n ]) + E[x n+ ] In terms of the normal equations we have ˆx n+ n = x n+, col{, y,,y n } col{, y,,y n } 2 col{, y,,y n } = x n+, col{, e,,e n } col{, e,,e n } 2 col{, e,,e n } = µ n+ + nx k= x n+, e k R e,ke k where the first innovation is now defined as e y ŷ = y E[y ] and all the innovations have zero mean by definition Centering the Output Equation E[y n ]=E[h n (ˆx n )] + H n (E[x n ] E[ˆx n ]) + E[v n ] =E[h n (ˆx n )] + H n (E[x n ] E[ˆx n ]) Now if we assume that ˆx n is nearly deterministic (in other words, perfect) then and this simplifies to E[ˆx n ] ˆx n E[h n (ˆx n )] h n (ˆx n ) E[y n ] h n (ˆx n )+H n (E[x n ] ˆx n ) Then the centered output is given by y c,n y n E[y n ] [h n (ˆx n )+H n (x n ˆx n )+v n ] [h n (ˆx n )+H n (E[x n ] ˆx n )] = H n (x n E[x n ]) + v n +(h n (ˆx n ) h n (ˆx n )) + H n (ˆx n ˆx n ) = H n x c,n + v n J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 9 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 The expected value of x n+ is State Mean E[x n+ ]=F n E[x n ]+E[f n (ˆx n )] F n E[ˆx n ]+G n E[u n ] μ n+ = F n μ n +E[f n (ˆx n )] F n E[ˆx n ] = F n (μ n E[ˆx n ]) + E[f n (ˆx n )] Thus we obtain a recursive update equation for the state mean. Again, if we assume that ˆx n is nearly deterministic E[ˆx n ] ˆx n E[f n (ˆx n )] f n (ˆx n ) The mean recursion then simplifies to μ n+ = F n (μ n ˆx n )+f n (ˆx n ) Centering the State μ n+ = F n (μ n ˆx n )+f n (ˆx n ) x c,n+ x n+ μ n+ =(F n x n + f n (ˆx n ) F n ˆx n + G n u n ) (F n μ n + f n (ˆx n ) F n ˆx n ) = F n (x n μ n )+(f n (ˆx n ) f n (ˆx n )) F n (ˆx n ˆx n )+G n u n = F n x c,n + G n u n Thus our complete linearized state space model becomes x c,n+ = F n x c,n + G n u n y c,n = H n x c,n + v n J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 2

Linearization Points x n+ f n (ˆx n )+F n (x n ˆx n )+G n u n y n h n (ˆx n )+H n (x n ˆx n )+v n Best estimate of x n for the state prediction is ˆx n ˆx n n Cannot be used for the output estimate because the innovation e n = y n ŷ n is required to calculate ˆx n n Thus, ˆx n ˆx n n is used for the output linearization point These estimates result in several simplifications Estimating the State x c,n+ = F n x c,n + G n u n y c,n = H n x c,n + v n The KF recursions will give us the optimal estimated state for the linearized model, ˆx c,n n and ˆx c,n n Conversion to the state estimates requires estimation of the state mean ˆx n n = ˆx c,n n + μ n ˆx n n = ˆx c,n n + μ n Would be convenient if we could estimate the state directly J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 3 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 4 Predicted State Recursions ( ) μ n+ = F n μn ˆx n n + fn (ˆx n n ) ˆx n+ n = ˆx c,n+ n + μ n+ = [ ] [ ( ) F n ˆx c,n n + Fn μn ˆx n n + fn (ˆx n n ) ] = F n (ˆx c,n n + μ n ) F n ˆx n n + f n (ˆx n n ) = F n ˆx n n F n ˆx n n + f n (ˆx n n ) = f n (ˆx n n ) Filtered State Recursions ˆx n n = ˆx c,n n + μ n = (ˆx c,n n + K f,n e n ) + μn = (ˆx c,n n + μ n ) + Kf,n e n = ˆx n n + K f,n e n Thus the measurement updates can also be obtained directly in terms of the estimated state without having to estimate the state mean! Thus the time update can be obtained directly in terms of the estimated state without having to estimate the mean! ˆx n+ n = f n (ˆx n n ) J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 5 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 6

The affine model for the output is Estimating the Output y n h n (ˆx n n )+H n (x n ˆx n n )+v n Then the best affine estimate of y n given L{, y,,y n } is given by ŷ n n = ĥn(ˆx n n )+H n (ˆx n n ˆx n n ) = ĥn(ˆx n n ) h n (ˆx n n ) Innovations It would be convenient if we could express the innovations directly in terms of the observations, rather than the centered observations. This would also eliminate the need to estimate E[y n ]. e n y c,n ŷ c,n n =(y n +E[y n ]) ( ŷ n n +E[y n ] ) = y n ŷ n n y n h n (ˆx n n ) As before, these approximations assume ˆx n n is nearly deterministic ĥ n (ˆx n n ) h n (ˆx n n ) J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 7 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 8 Linearization Summary The Kalman filter is optimal for the standard state space model Minimizes the mean square error of all possible linear estimators The EKF is only approximately optimal (nearly meaningless) Three approximations First- and zero-order Taylor series approximations Assumes ˆx n n is unbiased in the output approximation Assumes ˆx n n and ˆx n n are nearly deterministic It s not clear to me how critical each of these assumptions are Extended Kalman Filter (unsimplified) ˆx = x P =Π e n = y n h n (ˆx n n ) R e,n = R n + H n P n n Hn K f,n = P n n HnR e,n ˆx n n = ˆx n n + K f,n e n ˆx n+ n = f n (ˆx n n ) P n n = P n n K f,n R e,n Kf,n P n+ n = F n P n n F n + G n Q n G n J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 9 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 2

Simplifiying the EKF: Filtered Gain There are a number of ways in which these 9 equations can be simplified K f,n = P n n Hn(H n P n n Hn + R n ) R e,n Kf,n ( ) = R e,n R e,nh n P n n = H n P n n P n n = P n n K f,n R e,n K f,n = P n n K f,n H n P n n =(I K f,n H n )P n n Extended Kalman Filter (simplified) The Schmidt extended Kalman filter (EKF) can now be initialized with ˆx = x P =Π and the recursive algorithm can be expressed as K f,n = P n n Hn(H n P n n Hn + R n ) ( ˆx n n = ˆx n n + K f,n yn h n (ˆx n n ) ) ˆx n+ n = f n (ˆx n n ) P n n =(I K f,n H n )P n n P n+ n = F n P n n F n + G n Q n G n J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 2 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 22 Extended Kalman Filter (complete) A more complete algorithm is as follows ˆx = x P =Π H n = x h n (x) x=ˆxn n K f,n = P n n Hn(H n P n n Hn + R n ) ( ˆx n n = ˆx n n + K f,n yn h n (ˆx n n ) ) F n = x f n (x) x=ˆxn n G n = g n (ˆx n n ) where denotes the Jacobian operator x x () ˆx n+ n = f n (ˆx n n )P n n =(I K f,n H n )P n n P n+ n = F n P n n F n + G n Q n G n J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 23 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 24

EKF Discussion Can only be expected to perform well for nonlinear functions that can be accurately approximated locally about ˆx n as linear functions Excludes functions with discontinuities and discontinuous slopes Only smooth functions should be used There is a positive feedback mechanism in the Taylor series approximations If the estimate ˆx n is good, then the approximation will generally be accurate and the next estimate will be good (possibly better) If the estimate is poor, then the next estimate will be poor (possibly worse) If the EKF starts to lose track, the approximation error in the Taylor series approximation may exacerbate this problem in subsequent estimates Example : Frequency Tracking Develop a state space model to track the frequency of a quasi-periodic sinusoid measured from a noisy sensor. Consider the following continuous-time state space model y(t) =a sin ( ) 2πT s fn+ θ(n) + v(n) θ(t) =ω(t) ω(t) =u(t) f(t) = f + 2π ω(t) J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 25 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 26 Example : Continuous- to Discrete-time Conversion Use the coarse CT-DT conversion, we obtain a DT state space model y(n) =a sin ( ) 2πT s fn+ θ(n) + v(n) θ(n +)=θ(n)+t s ω(n) ω(n +)=ω(n)+u(n) f(n) = f + 2π u(n) Note that the state update equation is linear. [ ] [ ][ ] θ(n +) Ts θ(n) = + u(n) ω(n +) ω(n) x n+ = F x n + u n where u(n), u(k) = [ ] T s δ kn So we have and Example : Output Linearization y(n) =a sin ( 2πT s fn+ θ(n) ) + v(n) h n (x n ) a sin ( ) 2πT s fn+ θ(n) = a sin ( 2πT s fn+ xn () ) H n x h n (x) = [ a cos ( 2πT s fn+ xn () ) ] J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 27 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 28

Example : Chirp Frequency Tracking Example : Chirp Frequency Tracking q=5.e-7 q=5.e-6.5 5 5 2 25 3 35 4 45.5.5 5 5 2 25 3 35 4 45.5.4.4.4.4.3.2.3.2.3.2.3.2.... 5 5 2 25 3 35 4 45 5 5 2 25 3 35 4 45 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 29 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 3 Example : Chirp Frequency Tracking Example : Chirp Frequency Tracking q=5.e-5 q=.5.5 5 5 2 25 3 35 4 45.5.5 5 5 2 25 3 35 4 45.5.4.4.4.4.3.2.3.2.3.2.3.2.... 5 5 2 25 3 35 4 45 5 5 2 25 3 35 4 45 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 3 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 32

Example : Chirp Frequency Tracking Example : Chirp Frequency Tracking q=.5 q=.5.5 5 5 2 25 3 35 4 45.5.5 5 5 2 25 3 35 4 45.5.4.4.4.4.3.2.3.2.3.2.3.2.... 5 5 2 25 3 35 4 45 5 5 2 25 3 35 4 45 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 33 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 34 clear; close all; Example : MATLAB Code title(sprintf( $q$=5.3g,q(c)*var(y))); print(sprintf( KFChirpQd,round(log(q(c)))), -depsc ); User-Specified Parameters fs = ; Sample rate (Hz) N = 3; Number of observations from the process Np = 5; Number of samples to throw away to account for transient k = :N/4; t = (k-.5)/fs; yc = chirp(t,.5,t(end),.45); y = [yc fliplr(yc) yc fliplr(yc)] ; Kalman Filter Estimates q = [e-6 e-5 e-4 e-3 e-2 e-]; for c=:length(q), fi = KalmanFrequencyTracker(y,fs,[],[q(c)*var(y) var(y).]); NonparametricSpectrogram(y,fs,2); colormap(colorspiral); hold on; k = :length(y); t = (k-.5)/(6*fs); h = plot(t,fi, k,t,fi, w ); set(h(), LineWidth,.5); set(h(2), LineWidth,.5); hold off; FigureSet(, LTX ); AxisSet(8); J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 35 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 36

Example : MATLAB Code function [fi,fv,yh] = KalmanFrequencyTracker(y,fsa,fmna,nva,aa,pfa); KalmanFilterFrequencyTracker: Tracks instantaneous frequency [fi,yh] = KalmanFrequencyTracker(y,fs,fmn,nv,a,pf); y Input signal. fs Sample rate (Hz). Default = Hz. fmn A priori mean frequency nv Noise variances of [fi y x]. Default = [..]. a Amplitude of the frequency component. Default = see manuscript. pf Plot flag: =none (default), =screen, 2=current figure. fi Instantaneous frequency (fi) estimates. yh Smoothed estimate of the input signal. Uses the following model a harmonic estimate of the signal with time-varying amplitude and instantaneous frequency. Example: Generate the parametric spectrogram of an intracranial pressure signal using a Blackman-Harris window that is 45 s in duration. load Tremor.mat; N = length(x); fss = 75; New sample rate (Hz) Ns = ceil(n*fss/fs); Number of samples sti = floor((fss/fs)*(si-))+; Indices of spikes ks = :Ns; Sample index ts = (ks-.5)/fss; Time index xs = zeros(ns,); Allocate memory for spike train xs(sti) = ; Create spike train KalmanFrequencyTracker(xs,fss); S. Kim, J. McNames, "Tracking tremor frequency in spike trains using the extended Kalman filter," Proceedings of the 27th Annual International Conference of the Engineering in Medicine and Biology Society, September 25. Version. JM See also Lowpass, ParametricSpectrogram, and KernelFilter. Error Checking if nargin<2, help KalmanFrequencyTracker; return; Process Function Arguments fs = ; if exist( fsa, var ) & ~isempty(fsa), fs = fsa; fmean = 2; if exist( fmna, var ) & ~isempty(fmna), fmean = fmna; a =.*std(y); Amplitude of the frequency component if exist( aa, var ) & ~isempty(aa), a = aa; nv = [.*var(y) var(y).]; Noise Variances if exist( nva, var ) & ~isempty(nva), nv = nva; J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 37 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 38 pf = ; if nargout==, pf = ; if exist( pfa ) & ~isempty(pfa), pf = pfa; Default - no plotting Plot if no output arguments Fs = cell(ny,); Res = cell(ny,); es = cell(ny,); xhps = cell(ny,); Pps = cell(ny+,); Kalman Filter Parameter Definitions nx = 2; Dimension of the state vector Preprocessing my = mean(y); Save the mean value of y y = y(:) - mean(y); Make sure y is a column vector ny = length(y); No. samples Ts = /fs; Sampling interval (sec) wmean = fmean/(2*pi); Mean frequency Q = Ts*diag([;nv()]); Process noise covariance matrix for discrete-time process model R = nv(2); Measurement noise covariance matrix for discrete-time process model P = nv(3)*eye(nx,nx); Initial predicted error covariance matrix Memory Allocation fi = zeros(ny,); fv = zeros(ny,); yh = zeros(ny,); Hs = cell(ny,); -------------------------------------------------------------------- Variable Initialization -------------------------------------------------------------------- xhp = zeros(nx,); Initial value of predicted state at time i= Pp = P; Predicted state error covariance Pps{} = Pp; xhps{} = xhp; -------------------------------------------------------------------- Kalman Filter Recursions -------------------------------------------------------------------- for n=:ny, H = [a*cos(wmean*n*ts+xhp()),]; Re = R + H*Pp*H ; Innovation covariance matrix Kf = Pp*H *inv(re); Filtered estimate Kalman gain yhp = a*sin(wmean*n*ts+xhp()); Predicted estimate of the output e = y(n) - yhp; Innovation xhm = xhp + Kf*e; Measurement update estimate of the state F = [ Ts; ]; Linearized state transition matrix xhp = [xhm()+ts*xhm(2);xhm(2)]; Predicted state estimates Pm = Pp - Kf*Re*Kf ; Measurement state error covariance Pp = F*Pm*F + Q; Predicted state error covariance ---------------------------------------------------------------- Store Variables for Smoothing ---------------------------------------------------------------- J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 39 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 4

Hs{n} = H; Fs{n} = F; Res{n} = Re; es{n} = e; xhps{n+} = xhp; Pps{n+} = Pp; Process Return Arguments if nargout==, clear( fi, yh ); ---------------------------------------------------------------- Store Variables of Interest ---------------------------------------------------------------- fi(n) = (xhm(2)+wmean)/(2*pi); Instantaneous frequency estimate fv(n) = Pm(2,2)/((2*pi).^2); Estimated error variance of estimate Post-processing yh = yh + my; Plotting if pf>=, fmean = wmean/(2*pi); ds = floor(fs/(4*fmean)); Tx = ny/fs; NonparametricSpectrogram(decimate(y,ds),fs/ds,/fmean,[],[],[],pf); hold on; k = :ny; t = (k-.5)/fs; h = plot(t,fi, k,t,fi, w ); set(h(), LineWidth,2.5); set(h(2), LineWidth,.5); hold off; J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 4 J. McNames Portland State University ECE 539/639 Extended Kalman Filter Ver..2 42