Data assimilation with and without a model
|
|
- Cory Davidson
- 5 years ago
- Views:
Transcription
1 Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF
2 Most of this work is due to: Tyrus Berry, Postdoc, GMU Franz Hamilton, Postdoc, NC State
3 OUTLINE OF TALK 1. EnKF and applications Parameters Network edge detection 2. Problem: Don t know noise covariances Adaptive filtering 3. Problem: Model error Multimodel DA 4. Problem: Model unknown Kalman-Takens filter
4 DATA ASSIMILATION x k = f (x k 1 ) + η k η k N(0, Q) y k = h(x k ) + ν k ν k N(0, R) Main Problem. Given the model above plus observations y k, Filtering: Estimate the current state p(x k y 1,..., y k ) Forecasting: Estimate a future state p(x k+l y 1,..., y k ) Smoothing: Estimate a past state p(x k l y 1,..., y k ) Parameter estimation We ll apply the ensemble Kalman filter (EnKF) to attempt to achieve these goals.
5 DATA ASSIMILATION x k = f (x k 1 ) + η k η k N(0, Q) y k = h(x k ) + ν k ν k N(0, R) Possible obstructions. The observations y k mix system noise η k with obs noise ν k. Model error Q and R may be unknown Known model with unknown parameters Wrong model, even with best fit parameters Have model for some, not all of the variables
6 EXAMPLE 1. LORENZ 96 dx i dt = x i 2 x i 1 + x i 1 x i+1 x i + F Time
7 EXAMPLE 2. MEA RECORDINGS
8 TWO STEPS TO FIND p(x k y 1,..., y k ) Assume we have p(x k 1 y 1,..., y k 1 ) Forecast Step: Find p(x k y 1,..., y k 1 ) Assimilation Step: Perform a Bayesian update, p(x k y 1,..., y k ) p(x k y 1,..., y k 1 )p(y k x k, y 1,..., y k 1 ) Posterior Prior Likelihood
9 BEST POSSIBLE SCENARIO x k = f (x k 1 ) + η k η k N(0, Q) y k = h(x k ) + ν k ν k N(0, R) f and h are linear, all parameters known. x k = F k 1 x k 1 + η k η k N(0, Q) y k = H k x k + ν k ν k N(0, R)
10 KALMAN FILTER Assume linear dynamics/obs and additive Gaussian noise x k = F k 1 x k 1 + η k η k N (0, Q) y k = H k x k + ν k ν k N (0, R) observability condition for linear systems: H k H k l = H k+1 F k. H k+l+1 F k+l F k Must be full rank for some l KF guaranteed to work
11 KALMAN FILTER Assume linear dynamics/obs and additive Gaussian noise x k = F k 1 x k 1 + η k η k N (0, Q) y k = H k x k + ν k ν k N (0, R) Assume current estimate is Gaussian: p(x k 1 y 1,..., y k 1 ) = N (x a k 1, Pa k 1 ) Forecast: Linear combinations of Gaussians Prior: p(x k y 1,..., y k 1 ) = N (x f k, Pf k ) x f k = F k 1 x a k 1 P f k = F k 1 P k 1 F k 1 + Q Likelihood: p(y k x k, y 1,..., y k 1 ) = N (y f k, Py k ) y f k = H k x f k P y k = H kp f kh k + R
12 KALMAN FILTER Forecast: Linear combinations of Gaussians Prior: p(xk y 1,..., y k 1 ) = N (x f k, Pf k ) x f k = F k 1 x a k 1 P f k = F k 1 P k 1 F k 1 + Q Likelihood: p(y k x k, y 1,..., y k 1 ) = N (y f k, Py k ) y f k = H k x f k P y k = H kpkh f k + R Assimilation: Product of Gaussians (complete the square) p(x k y 1,..., y k ) = N (xk f, Pf k ) N (y k f, Py k ) = N (x k a, Pa k ) Define the Kalman gain: K k = Pk f HT k (Py k ) 1 x a k = xk f + K k(y k yk f ) P a k = (I K kh k )P f k
13 KALMAN FILTER SUMMARY forecast x f k = F k 1 x a k 1 covariance update P f k = F k 1 P a k 1 F T k 1 + Q k 1 P y k = H k P f k HT k + R k 1 Kalman gain K k = P f k HT k (Py k ) 1 ɛ k = y k y f k = y k H k x f k assimilation step x a k = x f k + K kɛ k P a k = (I K k H k )P f k
14 WHAT ABOUT NONLINEAR SYSTEMS? Consider a system of the form: x k+1 = f (x k ) + η k+1 η k+1 N (0, Q) y k+1 = h(x k+1 ) + ν k+1 ν k+1 N (0, R) More complicated observability condition (Lie derivatives) Extended Kalman Filter (EKF): Linearize F k = Df (ˆx a k ) and H k = Dh(ˆx f k ) Problem: State estimate ˆx a k may not be well localized Solution: Ensemble Kalman Filter (EnKF)
15 ENSEMBLE KALMAN FILTER (ENKF) + P xx P xx x t + x t i F x t ~ x i t F(x t i ) Generate an ensemble with the current statistics (use matrix square root): x i t = sigma points on semimajor axes xt f = 1 F(x i t ) 2n P f xx = 1 2n 1 (F(x i t ) x f t )(F (x i t ) x f t ) T + Q
16 ENSEMBLE KALMAN FILTER (ENKF) + P xx P xx x t + x t i F x t ~ x i t F(x t i ) Calculate yt i = H(F (xt i )). Set yt f = 1 2n i y t i. P yy = (2n 1) 1 (y i t y f t )(y i t y f t ) T + R P xy = (2n 1) 1 (F(x i t ) x f t )(y i t y f t ) T K = P xyp 1 yy x a t+1 = x f t + K (y t y f t ) and P a xx = P f xx KP yyk T
17 PARAMETER ESTIMATION When the model has parameters p, x k+1 = f (x k, p) + η k+1 Can augment the state x k = [x k, p k ] Introduce trivial dynamics for p x k+1 = f (x k, p k ) + η k+1 p k+1 = p k + η p k+1 Need to tune the covariance of η p k+1
18 EXAMPLE OF PARAMETER ESTIMATION Consider the Hodgkin-Huxley neuron model, expanded to a network of n equations V i = g Na m 3 h(v i E Na ) g K n 4 (V i E K ) g L (V i E L ) n +I + Γ HH (V j )V j j i ṁ i = a m (V i )(1 m i ) b m (V i )m i ḣ i = a h (V i )(1 h i ) b h (V i )h i ṅ i = a n (V i )(1 n i ) b n (V i )n i Γ HH (V j ) = β ij /(1 + e 10(V j +40) ) Only observe the voltages V i, recover the hidden variables and the connection parameters β
19 EXAMPLE OF PARAMETER ESTIMATION Can even turn connections on and off (grey dashes) Variance estimate statistical test (black dashes)
20 LINK DETECTION FROM NETWORKS OF MODEL NEURONS Network of Hindmarsh-Rose neurons, modeled by Hindmarsh-Rose Network of Hodgkin-Huxley neurons, modeled by Hindmarsh-Rose Sensitivity = true positive rate Specificity = true negative rate
21 LINK DETECTION FROM MEA RECORDINGS
22 LINK DETECTION FROM MEA RECORDINGS
23 ENKF: WRONG Q AND R 3.2 Simple example with full observation and diagonal noise covariances Red indicates RMSE of unfiltered observations Black is RMSE of optimal filter (true covariances known) RMSE RMSE Variance used in R k Variance used in Q k
24 ENKF: WRONG Q AND R Standard Kalman Update: Pk f = F k 1 Pk 1 a F k 1 T + Q k 1 RMSE P y k = H k P f k HT k + R k 1 K k = P f k HT k (Py k ) 1 P a k = (I K k H k )P f k Variance used in R k ɛ k = y k y f k = y k H k x f k x a k = x f k + K kɛ k RMSE Variance used in Q k
25 ADAPTIVE FILTER: ESTIMATING Q AND R Innovations contain information about Q and R ɛ k = y k yk f = h(x k ) + ν k h(xk f ) = h(f (x k 1 ) + η k ) h(f (x a k 1 )) + ν k H k F k 1 (x k 1 x a k 1 ) + H kη k + ν k IDEA: Use innovations to produce samples of Q and R : E[ɛ k ɛ T k ] HPf H T + R E[ɛ k+1 ɛ T k ] HFPe H T HFK E[ɛ k ɛ T k ] P e FP a F T + Q In the linear case this is rigorous and was first solved by Mehra in 1970
26 ADAPTIVE FILTER: ESTIMATING Q AND R To find Q and R we estimate H k and F k 1 from the ensemble and invert the equations: E[ɛ k ɛ T k ] HPf H T + R E[ɛ k+1 ɛ T k ] HFPe H T HFK E[ɛ k ɛ T k ] This gives the following empirical estimates of Q k and R k : P e k Pk e = (H k+1 F k ) 1 (ɛ k+1 ɛ T k + H k+1f k K k ɛ k ɛ T k )H T k Qk e = Pk e F k 1Pk 1 a F k 1 T R e k = ɛ k ɛ T k H kp f k HT k is an empirical estimate of the background covariance
27 ADAPTIVE ENKF We combine the estimates of Q and R with a moving average Original Kalman Eqs. P f k = F k 1 P a k 1 F T k 1 + Q k 1 P y k = H k P f k HT k + R k 1 Our Additional Update P e k 1 = F 1 k 1 H 1 k ɛ k ɛ T k 1 H T k 1 + K k 1 ɛ k 1 ɛ T k 1 H T k 1 K k = P f k HT k (Py k ) 1 P a k = (I K k H k )P f k Q e k 1 = P e k 1 F k 2P a k 2 F T k 2 R e k 1 = ɛ k 1 ɛ T k 1 H k 1P f k 1 HT k 1 ɛ k = y k y f k x a k = x f k + K kɛ k Q k = Q k 1 + (Q e k 1 Q k 1)/τ R k = R k 1 + (R e k 1 R k 1)/τ
28 ADAPTIVE FILTER: APPLICATION TO LORENZ-96 We will apply the adaptive EnKF to the 40-dimensional Lorenz96 model integrated over a time step t = 0.05 dx i dt = x i 2 x i 1 + x i 1 x i+1 x i + F We augment the model with Gaussian white noise x k = f (x k 1 ) + η k η k = N (0, Q) y k = h(x k ) + ν k ν k = N (0, R) The Adaptive EnKF uses F = 8 We will consider model error where the true F i = N (8, 16)
29 RECOVERING Q AND R, PERFECT MODEL True Covariance Initial Guess Final Estimate Difference 8 x 10 3 Q RMSE of Q Q k Filter Step k 0.8 R RMSE Filter Steps RMSE (bottom right) for the initial guess covariances (red) the true Q and R (black) and the adaptive filter (blue)
30 COMPENSATING FOR MODEL ERROR The adaptive filter compensates for errors in the forcing F i True Covariance Initial Guess Final Estimate Difference Q R RMSE (Q f ) ii Relative Change Site Number F i Model Error (%) Filter Steps x 10 RMSE (bottom right) for the initial guess covariances (red) the perfect model (black) and the adaptive filter (blue)
31 ESTIMATING UNMODELED DYNAMICS Multiple models approach: Run m subfilters in parallel ẇ = F (w) + η t y... y S = H(w) + ν t where w = x 1... x m c 1... c m d, F = f (x, p 1 )... f (x, p m) , H(w) = h(x 1 )... h(x m ) i,j c i j xi j + d Learn c i j and d from training data S, then can predict unmodeled S.
32 SELECTING THE ASSIMILATION MODEL Assume a generic spiking neuron model (Hindmarsh-Rose) V = y av 3 + bv 2 z + I ẏ = c dv 2 y ż = τ(s(v + 1) z) V neuron potential, y fast scale dynamics, z slow scale dynamics Goal: Use multiple parameterized Hindmarsh-Rose models to reconstruct unmodeled neuronal quantities while assimilating recorded neuron potential J. Hindmarsh and R. Rose, Proc. Roy. Soc., 221, 87 (1984).
33 RECONSTRUCTING UNMODELED GATING VARIABLES Observing Hodgkin-Huxley voltage, reconstruct unmodeled gating variables (assimilation model is Hindmarsh-Rose) A. Hodgkin and A. Huxley, J. Physiology 117, 500 (1952). Actual Observed Predicted
34 RECONSTRUCTING UNMODELED GATING VARIABLES Observing Hodgkin-Huxley voltage, reconstruct unmodeled gating variables (assimilation model is Hindmarsh-Rose) A. Hodgkin and A. Huxley, J. Physiology 117, 500 (1952). Actual Observed Predicted
35 RECONSTRUCTING UNMODELED GATING VARIABLES Observing Hodgkin-Huxley voltage, reconstruct unmodeled gating variables (assimilation model is Hindmarsh-Rose) A. Hodgkin and A. Huxley, J. Physiology 117, 500 (1952). Actual Observed Predicted
36 RECONSTRUCTING UNMODELED IONIC DYNAMICS Observing seizure voltage, reconstruct unmodeled potassium and sodium dynamics (assimilation model is Hindmarsh-Rose) J. Cressman, G. Ullah, J. Ziburkus, S. Schiff, and E. Barreto, Journal of Comp. Neuroscience 26, 159 (2009). Actual Observed Predicted
37 RECONSTRUCTING UNMODELED IONIC DYNAMICS Observing seizure voltage, reconstruct unmodeled potassium and sodium dynamics (assimilation model is Hindmarsh-Rose) J. Cressman, G. Ullah, J. Ziburkus, S. Schiff, and E. Barreto, Journal of Comp. Neuroscience 26, 159 (2009). Actual Observed Predicted
38 RECONSTRUCTING UNMODELED IONIC DYNAMICS Observing seizure voltage, reconstruct unmodeled potassium and sodium dynamics (assimilation model is Hindmarsh-Rose) J. Cressman, G. Ullah, J. Ziburkus, S. Schiff, and E. Barreto, Journal of Comp. Neuroscience 26, 159 (2009). Actual Observed Predicted
39 RECONSTRUCTING EXTRACELLULAR POTASSIUM FROM AN In Vitro NETWORK We want to track extracellular potassium dynamics in a network but measurements are difficult and spatially limited Extracellular potassium is an unmodeled variable
40 RECONSTRUCTING EXTRACELLULAR POTASSIUM FROM AN In Vitro NETWORK The local extracellular potassium in an MEA network can be reconstructed and predicted using our approach (assimilation model is Hindmarsh-Rose) Actual Observed Predicted
41 RECONSTRUCTING EXTRACELLULAR POTASSIUM FROM AN In Vitro NETWORK The local extracellular potassium in an MEA network can be reconstructed and predicted using our approach (assimilation model is Hindmarsh-Rose) Actual Observed Predicted
42 RECONSTRUCTING EXTRACELLULAR POTASSIUM FROM AN In Vitro NETWORK The local extracellular potassium in an MEA network can be reconstructed and predicted using our approach (assimilation model is Hindmarsh-Rose) Actual Observed Predicted
43 KALMAN-TAKENS FILTER: THROWING OUT THE MODEL... Starting with historical observations {y 0,..., y n } Form Takens delay-embedding state vectors x i = (y i, y i 1,..., y i d ) Build an EnKF: Apply analog forecast to each ensemble member Use the observation function Hxi = y i Crucial to estimate Q and R
44 NO MODEL: KALMAN-TAKENS FILTER Reasons why this might not work: Nonparametric methods that depend on local reconstructions tend to fail in presence of observation noise Takens Theorem assumes no obs noise Neighbors in reconstruction space are wrong Kalman filters depend on knowledge of model to advance dynamics Filter equations need knowledge of dynamics to advance background
45 ENSEMBLE KALMAN FILTERING (ENKF) At kth step: Form an ensemble of state vectors centered at best guess x + k 1, with spread given by estimated covariance P+ k 1 Model f is applied to this ensemble and observed with function h. Form the prior state estimate x k and the predicted observation y k Construct the resulting covariance matrices P xy k and P y k Update the state and covariance estimates with the observation y k K k = P xy k (Py k ) 1 P + k = P k Pxy k (Py k ) 1 P yx k x + k = x k + K ( k yk y ) k.
46 ENSEMBLE KALMAN FILTERING (ENKF) At kth step: Form an ensemble of state vectors centered at best guess x + k 1, with spread given by estimated covariance P+ k 1 Model f is applied to this ensemble and observed with function h. Form the prior state estimate x k and the predicted observation y k Construct the resulting covariance matrices P xy k and P y k Update the state and covariance estimates with the observation y k K k = P xy k (Py k ) 1 P + k = P k Pxy k (Py k ) 1 P yx k x + k = x k + K ( k yk y ) k.
47 KALMAN-TAKENS FILTER At kth step: Form ensemble of reconstructed states centered at delay-coordinate vector x + k 1, with spread given by estimated covariance P + k 1 Apply nonparametric prediction and observe with function h. Estimate prior state x k and observation y k Construct the resulting covariance matrices P xy k and P y k Update the state and covariance estimates with the observation y k K k = P xy k (Py k ) 1 P + k = P k Pxy k (Py k ) 1 P yx k x + k = x k + K ( k yk y ) k.
48 KALMAN-TAKENS FILTER: LORENZ x x Time Time full model Kalman- Takens
49 KALMAN-TAKENS FILTER Comparison of Kalman-Takens (red) with full model (blue) RMSE Percent Noise
50 KALMAN-TAKENS FILTER: LORENZ 96 x 1 x Time Time full model Kalman- Takens
51 KALMAN-TAKENS FILTER: MODEL ERROR Lorenz 63 Lorenz 96
52 KALMAN-TAKENS FILTER: LORENZ Time Time
53 KALMAN-TAKENS FILTER: FORECAST ERROR El Nino index Training data: Jan Dec Forecasts after RMSE Forecast Steps (months)
54 SUMMARY EnKF is a useful data assimilation technique for neurodynamics and other types of data Parameter estimation Adaptive QR is helpful when Q and R are unknown Difficulties Model error Unmodeled variables No model Multimodel data assimilation Kalman-Takens filter
55 REFERENCES T. Berry, T. Sauer, Adaptive ensemble Kalman filtering of nonlinear systems. Tellus A 65, (2013) F. Hamilton, T. Berry, N. Peixoto, T. Sauer, Real-time tracking of neuronal network structure using data assimilation. Phys. Rev. E 88, (2013) F. Hamilton, J. Cressman, N. Peixoto, T. Sauer, Reconstructing neural dynamics using data assimilation with multiple models. Europhys. Lett. 107, (2014) F. Hamilton, T. Berry, T. Sauer, Ensemble Kalman filtering without a model. Phys. Rev. X 6, (2016). F. Hamilton, T. Berry, T. Sauer, Kalman-Takens filtering in the presence of dynamical noise. To appear, Eur. Phys. J.
Data assimilation with and without a model
Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU
More informationAdaptive ensemble Kalman filtering of nonlinear systems
Adaptive ensemble Kalman filtering of nonlinear systems Tyrus Berry George Mason University June 12, 213 : Problem Setup We consider a system of the form: x k+1 = f (x k ) + ω k+1 ω N (, Q) y k+1 = h(x
More informationReconstructing neural dynamics using data assimilation with multiple models
September 204 EPL 07 204 68005 doi: 0209/0295-5075/07/68005 wwwepljournalorg Reconstructing neural dynamics using data assimilation with multiple models Franz Hamilton John Cressman Nathalia Peixoto and
More informationarxiv: v1 [physics.data-an] 16 Nov 2016
EPJ manuscript No. (will be inserted by the editor) Kalman-Takens filtering in the presence of dynamical noise Franz Hamilton 1, Tyrus Berry, and Timothy Sauer,a arxiv:1611.1v1 [physics.data-an] 16 Nov
More informationCorrecting biased observation model error in data assimilation
Correcting biased observation model error in data assimilation Tyrus Berry Dept. of Mathematical Sciences, GMU PSU-UMD DA Workshop June 27, 217 Joint work with John Harlim, PSU BIAS IN OBSERVATION MODELS
More informationAn Efficient Ensemble Data Assimilation Approach To Deal With Range Limited Observation
An Efficient Ensemble Data Assimilation Approach To Deal With Range Limited Observation A. Shah 1,2, M. E. Gharamti 1, L. Bertino 1 1 Nansen Environmental and Remote Sensing Center 2 University of Bergen
More informationState and Parameter Estimation in Stochastic Dynamical Models
State and Parameter Estimation in Stochastic Dynamical Models Timothy DelSole George Mason University, Fairfax, Va and Center for Ocean-Land-Atmosphere Studies, Calverton, MD June 21, 2011 1 1 collaboration
More informationEnKF-based particle filters
EnKF-based particle filters Jana de Wiljes, Sebastian Reich, Wilhelm Stannat, Walter Acevedo June 20, 2017 Filtering Problem Signal dx t = f (X t )dt + 2CdW t Observations dy t = h(x t )dt + R 1/2 dv t.
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationEnsemble Kalman Filter
Ensemble Kalman Filter Geir Evensen and Laurent Bertino Hydro Research Centre, Bergen, Norway, Nansen Environmental and Remote Sensing Center, Bergen, Norway The Ensemble Kalman Filter (EnKF) Represents
More informationAdaptive ensemble Kalman filtering of nonlinear systems. Tyrus Berry and Timothy Sauer George Mason University, Fairfax, VA 22030
Generated using V3.2 of the official AMS LATEX template journal page layout FOR AUTHOR USE ONLY, NOT FOR SUBMISSION! Adaptive ensemble Kalman filtering of nonlinear systems Tyrus Berry and Timothy Sauer
More informationSemiparametric modeling: correcting low-dimensional model error in parametric models
Semiparametric modeling: correcting low-dimensional model error in parametric models Tyrus Berry a,, John Harlim a,b a Department of Mathematics, the Pennsylvania State University, 09 McAllister Building,
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationLagrangian Data Assimilation and Its Application to Geophysical Fluid Flows
Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Laura Slivinski June, 3 Laura Slivinski (Brown University) Lagrangian Data Assimilation June, 3 / 3 Data Assimilation Setup:
More informationEnsemble Kalman filtering without a model
Ensemble Kalman filtering without a model Franz Hamilton, Tyrus Berry, and Timothy Sauer, North Carolina State University, Raleigh, NC 7695 George Mason University, Fairfax, VA Methods of data assimilation
More informationNonlinear Filtering. With Polynomial Chaos. Raktim Bhattacharya. Aerospace Engineering, Texas A&M University uq.tamu.edu
Nonlinear Filtering With Polynomial Chaos Raktim Bhattacharya Aerospace Engineering, Texas A&M University uq.tamu.edu Nonlinear Filtering with PC Problem Setup. Dynamics: ẋ = f(x, ) Sensor Model: ỹ = h(x)
More informationA State Space Model for Wind Forecast Correction
A State Space Model for Wind Forecast Correction Valrie Monbe, Pierre Ailliot 2, and Anne Cuzol 1 1 Lab-STICC, Université Européenne de Bretagne, France (e-mail: valerie.monbet@univ-ubs.fr, anne.cuzol@univ-ubs.fr)
More informationData Assimilation Methods for Neuronal State and Parameter Estimation
Journal of Mathematical Neuroscience (2018) 8:11 https://doi.org/10.1186/s13408-018-0066-8 R E V I E W Open Access Data Assimilation Methods for Neuronal State and Parameter Estimation Matthew J. Moye
More informationA Note on the Particle Filter with Posterior Gaussian Resampling
Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and
More informationA data-driven method for improving the correlation estimation in serial ensemble Kalman filter
A data-driven method for improving the correlation estimation in serial ensemble Kalman filter Michèle De La Chevrotière, 1 John Harlim 2 1 Department of Mathematics, Penn State University, 2 Department
More informationNew Fast Kalman filter method
New Fast Kalman filter method Hojat Ghorbanidehno, Hee Sun Lee 1. Introduction Data assimilation methods combine dynamical models of a system with typically noisy observations to obtain estimates of the
More information(Extended) Kalman Filter
(Extended) Kalman Filter Brian Hunt 7 June 2013 Goals of Data Assimilation (DA) Estimate the state of a system based on both current and all past observations of the system, using a model for the system
More informationA new unscented Kalman filter with higher order moment-matching
A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This
More informationErgodicity in data assimilation methods
Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation
More informationA Crash Course on Kalman Filtering
A Crash Course on Kalman Filtering Dan Simon Cleveland State University Fall 2014 1 / 64 Outline Linear Systems Probability State Means and Covariances Least Squares Estimation The Kalman Filter Unknown
More informationSLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada
SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM
More informationKalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q
Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationThe crucial role of statistics in manifold learning
The crucial role of statistics in manifold learning Tyrus Berry Postdoc, Dept. of Mathematical Sciences, GMU Statistics Seminar GMU Feb., 26 Postdoctoral position supported by NSF ANALYSIS OF POINT CLOUDS
More informationBayesian Inverse problem, Data assimilation and Localization
Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is
More informationData assimilation in high dimensions
Data assimilation in high dimensions David Kelly Courant Institute New York University New York NY www.dtbkelly.com February 12, 2015 Graduate seminar, CIMS David Kelly (CIMS) Data assimilation February
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationMethods of Data Assimilation and Comparisons for Lagrangian Data
Methods of Data Assimilation and Comparisons for Lagrangian Data Chris Jones, Warwick and UNC-CH Kayo Ide, UCLA Andrew Stuart, Jochen Voss, Warwick Guillaume Vernieres, UNC-CH Amarjit Budiraja, UNC-CH
More informationLecture 7: Optimal Smoothing
Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run
More information1 Kalman Filter Introduction
1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The
More informationMaximum Likelihood Ensemble Filter Applied to Multisensor Systems
Maximum Likelihood Ensemble Filter Applied to Multisensor Systems Arif R. Albayrak a, Milija Zupanski b and Dusanka Zupanski c abc Colorado State University (CIRA), 137 Campus Delivery Fort Collins, CO
More informationWhat do we know about EnKF?
What do we know about EnKF? David Kelly Kody Law Andrew Stuart Andrew Majda Xin Tong Courant Institute New York University New York, NY April 10, 2015 CAOS seminar, Courant. David Kelly (NYU) EnKF April
More informationOn Parameter Estimation for Neuron Models
On Parameter Estimation for Neuron Models Abhijit Biswas Department of Mathematics Temple University November 30th, 2017 Abhijit Biswas (Temple University) On Parameter Estimation for Neuron Models November
More informationA Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization
A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline
More informationMachine Learning 4771
Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object
More informationThe Ensemble Kalman Filter:
p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49
State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing
More information9 Multi-Model State Estimation
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State
More informationFour-Dimensional Ensemble Kalman Filtering
Four-Dimensional Ensemble Kalman Filtering B.R. Hunt, E. Kalnay, E.J. Kostelich, E. Ott, D.J. Patil, T. Sauer, I. Szunyogh, J.A. Yorke, A.V. Zimin University of Maryland, College Park, MD 20742, USA Ensemble
More information4. DATA ASSIMILATION FUNDAMENTALS
4. DATA ASSIMILATION FUNDAMENTALS... [the atmosphere] "is a chaotic system in which errors introduced into the system can grow with time... As a consequence, data assimilation is a struggle between chaotic
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationA Stochastic Collocation based. for Data Assimilation
A Stochastic Collocation based Kalman Filter (SCKF) for Data Assimilation Lingzao Zeng and Dongxiao Zhang University of Southern California August 11, 2009 Los Angeles Outline Introduction SCKF Algorithm
More informationHierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References
24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained
More informationEnKF and filter divergence
EnKF and filter divergence David Kelly Andrew Stuart Kody Law Courant Institute New York University New York, NY dtbkelly.com December 12, 2014 Applied and computational mathematics seminar, NIST. David
More informationSIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS
ECE5550: Applied Kalman Filtering 9 1 SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS 9.1: Parameters versus states Until now, we have assumed that the state-space model of the system
More informationFundamentals of Data Assimilation
National Center for Atmospheric Research, Boulder, CO USA GSI Data Assimilation Tutorial - June 28-30, 2010 Acknowledgments and References WRFDA Overview (WRF Tutorial Lectures, H. Huang and D. Barker)
More informationConsider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.
ATMO/OPTI 656b Spring 009 Bayesian Retrievals Note: This follows the discussion in Chapter of Rogers (000) As we have seen, the problem with the nadir viewing emission measurements is they do not contain
More informationStochastic Collocation Methods for Polynomial Chaos: Analysis and Applications
Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535
More informationElectrophysiology of the neuron
School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of
More informationFundamentals of Data Assimila1on
014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University
More informationLecture 9. Time series prediction
Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture
More informationLocal Ensemble Transform Kalman Filter
Local Ensemble Transform Kalman Filter Brian Hunt 11 June 2013 Review of Notation Forecast model: a known function M on a vector space of model states. Truth: an unknown sequence {x n } of model states
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationTYRUS BERRY PhD in Mathematics from George Mason University. GPA: 4.0
RESEARCH POSITIONS TYRUS BERRY Post Doctoral Research Fellow, Department of Mathematical Sciences 4400 Exploratory Hall, George Mason University tyrus.berry@gmail.com Home Page: http://math.gmu.edu/~berry/
More informationA new Hierarchical Bayes approach to ensemble-variational data assimilation
A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationParameter Estimation in a Moving Horizon Perspective
Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE
More informationStatistics Homework #4
Statistics 910 1 Homework #4 Chapter 6, Shumway and Stoffer These are outlines of the solutions. If you would like to fill in other details, please come see me during office hours. 6.1 State-space representation
More informationL06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms
L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian
More informationAbstract 2. ENSEMBLE KALMAN FILTERS 1. INTRODUCTION
J5.4 4D ENSEMBLE KALMAN FILTERING FOR ASSIMILATION OF ASYNCHRONOUS OBSERVATIONS T. Sauer George Mason University, Fairfax, VA 22030 B.R. Hunt, J.A. Yorke, A.V. Zimin, E. Ott, E.J. Kostelich, I. Szunyogh,
More informationData assimilation in high dimensions
Data assimilation in high dimensions David Kelly Kody Law Andy Majda Andrew Stuart Xin Tong Courant Institute New York University New York NY www.dtbkelly.com February 3, 2016 DPMMS, University of Cambridge
More informationSmoothers: Types and Benchmarks
Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationBayesian Statistics and Data Assimilation. Jonathan Stroud. Department of Statistics The George Washington University
Bayesian Statistics and Data Assimilation Jonathan Stroud Department of Statistics The George Washington University 1 Outline Motivation Bayesian Statistics Parameter Estimation in Data Assimilation Combined
More informationEnKF and Catastrophic filter divergence
EnKF and Catastrophic filter divergence David Kelly Kody Law Andrew Stuart Mathematics Department University of North Carolina Chapel Hill NC bkelly.com June 4, 2014 SIAM UQ 2014 Savannah. David Kelly
More informationSingle neuron models. L. Pezard Aix-Marseille University
Single neuron models L. Pezard Aix-Marseille University Biophysics Biological neuron Biophysics Ionic currents Passive properties Active properties Typology of models Compartmental models Differential
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More informationCorrecting Observation Model Error. in Data Assimilation
1 Correcting Observation Model Error 2 in Data Assimilation 3 Franz Hamilton, Tyrus Berry and Timothy Sauer 4 George Mason University, Fairfax, VA 223, USA 5 6 7 8 9 North Carolina State University, Department
More informationImplicit sampling for particle filters. Alexandre Chorin, Mathias Morzfeld, Xuemin Tu, Ethan Atkins
0/20 Implicit sampling for particle filters Alexandre Chorin, Mathias Morzfeld, Xuemin Tu, Ethan Atkins University of California at Berkeley 2/20 Example: Try to find people in a boat in the middle of
More informationStability of Ensemble Kalman Filters
Stability of Ensemble Kalman Filters Idrissa S. Amour, Zubeda Mussa, Alexander Bibov, Antti Solonen, John Bardsley, Heikki Haario and Tuomo Kauranne Lappeenranta University of Technology University of
More informationPractical Aspects of Ensemble-based Kalman Filters
Practical Aspects of Ensemble-based Kalman Filters Lars Nerger Alfred Wegener Institute for Polar and Marine Research Bremerhaven, Germany and Bremen Supercomputing Competence Center BremHLR Bremen, Germany
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationConvergence of the Ensemble Kalman Filter in Hilbert Space
Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53
State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State
More informationEnsemble forecasting and flow-dependent estimates of initial uncertainty. Martin Leutbecher
Ensemble forecasting and flow-dependent estimates of initial uncertainty Martin Leutbecher acknowledgements: Roberto Buizza, Lars Isaksen Flow-dependent aspects of data assimilation, ECMWF 11 13 June 2007
More informationChimera states in networks of biological neurons and coupled damped pendulums
in neural models in networks of pendulum-like elements in networks of biological neurons and coupled damped pendulums J. Hizanidis 1, V. Kanas 2, A. Bezerianos 3, and T. Bountis 4 1 National Center for
More informationFrom neuronal oscillations to complexity
1/39 The Fourth International Workshop on Advanced Computation for Engineering Applications (ACEA 2008) MACIS 2 Al-Balqa Applied University, Salt, Jordan Corson Nathalie, Aziz Alaoui M.A. University of
More informationOverfitting, Bias / Variance Analysis
Overfitting, Bias / Variance Analysis Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 8, 207 / 40 Outline Administration 2 Review of last lecture 3 Basic
More informationGaussian Filtering Strategies for Nonlinear Systems
Gaussian Filtering Strategies for Nonlinear Systems Canonical Nonlinear Filtering Problem ~u m+1 = ~ f (~u m )+~ m+1 ~v m+1 = ~g(~u m+1 )+~ o m+1 I ~ f and ~g are nonlinear & deterministic I Noise/Errors
More informationState Estimation using Moving Horizon Estimation and Particle Filtering
State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /
More informationState Estimation Based on Nested Particle Filters
Cleveland State University EngagedScholarship@CSU ETD Archive 2013 State Estimation Based on Nested Particle Filters Swathi Srinivasan Cleveland State University How does access to this work benefit you?
More informationRobot Localization and Kalman Filters
Robot Localization and Kalman Filters Rudy Negenborn rudy@negenborn.net August 26, 2003 Outline Robot Localization Probabilistic Localization Kalman Filters Kalman Localization Kalman Localization with
More informationThe Kalman filter. Chapter 6
Chapter 6 The Kalman filter In the last chapter, we saw that in Data Assimilation, we ultimately desire knowledge of the full a posteriori p.d.f., that is the conditional p.d.f. of the state given the
More informationPrediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother
Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications
More informationKalman Filter and Ensemble Kalman Filter
Kalman Filter and Ensemble Kalman Filter 1 Motivation Ensemble forecasting : Provides flow-dependent estimate of uncertainty of the forecast. Data assimilation : requires information about uncertainty
More informationEE 565: Position, Navigation, and Timing
EE 565: Position, Navigation, and Timing Kalman Filtering Example Aly El-Osery Kevin Wedeward Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA In Collaboration with Stephen Bruder
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationKalman Filter Computer Vision (Kris Kitani) Carnegie Mellon University
Kalman Filter 16-385 Computer Vision (Kris Kitani) Carnegie Mellon University Examples up to now have been discrete (binary) random variables Kalman filtering can be seen as a special case of a temporal
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationThe Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance
The Kalman Filter Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Sarah Dance School of Mathematical and Physical Sciences, University of Reading s.l.dance@reading.ac.uk July
More informationState Estimation of Linear and Nonlinear Dynamic Systems
State Estimation of Linear and Nonlinear Dynamic Systems Part III: Nonlinear Systems: Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) James B. Rawlings and Fernando V. Lima Department of
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationLabor-Supply Shifts and Economic Fluctuations. Technical Appendix
Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January
More information