The Stochastic Inverse Problem in Groundwater Hydrology: A Kalman Filter Approach

Size: px
Start display at page:

Download "The Stochastic Inverse Problem in Groundwater Hydrology: A Kalman Filter Approach"

Transcription

1 The Stochastic Inverse Problem in Groundwater Hydrology: A Kalman Filter Approach Franz Konecny Institute of Mathematics and Applied Statistics C/muerszZ?/ o/ Agrzcw/Zwm/ ^czences, Fzermo /W J#0 Ore^or Men(fe/-^m^e gg, Fzenna, /l^^rm konecny@mail.boku.ac.at Abstract In this paper we are concerned with the identification of the hydraulic conductivity field on the basis of noisy head observations. An augmented Kalman smoother is applied to the stochastic flow equation. The implementation of the smoother requires the solution of a Riccati type partial differential equation for the smoother covariance and a stochastic partial differential equation to obtain the optimal estimate of the augmented state. We discussfinitedimensional approximations of the smoothing problem and propose an EM algorithm for the estimation of the conductivity parameters. 1 Introduction Groundwater models are used as tools for decision making in water resources management. The accuracy of prediction depends of the reliability of the parameter estimates. In this paper we are mainly concerned with the identification of a hydraulic conductivity field from noisy head data. The Kalman filter (or Kalman-Bucyfilter)is the optimal estimator of the state of a linear system at the current time, given noisy measurements. By augmentation of the state space, we are able not only to estimate the hydraulic heads as the state of the flow equation, but also the conductivity field. In several papers (reviewed in Ginn [4] and Me Laughlin and Townley [6]) extended Kalmanfilteringwas

2 298 Computer Methods in Water Resources XII used for Bayes estimation of the parameters in the context of the deterministic inverse problem. Kalman smoothing refers to estimating the state of a linear system, given noisy measurements of a fixed time interval. In [8], Riedel was concerned with the evolution of the temperature, where the diffusivity was an unobservable time dependent randomfield.he pointed out that inverse problems are generally analyzed off-line and therefore the Kalman smoother is more appropriate then the Kalman filter. In this paper, we adopt this approach and combine it with the EM algorithm. This algorithm was introduced by Dempster et al. [3] as a general framework for handling incomplete data problems. In our problem we use the EM algorithm for the estimation of unknown system parameters. We are unaware of any usage of the EM algorithm in the context of distributed parameter systems. 2 The mathematical model We consider a groundwater flow through a porous, heterogenous, isotropic medium. The equation governing the flow is = V (#V6) 4- *(f, z) + f (f, z), zed, (1) where h hydraulic head, S specific storage coefficient, K hydraulic conductivity, < = deterministic source term, (= stochastic forcing term and D flow region. The equation may be adapted to ID, 2D or 3D-flow, subject to initial and boundary conditions: h(t,x) = hi(t,x), O I xedd,0<t<tf (2) (Z,x) = &2(<,x), xe(9d,0 <Z <*/, where bo,/%i,&2 are given functions. The log-conductivity y(x) = log/i(x) is assumed to be a homogeneous and isotropic Gaussian randomfield,which can be written as Y(x) = ^y+^(x), (3) where //y = E[y(x)] and y(x) is the fluctuation part with E[y(x)] = 0. In the literature, the following covariance function is often used to represent the spatial variability of hydraulic conductivity: (4)

3 Computer Methods in Water Resources XII 299 0y is the variance of Y and ly the correlation length. We assume the stochastic forcing term (*, x) representing the modeling error is a distributed Gaussian white noise process with the following properties: )] = 0 (5) Then the hydraulic heads ft(*,x) are also random. We define ft(*,x) = E[h(t,x)] and h(t,x) = h(t,x) -h(t,x). Neclecting nonlinear terms Dagan [1] obtained the following equation with ko = e^ subject to the same initial and boundary conditions as equation (1). The linearized evolution equation for the head fluctuations is r\ T ^ a7 with the homogenous boundary condition and the initial condition A(Z,x) = 0, xe<9d, (8) ^(0,x) = ((x), (9) where ((%) also is Gaussian noise with E[((x)] = 0 and the covariance function PO(X,X') =E[((x)((x')]. An important task is the estimation of the unknown values of /iy,0y and ly. Usually only few measured values of the conductivity are available, which cannot provide reliable estimates of the unknown parameters. Therefore estimation has to be based on the measurements of the heads. For this task we augment the state space by adjunction of the auxiliar equation Tt = (io) with the initial condition /(0,x) = y(x). (11)

4 300 Computer Methods in Water Resources XII We rewrite eqs. (6) and (9) using the augmented state vector u = (hjf where u = ((, 0)^ and ^ = A(t,x)u(t,x) + ^(t,x), (12) ot b Let us assume that the head measurements are taken at fixed points x*,...,x^ E D, the measurement times are fast relative to the characteristic groundwater flow evolution time and are well modeled by a continuous measurement process. We define a m- vector hm(t) = Then, the measurement process is given by o where rj(t) is the measurement noise, a m-dimensional Gaussian white noise stochastically independent of (t,x) and ("(x). 3 Estimation theory We use the theory of optimalfilteringfor distributed parameter systems (d.p.s.) to our groundwater estimation problem. In our setting we have a linear state equation and Gaussian system- and measurement noise. Therefore we are in the realm of the Kalman filter. We follow the formal approach to optimalfilteringof Omatu and Seinfeld [7,ch. 10]. The Kalman-Filter u(t) is given by the filter equation (14) where F(t,x)=pM(t,x)R-*(t), The error-covariance function P(t,x,x') satisfies the p.d.e.(14). ot A(t, x)p(t, x, x') + A(t, x')p(t, x, x'} ), (15) P(0,x,x') = PQ(X,X').

5 Computer Methods in Water Resources XII 301 For the details of the implementation we refer to [7, p. 319]. Since we are concerned with an off-line estimation problem, we shall use the fixed interval Kalman smoother. It is an optimal estimate u(t\tf) of the state vector u(tf), based on the measurement data ZQ' = (z(f),0 <t< if). For the equations representing the estimate u(t\tf) and its variance P(J fy,x,x') we refer to Omatu and Seinfeld [7]- 4 Numerical Implementation Since the state space of a d.p.s. is an infinite-dimensional function space a finite-dimensional approximation is required. Following Seinfeld and Koda [9] we approximate the d.p.s. by a lumped parameter system at the very beginning of the problem. Then we use the conventional estimation theory forfinite-dimensionalsystems. A Galerkin approximation to (6) and (9) is constructed as follows: TV where (<^(x)} is a set of basis functions, satisfying the boundary conditions (7). The random field t/(x) may be approximated by a truncated Karhunen-Loeve expansion (cf. Wong [10]) N 77 = 1 We use the inner product notation of functions J U The expansions coefficients Hn(t) and Fn(t) are determined by the Galerkin equations (Seinfeld and Koda [9]) and / -V ^ ' ' A4 ^ '

6 302 Computer Methods in Water Resources XII This a linear system of ordinary differential equations. We set U = (H, F)T and define N x N matrices as Hence (17-18) can be written as (20) To (19) the conventional Kalman filter and Kalman smoother may be applied. If the domain is subdivided into triangular elements and & is taken as a shaping function, the procedure is the finite element method ) & Time step dt = Fig.l: Covariance function of the Kalman filter (dotted) and Kalman smoother 5 Numerical Example In order to test the estimation procedures under considerations we performed a numerical experiment. We consider a ID - flow with the domain D = [0,1], the source term 050, the boundary condition Ar(Z,0) = Ar(Z,l) = 0 and the initial condition Ao(z) = cos(trz).

7 Computer Methods in Water Resources XII 303 Then we have A(Z,z) = e-(^/*)^cos(7rz). The variances of f(z,z) and rn(t) are q = 0.5 and r = 0.1, respectively. We use a Galerkin approximation of the order TV = 3 with ^(x) = y/2sm(rnrx) and ipn(x) = Uncosvnx + sinunx, where the u^ are the positive solutions of the transcendental equation tanu = 2w/(w^ - 1). (For details concerning the Karhunen Loeve basis see Konecny [5]). Since {</> } and {ipn} are orthonormal function systems, we have d = Ci - 7, the identity matrix of order 3. Furthermore we choose 3 measurement points %I = 0.25, z% = 0.5, a^ = We obtained approximations of the optimal filter and smoother by application of the standard schemes (cf. Riedel [8],(2.4-5) and (2.9-12)) on the lumped system (19). Fig. 1 shows the error covariance functions in the measurement point x\ = 0.5 of the Kalman filter and the Kalman smoother. Fig. 2 shows a typical estimate of the log-conductivity Y(x) obtained by Kalman smoothing together with its nominal 95%-confidence band "1 0.9 "0 0, x Fig. 2: Smoothed log-conductivity with 95%-confidence bands 6 The EM Algorithm For the calibration of a groundwater model the conductivity parameters are to be estimated from the head measurements. So we are faced with a parameter estimation problem with partial observations, which can be approached by a combination of the EM algorithm and Kalman smoothing (cf. Dembo and Zeitouni [2]). An interesting feature of this iterative algorithm is that it generates a maximizing sequence of estimates in the sense that the corresponding sequence

8 304 Computer Methods in Water Resources XII of likelihood function values is monotone increasing. The procedure in general involves iterations of Kalman smoothing with known parameters and then a deterministic maximization. For a detailed description of the algorithm we refer to our working paper [5]. 7 Conclusion In this paper, we have presented an application of the augmented Kalman smoother to the stochastic inverse problem in groundwater hydrology. It has been applied to a linearized form of the flow equation augmented by a stochastic forcing term representing the model error. We have outlined the estimation of the conductivity parameters by an extension of the EM algorithm. References [1] Dagan, G.: Flow and Transport in Pourous Formations. New York: Springer-Verlag, [2] Dembo, A. and Zeitouni, O.: Parameter estimation of partially observed continuous time stochastic processes via the EM algorithm. Stochastic Processes and Applications 23 (1), , [3] Dempster, A. P., Laird, N. M. and Rubin, D. B.: Maximum Likelihood from Incomplete Data via the EM Algorithm. J. R. Statist. Soc. B 39, 1-38, [4] Ginn, T. R. and Cushman, J. H.: Inverse methods for subsurface flow: A critical review of stochastic techniques. Stochastic Hydro!. ##W. 4, 1-26, [5] Konecny, F.: The Stochastic Inverse Problem in Groundwater Hydrology: A Kalmanfilter-smootherApproach. Working paper, Institute of Mathematics and Applied Statistics, University of Agricultural Sciences, Vienna, [6] McLaughlin, D. and Town ley, L. R.: A reassessment of the groundwater inverse problem. Water Resour. Res. 32(5), , [7] Omatu, S. and J.H. Seinfeld: Distributed Parameter Systems, Theory and Applications. Oxford: Clarendon Press, [8] Riedel, K. S.: Optimal Estimation of Dynamically Evolving Diffusivities. Journal of Computational Physics 115, 1-11, [9] Seinfeld, J. H. and Koda, M.: Numerical Implementation of Distributed Parameter Filters with Application to Problems in Air Pollution. In: Distributed Parameter Systems: Modelling and Identification. Lecture Notes in Control and Information Sciences, vol. 1. New York: Springer-Verlag, [10] Wong, E.: Stochastic Processes in Information and Dynamic Systems. New York: McGraw-Hill, 1971.

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

ABSTRACT INTRODUCTION

ABSTRACT INTRODUCTION ABSTRACT Presented in this paper is an approach to fault diagnosis based on a unifying review of linear Gaussian models. The unifying review draws together different algorithms such as PCA, factor analysis,

More information

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Lessons in Estimation Theory for Signal Processing, Communications, and Control Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER KRISTOFFER P. NIMARK The Kalman Filter We will be concerned with state space systems of the form X t = A t X t 1 + C t u t 0.1 Z t

More information

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

Assessment of Hydraulic Conductivity Upscaling Techniques and. Associated Uncertainty

Assessment of Hydraulic Conductivity Upscaling Techniques and. Associated Uncertainty CMWRXVI Assessment of Hydraulic Conductivity Upscaling Techniques and Associated Uncertainty FARAG BOTROS,, 4, AHMED HASSAN 3, 4, AND GREG POHLL Division of Hydrologic Sciences, University of Nevada, Reno

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

A Hilbert Space for Random Processes

A Hilbert Space for Random Processes Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of

More information

Lecture 6: April 19, 2002

Lecture 6: April 19, 2002 EE596 Pat. Recog. II: Introduction to Graphical Models Spring 2002 Lecturer: Jeff Bilmes Lecture 6: April 19, 2002 University of Washington Dept. of Electrical Engineering Scribe: Huaning Niu,Özgür Çetin

More information

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/

More information

Nonlinear State Estimation! Extended Kalman Filters!

Nonlinear State Estimation! Extended Kalman Filters! Nonlinear State Estimation! Extended Kalman Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Deformation of the probability distribution!! Neighboring-optimal

More information

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1 To appear in M. S. Kearns, S. A. Solla, D. A. Cohn, (eds.) Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 999. Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin

More information

Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations

Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations Ensemble Kalman filter assimilation of transient groundwater flow data via stochastic moment equations Alberto Guadagnini (1,), Marco Panzeri (1), Monica Riva (1,), Shlomo P. Neuman () (1) Department of

More information

Quantitative Trendspotting. Rex Yuxing Du and Wagner A. Kamakura. Web Appendix A Inferring and Projecting the Latent Dynamic Factors

Quantitative Trendspotting. Rex Yuxing Du and Wagner A. Kamakura. Web Appendix A Inferring and Projecting the Latent Dynamic Factors 1 Quantitative Trendspotting Rex Yuxing Du and Wagner A. Kamakura Web Appendix A Inferring and Projecting the Latent Dynamic Factors The procedure for inferring the latent state variables (i.e., [ ] ),

More information

A Spectral Approach to Linear Bayesian Updating

A Spectral Approach to Linear Bayesian Updating A Spectral Approach to Linear Bayesian Updating Oliver Pajonk 1,2, Bojana V. Rosic 1, Alexander Litvinenko 1, and Hermann G. Matthies 1 1 Institute of Scientific Computing, TU Braunschweig, Germany 2 SPT

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (6-83, F) Lecture# (Monday November ) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan Applications of Gaussian Processes (a) Inverse Kinematics

More information

Riccati difference equations to non linear extended Kalman filter constraints

Riccati difference equations to non linear extended Kalman filter constraints International Journal of Scientific & Engineering Research Volume 3, Issue 12, December-2012 1 Riccati difference equations to non linear extended Kalman filter constraints Abstract Elizabeth.S 1 & Jothilakshmi.R

More information

Maximum Likelihood (ML), Expectation Maximization (EM) Pieter Abbeel UC Berkeley EECS

Maximum Likelihood (ML), Expectation Maximization (EM) Pieter Abbeel UC Berkeley EECS Maximum Likelihood (ML), Expectation Maximization (EM) Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Outline Maximum likelihood (ML) Priors, and

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Simo Särkkä Aalto University Tampere University of Technology Lappeenranta University of Technology Finland November

More information

Estimating the parameters of hidden binomial trials by the EM algorithm

Estimating the parameters of hidden binomial trials by the EM algorithm Hacettepe Journal of Mathematics and Statistics Volume 43 (5) (2014), 885 890 Estimating the parameters of hidden binomial trials by the EM algorithm Degang Zhu Received 02 : 09 : 2013 : Accepted 02 :

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München Motivation Bayes filter is a useful tool for state

More information

Maximum variance formulation

Maximum variance formulation 12.1. Principal Component Analysis 561 Figure 12.2 Principal component analysis seeks a space of lower dimensionality, known as the principal subspace and denoted by the magenta line, such that the orthogonal

More information

Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method

Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method Jiachuan He University of Texas at Austin April 15, 2016 Jiachuan

More information

Introduction to Computational Stochastic Differential Equations

Introduction to Computational Stochastic Differential Equations Introduction to Computational Stochastic Differential Equations Gabriel J. Lord Catherine E. Powell Tony Shardlow Preface Techniques for solving many of the differential equations traditionally used by

More information

= _(2,r)af OG(x, a) 0p(a, y da (2) Aj(r) = (2*r)a (Oh(x)y,(y)) A (r) = -(2,r) a Ox Oxj G(x, a)p(a, y) da

= _(2,r)af OG(x, a) 0p(a, y da (2) Aj(r) = (2*r)a (Oh(x)y,(y)) A (r) = -(2,r) a Ox Oxj G(x, a)p(a, y) da WATER RESOURCES RESEARCH, VOL. 35, NO. 7, PAGES 2273-2277, JULY 999 A general method for obtaining analytical expressions for the first-order velocity covariance in heterogeneous porous media Kuo-Chin

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Analysis of Regression and Bayesian Predictive Uncertainty Measures

Analysis of Regression and Bayesian Predictive Uncertainty Measures Analysis of and Predictive Uncertainty Measures Dan Lu, Mary C. Hill, Ming Ye Florida State University, dl7f@fsu.edu, mye@fsu.edu, Tallahassee, FL, USA U.S. Geological Survey, mchill@usgs.gov, Boulder,

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian

More information

Volume 30, Issue 3. A note on Kalman filter approach to solution of rational expectations models

Volume 30, Issue 3. A note on Kalman filter approach to solution of rational expectations models Volume 30, Issue 3 A note on Kalman filter approach to solution of rational expectations models Marco Maria Sorge BGSE, University of Bonn Abstract In this note, a class of nonlinear dynamic models under

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Efficient Solvers for Stochastic Finite Element Saddle Point Problems

Efficient Solvers for Stochastic Finite Element Saddle Point Problems Efficient Solvers for Stochastic Finite Element Saddle Point Problems Catherine E. Powell c.powell@manchester.ac.uk School of Mathematics University of Manchester, UK Efficient Solvers for Stochastic Finite

More information

Platzhalter für Bild, Bild auf Titelfolie hinter das Logo einsetzen

Platzhalter für Bild, Bild auf Titelfolie hinter das Logo einsetzen Platzhalter für Bild, Bild auf Titelfolie hinter das Logo einsetzen PARAMetric UNCertainties, Budapest STOCHASTIC PROCESSES AND FIELDS Noémi Friedman Institut für Wissenschaftliches Rechnen, wire@tu-bs.de

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

A Stochastic Collocation based. for Data Assimilation

A Stochastic Collocation based. for Data Assimilation A Stochastic Collocation based Kalman Filter (SCKF) for Data Assimilation Lingzao Zeng and Dongxiao Zhang University of Southern California August 11, 2009 Los Angeles Outline Introduction SCKF Algorithm

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS XIX International Conference on Water Resources CMWR 2012 University of Illinois at Urbana-Champain June 17-22,2012 SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e. Kalman Filter Localization Bayes Filter Reminder Prediction Correction Gaussians p(x) ~ N(µ,σ 2 ) : Properties of Gaussians Univariate p(x) = 1 1 2πσ e 2 (x µ) 2 σ 2 µ Univariate -σ σ Multivariate µ Multivariate

More information

Open Economy Macroeconomics: Theory, methods and applications

Open Economy Macroeconomics: Theory, methods and applications Open Economy Macroeconomics: Theory, methods and applications Lecture 4: The state space representation and the Kalman Filter Hernán D. Seoane UC3M January, 2016 Today s lecture State space representation

More information

1.1 Basis of Statistical Decision Theory

1.1 Basis of Statistical Decision Theory ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of

More information

The Ensemble Kalman Filter:

The Ensemble Kalman Filter: p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets

Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets Matthias Katzfuß Advisor: Dr. Noel Cressie Department of Statistics The Ohio State University

More information

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties Chapter 6: Steady-State Data with Model Uncertainties CHAPTER 6 Steady-State Data with Model Uncertainties 6.1 Models with Uncertainties In the previous chapters, the models employed in the DR were considered

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

SIMON FRASER UNIVERSITY School of Engineering Science

SIMON FRASER UNIVERSITY School of Engineering Science SIMON FRASER UNIVERSITY School of Engineering Science Course Outline ENSC 810-3 Digital Signal Processing Calendar Description This course covers advanced digital signal processing techniques. The main

More information

11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments

11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments 11280 Electrical Resistivity Tomography Time-lapse Monitoring of Three-dimensional Synthetic Tracer Test Experiments M. Camporese (University of Padova), G. Cassiani* (University of Padova), R. Deiana

More information

Inverse Modelling for Flow and Transport in Porous Media

Inverse Modelling for Flow and Transport in Porous Media Inverse Modelling for Flow and Transport in Porous Media Mauro Giudici 1 Dipartimento di Scienze della Terra, Sezione di Geofisica, Università degli Studi di Milano, Milano, Italy Lecture given at the

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Semigroup Generation

Semigroup Generation Semigroup Generation Yudi Soeharyadi Analysis & Geometry Research Division Faculty of Mathematics and Natural Sciences Institut Teknologi Bandung WIDE-Workshoop in Integral and Differensial Equations 2017

More information

Learning Nonlinear Dynamical Systems using an EM Algorithm

Learning Nonlinear Dynamical Systems using an EM Algorithm Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin Ghahramani and Sam T. Roweis Gatsby Computational Neuroscience Unit University College London London WC1N 3AR, U.K. http://www.gatsby.ucl.ac.uk/

More information

A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models

A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes (bilmes@cs.berkeley.edu) International Computer Science Institute

More information

BME STUDIES OF STOCHASTIC DIFFERENTIAL EQUATIONS REPRESENTING PHYSICAL LAW

BME STUDIES OF STOCHASTIC DIFFERENTIAL EQUATIONS REPRESENTING PHYSICAL LAW 7 VIII. BME STUDIES OF STOCHASTIC DIFFERENTIAL EQUATIONS REPRESENTING PHYSICAL LAW A wide variety of natural processes are described using physical laws. A physical law may be expressed by means of an

More information

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt. SINGAPORE SHANGHAI Vol TAIPEI - Interdisciplinary Mathematical Sciences 19 Kernel-based Approximation Methods using MATLAB Gregory Fasshauer Illinois Institute of Technology, USA Michael McCourt University

More information

CS281 Section 4: Factor Analysis and PCA

CS281 Section 4: Factor Analysis and PCA CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we

More information

Linear-Quadratic-Gaussian (LQG) Controllers and Kalman Filters

Linear-Quadratic-Gaussian (LQG) Controllers and Kalman Filters Linear-Quadratic-Gaussian (LQG) Controllers and Kalman Filters Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 204 Emo Todorov (UW) AMATH/CSE 579, Winter

More information

ECE 3800 Probabilistic Methods of Signal and System Analysis

ECE 3800 Probabilistic Methods of Signal and System Analysis ECE 3800 Probabilistic Methods of Signal and System Analysis Dr. Bradley J. Bazuin Western Michigan University College of Engineering and Applied Sciences Department of Electrical and Computer Engineering

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras

Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras Lecture 1: Introduction Course Objectives: The focus of this course is on gaining understanding on how to make an

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Simo Särkkä Aalto University, Finland November 18, 2014 Simo Särkkä (Aalto) Lecture 4: Numerical Solution of SDEs November

More information

5.1 2D example 59 Figure 5.1: Parabolic velocity field in a straight two-dimensional pipe. Figure 5.2: Concentration on the input boundary of the pipe. The vertical axis corresponds to r 2 -coordinate,

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

9 Multi-Model State Estimation

9 Multi-Model State Estimation Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State

More information

Random Fields in Bayesian Inference: Effects of the Random Field Discretization

Random Fields in Bayesian Inference: Effects of the Random Field Discretization Random Fields in Bayesian Inference: Effects of the Random Field Discretization Felipe Uribe a, Iason Papaioannou a, Wolfgang Betz a, Elisabeth Ullmann b, Daniel Straub a a Engineering Risk Analysis Group,

More information

A COMPARISON OF COMPOUND POISSON CLASS DISTRIBUTIONS

A COMPARISON OF COMPOUND POISSON CLASS DISTRIBUTIONS The International Conference on Trends and Perspectives in Linear Statistical Inference A COMPARISON OF COMPOUND POISSON CLASS DISTRIBUTIONS Dr. Deniz INAN Oykum Esra ASKIN (PHD.CANDIDATE) Dr. Ali Hakan

More information

CALIFORNIA INSTITUTE OF TECHNOLOGY Control and Dynamical Systems. CDS 110b

CALIFORNIA INSTITUTE OF TECHNOLOGY Control and Dynamical Systems. CDS 110b CALIFORNIA INSTITUTE OF TECHNOLOGY Control and Dynamical Systems CDS 110b R. M. Murray Kalman Filters 14 January 2007 Reading: This set of lectures provides a brief introduction to Kalman filtering, following

More information

Partially Observable Markov Decision Processes (POMDPs)

Partially Observable Markov Decision Processes (POMDPs) Partially Observable Markov Decision Processes (POMDPs) Sachin Patil Guest Lecture: CS287 Advanced Robotics Slides adapted from Pieter Abbeel, Alex Lee Outline Introduction to POMDPs Locally Optimal Solutions

More information

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel Lecture Notes 4 Vector Detection and Estimation Vector Detection Reconstruction Problem Detection for Vector AGN Channel Vector Linear Estimation Linear Innovation Sequence Kalman Filter EE 278B: Random

More information

SIMULTANEOUS CONFIDENCE INTERVALS AMONG k MEAN VECTORS IN REPEATED MEASURES WITH MISSING DATA

SIMULTANEOUS CONFIDENCE INTERVALS AMONG k MEAN VECTORS IN REPEATED MEASURES WITH MISSING DATA SIMULTANEOUS CONFIDENCE INTERVALS AMONG k MEAN VECTORS IN REPEATED MEASURES WITH MISSING DATA Kazuyuki Koizumi Department of Mathematics, Graduate School of Science Tokyo University of Science 1-3, Kagurazaka,

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Stochastic Integration and Stochastic Differential Equations: a gentle introduction

Stochastic Integration and Stochastic Differential Equations: a gentle introduction Stochastic Integration and Stochastic Differential Equations: a gentle introduction Oleg Makhnin New Mexico Tech Dept. of Mathematics October 26, 27 Intro: why Stochastic? Brownian Motion/ Wiener process

More information

The ECMWF Hybrid 4D-Var and Ensemble of Data Assimilations

The ECMWF Hybrid 4D-Var and Ensemble of Data Assimilations The Hybrid 4D-Var and Ensemble of Data Assimilations Lars Isaksen, Massimo Bonavita and Elias Holm Data Assimilation Section lars.isaksen@ecmwf.int Acknowledgements to: Mike Fisher and Marta Janiskova

More information

Hierarchical Parallel Solution of Stochastic Systems

Hierarchical Parallel Solution of Stochastic Systems Hierarchical Parallel Solution of Stochastic Systems Second M.I.T. Conference on Computational Fluid and Solid Mechanics Contents: Simple Model of Stochastic Flow Stochastic Galerkin Scheme Resulting Equations

More information

New Fast Kalman filter method

New Fast Kalman filter method New Fast Kalman filter method Hojat Ghorbanidehno, Hee Sun Lee 1. Introduction Data assimilation methods combine dynamical models of a system with typically noisy observations to obtain estimates of the

More information

EL2520 Control Theory and Practice

EL2520 Control Theory and Practice EL2520 Control Theory and Practice Lecture 8: Linear quadratic control Mikael Johansson School of Electrical Engineering KTH, Stockholm, Sweden Linear quadratic control Allows to compute the controller

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part I: Linear Systems with Gaussian Noise James B. Rawlings and Fernando V. Lima Department of Chemical and Biological Engineering University of

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

PACKAGE LMest FOR LATENT MARKOV ANALYSIS

PACKAGE LMest FOR LATENT MARKOV ANALYSIS PACKAGE LMest FOR LATENT MARKOV ANALYSIS OF LONGITUDINAL CATEGORICAL DATA Francesco Bartolucci 1, Silvia Pandofi 1, and Fulvia Pennoni 2 1 Department of Economics, University of Perugia (e-mail: francesco.bartolucci@unipg.it,

More information

Gaussian Mixture Model Uncertainty Learning (GMMUL) Version 1.0 User Guide

Gaussian Mixture Model Uncertainty Learning (GMMUL) Version 1.0 User Guide Gaussian Mixture Model Uncertainty Learning (GMMUL) Version 1. User Guide Alexey Ozerov 1, Mathieu Lagrange and Emmanuel Vincent 1 1 INRIA, Centre de Rennes - Bretagne Atlantique Campus de Beaulieu, 3

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I

More information

MIXTURE MODELS AND EM

MIXTURE MODELS AND EM Last updated: November 6, 212 MIXTURE MODELS AND EM Credits 2 Some of these slides were sourced and/or modified from: Christopher Bishop, Microsoft UK Simon Prince, University College London Sergios Theodoridis,

More information

Gaussian process for nonstationary time series prediction

Gaussian process for nonstationary time series prediction Computational Statistics & Data Analysis 47 (2004) 705 712 www.elsevier.com/locate/csda Gaussian process for nonstationary time series prediction Soane Brahim-Belhouari, Amine Bermak EEE Department, Hong

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Solution of Stochastic Nonlinear PDEs Using Wiener-Hermite Expansion of High Orders

Solution of Stochastic Nonlinear PDEs Using Wiener-Hermite Expansion of High Orders Solution of Stochastic Nonlinear PDEs Using Wiener-Hermite Expansion of High Orders Dr. Mohamed El-Beltagy 1,2 Joint Wor with Late Prof. Magdy El-Tawil 2 1 Effat University, Engineering College, Electrical

More information

BME STUDIES OF STOCHASTIC DIFFERENTIAL EQUATIONS REPRESENTING PHYSICAL LAWS -PART II

BME STUDIES OF STOCHASTIC DIFFERENTIAL EQUATIONS REPRESENTING PHYSICAL LAWS -PART II BME SUDIES OF SOCHASIC DIFFERENIAL EQUAIONS REPRESENING PHYSICAL LAWS -PAR II M.L. Serre and G. Christakos Environmental Modelling Program, Department of Environmental Sciences & Engineering School of

More information

O.R. Jimoh, M.Tech. Department of Mathematics/Statistics, Federal University of Technology, PMB 65, Minna, Niger State, Nigeria.

O.R. Jimoh, M.Tech. Department of Mathematics/Statistics, Federal University of Technology, PMB 65, Minna, Niger State, Nigeria. Comparative Analysis of a Non-Reactive Contaminant Flow Problem for Constant Initial Concentration in Two Dimensions by Homotopy-Perturbation and Variational Iteration Methods OR Jimoh, MTech Department

More information

Mini-project in scientific computing

Mini-project in scientific computing Mini-project in scientific computing Eran Treister Computer Science Department, Ben-Gurion University of the Negev, Israel. March 7, 2018 1 / 30 Scientific computing Involves the solution of large computational

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Optimization-Based Control

Optimization-Based Control Optimization-Based Control Richard M. Murray Control and Dynamical Systems California Institute of Technology DRAFT v1.7a, 19 February 2008 c California Institute of Technology All rights reserved. This

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information