Data assimilation as an optimal control problem and applications to UQ

Size: px
Start display at page:

Download "Data assimilation as an optimal control problem and applications to UQ"

Transcription

1 Data assimilation as an optimal control problem and applications to UQ Walter Acevedo, Angwenyi David, Jana de Wiljes & Sebastian Reich Universität Potsdam/ University of Reading IPAM, November 13th 2017 Universität Potsdam/ University of Reading 1

2 Numerical Weather Prediction Model: highly nonlinear discretized partial differential equations Data: heterogeneous mix of ground-, airborne-, satellite-based and radar data 24/7 data assimilation service for optimal weather prediction Universität Potsdam/ University of Reading 2

3 Numerical Weather Prediction Model: highly nonlinear discretized partial differential equations Data: heterogeneous mix of ground-, airborne-, satellite-based and radar data 24/7 data assimilation service for optimal weather prediction Universität Potsdam/ University of Reading 2

4 Problem setting Model: dx t = f (x t, λ) dt + σ(x t ) dw t, (x 0, λ) π 0 Data/Observations: (A) continuous-in-time dy t = h(x t, λ) dt + R 1/2 dv t (B) discrete-in-time y tn = h(x tn, λ) + R 1/2 Ξ tn. Goal: Approximate conditional PDF π t (x, λ) = π t (x, λ Y t ) where Y t contains all the data up to time t and π 0 = π 0 at initial time. Universität Potsdam/ University of Reading 3

5 Problem setting Time evolved marginal distributions: P(x Y 0:t ) P(x) time t Universität Potsdam/ University of Reading 4

6 Problem setting Applications assimilation of observations into computer model (e.g. weather forecasting), assimilation of synthetic data from a high resolution model into a model of lower resolution (e.g. parameter estimation), rare event simulation (data comes from possible rare event scenarios), solution of inverse problems, minimization (model dynamics is trivial, e.g., dx t = 0). Universität Potsdam/ University of Reading 5

7 Problem setting Particle filters: Approximate conditional PDF π t by a set of particles z i t = (xi t, λi ), i = 1,..., M, t with weights w i t 0, E.g. sequential Monte Carlo. s.t. w i t = 1. i Eulerian: z i t = zi 0 (particle locations are fixed) E.g., grid-based methods. Lagrangian: subject of this talk. w i t = 1/M (weights are fixed) Universität Potsdam/ University of Reading 6

8 Problem setting Particle filters: Approximate conditional PDF π t by a set of particles z i t = (xi t, λi ), i = 1,..., M, t with weights w i t 0, E.g. sequential Monte Carlo. s.t. w i t = 1. i Eulerian: z i t = zi 0 (particle locations are fixed) E.g., grid-based methods. Lagrangian: subject of this talk. w i t = 1/M (weights are fixed) Universität Potsdam/ University of Reading 6

9 Ensemble Kalman-Bucy filter A control approach to the continuous-in-time filtering problem: Kalman gain factor: K t = P xh R 1, P xh denotes the covariance matrix between x t and h(x t ) Innovation: di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t Ensemble Kalman-Bucy filter: dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t Universität Potsdam/ University of Reading 7

10 Ensemble Kalman-Bucy filter A control approach to the continuous-in-time filtering problem: Kalman gain factor: K t = P xh R 1, P xh denotes the covariance matrix between x t and h(x t ) Innovation: di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t Ensemble Kalman-Bucy filter: dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t Universität Potsdam/ University of Reading 7

11 Ensemble Kalman-Bucy filter A control approach to the continuous-in-time filtering problem: Kalman gain factor: K t = P xh R 1, P xh denotes the covariance matrix between x t and h(x t ) Innovation: di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t Ensemble Kalman-Bucy filter: dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t Universität Potsdam/ University of Reading 7

12 Ensemble Kalman-Bucy filter A control approach to the continuous-in-time filtering problem: Kalman gain factor: K t = P xh R 1, P xh denotes the covariance matrix between x t and h(x t ) Innovation: di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t Ensemble Kalman-Bucy filter: dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t Universität Potsdam/ University of Reading 7

13 Numerical implementation (i) Draw samples x i 0, i = 1,..., M from the initial distribution π 0. (ii) Solve the interacting particle system with innovation and gain factor i = 1,..., M. dx i t = f (xi t, λ)dt + σ(xi t ) dwi t + K t di i t, di i t = dy t dy i t, dyi t := h(xi t )dt + R 1/2 du i t, K t = P xh R 1, P xh := 1 M 1 M (x i t x t )(h(x i t ) h t ) T, i=1 Universität Potsdam/ University of Reading 8

14 Feedback particle filter motivation Consider zero-drift & scalar SDE dx t = σ(x t ) dw t with time-evolved expectation values: t π t [f ] = π 0 [f ] + π s [Lf ] ds, Lf := σ x(σ x f ) Interacting particle representation: ẋ t = 1 2 σ(x t)i t (x t ), I t := π 1 t x (π t σ) It holds that (Liouville plus integration by parts) ˆπ t [f ] := π 0 [f ] + 1 t π s [( x f )(σi s )] ds 2 0 t = π 0 [f ] + π s [Lf ] ds = π t [f ]. 0 Universität Potsdam/ University of Reading 9

15 Feedback particle filter motivation Consider zero-drift & scalar SDE dx t = σ(x t ) dw t with time-evolved expectation values: t π t [f ] = π 0 [f ] + π s [Lf ] ds, Lf := σ x(σ x f ) Interacting particle representation: ẋ t = 1 2 σ(x t)i t (x t ), I t := π 1 t x (π t σ) It holds that (Liouville plus integration by parts) ˆπ t [f ] := π 0 [f ] + 1 t π s [( x f )(σi s )] ds 2 0 t = π 0 [f ] + π s [Lf ] ds = π t [f ]. 0 Universität Potsdam/ University of Reading 9

16 Feedback particle filter motivation Consider zero-drift & scalar SDE dx t = σ(x t ) dw t with time-evolved expectation values: t π t [f ] = π 0 [f ] + π s [Lf ] ds, Lf := σ x(σ x f ) Interacting particle representation: ẋ t = 1 2 σ(x t)i t (x t ), I t := π 1 t x (π t σ) It holds that (Liouville plus integration by parts) ˆπ t [f ] := π 0 [f ] + 1 t π s [( x f )(σi s )] ds 2 0 t = π 0 [f ] + π s [Lf ] ds = π t [f ]. 0 Universität Potsdam/ University of Reading 9

17 Feedback particle filter Generalization of the ensemble Kalman-Bucy filter: Innovation di t as before, i.e., dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t, Kalman gain matrix K t = ψ t with x ( ˆπ t x ψ t ) = R 1 ˆπ t (h π t [h]). It can be shown that π t [f ] = ˆπ t [f ]. Universität Potsdam/ University of Reading 10

18 Feedback particle filter Generalization of the ensemble Kalman-Bucy filter: Innovation di t as before, i.e., dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t, Kalman gain matrix K t = ψ t with x ( ˆπ t x ψ t ) = R 1 ˆπ t (h π t [h]). It can be shown that π t [f ] = ˆπ t [f ]. Universität Potsdam/ University of Reading 10

19 Feedback particle filter Generalization of the ensemble Kalman-Bucy filter: Innovation di t as before, i.e., dx t = f (x t, λ) dt + σ(x t ) dw t + K t di t di t = dy t 1 2 (h(x t) h t ) dt or di t = dy t h(x t ) dt R 1/2 du t, Kalman gain matrix K t = ψ t with x ( ˆπ t x ψ t ) = R 1 ˆπ t (h π t [h]). It can be shown that π t [f ] = ˆπ t [f ]. Universität Potsdam/ University of Reading 10

20 Feedback particle filter Structural form of filter formulations: Universität Potsdam/ University of Reading 11

21 Feedback particle filter Numerical implementation I A) Diffusion maps: Define elliptic operator L π ψ := π 1 x (π x ψ) and introduce the approximation e εl π Id ψ h π[h] ε The exponential e εl π can be approximated using diffusion maps. B) Optimal transport: Find optimal coupling T ε between π and π ε = π(1 + ε(h π[h]). Then x ψ(x) T ε(x) x ε. Universität Potsdam/ University of Reading 12

22 Feedback particle filter Numerical implementation I A) Diffusion maps: Define elliptic operator L π ψ := π 1 x (π x ψ) and introduce the approximation e εl π Id ψ h π[h] ε The exponential e εl π can be approximated using diffusion maps. B) Optimal transport: Find optimal coupling T ε between π and π ε = π(1 + ε(h π[h]). Then x ψ(x) T ε(x) x ε. Universität Potsdam/ University of Reading 12

23 Feedback particle filter Numerical implementation II The time-evolution of an ensemble of M particles x i is given by t dx i t = f (xi t, λ) dt + σ(xi t ) dwi t K i t dii t Gain: Innovation: or M K i t := x j t d ji x ψ t (x i t ) j=1 di i t = dy t 1 2 (h(xi t ) h t ) dt di i t = dy t h(x i t ) dt R1/2 du i t, Universität Potsdam/ University of Reading 13

24 References EnKBF: SR (BIT, 2011), Bergemann & SR (Meteorol. Zeitschrift, 2012) FPF: de Wiljes, Stannat & SR (ArXiv: , 2016), Del Moral, Kurtzmann & Tugaut (ArXiv: , 2016) Yang, Mehta & Meyn (IEEE Trans. Autom. Contr., 2013) Taghvaei, de Wiljes, Mehta & SR (ArXiv: ), David, de Wiljes & SR (ArXiv: ) Alternative control formulation: Crisan & Xiong (ESAIM, 2007), Crisan & Xiong (Stochastics, 2010) Universität Potsdam/ University of Reading 14

25 References EnKBF: SR (BIT, 2011), Bergemann & SR (Meteorol. Zeitschrift, 2012) FPF: de Wiljes, Stannat & SR (ArXiv: , 2016), Del Moral, Kurtzmann & Tugaut (ArXiv: , 2016) Yang, Mehta & Meyn (IEEE Trans. Autom. Contr., 2013) Taghvaei, de Wiljes, Mehta & SR (ArXiv: ), David, de Wiljes & SR (ArXiv: ) Alternative control formulation: Crisan & Xiong (ESAIM, 2007), Crisan & Xiong (Stochastics, 2010) Universität Potsdam/ University of Reading 14

26 References EnKBF: SR (BIT, 2011), Bergemann & SR (Meteorol. Zeitschrift, 2012) FPF: de Wiljes, Stannat & SR (ArXiv: , 2016), Del Moral, Kurtzmann & Tugaut (ArXiv: , 2016) Yang, Mehta & Meyn (IEEE Trans. Autom. Contr., 2013) Taghvaei, de Wiljes, Mehta & SR (ArXiv: ), David, de Wiljes & SR (ArXiv: ) Alternative control formulation: Crisan & Xiong (ESAIM, 2007), Crisan & Xiong (Stochastics, 2010) Universität Potsdam/ University of Reading 14

27 Bayesian inference Bayes: π(x y) π(y x) π(x) Task: Given a set of samples x i π(x), produce an estimator for 0 expectation values with respect to the posterior distribution. Application: Learn models y out = Ψ x (y in ), parametrized by x, which gives rise to the likelihood π(y x). Standard methodologies: importances sampling MCMC homotopy methods: learning models by making them interact Universität Potsdam/ University of Reading 15

28 Bayesian inference Bayes: π(x y) π(y x) π(x) Task: Given a set of samples x i π(x), produce an estimator for 0 expectation values with respect to the posterior distribution. Application: Learn models y out = Ψ x (y in ), parametrized by x, which gives rise to the likelihood π(y x). Standard methodologies: importances sampling MCMC homotopy methods: learning models by making them interact Universität Potsdam/ University of Reading 15

29 Homotopy method I Homotopy: Define family of distributions π α (x) e αl π(x), where L(x) := ln π(y x) and α 0. The posterior is obtained for α = 1. The function L can also stand for a tilting potential in rare event simulations/ importance sampling. Dynamic formulation: Bayes 1 α π α = π α (L π α [L]), π[e L ] = 1 + π α [L]dt, 0 Liouville α π α = x (π α x ψ α ) and parameter dynamics given by ẋ = x ψ α (x). Universität Potsdam/ University of Reading 16

30 Homotopy method I Homotopy: Define family of distributions π α (x) e αl π(x), where L(x) := ln π(y x) and α 0. The posterior is obtained for α = 1. The function L can also stand for a tilting potential in rare event simulations/ importance sampling. Dynamic formulation: Bayes 1 α π α = π α (L π α [L]), π[e L ] = 1 + π α [L]dt, 0 Liouville α π α = x (π α x ψ α ) and parameter dynamics given by ẋ = x ψ α (x). Universität Potsdam/ University of Reading 16

31 Homotopy method II Particle flow filter: ẋ i α = xψ α (x i α ), xi 0 π(x), α [0, 1], with potential ψ α solving x (π α x ψ α ) = π α (L π α (L)). Numerical implementation: diffusion maps optimal transportation (iterative) ensemble Kalman filter Universität Potsdam/ University of Reading 17

32 Homotopy method II Particle flow filter: ẋ i α = xψ α (x i α ), xi 0 π(x), α [0, 1], with potential ψ α solving x (π α x ψ α ) = π α (L π α (L)). Numerical implementation: diffusion maps optimal transportation (iterative) ensemble Kalman filter Universität Potsdam/ University of Reading 17

33 Homotopy method III Extensions: Randomized particle flow: dx α = ϵ x ln π α (x) dt + 2ϵ dw α + x ψ α (x) dt = x ψα (x) + ε ln π 0 (x) εαl(x) dt + 2ϵ dw α, for α 0. Here ε 0 is a parameter. It still holds that Minimization: Method for finding α π α = π α (L π α [L]). x = arg min V(x) if α, ε α 1, and π α e αv π 0. Universität Potsdam/ University of Reading 18

34 Homotopy method III Extensions: Randomized particle flow: dx α = ϵ x ln π α (x) dt + 2ϵ dw α + x ψ α (x) dt = x ψα (x) + ε ln π 0 (x) εαl(x) dt + 2ϵ dw α, for α 0. Here ε 0 is a parameter. It still holds that Minimization: Method for finding α π α = π α (L π α [L]). x = arg min V(x) if α, ε α 1, and π α e αv π 0. Universität Potsdam/ University of Reading 18

35 References Particle flow: Moser (1965), Daum et al. (SPIE, 2010), Daum et al. (SPIE, 2017) SR (BIT, 2011) Heng, Doucet & Pokern (2015, ArXiv: ) Extensions: iterative Kalman filters (Dean Oliver, Mark Bocquet, Marco Iglesias, SR...) sequential Monte Carlo & rejuvenation (..., Beskos, Crisan & Jasra (Annals of Applied Prob, 2014),...) Kalman filter as minimizer (Schilling & Stuart (2016, ArXiv: )) stochastic minimization techniques/ simulated annealing (...) Universität Potsdam/ University of Reading 19

36 References Particle flow: Moser (1965), Daum et al. (SPIE, 2010), Daum et al. (SPIE, 2017) SR (BIT, 2011) Heng, Doucet & Pokern (2015, ArXiv: ) Extensions: iterative Kalman filters (Dean Oliver, Mark Bocquet, Marco Iglesias, SR...) sequential Monte Carlo & rejuvenation (..., Beskos, Crisan & Jasra (Annals of Applied Prob, 2014),...) Kalman filter as minimizer (Schilling & Stuart (2016, ArXiv: )) stochastic minimization techniques/ simulated annealing (...) Universität Potsdam/ University of Reading 19

37 Conclusions ensemble Kalman filter has triggered renewed interest in Lagrangian filtering and data assimilation stability and accuracy of the resulting interacting particle systems is largely unknown many applications outside classic filtering context such as rare event simulations and whenever a change of measure arises exploitation of geometric properties? E.g. data assimilation for highly oscillatory Hamiltonian systems (Reinhardt, Hastermann, Klein, SR, ArXiv: )... Universität Potsdam/ University of Reading 20

38 Conclusions ensemble Kalman filter has triggered renewed interest in Lagrangian filtering and data assimilation stability and accuracy of the resulting interacting particle systems is largely unknown many applications outside classic filtering context such as rare event simulations and whenever a change of measure arises exploitation of geometric properties? E.g. data assimilation for highly oscillatory Hamiltonian systems (Reinhardt, Hastermann, Klein, SR, ArXiv: )... Universität Potsdam/ University of Reading 20

Data assimilation Schrödinger s perspective

Data assimilation Schrödinger s perspective Data assimilation Schrödinger s perspective Sebastian Reich (www.sfb1294.de) Universität Potsdam/ University of Reading IMS NUS, August 3, 218 Universität Potsdam/ University of Reading 1 Core components

More information

EnKF-based particle filters

EnKF-based particle filters EnKF-based particle filters Jana de Wiljes, Sebastian Reich, Wilhelm Stannat, Walter Acevedo June 20, 2017 Filtering Problem Signal dx t = f (X t )dt + 2CdW t Observations dy t = h(x t )dt + R 1/2 dv t.

More information

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Laura Slivinski June, 3 Laura Slivinski (Brown University) Lagrangian Data Assimilation June, 3 / 3 Data Assimilation Setup:

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Feedback Particle Filter and its Application to Coupled Oscillators Presentation at University of Maryland, College Park, MD

Feedback Particle Filter and its Application to Coupled Oscillators Presentation at University of Maryland, College Park, MD Feedback Particle Filter and its Application to Coupled Oscillators Presentation at University of Maryland, College Park, MD Prashant Mehta Dept. of Mechanical Science and Engineering and the Coordinated

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

arxiv: v3 [math.oc] 21 Dec 2017

arxiv: v3 [math.oc] 21 Dec 2017 Kalman Filter and its Modern Extensions for the Continuous-time onlinear Filtering Problem arxiv:172.7241v3 [math.oc] 21 Dec 217 This paper is concerned with the filtering problem in continuous-time. Three

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Methods of Data Assimilation and Comparisons for Lagrangian Data

Methods of Data Assimilation and Comparisons for Lagrangian Data Methods of Data Assimilation and Comparisons for Lagrangian Data Chris Jones, Warwick and UNC-CH Kayo Ide, UCLA Andrew Stuart, Jochen Voss, Warwick Guillaume Vernieres, UNC-CH Amarjit Budiraja, UNC-CH

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

Data assimilation in high dimensions

Data assimilation in high dimensions Data assimilation in high dimensions David Kelly Courant Institute New York University New York NY www.dtbkelly.com February 12, 2015 Graduate seminar, CIMS David Kelly (CIMS) Data assimilation February

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

What do we know about EnKF?

What do we know about EnKF? What do we know about EnKF? David Kelly Kody Law Andrew Stuart Andrew Majda Xin Tong Courant Institute New York University New York, NY April 10, 2015 CAOS seminar, Courant. David Kelly (NYU) EnKF April

More information

EnKF and Catastrophic filter divergence

EnKF and Catastrophic filter divergence EnKF and Catastrophic filter divergence David Kelly Kody Law Andrew Stuart Mathematics Department University of North Carolina Chapel Hill NC bkelly.com June 4, 2014 SIAM UQ 2014 Savannah. David Kelly

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

A nonparametric ensemble transform method for Bayesian inference

A nonparametric ensemble transform method for Bayesian inference A nonparametric ensemble transform method for Bayesian inference Article Published Version Reich, S. (2013) A nonparametric ensemble transform method for Bayesian inference. SIAM Journal on Scientific

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

Data assimilation in high dimensions

Data assimilation in high dimensions Data assimilation in high dimensions David Kelly Kody Law Andy Majda Andrew Stuart Xin Tong Courant Institute New York University New York NY www.dtbkelly.com February 3, 2016 DPMMS, University of Cambridge

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November

More information

Ergodicity in data assimilation methods

Ergodicity in data assimilation methods Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation

More information

Fundamentals of Data Assimilation

Fundamentals of Data Assimilation National Center for Atmospheric Research, Boulder, CO USA GSI Data Assimilation Tutorial - June 28-30, 2010 Acknowledgments and References WRFDA Overview (WRF Tutorial Lectures, H. Huang and D. Barker)

More information

EnKF and filter divergence

EnKF and filter divergence EnKF and filter divergence David Kelly Andrew Stuart Kody Law Courant Institute New York University New York, NY dtbkelly.com December 12, 2014 Applied and computational mathematics seminar, NIST. David

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München Motivation Bayes filter is a useful tool for state

More information

GSHMC: An efficient Markov chain Monte Carlo sampling method. Sebastian Reich in collaboration with Elena Akhmatskaya (Fujitsu Laboratories Europe)

GSHMC: An efficient Markov chain Monte Carlo sampling method. Sebastian Reich in collaboration with Elena Akhmatskaya (Fujitsu Laboratories Europe) GSHMC: An efficient Markov chain Monte Carlo sampling method Sebastian Reich in collaboration with Elena Akhmatskaya (Fujitsu Laboratories Europe) 1. Motivation In the first lecture, we started from a

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

A Comparative Study of Nonlinear Filtering Techniques

A Comparative Study of Nonlinear Filtering Techniques A Comparative Study of onlinear Filtering Techniques Adam K. Tilton, Shane Ghiotto and Prashant G. Mehta Abstract In a recent work it is shown that importance sampling can be avoided in the particle filter

More information

An introduction to data assimilation. Eric Blayo University of Grenoble and INRIA

An introduction to data assimilation. Eric Blayo University of Grenoble and INRIA An introduction to data assimilation Eric Blayo University of Grenoble and INRIA Data assimilation, the science of compromises Context characterizing a (complex) system and/or forecasting its evolution,

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

Adaptive Population Monte Carlo

Adaptive Population Monte Carlo Adaptive Population Monte Carlo Olivier Cappé Centre Nat. de la Recherche Scientifique & Télécom Paris 46 rue Barrault, 75634 Paris cedex 13, France http://www.tsi.enst.fr/~cappe/ Recent Advances in Monte

More information

arxiv: v2 [math.na] 22 Sep 2016

arxiv: v2 [math.na] 22 Sep 2016 Second-order accurate ensemble transform particle filters Jana de Wiljes Walter Acevedo Sebastian Reich September 3, 1 arxiv:1.179v [math.na] Sep 1 Abstract Particle filters (also called sequential Monte

More information

Model Uncertainty Quantification for Data Assimilation in partially observed Lorenz 96

Model Uncertainty Quantification for Data Assimilation in partially observed Lorenz 96 Model Uncertainty Quantification for Data Assimilation in partially observed Lorenz 96 Sahani Pathiraja, Peter Jan Van Leeuwen Institut für Mathematik Universität Potsdam With thanks: Sebastian Reich,

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

A variational radial basis function approximation for diffusion processes

A variational radial basis function approximation for diffusion processes A variational radial basis function approximation for diffusion processes Michail D. Vrettas, Dan Cornford and Yuan Shen Aston University - Neural Computing Research Group Aston Triangle, Birmingham B4

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

Blind Equalization via Particle Filtering

Blind Equalization via Particle Filtering Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte

More information

Bayesian parameter estimation in predictive engineering

Bayesian parameter estimation in predictive engineering Bayesian parameter estimation in predictive engineering Damon McDougall Institute for Computational Engineering and Sciences, UT Austin 14th August 2014 1/27 Motivation Understand physical phenomena Observations

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

EnKF Review. P.L. Houtekamer 7th EnKF workshop Introduction to the EnKF. Challenges. The ultimate global EnKF algorithm

EnKF Review. P.L. Houtekamer 7th EnKF workshop Introduction to the EnKF. Challenges. The ultimate global EnKF algorithm Overview 1 2 3 Review of the Ensemble Kalman Filter for Atmospheric Data Assimilation 6th EnKF Purpose EnKF equations localization After the 6th EnKF (2014), I decided with Prof. Zhang to summarize progress

More information

Estimating model evidence using data assimilation

Estimating model evidence using data assimilation Estimating model evidence using data assimilation Application to the attribution of climate-related events Marc Bocquet 1, Alberto Carrassi 2, Alexis Hannart 3 & Michael Ghil 4,5 and the DADA Team (1 CEREA,

More information

A nested sampling particle filter for nonlinear data assimilation

A nested sampling particle filter for nonlinear data assimilation Quarterly Journal of the Royal Meteorological Society Q. J. R. Meteorol. Soc. : 14, July 2 A DOI:.2/qj.224 A nested sampling particle filter for nonlinear data assimilation Ahmed H. Elsheikh a,b *, Ibrahim

More information

A Hamiltonian Numerical Scheme for Large Scale Geophysical Fluid Systems

A Hamiltonian Numerical Scheme for Large Scale Geophysical Fluid Systems A Hamiltonian Numerical Scheme for Large Scale Geophysical Fluid Systems Bob Peeters Joint work with Onno Bokhove & Jason Frank TW, University of Twente, Enschede CWI, Amsterdam PhD-TW colloquium, 9th

More information

arxiv: v3 [stat.co] 27 Nov 2014

arxiv: v3 [stat.co] 27 Nov 2014 arxiv:146.3183v3 [stat.co] 27 Nov 214 Approximations of the Optimal Importance Density using Gaussian Particle Flow Importance Sampling Pete Bunch and Simon Godsill Abstract Recently developed particle

More information

UNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS

UNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS Monday, June 27, 2011 CEDAR-GEM joint workshop: DA tutorial 1 UNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS Tomoko Matsuo University of Colorado, Boulder Space

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions Outline 1 Introduction 2 Problem and solution 3 Bayesian tracking model of group debris 4 Simulation results 5 Conclusions Problem The limited capability of the radar can t always satisfy the detection

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

(Extended) Kalman Filter

(Extended) Kalman Filter (Extended) Kalman Filter Brian Hunt 7 June 2013 Goals of Data Assimilation (DA) Estimate the state of a system based on both current and all past observations of the system, using a model for the system

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

Particle Filters in High Dimensions

Particle Filters in High Dimensions Particle Filters in High Dimensions Dan Crisan Imperial College London Bayesian Computation for High-Dimensional Statistical Models Opening Workshop 27-31 August 2018 Dan Crisan (Imperial College London)

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

Performance of ensemble Kalman filters with small ensembles

Performance of ensemble Kalman filters with small ensembles Performance of ensemble Kalman filters with small ensembles Xin T Tong Joint work with Andrew J. Majda National University of Singapore Sunday 28 th May, 2017 X.Tong EnKF performance 1 / 31 Content Ensemble

More information

c 2014 Krishna Kalyan Medarametla

c 2014 Krishna Kalyan Medarametla c 2014 Krishna Kalyan Medarametla COMPARISON OF TWO NONLINEAR FILTERING TECHNIQUES - THE EXTENDED KALMAN FILTER AND THE FEEDBACK PARTICLE FILTER BY KRISHNA KALYAN MEDARAMETLA THESIS Submitted in partial

More information

Smoothers: Types and Benchmarks

Smoothers: Types and Benchmarks Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds

More information

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

Convergence of the Ensemble Kalman Filter in Hilbert Space

Convergence of the Ensemble Kalman Filter in Hilbert Space Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based

More information

Non-linear and non-gaussian state estimation using log-homotopy based particle flow filters

Non-linear and non-gaussian state estimation using log-homotopy based particle flow filters Non-linear and non-gaussian state estimation using log-homotopy based particle flow filters Muhammad Altamash Khan, Martin Ulme Sensor Data and Information Fusion Department FKIE Fraunhofer, Wachtberg,

More information

Bridging the Gap between Center and Tail for Multiscale Processes

Bridging the Gap between Center and Tail for Multiscale Processes Bridging the Gap between Center and Tail for Multiscale Processes Matthew R. Morse Department of Mathematics and Statistics Boston University BU-Keio 2016, August 16 Matthew R. Morse (BU) Moderate Deviations

More information

State and Parameter Estimation in Stochastic Dynamical Models

State and Parameter Estimation in Stochastic Dynamical Models State and Parameter Estimation in Stochastic Dynamical Models Timothy DelSole George Mason University, Fairfax, Va and Center for Ocean-Land-Atmosphere Studies, Calverton, MD June 21, 2011 1 1 collaboration

More information

State Space Models for Wind Forecast Correction

State Space Models for Wind Forecast Correction for Wind Forecast Correction Valérie 1 Pierre Ailliot 2 Anne Cuzol 1 1 Université de Bretagne Sud 2 Université de Brest MAS - 2008/28/08 Outline 1 2 Linear Model : an adaptive bias correction Non Linear

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016 Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Estimation of State Noise for the Ensemble Kalman filter algorithm for 2D shallow water equations.

Estimation of State Noise for the Ensemble Kalman filter algorithm for 2D shallow water equations. Estimation of State Noise for the Ensemble Kalman filter algorithm for 2D shallow water equations. May 6, 2009 Motivation Constitutive Equations EnKF algorithm Some results Method Navier Stokes equations

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Filtering and Likelihood Inference

Filtering and Likelihood Inference Filtering and Likelihood Inference Jesús Fernández-Villaverde University of Pennsylvania July 10, 2011 Jesús Fernández-Villaverde (PENN) Filtering and Likelihood July 10, 2011 1 / 79 Motivation Introduction

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Lagrangian data assimilation for point vortex systems

Lagrangian data assimilation for point vortex systems JOT J OURNAL OF TURBULENCE http://jot.iop.org/ Lagrangian data assimilation for point vortex systems Kayo Ide 1, Leonid Kuznetsov 2 and Christopher KRTJones 2 1 Department of Atmospheric Sciences and Institute

More information

Stochastic Particle Dynamics for Unresolved Degrees of Freedom. Sebastian Reich in collaborative work with Colin Cotter (Imperial College)

Stochastic Particle Dynamics for Unresolved Degrees of Freedom. Sebastian Reich in collaborative work with Colin Cotter (Imperial College) Stochastic Particle Dynamics for Unresolved Degrees of Freedom Sebastian Reich in collaborative work with Colin Cotter (Imperial College) 1. The Motivation Classical molecular dynamics (MD) as well as

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator

Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator Andreas Rauh, Kai Briechle, and Uwe D Hanebec Member, IEEE Abstract In this paper, the Prior Density Splitting Mixture

More information

Gain Function Approximation in the Feedback Particle Filter

Gain Function Approximation in the Feedback Particle Filter Gain Function Approimation in the Feedback Particle Filter Amirhossein Taghvaei, Prashant G. Mehta Abstract This paper is concerned with numerical algorithms for gain function approimation in the feedback

More information

Correcting biased observation model error in data assimilation

Correcting biased observation model error in data assimilation Correcting biased observation model error in data assimilation Tyrus Berry Dept. of Mathematical Sciences, GMU PSU-UMD DA Workshop June 27, 217 Joint work with John Harlim, PSU BIAS IN OBSERVATION MODELS

More information

Particle MCMC for Bayesian Microwave Control

Particle MCMC for Bayesian Microwave Control Particle MCMC for Bayesian Microwave Control P. Minvielle 1, A. Todeschini 2, F. Caron 3, P. Del Moral 4, 1 CEA-CESTA, 33114 Le Barp, France 2 INRIA Bordeaux Sud-Ouest, 351, cours de la Liberation, 33405

More information

Bayesian Calibration of Simulators with Structured Discretization Uncertainty

Bayesian Calibration of Simulators with Structured Discretization Uncertainty Bayesian Calibration of Simulators with Structured Discretization Uncertainty Oksana A. Chkrebtii Department of Statistics, The Ohio State University Joint work with Matthew T. Pratola (Statistics, The

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y.

Consider the joint probability, P(x,y), shown as the contours in the figure above. P(x) is given by the integral of P(x,y) over all values of y. ATMO/OPTI 656b Spring 009 Bayesian Retrievals Note: This follows the discussion in Chapter of Rogers (000) As we have seen, the problem with the nadir viewing emission measurements is they do not contain

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

Hierarchical Bayesian Inversion

Hierarchical Bayesian Inversion Hierarchical Bayesian Inversion Andrew M Stuart Computing and Mathematical Sciences, Caltech cw/ S. Agapiou, J. Bardsley and O. Papaspiliopoulos SIAM/ASA JUQ 2(2014), pp. 511--544 cw/ M. Dunlop and M.

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information