1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.
|
|
- Job Ward
- 5 years ago
- Views:
Transcription
1 Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy 2 Data augmentation. Thomas Schön Division of Systems and Control Department of Information Technology Uppsala University. thomas.schon@it.uu.se, www: user.it.uu.se/~thosc2. Summary of Part 3 2. Identification strategy 2 Data augmentation 3. Maximum likelihood via Expectation Maximization (EM) 4. Sampling state trajectories using Markov kernels 5. Bayesian identification using Gibbs sampling 6. Some of our current research activities (if there is time) Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Summary of Part 3 Identification strategy 2 data augmentation We introduced an auxiliary variable u = (x :T, a 2:T ), u ψ θ (u y :T ). and considered an extended target distribution φ(θ, u y :T ) = p θ,u(y :T )ψ θ (u y :T )p(θ). p(y :T ) Could construct a standard MH algorithm that operates in the non-standard extended space Θ X NT {,..., N} N(T ). p(θ y :T ) is recovered exactly as the marginal of the extended target distribution φ(θ, u y :T ), despite the fact that we employ an SMC approximation of the likelihood using a finite number of particles N. Motivation: If we had access to the complete likelihood p θ (x :T, y :T ) = µ θ (x ) T t= the problem would be much easier. T g θ (y t x t ) f θ (x t+ x t ) Key idea: Treat the state sequence x :T as an auxiliary variable that is estimated together with the parameters θ. The data augmentation strategy breaks the original problem into two new and closely linked problems. t= Intuitively: Alternate between updating θ and x :T. 2 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
2 Maximum likelihood via EM. Expectation maximization (EM) employs the complete likelihood p θ (x :T, y :T ) as a substitute for the observed likelihood p θ (y :T ). They are of course related, p θ (x :T, y :T ) = p θ (x :T y :T )p θ (y :T ). It can be shown that by iteratively maximizing Q(θ, θ[k]) log p θ (x :T, y :T )p θ[k] (x :T y :T )dx :T = E θ[k] [log p θ (x :T, y :T ) y :T ], w.r.t. θ, the obtained sequence {θ[k]} k will result in a monotonical increase in likelihood values. SMC is used to approximate the JSD p θ[k] (x :T y :T ). 4 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Ex blind Wiener identification (I/III) Using EM and particle smoothing together Algorithm EM for identifying nonlinear dynamical systems. Initialise: Set k = and choose an initial θ. 2. While not converged do: (a) Expectation (E) step: Compute Q(θ, θ k ) = log p θ (x :T, y :T ) p θk (x :T y :T ) }{{} dx :T using sequential Monte Carlo (particle smoother). (b) Maximization (M) step: Compute θ k+ = arg max θ Θ (c) k k + Q(θ, θ k ) Thomas B. Schön, Adrian Wills and Brett Ninness. System Identification of Nonlinear State-Space Models. Automatica, 47():39-49, January 2. 5 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Ex blind Wiener identification (II/III) e,t u t L x t+ = ( A z t h (z t, β) h 2 (z t, β) Σ e 2,t Σ y,t y 2,t B ) ( ) x t, u u t N (, Q), t z t = Cx t, y t = h(z t, β) + e t, e t N (, R). Id. problem: Find L, β, r, and r 2 based on {y,:t, y 2,:T }. Second order LGSS model with complex poles. Results obtained using T = samples. Employ the EM-PS with N = particles. The plots are based on realisations of data. Nonlinearities (dead zone and saturation) shown on next slide. Gain (db) Normalised frequency Bode plot of estimated mean (black), true system (red) and the result for all realisations (gray). 6 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
3 Ex blind Wiener identification (III/III) Outline Part 4 Output of nonlinearity Input to nonlinearity Output of nonlinearity Input to nonlinearity 2 Estimated mean (black), true static nonlinearity (red) and the result for all realisations (gray). Aim: Show how SMC can be used to implement identification strategy 2 Data augmentation.. Summary of Part 3 2. Identification strategy 2 Data augmentation 3. Maximum likelihood via Expectation Maximization (EM) 4. Sampling state trajectories using Markov kernels 5. Bayesian identification using Gibbs sampling 6. Some of our current research activities (if there is time) Adrian Wills, Thomas B. Schön, Lennart Ljung and Brett Ninness. Identification of Hammerstein-Wiener Models. Automatica, 49(): 7-8, January / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Gibbs sampler for SSMs Sampling based on SMC Aim: Compute p(θ, x :T y :T ). MCMC: Gibbs sampling (blocked) for SSMs amounts to iterating Draw θ[m] p(θ x :T [m ], y :T ); OK! Draw x :T [m] p(x :T θ[m], y :T ). Hard! With P ( x :T = xi :T ) w i T we get, x :T.5.5 approx. p(x :T θ, y :T ). Problem: p(x :T θ[m], y :T ) not available! State Idea: Approximate p(x :T θ[m], y :T ) using a sequential Monte Carlo method! Time / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
4 Problems and a solution PMCMC Problems with this approach, Based on a PF approximate sample. Does not leave p(x :T θ, y :T ) invariant! Relies on large N to be successful. A lot of wasted computations. To get around these problems, The idea underlying PMCMC is to make use of a certain SMC sampler to construct a Markov kernel leaving the joint smoothing distribution p(x :T θ, y :T ) invariant. This Markov kernel is then used in a standard MCMC algorithm (e.g. Gibbs, results in the Particle Gibbs (PG)). Use a conditional particle filter (CPF). One prespecified reference trajectory is retained throughout the sampler. SMC is used to build an MCMC kernel with p(x :t θ, y :t ) as its stationary distribution without introducing any systematic errors! Christophe Andrieu, Arnaud Doucet and Roman Holenstein, Particle Markov chain Monte Carlo methods, Journal of the Royal Statistical Society: Series B, 72: , 2. 2 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Three SMC samplers Three SMC samplers leaving p(x :T θ, y :T ) invariant:. Conditional particle filter (CPF) Christophe Andrieu, Arnaud Doucet and Roman Holenstein, Particle Markov chain Monte Carlo methods, Journal of the Royal Statistical Society: Series B, 72: , CPF with backward simulation (CPF-BS) N. Whiteley, Discussion on Particle Markov chain Monte Carlo methods, Journal of the Royal Statistical Society: Series B, 72(3), 36 37, 2. Conditional particle filter (CPF) Let x :T = (x,..., x T ) be a fixed reference trajectory. At each time t, sample only N particles in the standard way. Set the N th particle deterministically: x N t = x t. N. Whiteley, C. Andrieu and A. Doucet, Efficient Bayesian inference for switching state-space models using discrete particle Markov chain Monte Carlo methods, Bristol Statistics Research Report :4, 2. Fredrik Lindsten and Thomas B. Schön. On the use of backward simulation in the particle Gibbs sampler. Proc. of the 37th Internat. Conf. on Acoustics, Speech, and Signal Processing (ICASSP), Kyoto, Japan, March CPF with ancestor sampling (CPF-AS) Fredrik Lindsten, Michael I. Jordan and Thomas B. Schön, Particle Gibbs with ancestor sampling, arxiv:4.64, 24. Fredrik Lindsten, Michael I. Jordan and Thomas B. Schön, Ancestor sampling for particle Gibbs, Advances in Neural Information Processing Systems (NIPS) 25, Lake Tahoe, NV, US, December, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. CPF causes us to degenerate to the something that is very similar to the reference trajectory, resulting in slow mixing. 5 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
5 CPF vs. CPF-AS motivation BS is problematic for models with more intricate dependencies. Reason: Requires complete trajectories of the latent variable in the backward sweep. Solution: Modify the computation to achieve the same effect as BS, but without an explicit backwards sweep. Implication: Ancestor sampling opens up for inference in a wider class of models, e.g. non-markovian SSMs, PGMs and BNP models. CPF-AS intuition Let x :T = (x,..., x T ) be a fixed reference trajectory. At each time t, sample only N particles in the standard way. Set the N th particle deterministically: x N t = x t. Generate an artificial history for x N t by ancestor sampling. Ancestor sampling is conceptually similar to backward simulation, but instead of separate forward and backward sweeps, we achieve the same effect in a single forward sweep. 6 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. CPF-AS algorithm CPF-AS causes us to degenerate to something that is very different from the reference trajectory, resulting in better mixing. 7 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. The PGAS Markov kernel Algorithm CPF-AS, conditioned on x :T. Initialize (t = ): (a) Draw x i r θ, (x i ), for i =,..., N. (b) Set x N = x. (c) Set w i = W θ, (x i ). 2. For t = 2 to T do: (a) Draw a i t C({w j t } N j= ), for i =,..., N. (b) Draw x i t r θ,t (x t x ai t :t ), for i =,..., N. (c) Set x N t = x t. (d) Draw a N t with P (() a N t = i) w i t γ θ,t (x i :t,x t:t ) γ θ,t (x i :t ). (e) Set x i :t = {x ai t :t, xi t} and w i t = W θ,t (x i :t).. Run CPF-AS(x :T ) targeting p(x :T θ, y :T ). 2. Sample x :T with P ( x :T = xi :T ) w i T. Maps x :T stochastically into x :T Implicitly defines an ergodic Markov kernel (Pθ N ) referred to as the PGAS (particle Gibbs with ancestor sampling) kernel. Theorem For any number of particles N and θ Θ, the PGAS kernel Pθ N leaves p(x :T θ, y :T ) invariant, p(dx :T θ, y :T ) = Pθ N (x :T, dx :T )p(dx :T θ, y :T ) 8 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
6 PGAS for Bayesian learning Toy example stochastic volatility (I/II) Bayesian learning: Gibbs + CPF-AS = PGAS Algorithm Particle Gibbs with ancestor sampling (PGAS). Initialize: Set {θ[], x :T []} arbitrarily. 2. For m, iterate: (a) Draw θ[m] p(θ x :T [m ], y :T ). (b) Run CPF-AS(x :T [m ]), targeting p(x :T θ[m], y :T ). (c) Sample with P (x :T [m] = x i :T ) wi T. ACF Consider the stochastic volatility model, x t+ =.9x t + w t, w t N (, θ), ( ) y t = e t exp 2 x t, e t N (, ). Let us study the ACF for the estimation error, θ E [θ y :T ] PG, T = N=5 N=2 N= N= ACF PG-AS, T = N=5 N=2 N= N= Lag Lag 2 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Toy example stochastic volatility (II/II) Example semiparametric Wiener model PG PGAS v t e t Update frequencey of xt N = 5 N = 2. N = N = Time (t) Update frequencey of xt N = 5 N = 2. N = N = Time (t) Plots of the update rate of x t versus t, i.e. the proportion of iterations where x t changes value. This provides another comparison of the mixing. z t u t L g( ) Σ Parametric LGSS and a nonparametric static nonlinearity: x t+ = ( A B ) ( ) xt + v }{{} u t, v t N (, Q), t Γ z t = Cx t. y t = g(z t ) + e t, y t e t N (, R). 22 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
7 Example semiparametric Wiener model Example semiparametric Wiener model Show movie Parameters : θ = {A, B, Q, g( ), r}. Bayesian model specified by priors Conjugate priors for Γ = [A B], Q and r, p(γ, Q) = Matrix-normal inverse-wishart p(r) = inverse-wishart Gaussian process prior on g( ), g( ) GP(z, k(z, z )). Inference using PGAS with N = 5 particles. T = measurements. We ran 5 MCMC iterations and discarded 5 as burn-in. Γ g( ) u :T x :T y :T Q r Magnitude (db) Phase (deg) 2 2 True Posterior mean 99 % credibility Frequency (rad/s) Bode diagram of the 4th-order linear system. Estimated mean (dashed black), true (solid black) and 99% credibility intervals (blue). h(z) True Posterior mean 99 % credibility z Static nonlinearity (non-monotonic), estimated mean (dashed black), true (black) and the 99% credibility intervals (blue). Fredrik Lindsten, Thomas B. Schön and Michael I. Jordan. Bayesian semiparametric Wiener system identification. Automatica, 49(7): , July / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Some of our current research activities Inference in probabilistic graphical models Joint work with (alphabetical order): Christian A. Naesseth (Linköping University), John Aston (University of Cambridge), Alexandre Bouchard-Côté (University of British Columbia). Adam M. Johansen (University of Warwick), Bonnie Kirkpatrick (University of Miami), Fredrik Lindsten (University of Cambridge), Arno Solin (Aalto University), Andreas Svensson (Uppsala University), Simo Särkkä (Aalto University) and Johan Wågberg (Uppsala University). Constructing an artificial sequence of intermediate target distributions for an SMC sampler is a powerful (and quite possibly underutilized) idea. x 4 x 5 x x 2 x 3 y y 2 y 3 Christian A. Naesseth, Fredrik Lindsten and Thomas B. Schön, Sequential Monte Carlo methods for graphical models. Advances in Neural Information Processing Systems (NIPS) 27, Montreal, Canada, December, 24. Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston and Alexandre Bouchard-Côté. Divide-and-Conquer with Sequential Monte Carlo. arxiv: , June / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
8 SMC in high dimensions A nonparametric model GP-SSM The bootstrap PF suffers from weight collapse in high-dimensional settings. This degeneracy can be reduced by using so-called fully adapted proposals. k k k + We can mimic the efficient fully adapted proposals for arbitrary latent spaces and structures in high-dimensional models. Approximations the proposal distribution and use a nested coupling of multiple SMC samplers and backward simulators. Consider the Gaussian Process SSM (GP-SSM): x t+ = f(x t ) + w t, s.t. f(x) GP(, κ θ,f (x, x )), y t = g(x t ) + e t, s.t. g(x) GP(, κ θ,g (x, x )), The model functions f and g are assumed to be realizations from a Gaussian process prior We will identify this model by inferring the posterior distribution The PGAS construction is used. p(f, g, Q, R, θ y :T ). Christian A. Naesseth, Fredrik Lindsten and Thomas B. Schön, Nested sequential Monte Carlo. In Proceedings of the 32nd International Conference on Machine Learning (ICML), Lille, France, July, 25. Andreas Svensson, Arno Solin, Simo Särkkä and Thomas B. Schön, Computationally Efficient Bayesian Learning of Gaussian Process State Space Models. In submitted, available on arxiv tomorrow, June, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. The aim of these lectures The aim of these lectures is to provide an introduction to the theory and application of sequential Monte Carlo (SMC) methods for nonlinear system identification (and state estimation) problems. When it comes to SMC we will focus on the particle filter. After this course you should be able to derive your own SMC based algorithms allowing you to solve nonlinear system identification and state estimation problems. If you need inspiration I have provided some exercises here user.it.uu.se/~thosc2/courses.html 3 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25. Conclusions. Provided two key strategies and four different approaches for system identification in nonlinear SSMs. 2. Showed how SMC can be used to implement these strategies. 3. (Hopefully) conveyed the intuition underlying SMC. 4. SMC is applicable to many problems, not just SSMs via PF. Manuscript is also available (just ask me for a draft) Thomas B. Schöon and Fredrik Lindsten. Learning of dynamical systems Particle filters and Markov chain methods, 25. PhD course on the topic is available here user.it.uu.se/~thosc2/cids.html There will be an invited session on SMC at SYSID in Beijing (China) in October. Fast moving research area offering lots of opportunities! 3 / 3 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 25.
1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.
Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy
More informationAn efficient stochastic approximation EM algorithm using conditional particle filters
An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:
More informationSequential Monte Carlo in the machine learning toolbox
Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,
More informationOutline lecture 6 2(35)
Outline lecture 35 Lecture Expectation aximization E and clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 13-1373 Office: House
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationSequential Monte Carlo methods for system identification
Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,
More informationAn introduction to Sequential Monte Carlo
An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods
More informationIdentification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM
Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,
More informationIdentification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM
Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik
More informationThe Hierarchical Particle Filter
and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle
More informationAdversarial Sequential Monte Carlo
Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationMonte Carlo Approximation of Monte Carlo Filters
Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space
More informationAn introduction to particle filters
An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline
More informationAuxiliary Particle Methods
Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationSystem identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.
System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors
More informationInference in state-space models with multiple paths from conditional SMC
Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September
More informationAdvanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering
Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London
More informationBayesian semiparametric Wiener system identification
Bayesian semiparametric Wiener system identification Fredrik Lindsten a, Thomas B. Schön a, Michael I. Jordan b a Division of Automatic Control, Linköping University, Linköping, Sweden. b Departments of
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationCalibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods
Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June
More informationLearning of dynamical systems
Learning of dynamical systems Particle filters and Markov chain methods Thomas B. Schön and Fredrik Lindsten c Draft date August 23, 2017 2 Contents 1 Introduction 3 1.1 A few words for readers of the
More informationLearning Static Parameters in Stochastic Processes
Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We
More informationSequential Monte Carlo for Graphical Models
Sequential Monte Carlo for Graphical Models Christian A. Naesseth Div. of Automatic Control Linköping University Linköping, Sweden chran60@isy.liu.se Fredrik Lindsten Dept. of Engineering The University
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationUnsupervised Learning
Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College
More informationDivide-and-Conquer Sequential Monte Carlo
Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/
More informationNon-Factorised Variational Inference in Dynamical Systems
st Symposium on Advances in Approximate Bayesian Inference, 08 6 Non-Factorised Variational Inference in Dynamical Systems Alessandro D. Ialongo University of Cambridge and Max Planck Institute for Intelligent
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationSAMPLING ALGORITHMS. In general. Inference in Bayesian models
SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationNested Sequential Monte Carlo Methods
Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön Linköping University, Sweden The University of Cambridge, UK Uppsala University, Sweden ICML 2015 Presented
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationL09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms
L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state
More informationTowards a Bayesian model for Cyber Security
Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationOutline Lecture 3 2(40)
Outline Lecture 3 4 Lecture 3 Expectation aximization E and Clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 3-8373 Office: House
More informationThe Problem 2(47) Nonlinear System Identification: A Palette from Off-white to Pit-black. A Common Frame 4(47) This Presentation... 3(47) ˆθ = arg min
y y y y y Tunn: measurement; Tjock: simulation 6 7 8 9.. 6 7 8 9.. 6 7 8 9 6 7 8 9 6 7 8 9 [s] 6 OUTPUT # 6 - INPUT # - 6.. -. OUTPUT # INPUT # - 6 The Problem (7) : A Palette from Off-white to Pit-black
More informationLearning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution
Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationAUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET. Machine Learning T. Schön. (Chapter 11) AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET
About the Eam I/II) ), Lecture 7 MCMC and Sampling Methods Thomas Schön Division of Automatic Control Linköping University Linköping, Sweden. Email: schon@isy.liu.se, Phone: 3-373, www.control.isy.liu.se/~schon/
More informationSequential Monte Carlo Methods (for DSGE Models)
Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationList of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016
List of projects FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 206 Work in groups of two (if this is absolutely not possible for some reason, please let the lecturers
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationParticle Metropolis-Hastings using gradient and Hessian information
Particle Metropolis-Hastings using gradient and Hessian information Johan Dahlin, Fredrik Lindsten and Thomas B. Schön September 19, 2014 Abstract Particle Metropolis-Hastings (PMH) allows for Bayesian
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationExpectation Propagation in Dynamical Systems
Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationA Note on Auxiliary Particle Filters
A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationOutline lecture 2 2(30)
Outline lecture 2 2(3), Lecture 2 Linear Regression it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic Control
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationNotes on pseudo-marginal methods, variational Bayes and ABC
Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution
More informationSampling Methods (11/30/04)
CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with
More informationSMC 2 : an efficient algorithm for sequential analysis of state-space models
SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationThe aim Part 3 2(58)
The aim Part (8 Part - onlinear state inference using sequential Monte Carlo The aim in part is to introduce the particle filter and the particle smoother. Division of Automatic Control Linköping University
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationChapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang
Chapter 4 Dynamic Bayesian Networks 2016 Fall Jin Gu, Michael Zhang Reviews: BN Representation Basic steps for BN representations Define variables Define the preliminary relations between variables Check
More informationInteracting Particle Markov Chain Monte Carlo
Technical report arxiv:62528v [statco] 6 Feb 26 Interacting Particle Markov Chain Monte Carlo Tom Rainforth*, Christian A Naesseth*, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationLarge-scale Ordinal Collaborative Filtering
Large-scale Ordinal Collaborative Filtering Ulrich Paquet, Blaise Thomson, and Ole Winther Microsoft Research Cambridge, University of Cambridge, Technical University of Denmark ulripa@microsoft.com,brmt2@cam.ac.uk,owi@imm.dtu.dk
More informationMCMC and Gibbs Sampling. Kayhan Batmanghelich
MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction
More informationOptimal input design for nonlinear dynamical systems: a graph-theory approach
Optimal input design for nonlinear dynamical systems: a graph-theory approach Patricio E. Valenzuela Department of Automatic Control and ACCESS Linnaeus Centre KTH Royal Institute of Technology, Stockholm,
More informationSequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember
More informationBayesian Calibration of Simulators with Structured Discretization Uncertainty
Bayesian Calibration of Simulators with Structured Discretization Uncertainty Oksana A. Chkrebtii Department of Statistics, The Ohio State University Joint work with Matthew T. Pratola (Statistics, The
More informationVariational Scoring of Graphical Model Structures
Variational Scoring of Graphical Model Structures Matthew J. Beal Work with Zoubin Ghahramani & Carl Rasmussen, Toronto. 15th September 2003 Overview Bayesian model selection Approximations using Variational
More informationGraphical model inference: Sequential Monte Carlo meets deterministic approximations
Graphical model inference: Sequential Monte Carlo meets deterministic approximations Fredrik Lindsten Department of Information Technology Uppsala University Uppsala, Sweden fredrik.lindsten@it.uu.se Jouni
More informationInferring biological dynamics Iterated filtering (IF)
Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationBlind Equalization via Particle Filtering
Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte
More informationHierarchical Bayesian approaches for robust inference in ARX models
Hierarchical Bayesian approaches for robust inference in ARX models Johan Dahlin, Fredrik Lindsten, Thomas Bo Schön and Adrian George Wills Linköping University Post Print N.B.: When citing this work,
More informationMCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17
MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making
More informationPseudo-marginal MCMC methods for inference in latent variable models
Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016
More informationApproximate Bayesian Computation
Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationContents lecture 6 2(17) Automatic Control III. Summary of lecture 5 (I/III) 3(17) Summary of lecture 5 (II/III) 4(17) H 2, H synthesis pros and cons:
Contents lecture 6 (7) Automatic Control III Lecture 6 Linearization and phase portraits. Summary of lecture 5 Thomas Schön Division of Systems and Control Department of Information Technology Uppsala
More informationLearning Gaussian Process Models from Uncertain Data
Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada
More informationarxiv: v1 [stat.co] 1 Jun 2015
arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationSequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them
HMM, MEMM and CRF 40-957 Special opics in Artificial Intelligence: Probabilistic Graphical Models Sharif University of echnology Soleymani Spring 2014 Sequence labeling aking collective a set of interrelated
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationBayesian Inference for Dirichlet-Multinomials
Bayesian Inference for Dirichlet-Multinomials Mark Johnson Macquarie University Sydney, Australia MLSS Summer School 1 / 50 Random variables and distributed according to notation A probability distribution
More informationLecture 4: State Estimation in Hidden Markov Models (cont.)
EE378A Statistical Signal Processing Lecture 4-04/13/2017 Lecture 4: State Estimation in Hidden Markov Models (cont.) Lecturer: Tsachy Weissman Scribe: David Wugofski In this lecture we build on previous
More informationFitting Narrow Emission Lines in X-ray Spectra
Outline Fitting Narrow Emission Lines in X-ray Spectra Taeyoung Park Department of Statistics, University of Pittsburgh October 11, 2007 Outline of Presentation Outline This talk has three components:
More informationMonte Carlo methods for sampling-based Stochastic Optimization
Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from
More informationIdentification using Convexification and Recursion. This Page will be Replaced before Printing
Identification using Convexification and Recursion This Page will be Replaced before Printing 3 Abstract System identification studies how to construct mathematical models for dynamical systems from the
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More information