ML Estimation of Process Noise Variance in Dynamic Systems
|
|
- Timothy Hart
- 5 years ago
- Views:
Transcription
1 ML Estimation of Process Noise Variance in Dynamic Systems Patrik Axelsson, Umut Orguner, Fredrik Gustafsson and Mikael Norrlöf Division of Automatic Control Department of Electrical Engineering Linköping University, Sweden
2 Outline 2(21) 1. Problem Formulation 2. Estimation of process noise covariance 3. Alternative Methods 4. Simulation Results
3 Outline 3(21) 1. Problem Formulation 2. Estimation of process noise covariance 3. Alternative Methods 4. Simulation Results
4 Background 4(21) The elasticity (in joints, arms,... ) for an industrial robot has increased over the last years. The motion control must be improved for these new robots. ABB IRB4600 (
5 Background 4(21) The elasticity (in joints, arms,... ) for an industrial robot has increased over the last years. The motion control must be improved for these new robots. One solution can be to estimate the tool position and use that estimate in the feedback loop. The tool position can be estimated with e.g. a EKF which uses knowledge about the process noise. ABB IRB4600 (
6 Background 4(21) The elasticity (in joints, arms,... ) for an industrial robot has increased over the last years. The motion control must be improved for these new robots. One solution can be to estimate the tool position and use that estimate in the feedback loop. The tool position can be estimated with e.g. a EKF which uses knowledge about the process noise. The accuracy is very important, < 1 mm. ABB IRB4600 (
7 Problem Formulation 5(21) Serial robot with two degrees of freedom. Nonlinear stiffness. Nonlinear friction. z b q a1 ρb q a2 z S P x S x b M(q) q + C(q, q) + G(q) + T(q) + D q + F( q) = u The model structure is common for mechanical systems derived by Newton s second law of motion or Lagrange s equation.
8 Problem Formulation 5(21) Serial robot with two degrees of freedom. Nonlinear stiffness. Nonlinear friction. z b q a1 ρb q a2 z S P x S x b M(q) q + C(q, q) + G(q) + T(q) + D q + F( q) = u The model structure is common for mechanical systems derived by Newton s second law of motion or Lagrange s equation. Accelerometer ( ) ρ s (q a, q a, q a ) = R b s(q a ) ρ b (q a, q a, q a ) + G b + δ s
9 Problem Formulation 6(21) cont d States x = ( q T a q T m qt a q T ) T m R 8 Discrete state space model (cont. disc. using Euler forward) x k+1 = F 1 (x k, u k ) + F 2 (x k )v k, v k N (0, Q) ( ) qm,k y k = + e k = h(x k, u k ) + e k, e k N (0, R) ρ s,k where F 2 (x k ) = ( ) 0. F 2 (x k ) All model parameters known except for the covariance matrix Q.
10 The Expectation Maximisation (EM) Algorithm 7(21) Maximum Likelihood (ML) method ˆθ ML = arg max log p θ (y 1:N ). θ Θ
11 The Expectation Maximisation (EM) Algorithm 7(21) Maximum Likelihood (ML) method ˆθ ML = arg max log p θ (y 1:N ). θ Θ Solution hard to find Expectation Maximisation (EM) algorithm.
12 The Expectation Maximisation (EM) Algorithm 7(21) Maximum Likelihood (ML) method ˆθ ML = arg max log p θ (y 1:N ). θ Θ Solution hard to find Expectation Maximisation (EM) algorithm. 1. Select an initial value θ 0 and set l = Expectation Step (E-step): Calculate Γ(θ; θ l ) = E θl [log p θ (y 1:N, x 1:N ) y 1:N ]. 3. Maximisation Step (M-step): Compute θ l+1 = arg max Γ(θ; θ l ). θ Θ 4. If converged, stop. If not, set l = l + 1 and go to step 2.
13 Outline 8(21) 1. Problem Formulation 2. Estimation of process noise covariance 3. Alternative Methods 4. Simulation Results
14 Estimation of process noise covariance 9(21) Estimate Q with the Expectation Maximisation (EM) algorithm.
15 Estimation of process noise covariance 9(21) Estimate Q with the Expectation Maximisation (EM) algorithm. Main ideas Calculate Γ(Q; Q l ) using linearisation. Use the smoothed states (EKS) as approximation when necessary. Possible since the smoothed densities are peaky when the SNR is high which is the case here.
16 Estimation of process noise covariance 10(21) Conditional densities x k+1 p(x k+1 x k ) = N y k p(y k x k ) = N (y k ; h k, R) ( ) x k+1 ; F 1,k, F 2,k QF T 2,k
17 Estimation of process noise covariance 10(21) Conditional densities x k+1 p(x k+1 x k ) = N y k p(y k x k ) = N (y k ; h k, R) The joint log likelihood function L Q (y 1:N, x 1:N ) = log p Q (y 1:N, x 1:N ) = L ( ) x k+1 ; F 1,k, F 2,k QF T 2,k N ( (x i F 1,i 1 ) T F 2,i 1 QF2,i 1) T (xi F 1,i 1 ) i=2 N i=2 log ( ( λ j F 2,i 1 QF2,i 1) ) T λ j =0
18 Estimation of process noise covariance 10(21) Conditional densities x k+1 p(x k+1 x k ) = N y k p(y k x k ) = N (y k ; h k, R) The joint log likelihood function L Q (y 1:N, x 1:N ) = log p Q (y 1:N, x 1:N ) = L ( ) x k+1 ; F 1,k, F 2,k QF T 2,k N ( (x i F 1,i 1 ) T F 2,i 1 QF2,i 1) T (xi F 1,i 1 ) i=2 N i=2 ( ) ) log ( F 2,i 1Q F T 2,i 1
19 Estimation of process noise covariance 11(21) cont d Expectation of the joint log likelihood function Γ(Q; Q l ) = E Ql [L Q (y 1:N, x 1:N ) y 1:N ] = L + 1 N ( ) ) ] 2 E Ql [log ( F 2,i 1Q F T 2,i 1 y 1:N i=2 1 N [ ( ) ] 2 tr E Ql F 2,i 1 QF T 2,i 1 (xi F 1,i 1 ) (x i F 1,i 1 ) T y 1:N i=2
20 Estimation of process noise covariance 12(21) cont d Calculation of the first expectation E Ql [log = ( ) ) ] ( F 2,i 1Q F T 2,i 1 y 1:N ( ) ) log ( F 2 (x i 1)Q F T 2 (x i 1) p Ql (x i 1 y 1:N ) dx i 1
21 Estimation of process noise covariance 12(21) cont d Calculation of the first expectation E Ql [log = ( ) ) ] ( F 2,i 1Q F T 2,i 1 y 1:N ( ) ) log ( F 2 (x i 1)Q F T 2 (x i 1) p Ql (x i 1 y 1:N ) dx i 1 Cannot be solved Approximation: Use the smoothed states. ( ) ) ] E Ql [log ( F 2,i 1Q F T 2,i 1 y 1:N ( ) ) log ( F 2 (ˆx s i 1 N )Q F T 2 (ˆxs i 1 N )
22 Estimation of process noise covariance 13(21) cont d Calculation of the second expectation [ ( ) ] E Ql F 2,i 1 QF T 2,i 1 (xi F 1,i 1 ) (x i F 1,i 1 ) T y 1:N ( ) = F 2 (x i 1 )QF T 2 (x i 1) (xi F 1,i 1 ) (x i F 1,i 1 ) T p Ql (x i, x i 1 y 1:N ) dx i dx i 1
23 Estimation of process noise covariance 13(21) cont d Calculation of the second expectation [ ( ) ] E Ql F 2,i 1 QF T 2,i 1 (xi F 1,i 1 ) (x i F 1,i 1 ) T y 1:N ( ) = F 2 (x i 1 )QF T 2 (x i 1) (xi F 1,i 1 ) (x i F 1,i 1 ) T p Ql (x i, x i 1 y 1:N ) dx i dx i 1 Use the smoothed states again [ ( ) ] E Ql F 2,i 1 QF T 2,i 1 (xi F 1,i 1 ) (x i F 1,i 1 ) T y 1:N ) (F 2 (ˆx s i 1 N )QFT 2 (ˆxs i 1 N ) (x i F 1,i 1 ) (x i F 1,i 1 ) T p Ql (x i, x i 1 y 1:N ) dx i dx i 1
24 Estimation of process noise covariance 14(21) cont d A first order Taylor approximation gives M = (x i F 1,i 1 ) (x i F 1,i 1 ) T p Ql (x i, x i 1 y 1:N ) dx i dx i 1 = ( J 1 I ) P ξ,s ( J1 I ) T i N ) ) T + (ˆx s i N F 1(ˆx s i 1 N (ˆx ) s i N F 1(ˆx s i 1 N ) J 1 = F 1(x) x P ξ,s i N x=ˆx s i 1 N is the smoothed state covariance for the augmented state vector ξ = ( x T i 1 x T ) T i
25 Estimation of process noise covariance 15(21) cont d Finally where Γ(Q; Q l ) = L + N 1 2 W = N i=2 log Q tr Q 1 W F 2(ˆx s i 1 N )M ( F 2(ˆx s i 1 N ) ) T
26 Estimation of process noise covariance 15(21) cont d Finally where Γ(Q; Q l ) = L + N 1 2 W = N i=2 log Q tr Q 1 W F 2(ˆx s i 1 N )M ( F 2(ˆx s i 1 N ) ) T Maximisation of Γ(Q; Q l ) w.r.t. Q gives Q l+1 = 1 N 1 N i=2 F 2(ˆx s i 1 N )M ( F 2(ˆx s i 1 N ) ) T
27 Outline 16(21) 1. Problem Formulation 2. Estimation of process noise covariance 3. Alternative Methods 4. Simulation Results
28 Alternative Methods 17(21) Alt. 1 Minimisation of the estimation error, given by e.g. EKF, w.r.t. Q, where Q is parametrised as a diagonal matrix. Alt. 2 Calculate v k from the state space model.
29 Alternative Methods 17(21) Alt. 1 Minimisation of the estimation error, given by e.g. EKF, w.r.t. Q, where Q is parametrised as a diagonal matrix. Alt. 2 Calculate v k from the state space model. 1. Select an initial value Q 0 and set l = Use the EKS with Q l. 3. Calculate the noise according to v k = F 2 ( ˆx s k N ) ( ˆx s k+1 N F 1 ( )) ˆx s k N, u k. 4. Let Q l+1 be the covariance matrix for v k according to Q l+1 = 1 N N v T k v k. k=1 5. If converged, stop, If not, set l = l + 1 and go to step 2.
30 Outline 18(21) 1. Problem Formulation 2. Estimation of process noise covariance 3. Alternative Methods 4. Simulation Results
31 Simulation Results 19(21) cont d Monte Carlo simulations on one simulated path with different initial values. No true values for Q Use Q in an EKF and calculate the estimation error (RMSE) for the path.
32 Simulation Results 19(21) cont d Monte Carlo simulations on one simulated path with different initial values. No true values for Q Use Q in an EKF and calculate the estimation error (RMSE) for the path. The EM algorithm converges to the same RMSE for all initial values. Same thing happens for Alt. 2. Alt. 1 ends up in different errors. The EM algorithm gives the lowest RMSE (2-norm of the RMSE). Max Min EM Alt Alt
33 Simulation Results cont d 20(21) EM Alg. 2 Alg Path error [mm] Time [s]
34 ML Estimation of Process Noise Variance in Dynamic Systems Patrik Axelsson, Umut Orguner, Fredrik Gustafsson and Mikael Norrlöf Division of Automatic Control Department of Electrical Engineering Linköping University, Sweden
ML Estimation of Process Noise Variance in Dynamic Systems
Milano Italy August 8 - September, 011 ML Estimation of Process Noise Variance in Dynamic Systems Patrik Axelsson, Umut Orguner, Fredrik Gustafsson, Mikael Norrlöf Department of Electrical Engineering,
More informationLinköping University Electronic Press
Linköping University Electronic Press Report Simulation Model of a 2 Degrees of Freedom Industrial Manipulator Patrik Axelsson Series: LiTH-ISY-R, ISSN 400-3902, No. 3020 ISRN: LiTH-ISY-R-3020 Available
More informationParameter Estimation in a Moving Horizon Perspective
Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE
More informationPosition Estimation and Modeling of a Flexible Industrial Robot
Position Estimation and Modeling of a Flexible Industrial Robot Rickard Karlsson, Mikael Norrlöf, Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581 83 Linköping,
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationCalibration of a magnetometer in combination with inertial sensors
Calibration of a magnetometer in combination with inertial sensors Manon Kok, Linköping University, Sweden Joint work with: Thomas Schön, Uppsala University, Sweden Jeroen Hol, Xsens Technologies, the
More informationVehicle Motion Estimation Using an Infrared Camera an Industrial Paper
Vehicle Motion Estimation Using an Infrared Camera an Industrial Paper Emil Nilsson, Christian Lundquist +, Thomas B. Schön +, David Forslund and Jacob Roll * Autoliv Electronics AB, Linköping, Sweden.
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationA Benchmark Problem for Robust Control of a Multivariable Nonlinear Flexible Manipulator
Proceedings of the 17th World Congress The International Federation of Automatic Control Seoul, Korea, July 6-11, 28 A Benchmark Problem for Robust Control of a Multivariable Nonlinear Flexible Manipulator
More informationEstimation and Maintenance of Measurement Rates for Multiple Extended Target Tracking
FUSION 2012, Singapore 118) Estimation and Maintenance of Measurement Rates for Multiple Extended Target Tracking Karl Granström*, Umut Orguner** *Division of Automatic Control Department of Electrical
More informationEstimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator
Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were
More informationMODELING AND IDENTIFICATION OF A MECHANICAL INDUSTRIAL MANIPULATOR 1
Copyright 22 IFAC 15th Triennial World Congress, Barcelona, Spain MODELING AND IDENTIFICATION OF A MECHANICAL INDUSTRIAL MANIPULATOR 1 M. Norrlöf F. Tjärnström M. Östring M. Aberger Department of Electrical
More informationNonlinear Identification of Backlash in Robot Transmissions
Nonlinear Identification of Backlash in Robot Transmissions G. Hovland, S. Hanssen, S. Moberg, T. Brogårdh, S. Gunnarsson, M. Isaksson ABB Corporate Research, Control Systems Group, Switzerland ABB Automation
More informationEKF and SLAM. McGill COMP 765 Sept 18 th, 2017
EKF and SLAM McGill COMP 765 Sept 18 th, 2017 Outline News and information Instructions for paper presentations Continue on Kalman filter: EKF and extension to mapping Example of a real mapping system:
More informationThe Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models
The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models Centre for Molecular, Environmental, Genetic & Analytic (MEGA) Epidemiology School of Population Health
More informationRobotics. Dynamics. Marc Toussaint U Stuttgart
Robotics Dynamics 1D point mass, damping & oscillation, PID, dynamics of mechanical systems, Euler-Lagrange equation, Newton-Euler recursion, general robot dynamics, joint space control, reference trajectory
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationLecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu
Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationA Note on the Expectation-Maximization (EM) Algorithm
A Note on the Expectation-Maximization (EM) Algorithm ChengXiang Zhai Department of Computer Science University of Illinois at Urbana-Champaign March 11, 2007 1 Introduction The Expectation-Maximization
More informationDual Estimation and the Unscented Transformation
Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology
More informationLecture 7: Optimal Smoothing
Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother
More informationFactor Analysis and Kalman Filtering (11/2/04)
CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used
More informationA new algorithm for deriving optimal designs
A new algorithm for deriving optimal designs Stefanie Biedermann, University of Southampton, UK Joint work with Min Yang, University of Illinois at Chicago 18 October 2012, DAE, University of Georgia,
More informationq 1 F m d p q 2 Figure 1: An automated crane with the relevant kinematic and dynamic definitions.
Robotics II March 7, 018 Exercise 1 An automated crane can be seen as a mechanical system with two degrees of freedom that moves along a horizontal rail subject to the actuation force F, and that transports
More informationLeast squares problems
Least squares problems How to state and solve them, then evaluate their solutions Stéphane Mottelet Université de Technologie de Compiègne 30 septembre 2016 Stéphane Mottelet (UTC) Least squares 1 / 55
More information1 Kalman Filter Introduction
1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation
More informationA Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization
A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline
More informationInverse Dynamics of Flexible Manipulators
Technical report from Automatic Control at Linköpings universitet Inverse Dynamics of Flexible Manipulators Stig Moberg, Sven Hanssen Division of Automatic Control E-mail: stig@isy.liu.se, soha@kth.se
More informationAn introduction to particle filters
An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationPartially Observable Markov Decision Processes (POMDPs)
Partially Observable Markov Decision Processes (POMDPs) Sachin Patil Guest Lecture: CS287 Advanced Robotics Slides adapted from Pieter Abbeel, Alex Lee Outline Introduction to POMDPs Locally Optimal Solutions
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationOn the Cramér-Rao lower bound under model mismatch
On the Cramér-Rao lower bound under model mismatch Carsten Fritsche, Umut Orguner, Emre Özkan and Fredrik Gustafsson Linköping University Post Print N.B.: When citing this work, cite the original article.
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals
ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal
More informationOptimal input design for nonlinear dynamical systems: a graph-theory approach
Optimal input design for nonlinear dynamical systems: a graph-theory approach Patricio E. Valenzuela Department of Automatic Control and ACCESS Linnaeus Centre KTH Royal Institute of Technology, Stockholm,
More informationEstimation theory and information geometry based on denoising
Estimation theory and information geometry based on denoising Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract What is the best
More informationPhysics 403. Segev BenZvi. Propagation of Uncertainties. Department of Physics and Astronomy University of Rochester
Physics 403 Propagation of Uncertainties Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Maximum Likelihood and Minimum Least Squares Uncertainty Intervals
More informationSystem identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.
System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors
More informationRobotics. Dynamics. University of Stuttgart Winter 2018/19
Robotics Dynamics 1D point mass, damping & oscillation, PID, dynamics of mechanical systems, Euler-Lagrange equation, Newton-Euler, joint space control, reference trajectory following, optimal operational
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationDynamics. Basilio Bona. Semester 1, DAUIN Politecnico di Torino. B. Bona (DAUIN) Dynamics Semester 1, / 18
Dynamics Basilio Bona DAUIN Politecnico di Torino Semester 1, 2016-17 B. Bona (DAUIN) Dynamics Semester 1, 2016-17 1 / 18 Dynamics Dynamics studies the relations between the 3D space generalized forces
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationMachine Learning 4771
Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object
More informationOutline Lecture 2 2(32)
Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian
More informationHierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References
24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained
More informationTransverse Linearization for Controlled Mechanical Systems with Several Passive Degrees of Freedom (Application to Orbital Stabilization)
Transverse Linearization for Controlled Mechanical Systems with Several Passive Degrees of Freedom (Application to Orbital Stabilization) Anton Shiriaev 1,2, Leonid Freidovich 1, Sergey Gusev 3 1 Department
More informationAUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET
The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface
More informationTSRT14: Sensor Fusion Lecture 6. Kalman Filter (KF) Le 6: Kalman filter (KF), approximations (EKF, UKF) Lecture 5: summary
TSRT14 Lecture 6 Gustaf Hendeby Spring 217 1 / 42 Le 6: Kalman filter KF approximations EKF UKF TSRT14: Sensor Fusion Lecture 6 Kalman filter KF KF approximations EKF UKF Gustaf Hendeby hendeby@isyliuse
More informationIdentification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM
Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,
More informationRobotics & Automation. Lecture 25. Dynamics of Constrained Systems, Dynamic Control. John T. Wen. April 26, 2007
Robotics & Automation Lecture 25 Dynamics of Constrained Systems, Dynamic Control John T. Wen April 26, 2007 Last Time Order N Forward Dynamics (3-sweep algorithm) Factorization perspective: causal-anticausal
More informationLecture 6: Gaussian Mixture Models (GMM)
Helsinki Institute for Information Technology Lecture 6: Gaussian Mixture Models (GMM) Pedram Daee 3.11.2015 Outline Gaussian Mixture Models (GMM) Models Model families and parameters Parameter learning
More informationRandom Matrix Eigenvalue Problems in Probabilistic Structural Mechanics
Random Matrix Eigenvalue Problems in Probabilistic Structural Mechanics S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 254 Part V
More informationFractional Imputation in Survey Sampling: A Comparative Review
Fractional Imputation in Survey Sampling: A Comparative Review Shu Yang Jae-Kwang Kim Iowa State University Joint Statistical Meetings, August 2015 Outline Introduction Fractional imputation Features Numerical
More informationVector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.
Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar
More informationIdentification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM
Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik
More informationExponential Controller for Robot Manipulators
Exponential Controller for Robot Manipulators Fernando Reyes Benemérita Universidad Autónoma de Puebla Grupo de Robótica de la Facultad de Ciencias de la Electrónica Apartado Postal 542, Puebla 7200, México
More information1 EM algorithm: updating the mixing proportions {π k } ik are the posterior probabilities at the qth iteration of EM.
Université du Sud Toulon - Var Master Informatique Probabilistic Learning and Data Analysis TD: Model-based clustering by Faicel CHAMROUKHI Solution The aim of this practical wor is to show how the Classification
More informationProperties and approximations of some matrix variate probability density functions
Technical report from Automatic Control at Linköpings universitet Properties and approximations of some matrix variate probability density functions Karl Granström, Umut Orguner Division of Automatic Control
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationProbabilistic & Unsupervised Learning
Probabilistic & Unsupervised Learning Week 2: Latent Variable Models Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College
More informationIntroduction to Mobile Robotics Bayes Filter Kalman Filter
Introduction to Mobile Robotics Bayes Filter Kalman Filter Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras 1 Bayes Filter Reminder 1. Algorithm Bayes_filter( Bel(x),d ):
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationCheng Soon Ong & Christian Walder. Canberra February June 2017
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2017 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 679 Part XIX
More informationLocal Positioning with Parallelepiped Moving Grid
Local Positioning with Parallelepiped Moving Grid, WPNC06 16.3.2006, niilo.sirola@tut.fi p. 1/?? TA M P E R E U N I V E R S I T Y O F T E C H N O L O G Y M a t h e m a t i c s Local Positioning with Parallelepiped
More informationTarget Localization and Tracking in Non-Coherent Multiple-Input Multiple-Output Radar Systems
Syracuse University SURFACE Electrical Engineering and Computer Science College of Engineering and Computer Science 2012 Target Localization and Tracking in Non-Coherent Multiple-Input Multiple-Output
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationA variational radial basis function approximation for diffusion processes
A variational radial basis function approximation for diffusion processes Michail D. Vrettas, Dan Cornford and Yuan Shen Aston University - Neural Computing Research Group Aston Triangle, Birmingham B4
More informationComputer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression
Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.
More informationDynamics modeling of an electro-hydraulically actuated system
Dynamics modeling of an electro-hydraulically actuated system Pedro Miranda La Hera Dept. of Applied Physics and Electronics Umeå University xavier.lahera@tfe.umu.se Abstract This report presents a discussion
More informationStat 451 Lecture Notes Numerical Integration
Stat 451 Lecture Notes 03 12 Numerical Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 5 in Givens & Hoeting, and Chapters 4 & 18 of Lange 2 Updated: February 11, 2016 1 / 29
More informationCMU-Q Lecture 24:
CMU-Q 15-381 Lecture 24: Supervised Learning 2 Teacher: Gianni A. Di Caro SUPERVISED LEARNING Hypotheses space Hypothesis function Labeled Given Errors Performance criteria Given a collection of input
More informationCS281A/Stat241A Lecture 17
CS281A/Stat241A Lecture 17 p. 1/4 CS281A/Stat241A Lecture 17 Factor Analysis and State Space Models Peter Bartlett CS281A/Stat241A Lecture 17 p. 2/4 Key ideas of this lecture Factor Analysis. Recall: Gaussian
More informationAlgorithmisches Lernen/Machine Learning
Algorithmisches Lernen/Machine Learning Part 1: Stefan Wermter Introduction Connectionist Learning (e.g. Neural Networks) Decision-Trees, Genetic Algorithms Part 2: Norman Hendrich Support-Vector Machines
More informationClustering. Professor Ameet Talwalkar. Professor Ameet Talwalkar CS260 Machine Learning Algorithms March 8, / 26
Clustering Professor Ameet Talwalkar Professor Ameet Talwalkar CS26 Machine Learning Algorithms March 8, 217 1 / 26 Outline 1 Administration 2 Review of last lecture 3 Clustering Professor Ameet Talwalkar
More informationRandom Eigenvalue Problems Revisited
Random Eigenvalue Problems Revisited S Adhikari Department of Aerospace Engineering, University of Bristol, Bristol, U.K. Email: S.Adhikari@bristol.ac.uk URL: http://www.aer.bris.ac.uk/contact/academic/adhikari/home.html
More informationRirdge Regression. Szymon Bobek. Institute of Applied Computer science AGH University of Science and Technology
Rirdge Regression Szymon Bobek Institute of Applied Computer science AGH University of Science and Technology Based on Carlos Guestrin adn Emily Fox slides from Coursera Specialization on Machine Learnign
More informationIN particle filter (PF) applications, knowledge of the computational
Complexity Analysis of the Marginalized Particle Filter Rickard Karlsson, Thomas Schön and Fredrik Gustafsson, Member IEEE Abstract In this paper the computational complexity of the marginalized particle
More informationData assimilation with and without a model
Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,
More informationSpatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016
Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström
More informationOverfitting, Bias / Variance Analysis
Overfitting, Bias / Variance Analysis Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 8, 207 / 40 Outline Administration 2 Review of last lecture 3 Basic
More informationSIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS
ECE5550: Applied Kalman Filtering 9 1 SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS 9.1: Parameters versus states Until now, we have assumed that the state-space model of the system
More informationCorner. Corners are the intersections of two edges of sufficiently different orientations.
2D Image Features Two dimensional image features are interesting local structures. They include junctions of different types like Y, T, X, and L. Much of the work on 2D features focuses on junction L,
More informationMMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING
MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING Yichuan Hu (), Javier Garcia-Frias () () Dept. of Elec. and Comp. Engineering University of Delaware Newark, DE
More informationWeek 3: The EM algorithm
Week 3: The EM algorithm Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2005 Mixtures of Gaussians Data: Y = {y 1... y N } Latent
More informationIntroduction to Probabilistic Graphical Models: Exercises
Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics
More informationTrajectory Tracking Control of a Very Flexible Robot Using a Feedback Linearization Controller and a Nonlinear Observer
Trajectory Tracking Control of a Very Flexible Robot Using a Feedback Linearization Controller and a Nonlinear Observer Fatemeh Ansarieshlaghi and Peter Eberhard Institute of Engineering and Computational
More informationStatistical Estimation
Statistical Estimation Use data and a model. The plug-in estimators are based on the simple principle of applying the defining functional to the ECDF. Other methods of estimation: minimize residuals from
More informationLatent Variable Models and EM Algorithm
SC4/SM8 Advanced Topics in Statistical Machine Learning Latent Variable Models and EM Algorithm Dino Sejdinovic Department of Statistics Oxford Slides and other materials available at: http://www.stats.ox.ac.uk/~sejdinov/atsml/
More informationMarkov Chains and Hidden Markov Models
Chapter 1 Markov Chains and Hidden Markov Models In this chapter, we will introduce the concept of Markov chains, and show how Markov chains can be used to model signals using structures such as hidden
More informationComputing Optimized Nonlinear Sliding Surfaces
Computing Optimized Nonlinear Sliding Surfaces Azad Ghaffari and Mohammad Javad Yazdanpanah Abstract In this paper, we have concentrated on real systems consisting of structural uncertainties and affected
More informationNeural Networks Lecture 10: Fault Detection and Isolation (FDI) Using Neural Networks
Neural Networks Lecture 10: Fault Detection and Isolation (FDI) Using Neural Networks H.A. Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011.
More informationDynamic System Identification using HDMR-Bayesian Technique
Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in
More information