Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound

Size: px
Start display at page:

Download "Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound"

Transcription

1 Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound Thomas Brehard (IRISA/CNRS), Jean-Pierre Le Cadre (IRISA/CNRS). Journée AS 67 Méthodes Particulaires Applications du Filtrage Particulaire ENST 3 December 2003

2 Summary 1. What is the Bearings-Only Tracking problem? 2. Solving using particle filtering algorithm From cartesian to modified polar coordinate system. 4. Initialization of the particle filtering algorithm. 5. Posterior Cramér-Rao bound. 6. Conclusion.

3 1. What is the Bearings-Only Tracking problem? (1/2) FIG. 1 Trajectories of the observer (pink) and the target (blue) and simulated bearing measurements. X(t) = X 1 (t) X 2 (t) X 3 (t) X 4 (t) = v x (t) v y (t) r x (t) r y (t) Z t tan ( ) ry (t) r x (t) X t is the target state at time t composed of relative velocity and position of the target in the x y plane Z t is the bearing measurement received at time t. Problem : Estimate X t using {Z 1,..., Z t }.

4 1. What is the bearings-only tracking problem?(2/2) The stochastic system : { Xt+1 = AX t + HU t + W t Z t = G(X t ) + V t where : X t is the target state at time t composed of relative velocity and position of the target in the x y plane, Z t the bearing measurement received at time t, U t is the known difference between observer velocity at time t + 1 and t. V t has a center normal distribution with variance σ v known. W t has a center normal distribution with covariance matrix Q known. This is a non-linear filtering problem which can be solved using particle filtering algorithm!

5 2. Solving using particle filtering algorithm... At each step of time : 1. Propagating the set of particles using the state equation. 2. Weighting each of the particles using the measurement equation. 3. Resampling step. Reference : Doucet et al. (2001) Problem Particles must be properly initialized!

6 3. From cartesian to modified polar coordinate system (1/3) There is an unobservability problem hidden in the cartesian formulation of the system : { Xt+1 = AX t + HU t + W t Problem Z t = G(X t ) + V t The range is unobservable until the observer has maneuvered. Solution Reference : Aidala and Hammel (1983) A coordinate system more suited to the problem : the modified polar coordinate system

7 3. From cartesian to modified polar coordinate system (2/3) The modified polar coordinates We can show that : Y (t) = Y 1 (t) Y 2 (t) Y 3 (t) Y 4 (t) = β(t) ṙ(t) r(t) β(t) 1 r(t) Y 4 (t) is unobservable until the observer has not maneuvered. Y r (t) = Y 1(t) β(t) Y 2 (t) = ṙ(t) r(t) is always observable. Y 3 (t) β(t)

8 3. From cartesian to modified polar coordinate system (3/3) Before the observer maneuvers, the stochastic system in modified polar coordinate system is where Y r t+1 = F (Y r t, W t ), Y 4 (t + 1) = H(Y t, W t ), Z t = Y 3 (t) + V t W t = Y 4 (t)w t and W t has a center normal distribution with covariance matrix Q known. An interesting model : This is a non linear filtering problem with unknown covariance state ( Y 4 (t) is unknown!).

9 4. Initialization of the particle filtering algorithm (1/7) 2 key ideas : We can proove that if the target has a deterministic trajectory then for all k ( ) Z(t 0 + k) = Y 3 (t 0 ) + tan 1 kδt Y 1 (t 0 ) + V k 1 + kδ t Y 2 (t 0 ) It is just an optimization problem. The observable components Yt r 0 can be estimated using the set of measurements {Z t0,..., Z t0 +K}. We only assume a prior information on the unobservable component Y 4 (t 0 ) : 1 R max Y 4 (t 0 ) 1 R min

10 4. Initialization of the particle filtering algorithm (2/7) Initialization of the particle filtering algorithm Wait until time t 0 + K, the particule i is initialized by where 1 (t 0 ) 2 (t 0 ) 3 (t 0 ) Y (i) Y (i) Y (i) CA(Ŷ r t 0 ), Ŷ t r 0 is computed using a Gauss-Newton optimization algorithm (initialized by linear regression). CA(Ŷ t r 0 ) is the confidence area of Ŷ t r 0 (approximated by an hyperellipsoid). Y (i) 4 (t 0 ) 1 U[R min, R max ].

11 4. Initialization of the particle filtering algorithm (3/7) Two important points Before the observer maneuvers, Y (i) 1 (t) the observable component of the particles Y (i) 2 (t) and the unobservable Y (i) 3 (t) component of the particles Y (i) 4 (t) must be resampled independently! How the initialization time K can be fixed? The particle filtering algorithm is initialized as soon as the volume of the confidence area for Ŷ r t 0 is sufficiently small to be filled by N particles.

12 4. Initialization of the particle filtering algorithm (4/7) X target 0 = Simulation scenario 10 ms 1 2 ms 1 0 m m, Xobs 0 = 6 ms 1 3 ms m 0 m The observer follows a leg-by-leg trajectory. His velocity vector is constant on each leg and modified at the two following instants : ( ) ( V obs x (2100) 10 ms 1 Vx obs = (2100) 2 ms 1 ) ( V obs, x (3900) Vx obs (3900). ) ( 10 ms 1 = 2 ms 1 ). t = 1800s t = 3600s t = 5400s Trajectories of the observer (pink) and the target (blue).

13 4. Initialization of the particle filtering algorithm (5/7) Simulation scenario Simulated bearing measurements. The measurement standard deviation is 0.05 rad (about 3 deg). Parameters for the particle filtering algorithm Number of particles : 10000, Sampling threshold : 0.5, The single assumption : R min = 1000 m and R max = m.

14 4. Initialization of the particle filtering algorithm (6/7) Simulation results in modified polar coordinates Y 1 (t) Y 2 (t) Y 3 (t) Y 4 (t) True value (blue), Estimates (red), Confidence (green)

15 4. Initialization of the particle filtering algorithm (7/7) Simulation results in cartesian system Conclusion t = 1800s t = 3600s t = 5400s Trajectory of the observer (pink), trajectory of the true target (blue), particles (red), confidence for the estimate (green). We have initialized the particle filtering algorithm using a weak prior on range. Performance analysis in polar modified coordinate system.

16 5. Posterior Cramér-Rao Bound (1/9) Next step : The performance analysis in modified polar coordinate system. Tool : The Posterior Cramér-Rao Bound (PCRB). The PCRB gives a lower bound for the Mean Square Error

17 5. Posterior Cramér-Rao Bound (2/9) Notations : Y 0:t = {Y 0,..., Y t } and Z 0:t = {Z 0,..., Z t }. The bias where g(z 0:t ) is the estimator of Y 0:t B(Y 0:t ) = E (g(z 0:t Y 0:t )) Y 0:t The asymptotic bias assumption lim Y i (j) Y + i B(Y 0:t )p(y 0:t ) = lim Y i (j) Y i B(Y 0:t )p(y 0:t ) i {1,..., n y } and j {0,..., t} where Y i is the state space of Y i (j) for all j in {0,..., t}, Y i and Y + i are the endpoints of Y i

18 5. Posterior Cramér-Rao Bound (3/9) Proposition 1 Under the asymptotic bias assumption, where ECM 0:t J 1 0:t ECM 0:t = E{(Y 0:t g(z 0:t ))(Y 0:t g(z 0:t )) t } J 0:t = E{ Y 0:t Y 0:t ln p(z 0:t, Y 0:t )} and g(z 0:t ) is the estimator of Y 0:t. In the filtering context, we only want to approximate the right lower block of J 0:t noted J t.

19 5. Posterior Cramér-Rao Bound (4/9) Using Tichavsky et al. results, we have where Conclusion : J t+1 = D 22 t D 21 t (J t + D 11 t ) 1 D 12 t D 11 t = E{ Yt ln p(y t+1 Y t ) t Y t ln p(y t+1 Y t )}}, D 21 t = E{ Yt+1 ln p(y t+1 Y t ) t Y t ln p(y t+1 Y t )}, D 12 t = E{ Yt+1 ln p(y t+1 Y t ) t Y t+1 ln p(y t+1 Y t )}, D 22 t = E{ Yt+1 ln p(y t+1 Y t ) t Y t+1 ln p(y t+1 Y t )} + E{ Yt+1 ln p(z t+1 Y t+1 ) t Y t+1 ln p(z t+1 Y t+1 )}. Rq : Dt 11, Dt 12, Dt 22, Dt 21 Cramér-Rao Bound. J t can be computed recursively. are approximated using Monte Carlo method and J(t 0 ) is obtained by a

20 5. Posterior Cramér-Rao Bound (5/9) Simulation results Y 1 (t) Y 2 (t) Y 3 (t) Y 4 (t) Problem : Hypothesis : (Mean square error in blue, PCRB in red) The PCRB seems over optimistic... Is the asymptotic bias assumption true? Is J 0:t ill-conditionned due to range unobservability?

21 5. Posterior Cramér-Rao Bound (6/9) Is the asymptotic bias assumption true? A more general bound : where ECM 0:t C 0:t J 1 0:t C t 0:t C 0:t = E{(g(Z 0:t ) Y 0:t ) t Y 0:t ln p(z 0:t, Y 0:t )}. Remarks : C 0:t can be approximated using Monte Carlo approximation. The recursive formulation of the PCRB is no longer valid.

22 5. Posterior Cramér-Rao Bound (7/9) Is the asymptotic bias assumption true? Simulation results Y 1 (t) Y 2 (t) Y 3 (t) Y 4 (t) Solution : (Mean square error in blue, classical PCRB in red, new PCRB in green) The asymptotic bias assumption is not true in the bearings-only tracking problem.

23 5. Posterior Cramér-Rao Bound (8/9) Problem : The posterior Cramér-Rao bound seems over optimistic... Hypothesis : Is the asymptotic bias assumption true? No Is J 0:t ill-conditionned due to range unobservability?

24 5. Posterior Cramér-Rao Bound (9/9) Is J 0:t ill-conditionned due to range unobservability? Idea : We only construct a bound for the observable components of the state Y r t. We can proove that : ECM r 0:t E{C 0:t (Y 4 (t 0 ))J 1 0:t (Y 4 (t 0 ))C t 0:t(Y 4 (t 0 ))}. where J 0:t (Y 4 (t 0 )) = E{ Y 0:t r Y ln p 0:t r Y 4 (t 0 )(Z 0:t, Y0:t) Y r 4 (t 0 )}, C 0:t (Y 4 (t 0 )) = E{(g r (Z 0:t ) Y0:t) r t Y ln p 0:t r Y 4 (t 0 )(Z 0:t, Y0:t) Y r 4 (t 0 )}. It is possible to approximate J 0:t (Y 4 (t 0 )) and C 0:t (Y 4 (t 0 )).

25 6. Conclusion We have proposed : An initialization method for particle filtering algorithm using modifed polar coordinates using a weak prior on range. A new interpretation of the Bearings-Only Tracking problem. A realistic posterior Cramér-Rao bound in modified polar coordinate system. Perspectives : 3D target tracking. Maneuvering target.

P U B L I C A T I O N I N T E R N E N o INITIALIZATION OF PARTICLE FILTER AND POSTERIOR CRAMÉR-RAO BOUND FOR BEARINGS-ONLY TRACKING IN

P U B L I C A T I O N I N T E R N E N o INITIALIZATION OF PARTICLE FILTER AND POSTERIOR CRAMÉR-RAO BOUND FOR BEARINGS-ONLY TRACKING IN I R I 1588 P U B L I C A T I O N I N T E R N E N o S INSTITUT DE RECHERCHE EN INFORMATIQUE ET SYSTÈMES ALÉATOIRES A INITIALIZATION OF PARTICLE FILTER AND POSTERIOR CRAMÉR-RAO BOUND FOR BEARINGS-ONLY TRACKING

More information

Initialization of Particle Filter and Posterior Cramér-Rao Bound for Bearings-Only Tracking in Modified Polar Coordinate System

Initialization of Particle Filter and Posterior Cramér-Rao Bound for Bearings-Only Tracking in Modified Polar Coordinate System Initialization of Particle Filter and Posterior Cramér-Rao Bound for Bearings-Only Tracking in Modified Polar Coordinate System Thomas Bréhard Jean-Pierre Le Cadre To cite this version: Thomas Bréhard

More information

Hierarchical Particle Filter for Bearings-Only Tracking

Hierarchical Particle Filter for Bearings-Only Tracking Hierarchical Particle Filter for Bearings-Only Tracing 1 T. Bréhard and J.-P. Le Cadre IRISA/CNRS Campus de Beaulieu 3542 Rennes Cedex, France e-mail: thomas.brehard@gmail.com, lecadre@irisa.fr Tel : 33-2.99.84.71.69

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

SELECTIVE ANGLE MEASUREMENTS FOR A 3D-AOA INSTRUMENTAL VARIABLE TMA ALGORITHM

SELECTIVE ANGLE MEASUREMENTS FOR A 3D-AOA INSTRUMENTAL VARIABLE TMA ALGORITHM SELECTIVE ANGLE MEASUREMENTS FOR A 3D-AOA INSTRUMENTAL VARIABLE TMA ALGORITHM Kutluyıl Doğançay Reza Arablouei School of Engineering, University of South Australia, Mawson Lakes, SA 595, Australia ABSTRACT

More information

Application of probabilistic PCR5 Fusion Rule for Multisensor Target Tracking

Application of probabilistic PCR5 Fusion Rule for Multisensor Target Tracking Application of probabilistic PCR5 Fusion Rule for Multisensor Target Tracking arxiv:0707.3013v1 [stat.ap] 20 Jul 2007 Aloïs Kirchner a, Frédéric Dambreville b, Francis Celeste c Délégation Générale pour

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

AUTOMOTIVE ENVIRONMENT SENSORS

AUTOMOTIVE ENVIRONMENT SENSORS AUTOMOTIVE ENVIRONMENT SENSORS Lecture 5. Localization BME KÖZLEKEDÉSMÉRNÖKI ÉS JÁRMŰMÉRNÖKI KAR 32708-2/2017/INTFIN SZÁMÚ EMMI ÁLTAL TÁMOGATOTT TANANYAG Related concepts Concepts related to vehicles moving

More information

Target Tracking and Classification using Collaborative Sensor Networks

Target Tracking and Classification using Collaborative Sensor Networks Target Tracking and Classification using Collaborative Sensor Networks Xiaodong Wang Department of Electrical Engineering Columbia University p.1/3 Talk Outline Background on distributed wireless sensor

More information

Optimal path planning using Cross-Entropy method

Optimal path planning using Cross-Entropy method Optimal path planning using Cross-Entropy method F Celeste, FDambreville CEP/Dept of Geomatics Imagery Perception 9 Arcueil France {francisceleste, fredericdambreville}@etcafr J-P Le Cadre IRISA/CNRS Campus

More information

A note on multiple imputation for general purpose estimation

A note on multiple imputation for general purpose estimation A note on multiple imputation for general purpose estimation Shu Yang Jae Kwang Kim SSC meeting June 16, 2015 Shu Yang, Jae Kwang Kim Multiple Imputation June 16, 2015 1 / 32 Introduction Basic Setup Assume

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Monte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology

Monte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology Steps in Monte Carlo Simulation Create input sample space with known distribution, e.g. ensemble of all possible combinations of v, D, q,

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Monte Carlo Studies. The response in a Monte Carlo study is a random variable.

Monte Carlo Studies. The response in a Monte Carlo study is a random variable. Monte Carlo Studies The response in a Monte Carlo study is a random variable. The response in a Monte Carlo study has a variance that comes from the variance of the stochastic elements in the data-generating

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

RESEARCH OBJECTIVES AND SUMMARY OF RESEARCH

RESEARCH OBJECTIVES AND SUMMARY OF RESEARCH XVIII. DETECTION AND ESTIMATION THEORY Academic Research Staff Prof. H. L. Van Trees Prof. A. B. Baggeroer Graduate Students J. P. Albuquerque J. M. F. Moura L. C. Pusey L. S. Metzger S. Orphanoudakis

More information

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability

More information

Lecture Note 12: Kalman Filter

Lecture Note 12: Kalman Filter ECE 645: Estimation Theory Spring 2015 Instructor: Prof. Stanley H. Chan Lecture Note 12: Kalman Filter LaTeX prepared by Stylianos Chatzidakis) May 4, 2015 This lecture note is based on ECE 645Spring

More information

STONY BROOK UNIVERSITY. CEAS Technical Report 829

STONY BROOK UNIVERSITY. CEAS Technical Report 829 1 STONY BROOK UNIVERSITY CEAS Technical Report 829 Variable and Multiple Target Tracking by Particle Filtering and Maximum Likelihood Monte Carlo Method Jaechan Lim January 4, 2006 2 Abstract In most applications

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

= m(0) + 4e 2 ( 3e 2 ) 2e 2, 1 (2k + k 2 ) dt. m(0) = u + R 1 B T P x 2 R dt. u + R 1 B T P y 2 R dt +

= m(0) + 4e 2 ( 3e 2 ) 2e 2, 1 (2k + k 2 ) dt. m(0) = u + R 1 B T P x 2 R dt. u + R 1 B T P y 2 R dt + ECE 553, Spring 8 Posted: May nd, 8 Problem Set #7 Solution Solutions: 1. The optimal controller is still the one given in the solution to the Problem 6 in Homework #5: u (x, t) = p(t)x k(t), t. The minimum

More information

CALCULATION METHOD FOR NONLINEAR DYNAMIC LEAST-ABSOLUTE DEVIATIONS ESTIMATOR

CALCULATION METHOD FOR NONLINEAR DYNAMIC LEAST-ABSOLUTE DEVIATIONS ESTIMATOR J. Japan Statist. Soc. Vol. 3 No. 200 39 5 CALCULAION MEHOD FOR NONLINEAR DYNAMIC LEAS-ABSOLUE DEVIAIONS ESIMAOR Kohtaro Hitomi * and Masato Kagihara ** In a nonlinear dynamic model, the consistency and

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems Modeling CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami February 21, 2017 Outline 1 Modeling and state estimation 2 Examples 3 State estimation 4 Probabilities

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Nonlinear State Estimation! Extended Kalman Filters!

Nonlinear State Estimation! Extended Kalman Filters! Nonlinear State Estimation! Extended Kalman Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Deformation of the probability distribution!! Neighboring-optimal

More information

Stochastic Simulation Variance reduction methods Bo Friis Nielsen

Stochastic Simulation Variance reduction methods Bo Friis Nielsen Stochastic Simulation Variance reduction methods Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: bfni@dtu.dk Variance reduction

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

(1 2t), y(1) = 2 y. dy dt = t. e t y, y(0) = 1. dr, r(1) = 2 (r = r(θ)) y = t(t2 + 1) 4y 3, y(0) = 1. 2t y + t 2 y, y(0) = 2. 2t 1 + 2y, y(2) = 0

(1 2t), y(1) = 2 y. dy dt = t. e t y, y(0) = 1. dr, r(1) = 2 (r = r(θ)) y = t(t2 + 1) 4y 3, y(0) = 1. 2t y + t 2 y, y(0) = 2. 2t 1 + 2y, y(2) = 0 MATH 307 Due: Problem 1 Text: 2.2.9-20 Solve the following initial value problems (this problem should mainly be a review of MATH 125). 1. y = (1 2t)y 2, y(0) = 1/6 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12.

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

Problem Selected Scores

Problem Selected Scores Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected

More information

Lectures on Statistics. William G. Faris

Lectures on Statistics. William G. Faris Lectures on Statistics William G. Faris December 1, 2003 ii Contents 1 Expectation 1 1.1 Random variables and expectation................. 1 1.2 The sample mean........................... 3 1.3 The sample

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

Posterior Cramer-Rao Lower Bound for Mobile Tracking in Mixed Line-of-Sight/Non Line-of-Sight Conditions

Posterior Cramer-Rao Lower Bound for Mobile Tracking in Mixed Line-of-Sight/Non Line-of-Sight Conditions Posterior Cramer-Rao Lower Bound for Mobile Tracking in Mixed Line-of-Sight/Non Line-of-Sight Conditions Chen Liang 1,2, Wu Lenan 2, Robert Piché 1 1 Tampere University of Technology, Finland 2 Southeast

More information

Basic Sampling Methods

Basic Sampling Methods Basic Sampling Methods Sargur Srihari srihari@cedar.buffalo.edu 1 1. Motivation Topics Intractability in ML How sampling can help 2. Ancestral Sampling Using BNs 3. Transforming a Uniform Distribution

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1. Problem 1 (21 points) An economist runs the regression y i = β 0 + x 1i β 1 + x 2i β 2 + x 3i β 3 + ε i (1) The results are summarized in the following table: Equation 1. Variable Coefficient Std. Error

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Economics 620, Lecture 9: Asymptotics III: Maximum Likelihood Estimation

Economics 620, Lecture 9: Asymptotics III: Maximum Likelihood Estimation Economics 620, Lecture 9: Asymptotics III: Maximum Likelihood Estimation Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 9: Asymptotics III(MLE) 1 / 20 Jensen

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Using wavelet tools to estimate and assess trends in atmospheric data

Using wavelet tools to estimate and assess trends in atmospheric data NRCSE Using wavelet tools to estimate and assess trends in atmospheric data Wavelets Fourier analysis uses big waves Wavelets are small waves Requirements for wavelets Integrate to zero Square integrate

More information

Dynamical Models for Tracking with the Variable Rate Particle Filter

Dynamical Models for Tracking with the Variable Rate Particle Filter Dynamical Models for Tracking with the Variable Rate Particle Filter Pete Bunch and Simon Godsill Department of Engineering University of Cambridge Cambridge, UK Email: {pb44, sjg}@eng.cam.ac.uk Abstract

More information

State Space Models for Wind Forecast Correction

State Space Models for Wind Forecast Correction for Wind Forecast Correction Valérie 1 Pierre Ailliot 2 Anne Cuzol 1 1 Université de Bretagne Sud 2 Université de Brest MAS - 2008/28/08 Outline 1 2 Linear Model : an adaptive bias correction Non Linear

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Heterogeneous Track-to-Track Fusion

Heterogeneous Track-to-Track Fusion Heterogeneous Track-to-Track Fusion Ting Yuan, Yaakov Bar-Shalom and Xin Tian University of Connecticut, ECE Dept. Storrs, CT 06269 E-mail: {tiy, ybs, xin.tian}@ee.uconn.edu T. Yuan, Y. Bar-Shalom and

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions

Outline. 1 Introduction. 2 Problem and solution. 3 Bayesian tracking model of group debris. 4 Simulation results. 5 Conclusions Outline 1 Introduction 2 Problem and solution 3 Bayesian tracking model of group debris 4 Simulation results 5 Conclusions Problem The limited capability of the radar can t always satisfy the detection

More information

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes

ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes ED HERBST September 11, 2017 Background Hamilton, chapters 15-16 Trends vs Cycles A commond decomposition of macroeconomic time series

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Complexity of two and multi-stage stochastic programming problems

Complexity of two and multi-stage stochastic programming problems Complexity of two and multi-stage stochastic programming problems A. Shapiro School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0205, USA The concept

More information

Fisher Information Matrix-based Nonlinear System Conversion for State Estimation

Fisher Information Matrix-based Nonlinear System Conversion for State Estimation Fisher Information Matrix-based Nonlinear System Conversion for State Estimation Ming Lei Christophe Baehr and Pierre Del Moral Abstract In practical target tracing a number of improved measurement conversion

More information

First order differential equations

First order differential equations First order differential equations Samy Tindel Purdue University Differential equations and linear algebra - MA 262 Taken from Differential equations and linear algebra by Goode and Annin Samy T. First

More information

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking 1 The Problem Goal: To track activities

More information

Evaluation of a robot learning and planning via Extreme Value Theory

Evaluation of a robot learning and planning via Extreme Value Theory Evaluation of a robot learning and planning via Extreme Value Theory F. CELESTE, F. DAMBREVILLE CEP/Dept. of Geomatics Imagery Perception 94114 Arcueil, France Emails: francis.celeste, frederic.dambreville}@etca.fr

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

A State Space Model for Wind Forecast Correction

A State Space Model for Wind Forecast Correction A State Space Model for Wind Forecast Correction Valrie Monbe, Pierre Ailliot 2, and Anne Cuzol 1 1 Lab-STICC, Université Européenne de Bretagne, France (e-mail: valerie.monbet@univ-ubs.fr, anne.cuzol@univ-ubs.fr)

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

Gear Health Monitoring and Prognosis

Gear Health Monitoring and Prognosis Gear Health Monitoring and Prognosis Matej Gas perin, Pavle Bos koski, -Dani Juiric ic Department of Systems and Control Joz ef Stefan Institute Ljubljana, Slovenia matej.gasperin@ijs.si Abstract Many

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Spall John Wiley and Sons, Inc., 2003 Preface... xiii 1. Stochastic Search

More information

There seems to be three different groups of students: A group around 6 A group around 12 A group around 16

There seems to be three different groups of students: A group around 6 A group around 12 A group around 16 10 5 0 0 5 10 15 20 25 30 There seems to be three different groups of students: A group around 6 A group around 12 A group around 16 Altuğ Özpineci ( METU ) Phys109-MECHANICS PHYS109 55 / 67 10 5 0 0 5

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

The Extended Kalman Filter is a Natural Gradient Descent in Trajectory Space

The Extended Kalman Filter is a Natural Gradient Descent in Trajectory Space The Extended Kalman Filter is a Natural Gradient Descent in Trajectory Space Yann Ollivier Abstract The extended Kalman filter is perhaps the most standard tool to estimate in real time the state of a

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

GLS-based unit root tests with multiple structural breaks both under the null and the alternative hypotheses

GLS-based unit root tests with multiple structural breaks both under the null and the alternative hypotheses GLS-based unit root tests with multiple structural breaks both under the null and the alternative hypotheses Josep Lluís Carrion-i-Silvestre University of Barcelona Dukpa Kim Boston University Pierre Perron

More information

Dynamic Regression Models (Lect 15)

Dynamic Regression Models (Lect 15) Dynamic Regression Models (Lect 15) Ragnar Nymoen University of Oslo 21 March 2013 1 / 17 HGL: Ch 9; BN: Kap 10 The HGL Ch 9 is a long chapter, and the testing for autocorrelation part we have already

More information

ODE Math 3331 (Summer 2014) June 16, 2014

ODE Math 3331 (Summer 2014) June 16, 2014 Page 1 of 12 Please go to the next page... Sample Midterm 1 ODE Math 3331 (Summer 2014) June 16, 2014 50 points 1. Find the solution of the following initial-value problem 1. Solution (S.O.V) dt = ty2,

More information

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Ellida M. Khazen * 13395 Coppermine Rd. Apartment 410 Herndon VA 20171 USA Abstract

More information

Steven Cook University of Wales Swansea. Abstract

Steven Cook University of Wales Swansea. Abstract On the finite sample power of modified Dickey Fuller tests: The role of the initial condition Steven Cook University of Wales Swansea Abstract The relationship between the initial condition of time series

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

The Shifted Rayleigh Filter for 3D Bearings-only Measurements with Clutter

The Shifted Rayleigh Filter for 3D Bearings-only Measurements with Clutter The Shifted Rayleigh Filter for 3D Bearings-only Measurements with Clutter Attila Can Özelçi EEE Department Imperial College London, SW7 BT attila.ozelci@imperial.ac.uk Richard Vinter EEE Department Imperial

More information

Basic concepts in estimation

Basic concepts in estimation Basic concepts in estimation Random and nonrandom parameters Definitions of estimates ML Maimum Lielihood MAP Maimum A Posteriori LS Least Squares MMS Minimum Mean square rror Measures of quality of estimates

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Monte Carlo Methods. Geoff Gordon February 9, 2006

Monte Carlo Methods. Geoff Gordon February 9, 2006 Monte Carlo Methods Geoff Gordon ggordon@cs.cmu.edu February 9, 2006 Numerical integration problem 5 4 3 f(x,y) 2 1 1 0 0.5 0 X 0.5 1 1 0.8 0.6 0.4 Y 0.2 0 0.2 0.4 0.6 0.8 1 x X f(x)dx Used for: function

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: Particle Filter for Localization Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, and the book Probabilistic Robotics from Thurn et al.

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information