Inference for multi-object dynamical systems: methods and analysis

Similar documents
Sequential Monte Carlo Samplers for Applications in High Dimensions

Auxiliary Particle Methods

An Brief Overview of Particle Filtering

A second-order PHD filter with mean and variance in target number

Bayesian Inference in Astronomy & Astrophysics A Short Course

Extreme Value Analysis and Spatial Extremes

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Spring 2012 Math 541B Exam 1

Controlled sequential Monte Carlo

The Multiple Model Labeled Multi-Bernoulli Filter

Unsupervised Learning

Fast Sequential Monte Carlo PHD Smoothing

arxiv: v2 [math.pr] 28 Feb 2017

9 Multi-Model State Estimation

A new class of interacting Markov Chain Monte Carlo methods

BAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS. Francesco Papi and Du Yong Kim

Machine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014

Sequential Monte Carlo Methods for Bayesian Computation

Dynamic models 1 Kalman filters, linearization,

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

A new iterated filtering algorithm

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Lecture 7 Introduction to Statistical Decision Theory

Lecture 4: Probabilistic Learning. Estimation Theory. Classification with Probability Distributions

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Probabilistic Graphical Models

Artificial Intelligence

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

1: PROBABILITY REVIEW

A Backward Particle Interpretation of Feynman-Kac Formulae

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Lecture 4: Probabilistic Learning

Particle Filters: Convergence Results and High Dimensions

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

University of Regina. Lecture Notes. Michael Kozdron

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

Computer Intensive Methods in Mathematical Statistics

Artificial Intelligence

Patterns of Scalable Bayesian Inference Background (Session 1)

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Dynamic System Identification using HDMR-Bayesian Technique

Negative Association, Ordering and Convergence of Resampling Methods

Practical conditions on Markov chains for weak convergence of tail empirical processes

MAS223 Statistical Inference and Modelling Exercises

Convergence in Distribution

Final Examination. STA 711: Probability & Measure Theory. Saturday, 2017 Dec 16, 7:00 10:00 pm

Mobile Robot Localization

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

CS145: Probability & Computing

Lecture 2: Repetition of probability theory and statistics

Markov Chain Monte Carlo (MCMC)

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions

Nonparameteric Regression:

The Delta Method and Applications

Computer Intensive Methods in Mathematical Statistics

A Random Finite Set Conjugate Prior and Application to Multi-Target Tracking

Part 1: Expectation Propagation

1 Presessional Probability

Review of Probabilities and Basic Statistics

an introduction to bayesian inference

Basic math for biology

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Asymptotic Statistics-III. Changliang Zou

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Smoothing Algorithms for the Probability Hypothesis Density Filter

P (A G) dp G P (A G)

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

LECTURE 15 Markov chain Monte Carlo

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Linear Dynamical Systems

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Contraction properties of Feynman-Kac semigroups

6.1 Moment Generating and Characteristic Functions

Quick Tour of Basic Probability Theory and Linear Algebra

CPHD filtering in unknown clutter rate and detection profile

Incorporating Track Uncertainty into the OSPA Metric

A Review of Pseudo-Marginal Markov Chain Monte Carlo

University of Toronto Department of Statistics

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

Probability and Measure

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Chp 4. Expectation and Variance

Exercises with solutions (Set D)

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

STA205 Probability: Week 8 R. Wolpert

Graphical Models and Kernel Methods

simple if it completely specifies the density of x

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Gaussian Mixture PHD and CPHD Filtering with Partially Uniform Target Birth

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Transcription:

Inference for multi-object dynamical systems: methods and analysis Jérémie Houssineau National University of Singapore September 13, 2018 Joint work with: Daniel Clark, Ajay Jasra, Sumeetpal Singh, Emmanuel Delande, Isabel Schlangen, Pierre Del Moral, Adrian Bishop and others. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 1 / 67

Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 2 / 67

Overview Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 3 / 67

Overview Applications (a) Microscopy (b) Surveillance (c) Space Debris (d) Microfluidics Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 4 / 67

Overview Multi-object dynamical system Number of objects changes in time (birth/death process) Observation process: Observation of a given object might fail (false negative) When successful, it is prone to errors Some observations originate from background noise (false positive) Data association is unknown a priori Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 5 / 67

Overview Example: Space Situational Awareness http://astria.tacc.utexas.edu/astriagraph/ Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 6 / 67

Overview Example: Space Situational Awareness http://astria.tacc.utexas.edu/astriagraph/ Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 6 / 67

Overview Example: Finite-resolution sensor Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 7 / 67

Overview Example: Finite-resolution sensor From H., Clark, and Del Moral 2015 300 200 100 0-100 -200-300 -400-300 -200-100 0 100 200 300 400 Trajectories and observations 0 20 40 60 80 0 100 20 120 40 140 60 160 80 100 120 140 1-100 20 0-20 -40-60 -80 20 0-20 -40-60 -80-100 cell: (5 m, 1 deg) cell: (20 m, 4 deg) Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 8 / 67

Overview Example: Finite-resolution sensor From H., Clark, and Del Moral 2015 300 200 100 0-100 -200-300 -400-300 -200-100 0 100 200 300 400 Trajectories and observations 0 20 40 60 80 0 100 20 120 40 140 60 160 80 100 120 140 1-100 20 0-20 -40-60 -80 20 0-20 -40-60 -80-100 cell: (5 m, 1 deg) cell: (20 m, 4 deg) Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 8 / 67

Overview Example: Classification From Pailhas, H., Petillot, and Clark 2016 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 9 / 67

x (m) Overview x (m) Example: Classification From Pailhas, H., Petillot, and Clark 2016 0 y (m) 0 50 100 150 200 0 y (m) 0 50 100 150 200 50 50 100 100 150 150 200 200 250 250 300 300 Harbour surveillance: threat detection from motion-based classification Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 10 / 67

Overview Example: Estimation of parameters From H., Clark, Ivekovic, Lee, and Franco 2016 Camera calibration: joint estimation of camera pose and paper plane trajectories Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 11 / 67

Overview Example: Estimation of parameters From H., Clark, Ivekovic, Lee, and Franco 2016 Camera calibration: joint estimation of camera pose and paper plane trajectories Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 11 / 67

Modelling and assumptions Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 12 / 67

Modelling and assumptions Modelling Single-object modelling Each object is characterised by a HMM parametrised by θ Θ with: A Markov kernel f θ on the state space S R d An initial distribution µ A likelihood g θ from S to the observation space O R d Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 13 / 67

Modelling and assumptions Modelling Assumptions No interactions between objects (dynamics and observation) False positives are independent of all objects Birth/death process is independent of all objects Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 14 / 67

Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 15 / 67

Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 15 / 67

Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 15 / 67

Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 15 / 67

Modelling and assumptions Modelling Naive solution First idea Solve the data association by finding the closest observation. Lead to track coalescence Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 16 / 67

Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 17 / 67

Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 17 / 67

Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 17 / 67

Modelling and assumptions Modelling Multi-object modelling False positives: i.i.d. Poisson with rate λ and distribution p ϑ Probability of detection p D (0, 1] Multi-object parameter θ. = [θ, K, p D, λ, ϑ] t Multi-object observation function for y O m, m 0: g θ (y x). = d {0,1} K d m [ Po λ (m d ) m σ Sym(m) i= d +1 ( ) d ] ( ) p ϑ yσ(i) g θ yσ(i) x r(i) um (σ)q θ (d), i=1 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 18 / 67

Modelling and assumptions Modelling Multi-object modelling False positives: i.i.d. Poisson with rate λ and distribution p ϑ Probability of detection p D (0, 1] Multi-object parameter θ. = [θ, K, p D, λ, ϑ] t Multi-object observation function for y O m, m 0: g θ (y x). = d {0,1} K d m [ Po λ (m d ) m σ Sym(m) i= d +1 ( ) d ] ( ) p ϑ yσ(i) g θ yσ(i) x r(i) um (σ)q θ (d), i=1 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 18 / 67

Modelling and assumptions Analysis Analysis Multi-object Fisher information matrix (assumed positive definite): I(θ 1 [ ) = lim n θ log p θ (Y 1:n ) θ log p θ (Y 1:n ) t], nēθ Theorem (H., Singh, and Jasra 2017) Under assumptions of boundedness for the single-object transition and observation functions and the assumption of identifiability of θ, it holds that lim ˆθ n,x0 = θ n for any x 0 S K with K N. Under additionally assumptions it holds that n( ˆθn,x0 θ ) N ( 0, I(θ ) 1), for any x 0 S K and any K N. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 19 / 67

relative loss Modelling and assumptions Analysis Analysis Example: Single static object with false alarm 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 E[N/(N+1)] Gaussian U([-5,5]) U([-10,10]) U([-25,25]) U([-50,50]) U([-100,100]) 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 log(6+1) Information loss as a function of the Poisson parameter λ in log-scale Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 20 / 67

, Modelling and assumptions Analysis Analysis Example: 5 static objects with λ = 0 and p D = 1 5 4.5 4 3.5 3 2.5 2 1.5 1 0 0.5 1 1.5 2 2.5 3 3.5 = Information loss for varying association uncertainty α and spatial separation τ Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 21 / 67

relative loss Modelling and assumptions Analysis Analysis Example: Single object with λ = 0 1 0.9 experimental 1-p D 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 p D Information loss for a varying probability of detection p D. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 22 / 67

Modelling and assumptions More generally Objects persists between time step with probability p S (0, 1] Number of births per time step has a known distribution Use MCMC to explore: The set of data associations assuming a linear-gaussian single-object model (Oh, Russell, and Sastry 2009) The data associations and the states in general (Jiang, Singh, and Yıldırım 2015) In both cases, parameters can also be estimated Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 23 / 67

Modelling and assumptions More generally Objects persists between time step with probability p S (0, 1] Number of births per time step has a known distribution Use MCMC to explore: The set of data associations assuming a linear-gaussian single-object model (Oh, Russell, and Sastry 2009) The data associations and the states in general (Jiang, Singh, and Yıldırım 2015) In both cases, parameters can also be estimated Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 23 / 67

Point-process formulation Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 24 / 67

Point-process formulation Modelling Multi-object modelling Pros and cons: + Allows for modelling uncertainty in the number of objects and the birth/death process + Can use existing results in the literature Prevents from distinguishing objects Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 25 / 67

Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 26 / 67

Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 26 / 67

Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 26 / 67

Point-process formulation Recursion First-moment recursion Prediction Denote by γ n 1 ( Y 1:n 1 ) the posterior first-moment density at time n 1. Theorem (Mahler 2003) The predicted first-moment density γ n ( Y 1:n 1 ) is characterised by γ n (x Y 1:n 1 ) = γ b (x) + p S f θ (x x )γ n 1 (x Y 1:n 1 )dx for any x S. Sketch of proof Introduce ψ as a cemetery state and extend the state space to X = S {ψ} Extend f θ to X as F θ (ψ x) = 1 p S and F θ (x x ) = p S f θ (x x ) Apply (1) to F θ and X n 1 and (2) to the resulting point process and X b Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 27 / 67

Point-process formulation Recursion First-moment recursion Prediction Denote by γ n 1 ( Y 1:n 1 ) the posterior first-moment density at time n 1. Theorem (Mahler 2003) The predicted first-moment density γ n ( Y 1:n 1 ) is characterised by γ n (x Y 1:n 1 ) = γ b (x) + p S f θ (x x )γ n 1 (x Y 1:n 1 )dx for any x S. Sketch of proof Introduce ψ as a cemetery state and extend the state space to X = S {ψ} Extend f θ to X as F θ (ψ x) = 1 p S and F θ (x x ) = p S f θ (x x ) Apply (1) to F θ and X n 1 and (2) to the resulting point process and X b Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 27 / 67

Point-process formulation Recursion First-moment recursion Update Theorem (Mahler 2003) Assuming that the distribution of X n given Y 1:n 1 is Poisson i.i.d., the posterior first-moment density γ n ( Y 1:n ) is characterised by γ n (x Y 1:n ) = (1 p D )γ n (x Y 1:n 1 ) p D g θ (y x)γ n (x Y 1:n 1 ) + λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Y n(dy) for any x S. Sketch of proof (based on Caron, Del Moral, Doucet, and Pace 2011) Introduce φ as an empty observation and Y. = O {φ} Extend X n to X n on X by adding false-positive generators on ψ Extend g θ to likelihood from X to Y as G θ (φ x) = 1 p D and G θ (y x) = p D g θ (y x) and such that G θ ( ψ) = p ϑ on O Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 28 / 67

Point-process formulation Recursion First-moment recursion Update Theorem (Mahler 2003) Assuming that the distribution of X n given Y 1:n 1 is Poisson i.i.d., the posterior first-moment density γ n ( Y 1:n ) is characterised by γ n (x Y 1:n ) = (1 p D )γ n (x Y 1:n 1 ) p D g θ (y x)γ n (x Y 1:n 1 ) + λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Y n(dy) for any x S. Sketch of proof (based on Caron, Del Moral, Doucet, and Pace 2011) Introduce φ as an empty observation and Y. = O {φ} Extend X n to X n on X by adding false-positive generators on ψ Extend g θ to likelihood from X to Y as G θ (φ x) = 1 p D and G θ (y x) = p D g θ (y x) and such that G θ ( ψ) = p ϑ on O Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 28 / 67

Point-process formulation Recursion First-moment recursion Update 1. Denoting γ the first-moment measure of Xn given Y 1:n 1, it holds that E ( F ( X ) n) Y 1:n 1, Ȳn = ( N N φ ) [ N F δ xi + δ x Ψ Gθ (Y j i )( γ)(dx i) ][ Nφ i=1 j=1 i=1 j=1 ] Ψ Gθ (φ )( γ)(dx j) 2. Notice that the extension Ȳn = Yn + N φδ φ of Y n to Y verifies E ( ( ) F (Ȳn) ) ( ) k (1 pd )Γ n Yn = exp (1 pd )Γ n F (Y n + kδ φ ), k! with Γ n = γ n(x Y 1:n 1)dx 3. Law of total expect. and γ n(f Y 1:n) = E(F (X n) Y 1:n) with F (X n) = X n(f) k 0 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 29 / 67

Point-process formulation Recursion First-moment recursion Implementations SMC (Vo, Singh, and Doucet 2005) Track extraction requires clustering in general (K-means) Clustering can be based on tracks observation history (Pace and Del Moral 2013, Del Moral and H. 2015) Gaussian mixture (Vo and Ma 2006) Requires pruning and merging Track extraction relies on merging in its simplest form Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 30 / 67

Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 31 / 67

Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 31 / 67

Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 31 / 67

Point-process formulation Recursion Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 32 / 67

Point-process formulation Analysis First-moment recursion Analysis (Del Moral 2013) Write recursion as (m n+1 n, η n+1 n ) = Λ n (m n n 1, η n n 1 ), and denote Λ (1) n and Λ (2) n the first and second component of Λ n Introduce the semigroup transformation as well as Φ (1) n,n,η Φ (1) n,η n (m) = Λ (1) n (m, η n ) and Φ (2) n,m n (µ) = Λ (2) t (m n, µ). = Φ (1) n,η n... Φ (1) n,η n and Φ (2) n,n,m. = Φ (2) n,m n... Φ (2) n,m n Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 33 / 67

Point-process formulation Analysis First-moment recursion Analysis (Del Moral 2013) Assumptions: (L) The following Lipschitz inequalities hold: Φ (1) (m) n Φ(1),n,η n,n,η (m ) c (1) [ Φ (2) n,n,m(µ) Φ(2) n,n,m (µ ) ] (f) c (2) n,n m n,n m [µ µ ](ϕ) Q n,n,µ (f, dϕ) with c (i) n,n aie b i(n n ) for constants a i and b i > 0 verifying b 1 b 2 (C) The following continuity inequalities hold: with c i = sup n c (i) n Φ (1) n,µ(m) Φ (1) (m) n,µ c (1) n [µ µ ](ϕ) P n,µ (dϕ) [ Φ (2) n,m(µ) Φ (2) n,m (µ) ] (f) c (2) n m m, < and a 1a 2c 1c 2 ( 1 e (b 1 b 2 ) )( e (b 1 b 2 ) e (b 1 b 2 ) ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 34 / 67

Point-process formulation Analysis First-moment recursion Theorem (From Del Moral 2013, Thm. 13.3.3) Under (L) and (C), the following Lipschitz inequalities hold: Λ (1) (m, µ) n Λ(1),n n,n (m, µ ) ( e b(n n ) [µ ) a 1,1 m m + a 1,2 µ ](ϕ) P n,n,m,µ (dϕ) and Λ (2) (m, µ)(f) n Λ(2),n n,n (m, µ )(f) ( e b(n n ) a 2,1 m m + a 2,2 [µ µ ](ϕ) Q n,n,m,µ (f, dϕ) ), Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 35 / 67

Point-process formulation Analysis First-moment recursion Assumption: for any y Y, it holds that l ( ) (y) =. inf g θ (y x) 0 and l (+) (y) =. sup g θ (y x) <. x S x S Theorem (From Del Moral 2013, Thm. 13.4.1) If sup n Y n(f) is finite for f equal to l (+) /l ( ) and l (+) /(l ( ) ) 2, then there exist constants 0 < r D 1, r b < and r > 0 such that Φ (1) and n Φ(2) satisfy the,n,η n,n,m conditions (L) and (C) whenever p D r D, λ b r b, and λ r. the first-moment recursion is exponentially stable when 1. the probability of detection is sufficiently high 2. the expected number of appearing objects is large enough 3. the number of spurious observations is limited Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 36 / 67

Point-process formulation Analysis First-moment recursion Assumption: for any y Y, it holds that l ( ) (y) =. inf g θ (y x) 0 and l (+) (y) =. sup g θ (y x) <. x S x S Theorem (From Del Moral 2013, Thm. 13.4.1) If sup n Y n(f) is finite for f equal to l (+) /l ( ) and l (+) /(l ( ) ) 2, then there exist constants 0 < r D 1, r b < and r > 0 such that Φ (1) and n Φ(2) satisfy the,n,η n,n,m conditions (L) and (C) whenever p D r D, λ b r b, and λ r. the first-moment recursion is exponentially stable when 1. the probability of detection is sufficiently high 2. the expected number of appearing objects is large enough 3. the number of spurious observations is limited Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 36 / 67

Point-process formulation Alternatives First-moment recursion Conclusions and alternatives Shortcomings of the first-moment recursion: Short memory Objects are indistinguishable Track extraction can be difficult Some related techniques: Use i.i.d. point processes instead (Mahler 2007) + confidence in the number of objects can be greatly improved introduces long-range interactions that can be counter-intuitive Use marked point processes (Vo, Vo, and Phung 2014) + allows for distinguishing objects objects birth is less natural to represent Develop a representation of partial distinguishability H. 2015 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 37 / 67

Point-process formulation Alternatives First-moment recursion Conclusions and alternatives Shortcomings of the first-moment recursion: Short memory Objects are indistinguishable Track extraction can be difficult Some related techniques: Use i.i.d. point processes instead (Mahler 2007) + confidence in the number of objects can be greatly improved introduces long-range interactions that can be counter-intuitive Use marked point processes (Vo, Vo, and Phung 2014) + allows for distinguishing objects objects birth is less natural to represent Develop a representation of partial distinguishability H. 2015 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 37 / 67

Y (m) Point-process formulation Alternatives Using partial distinguishability H. and Clark 2018 300 200 Sensor Target at t=0 5 100 4 0-100 1-200 -300 2 3-400 -300-200 -100 0 100 200 300 400 X (m) A realisation of the target trajectories Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 38 / 67

OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark 2018 100 90 80 70 5 4 3 HISP PHD CPHD LMB 4 3 2 60 2 50 40 1 1 30 20 10 0 0 20 40 60 80 100 120 140 160 180 200 time (s) p D = 0.995 and λ = 83 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 38 / 67

OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark 2018 100 90 80 70 5 4 3 HISP PHD CPHD LMB 60 50 40 30 20 10 2 1 0 0 20 40 60 80 100 120 140 160 180 200 time (s) p D = 0.8 and λ = 167 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 38 / 67

OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark 2018 100 90 80 70 5 4 3 HISP PHD CPHD LMB 60 50 40 30 20 10 2 1 0 0 20 40 60 80 100 120 140 160 180 200 time (s) p D = 0.5 and λ = 15 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 38 / 67

Point-process formulation Alternatives Higher-order moments Since the posterior point process is not Poisson i.i.d. in general, one can compute the variance after update. Theorem (Delande, Uney, H., and Clark 2014) The regional variance in B S of X n given Y 1:n is characterised by var Xn Y 1:n (B) = (1 p D ) γ n (x Y 1:n 1 )dx + R y (B) ( 1 R y (B) ) Y n (dy) with B B R y (B) = p Dg θ (y x)γ n (x Y 1:n 1 )dx λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Consequence: If the origin of observations is unambiguous low variance Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 39 / 67

Point-process formulation Alternatives Higher-order moments Since the posterior point process is not Poisson i.i.d. in general, one can compute the variance after update. Theorem (Delande, Uney, H., and Clark 2014) The regional variance in B S of X n given Y 1:n is characterised by var Xn Y 1:n (B) = (1 p D ) γ n (x Y 1:n 1 )dx + R y (B) ( 1 R y (B) ) Y n (dy) with B B R y (B) = p Dg θ (y x)γ n (x Y 1:n 1 )dx λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Consequence: If the origin of observations is unambiguous low variance Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 39 / 67

Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark 2018 Other parametrisations of the cardinality are also possible: we can consider instead that the number of points K in X n is Panjer distributed p K (n) = ( 1 + β 1) ( )( ) n α α 1 n β + 1 Finite and positive α and β negative binomial. Finite and negative α and β binomial. In the limit α, β with λ = α β Poisson. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 40 / 67

Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark 2018 Other parametrisations of the cardinality are also possible: we can consider instead that the number of points K in X n is Panjer distributed p K (n) = ( 1 + β 1) ( )( ) n α α 1 n β + 1 Finite and positive α and β negative binomial. Finite and negative α and β binomial. In the limit α, β with λ = α β Poisson. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 40 / 67

Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark 2018 50 0 corr(a, B) 0.5 0 0.5 Poisson Panjer General 100 50 0 50 100 Tracking scenario, with region A on the left and region B on the right. 0 20 40 60 80 100 time Correlation between the estimated number of targets in regions A and B. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 41 / 67

Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 42 / 67

Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 42 / 67

Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 42 / 67

Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 42 / 67

Inference with outer measures Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 43 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)1 A (θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) = P (B) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) = P (B) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv:1801.00569 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 44 / 67

Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N 1 (x µ) 2 2 ( 1 + x2 ν σ 2 ) γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 45 / 67

Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N ) 1 (x µ) 2 2 σ 2 ( Γ( ν+1 2 ) νπγ( ν 1 + x2 2 ) ν γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 45 / 67

Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N 1 (x µ) 2 2 ( 1 + x2 ν σ 2 ) γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 45 / 67

Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 46 / 67

Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 46 / 67

Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 46 / 67

Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 47 / 67

Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 47 / 67

Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 47 / 67

Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 47 / 67

Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 48 / 67

Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Example: Define the self-information as I(x) = log f(x) then H(X) = Ē(I(X)) = f log f meaningful on uncountable spaces Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 48 / 67

Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 48 / 67

Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Example: Maximum-likelihood estimate with i.i.d. samples y 1,..., y n p( x) E (X y 1:n ) = argmax x f(x) n p(y i x) i=1 can justify profile likelihood Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 48 / 67

Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x 1 + + x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 49 / 67

Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x 1 + + x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 49 / 67

Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x 1 + + x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 49 / 67

Inference with outer measures Representing uncertainty CLT Proposition If d = 1, argmax x f(x) = µ and f is twice differentiable on the right of µ, then the possibility function f n describing the uncertain variable n(s n µ) verifies 1 µ (x) if + f(µ) 0 lim f n(x) = exp ( n 1 2 2 +f(µ) (x µ) 2) if +f(µ) 2 0 1(x) otherwise, for any x [µ, ) and similarly on (, µ]. Consequences: 9 limiting possibility functions (!) Suggest a definition of the variance as 1/f (µ) Recover exactly the Laplace approximation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 50 / 67

Inference with outer measures Representing uncertainty CLT Proposition If d = 1, argmax x f(x) = µ and f is twice differentiable on the right of µ, then the possibility function f n describing the uncertain variable n(s n µ) verifies 1 µ (x) if + f(µ) 0 lim f n(x) = exp ( n 1 2 2 +f(µ) (x µ) 2) if +f(µ) 2 0 1(x) otherwise, for any x [µ, ) and similarly on (, µ]. Consequences: 9 limiting possibility functions (!) Suggest a definition of the variance as 1/f (µ) Recover exactly the Laplace approximation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 50 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 51 / 67

Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 52 / 67

Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 52 / 67

Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 52 / 67

Inference with outer measures Representing uncertainty Kalman filter Recursion f n 1 (x y 1:n 1 ) = N (x; m n 1, Σ n 1 ) g(x x ) = N (x; F x, Q) h(y x) = N (y; Hx, R) Same means m n n 1, m n and spreads Σ n n 1, Σ n Different marginal likelihood ( f Yn (y n ) = exp 1 ) 2 (y n Hm n n 1 ) T Sn 1 (y n Hm n n 1 ) with S n = HΣ n n 1 H T + R J. H. and A. Bishop. Smoothing and filtering with a class of outer measures. In: SIAM Journal on Uncertainty Quantification 6.2 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 53 / 67

Inference with outer measures Representing uncertainty Kalman filter Recursion f n 1 (x y 1:n 1 ) = N (x; m n 1, Σ n 1 ) g(x x ) = N (x; F x, Q) h(y x) = N (y; Hx, R) Same means m n n 1, m n and spreads Σ n n 1, Σ n Different marginal likelihood ( f Yn (y n ) = exp 1 ) 2 (y n Hm n n 1 ) T Sn 1 (y n Hm n n 1 ) with S n = HΣ n n 1 H T + R J. H. and A. Bishop. Smoothing and filtering with a class of outer measures. In: SIAM Journal on Uncertainty Quantification 6.2 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 53 / 67

Inference with outer measures Complex systems Natural language processing: Bike theft Figure: Map of the surroundings (Google Maps). Red-dotted rectangle: area of interest, red dot: location of bike theft. A. Bishop, J. H., D. Angley, and B. Ristić. Spatio-temporal tracking from natural language statements using outer probability theory. In: Elsevier Information Sciences 463 464 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 54 / 67

Inference with outer measures Complex systems Natural language processing: Bike theft Information to be confirmed: 1. Suspect alibi: I was with a friend at the tram stop on the intersection of La Trobe St. and Elizabeth St. 2. CCTV: Recording of the theft The witnesses declarations are: 1. The suspect has been seen on Elizabeth St. around 2.07p.m. 2. The suspect turned at the intersection of Swanston and Abeckett St. between 2.25p.m. and 2.35p.m. 3. The suspect has been seen near RMIT building 80 around 2.35p.m. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 55 / 67

Inference with outer measures Complex systems Natural language processing: Bike theft Information to be confirmed: 1. Suspect alibi: I was with a friend at the tram stop on the intersection of La Trobe St. and Elizabeth St. 2. CCTV: Recording of the theft The witnesses declarations are: 1. The suspect has been seen on Elizabeth St. around 2.07p.m. 2. The suspect turned at the intersection of Swanston and Abeckett St. between 2.25p.m. and 2.35p.m. 3. The suspect has been seen near RMIT building 80 around 2.35p.m. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 55 / 67

Inference with outer measures Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 56 / 67

Inference with outer measures Complex systems Complex system 10.0 7.5 5.0 Xn False alarms True observation Yn 10.0 7.5 5.0 2.5 2.5 position 0.0 2.5 position 0.0 2.5 5.0 5.0 7.5 7.5 10.0 0 5 10 15 20 25 30 35 40 time 10.0 0 5 10 15 20 25 30 35 40 time J. H. Detection and estimation of partially-observed dynamical systems: an outer-measure approach. In: arxiv:1801:00571 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 57 / 67

Inference with outer measures Complex systems Modelling Uncertain counting measure with X : ω u N a N-valued uncertain variable N(ω u) i=1 δ Xi(ω u) {X i } i a collection of X-valued uncertain variables First-moment outer measure F X (B) = Ē( max i {1,...,N} 1 B (X i ) ) Proposition If X and X are independent then F X +X (x) = max{f X (x), F X (x)}. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 58 / 67

Inference with outer measures Complex systems Modelling Uncertain counting measure with X : ω u N a N-valued uncertain variable N(ω u) i=1 δ Xi(ω u) {X i } i a collection of X-valued uncertain variables First-moment outer measure F X (B) = Ē( max i {1,...,N} 1 B (X i ) ) Proposition If X and X are independent then F X +X (x) = max{f X (x), F X (x)}. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 58 / 67

Inference with outer measures Complex systems Detection and estimation of dynamical systems No information on false positives: f (y 1,..., y n ) = 1, y 1,..., y n O, n 0 Lower bound on the probability of detection: h(φ x) = α = p D 1 α Lower bound for the probability of staying in the state space: g(ψ x) = β = p S 1 β Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 59 / 67

Inference with outer measures Complex systems Detection and estimation of dynamical systems No information on false positives: f (y 1,..., y n ) = 1, y 1,..., y n O, n 0 Lower bound on the probability of detection: h(φ x) = α = p D 1 α Lower bound for the probability of staying in the state space: g(ψ x) = β = p S 1 β Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, 2018 59 / 67