Inference for multi-object dynamical systems: methods and analysis
|
|
- Leon Hunt
- 5 years ago
- Views:
Transcription
1 Inference for multi-object dynamical systems: methods and analysis Jérémie Houssineau National University of Singapore September 13, 2018 Joint work with: Daniel Clark, Ajay Jasra, Sumeetpal Singh, Emmanuel Delande, Isabel Schlangen, Pierre Del Moral, Adrian Bishop and others. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
2 Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
3 Overview Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
4 Overview Applications (a) Microscopy (b) Surveillance (c) Space Debris (d) Microfluidics Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
5 Overview Multi-object dynamical system Number of objects changes in time (birth/death process) Observation process: Observation of a given object might fail (false negative) When successful, it is prone to errors Some observations originate from background noise (false positive) Data association is unknown a priori Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
6 Overview Example: Space Situational Awareness Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
7 Overview Example: Space Situational Awareness Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
8 Overview Example: Finite-resolution sensor Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
9 Overview Example: Finite-resolution sensor From H., Clark, and Del Moral Trajectories and observations cell: (5 m, 1 deg) cell: (20 m, 4 deg) Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
10 Overview Example: Finite-resolution sensor From H., Clark, and Del Moral Trajectories and observations cell: (5 m, 1 deg) cell: (20 m, 4 deg) Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
11 Overview Example: Classification From Pailhas, H., Petillot, and Clark 2016 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
12 x (m) Overview x (m) Example: Classification From Pailhas, H., Petillot, and Clark y (m) y (m) Harbour surveillance: threat detection from motion-based classification Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
13 Overview Example: Estimation of parameters From H., Clark, Ivekovic, Lee, and Franco 2016 Camera calibration: joint estimation of camera pose and paper plane trajectories Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
14 Overview Example: Estimation of parameters From H., Clark, Ivekovic, Lee, and Franco 2016 Camera calibration: joint estimation of camera pose and paper plane trajectories Video Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
15 Modelling and assumptions Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
16 Modelling and assumptions Modelling Single-object modelling Each object is characterised by a HMM parametrised by θ Θ with: A Markov kernel f θ on the state space S R d An initial distribution µ A likelihood g θ from S to the observation space O R d Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
17 Modelling and assumptions Modelling Assumptions No interactions between objects (dynamics and observation) False positives are independent of all objects Birth/death process is independent of all objects Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
18 Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
19 Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
20 Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
21 Modelling and assumptions Modelling Multi-object modelling Let s assume that the number of objects is known and fixed to K Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
22 Modelling and assumptions Modelling Naive solution First idea Solve the data association by finding the closest observation. Lead to track coalescence Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
23 Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
24 Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
25 Modelling and assumptions Modelling Better solutions Second idea Find the best global association Suboptimal solution in time Third idea Multiple Hypothesis Tracking (Blackman 1986) Potentially costly Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
26 Modelling and assumptions Modelling Multi-object modelling False positives: i.i.d. Poisson with rate λ and distribution p ϑ Probability of detection p D (0, 1] Multi-object parameter θ. = [θ, K, p D, λ, ϑ] t Multi-object observation function for y O m, m 0: g θ (y x). = d {0,1} K d m [ Po λ (m d ) m σ Sym(m) i= d +1 ( ) d ] ( ) p ϑ yσ(i) g θ yσ(i) x r(i) um (σ)q θ (d), i=1 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
27 Modelling and assumptions Modelling Multi-object modelling False positives: i.i.d. Poisson with rate λ and distribution p ϑ Probability of detection p D (0, 1] Multi-object parameter θ. = [θ, K, p D, λ, ϑ] t Multi-object observation function for y O m, m 0: g θ (y x). = d {0,1} K d m [ Po λ (m d ) m σ Sym(m) i= d +1 ( ) d ] ( ) p ϑ yσ(i) g θ yσ(i) x r(i) um (σ)q θ (d), i=1 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
28 Modelling and assumptions Analysis Analysis Multi-object Fisher information matrix (assumed positive definite): I(θ 1 [ ) = lim n θ log p θ (Y 1:n ) θ log p θ (Y 1:n ) t], nēθ Theorem (H., Singh, and Jasra 2017) Under assumptions of boundedness for the single-object transition and observation functions and the assumption of identifiability of θ, it holds that lim ˆθ n,x0 = θ n for any x 0 S K with K N. Under additionally assumptions it holds that n( ˆθn,x0 θ ) N ( 0, I(θ ) 1), for any x 0 S K and any K N. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
29 relative loss Modelling and assumptions Analysis Analysis Example: Single static object with false alarm E[N/(N+1)] Gaussian U([-5,5]) U([-10,10]) U([-25,25]) U([-50,50]) U([-100,100]) log(6+1) Information loss as a function of the Poisson parameter λ in log-scale Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
30 , Modelling and assumptions Analysis Analysis Example: 5 static objects with λ = 0 and p D = = Information loss for varying association uncertainty α and spatial separation τ Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
31 relative loss Modelling and assumptions Analysis Analysis Example: Single object with λ = experimental 1-p D p D Information loss for a varying probability of detection p D. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
32 Modelling and assumptions More generally Objects persists between time step with probability p S (0, 1] Number of births per time step has a known distribution Use MCMC to explore: The set of data associations assuming a linear-gaussian single-object model (Oh, Russell, and Sastry 2009) The data associations and the states in general (Jiang, Singh, and Yıldırım 2015) In both cases, parameters can also be estimated Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
33 Modelling and assumptions More generally Objects persists between time step with probability p S (0, 1] Number of births per time step has a known distribution Use MCMC to explore: The set of data associations assuming a linear-gaussian single-object model (Oh, Russell, and Sastry 2009) The data associations and the states in general (Jiang, Singh, and Yıldırım 2015) In both cases, parameters can also be estimated Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
34 Point-process formulation Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
35 Point-process formulation Modelling Multi-object modelling Pros and cons: + Allows for modelling uncertainty in the number of objects and the birth/death process + Can use existing results in the literature Prevents from distinguishing objects Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
36 Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
37 Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
38 Point-process formulation Modelling Multi-object modelling Assuming that: All model parameters are known Objects states and observations at time n are represented by the point processes X n = K i=1 δ X i and Y n = N i=1 δ Y i (simple) Objects birth follow a point process X b, independent from X n First idea: Consider the first-moment density γ n of X n Useful results: (1) If X results from applying the dynamics modelled by f θ to the points of point process X then γ X (x) = f θ (x x )γ X (x )dx for any x S (2) If X and X are independent point processes then γ X +X = γ X + γ X Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
39 Point-process formulation Recursion First-moment recursion Prediction Denote by γ n 1 ( Y 1:n 1 ) the posterior first-moment density at time n 1. Theorem (Mahler 2003) The predicted first-moment density γ n ( Y 1:n 1 ) is characterised by γ n (x Y 1:n 1 ) = γ b (x) + p S f θ (x x )γ n 1 (x Y 1:n 1 )dx for any x S. Sketch of proof Introduce ψ as a cemetery state and extend the state space to X = S {ψ} Extend f θ to X as F θ (ψ x) = 1 p S and F θ (x x ) = p S f θ (x x ) Apply (1) to F θ and X n 1 and (2) to the resulting point process and X b Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
40 Point-process formulation Recursion First-moment recursion Prediction Denote by γ n 1 ( Y 1:n 1 ) the posterior first-moment density at time n 1. Theorem (Mahler 2003) The predicted first-moment density γ n ( Y 1:n 1 ) is characterised by γ n (x Y 1:n 1 ) = γ b (x) + p S f θ (x x )γ n 1 (x Y 1:n 1 )dx for any x S. Sketch of proof Introduce ψ as a cemetery state and extend the state space to X = S {ψ} Extend f θ to X as F θ (ψ x) = 1 p S and F θ (x x ) = p S f θ (x x ) Apply (1) to F θ and X n 1 and (2) to the resulting point process and X b Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
41 Point-process formulation Recursion First-moment recursion Update Theorem (Mahler 2003) Assuming that the distribution of X n given Y 1:n 1 is Poisson i.i.d., the posterior first-moment density γ n ( Y 1:n ) is characterised by γ n (x Y 1:n ) = (1 p D )γ n (x Y 1:n 1 ) p D g θ (y x)γ n (x Y 1:n 1 ) + λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Y n(dy) for any x S. Sketch of proof (based on Caron, Del Moral, Doucet, and Pace 2011) Introduce φ as an empty observation and Y. = O {φ} Extend X n to X n on X by adding false-positive generators on ψ Extend g θ to likelihood from X to Y as G θ (φ x) = 1 p D and G θ (y x) = p D g θ (y x) and such that G θ ( ψ) = p ϑ on O Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
42 Point-process formulation Recursion First-moment recursion Update Theorem (Mahler 2003) Assuming that the distribution of X n given Y 1:n 1 is Poisson i.i.d., the posterior first-moment density γ n ( Y 1:n ) is characterised by γ n (x Y 1:n ) = (1 p D )γ n (x Y 1:n 1 ) p D g θ (y x)γ n (x Y 1:n 1 ) + λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Y n(dy) for any x S. Sketch of proof (based on Caron, Del Moral, Doucet, and Pace 2011) Introduce φ as an empty observation and Y. = O {φ} Extend X n to X n on X by adding false-positive generators on ψ Extend g θ to likelihood from X to Y as G θ (φ x) = 1 p D and G θ (y x) = p D g θ (y x) and such that G θ ( ψ) = p ϑ on O Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
43 Point-process formulation Recursion First-moment recursion Update 1. Denoting γ the first-moment measure of Xn given Y 1:n 1, it holds that E ( F ( X ) n) Y 1:n 1, Ȳn = ( N N φ ) [ N F δ xi + δ x Ψ Gθ (Y j i )( γ)(dx i) ][ Nφ i=1 j=1 i=1 j=1 ] Ψ Gθ (φ )( γ)(dx j) 2. Notice that the extension Ȳn = Yn + N φδ φ of Y n to Y verifies E ( ( ) F (Ȳn) ) ( ) k (1 pd )Γ n Yn = exp (1 pd )Γ n F (Y n + kδ φ ), k! with Γ n = γ n(x Y 1:n 1)dx 3. Law of total expect. and γ n(f Y 1:n) = E(F (X n) Y 1:n) with F (X n) = X n(f) k 0 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
44 Point-process formulation Recursion First-moment recursion Implementations SMC (Vo, Singh, and Doucet 2005) Track extraction requires clustering in general (K-means) Clustering can be based on tracks observation history (Pace and Del Moral 2013, Del Moral and H. 2015) Gaussian mixture (Vo and Ma 2006) Requires pruning and merging Track extraction relies on merging in its simplest form Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
45 Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
46 Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
47 Point-process formulation Recursion Example Simple scenario Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
48 Point-process formulation Recursion Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
49 Point-process formulation Analysis First-moment recursion Analysis (Del Moral 2013) Write recursion as (m n+1 n, η n+1 n ) = Λ n (m n n 1, η n n 1 ), and denote Λ (1) n and Λ (2) n the first and second component of Λ n Introduce the semigroup transformation as well as Φ (1) n,n,η Φ (1) n,η n (m) = Λ (1) n (m, η n ) and Φ (2) n,m n (µ) = Λ (2) t (m n, µ). = Φ (1) n,η n... Φ (1) n,η n and Φ (2) n,n,m. = Φ (2) n,m n... Φ (2) n,m n Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
50 Point-process formulation Analysis First-moment recursion Analysis (Del Moral 2013) Assumptions: (L) The following Lipschitz inequalities hold: Φ (1) (m) n Φ(1),n,η n,n,η (m ) c (1) [ Φ (2) n,n,m(µ) Φ(2) n,n,m (µ ) ] (f) c (2) n,n m n,n m [µ µ ](ϕ) Q n,n,µ (f, dϕ) with c (i) n,n aie b i(n n ) for constants a i and b i > 0 verifying b 1 b 2 (C) The following continuity inequalities hold: with c i = sup n c (i) n Φ (1) n,µ(m) Φ (1) (m) n,µ c (1) n [µ µ ](ϕ) P n,µ (dϕ) [ Φ (2) n,m(µ) Φ (2) n,m (µ) ] (f) c (2) n m m, < and a 1a 2c 1c 2 ( 1 e (b 1 b 2 ) )( e (b 1 b 2 ) e (b 1 b 2 ) ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
51 Point-process formulation Analysis First-moment recursion Theorem (From Del Moral 2013, Thm ) Under (L) and (C), the following Lipschitz inequalities hold: Λ (1) (m, µ) n Λ(1),n n,n (m, µ ) ( e b(n n ) [µ ) a 1,1 m m + a 1,2 µ ](ϕ) P n,n,m,µ (dϕ) and Λ (2) (m, µ)(f) n Λ(2),n n,n (m, µ )(f) ( e b(n n ) a 2,1 m m + a 2,2 [µ µ ](ϕ) Q n,n,m,µ (f, dϕ) ), Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
52 Point-process formulation Analysis First-moment recursion Assumption: for any y Y, it holds that l ( ) (y) =. inf g θ (y x) 0 and l (+) (y) =. sup g θ (y x) <. x S x S Theorem (From Del Moral 2013, Thm ) If sup n Y n(f) is finite for f equal to l (+) /l ( ) and l (+) /(l ( ) ) 2, then there exist constants 0 < r D 1, r b < and r > 0 such that Φ (1) and n Φ(2) satisfy the,n,η n,n,m conditions (L) and (C) whenever p D r D, λ b r b, and λ r. the first-moment recursion is exponentially stable when 1. the probability of detection is sufficiently high 2. the expected number of appearing objects is large enough 3. the number of spurious observations is limited Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
53 Point-process formulation Analysis First-moment recursion Assumption: for any y Y, it holds that l ( ) (y) =. inf g θ (y x) 0 and l (+) (y) =. sup g θ (y x) <. x S x S Theorem (From Del Moral 2013, Thm ) If sup n Y n(f) is finite for f equal to l (+) /l ( ) and l (+) /(l ( ) ) 2, then there exist constants 0 < r D 1, r b < and r > 0 such that Φ (1) and n Φ(2) satisfy the,n,η n,n,m conditions (L) and (C) whenever p D r D, λ b r b, and λ r. the first-moment recursion is exponentially stable when 1. the probability of detection is sufficiently high 2. the expected number of appearing objects is large enough 3. the number of spurious observations is limited Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
54 Point-process formulation Alternatives First-moment recursion Conclusions and alternatives Shortcomings of the first-moment recursion: Short memory Objects are indistinguishable Track extraction can be difficult Some related techniques: Use i.i.d. point processes instead (Mahler 2007) + confidence in the number of objects can be greatly improved introduces long-range interactions that can be counter-intuitive Use marked point processes (Vo, Vo, and Phung 2014) + allows for distinguishing objects objects birth is less natural to represent Develop a representation of partial distinguishability H Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
55 Point-process formulation Alternatives First-moment recursion Conclusions and alternatives Shortcomings of the first-moment recursion: Short memory Objects are indistinguishable Track extraction can be difficult Some related techniques: Use i.i.d. point processes instead (Mahler 2007) + confidence in the number of objects can be greatly improved introduces long-range interactions that can be counter-intuitive Use marked point processes (Vo, Vo, and Phung 2014) + allows for distinguishing objects objects birth is less natural to represent Develop a representation of partial distinguishability H Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
56 Y (m) Point-process formulation Alternatives Using partial distinguishability H. and Clark Sensor Target at t= X (m) A realisation of the target trajectories Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
57 OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark HISP PHD CPHD LMB time (s) p D = and λ = 83 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
58 OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark HISP PHD CPHD LMB time (s) p D = 0.8 and λ = 167 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
59 OSPA distance Point-process formulation Alternatives Using partial distinguishability H. and Clark HISP PHD CPHD LMB time (s) p D = 0.5 and λ = 15 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
60 Point-process formulation Alternatives Higher-order moments Since the posterior point process is not Poisson i.i.d. in general, one can compute the variance after update. Theorem (Delande, Uney, H., and Clark 2014) The regional variance in B S of X n given Y 1:n is characterised by var Xn Y 1:n (B) = (1 p D ) γ n (x Y 1:n 1 )dx + R y (B) ( 1 R y (B) ) Y n (dy) with B B R y (B) = p Dg θ (y x)γ n (x Y 1:n 1 )dx λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Consequence: If the origin of observations is unambiguous low variance Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
61 Point-process formulation Alternatives Higher-order moments Since the posterior point process is not Poisson i.i.d. in general, one can compute the variance after update. Theorem (Delande, Uney, H., and Clark 2014) The regional variance in B S of X n given Y 1:n is characterised by var Xn Y 1:n (B) = (1 p D ) γ n (x Y 1:n 1 )dx + R y (B) ( 1 R y (B) ) Y n (dy) with B B R y (B) = p Dg θ (y x)γ n (x Y 1:n 1 )dx λp ϑ (y) + p D gθ (y x )γ n (x Y 1:n 1 )dx Consequence: If the origin of observations is unambiguous low variance Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
62 Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark 2018 Other parametrisations of the cardinality are also possible: we can consider instead that the number of points K in X n is Panjer distributed p K (n) = ( 1 + β 1) ( )( ) n α α 1 n β + 1 Finite and positive α and β negative binomial. Finite and negative α and β binomial. In the limit α, β with λ = α β Poisson. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
63 Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark 2018 Other parametrisations of the cardinality are also possible: we can consider instead that the number of points K in X n is Panjer distributed p K (n) = ( 1 + β 1) ( )( ) n α α 1 n β + 1 Finite and positive α and β negative binomial. Finite and negative α and β binomial. In the limit α, β with λ = α β Poisson. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
64 Point-process formulation Alternatives Higher-order moments Schlangen, Delande, H., and Clark corr(a, B) Poisson Panjer General Tracking scenario, with region A on the left and region B on the right time Correlation between the estimated number of targets in regions A and B. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
65 Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
66 Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
67 Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
68 Point-process formulation Alternatives Fundamental limitations The distribution of false positives can vary dramatically in time There is often no prior information on the location of objects The observation process is difficult to describe in a standard way Radar cross section of an A-26 Invader (Wikipedia) What about the uncertainty quantification? Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
69 Inference with outer measures Outline 1 Overview 2 Modelling and assumptions Modelling Analysis 3 Point-process formulation Modelling Recursion Analysis Alternatives 4 Inference with outer measures Representing uncertainty Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
70 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
71 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)1 A (θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
72 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
73 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
74 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) = P (B) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
75 Inference with outer measures Representing uncertainty Outer probability measure Assuming: Then A r.v. X with conditional probability distribution p( θ) No knowledge about θ Θ P(X B) sup p(b θ)f(θ) = P (B) θ Θ With f : Θ [0, 1] such that sup θ f(θ) = 1 Remarks Does not require a reference measure Standard operations apply directly: if Θ = Θ 1 Θ 2 f 2 (θ 2 ) = sup f(θ 1, θ 2 ) and f 1 2 (θ 1 θ 2 ) = f(θ 1, θ 2 ) θ 1 Θ 1 f 2 (θ 2 ) J. H. Parameter estimation with a class of outer probability measures. In: arxiv: (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
76 Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N 1 (x µ) 2 2 ( 1 + x2 ν σ 2 ) γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
77 Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N ) 1 (x µ) 2 2 σ 2 ( Γ( ν+1 2 ) νπγ( ν 1 + x2 2 ) ν γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
78 Inference with outer measures Representing uncertainty Possibility function Parameter(s) Function of x R Pros & cons Uniform Ū([a, b]) a, b R, a < b 1 [a,b](x) ( Gaussian N (µ, σ 2 ) µ R, σ 2 > 0 exp Student s t ν > 0 Cauchy x 0 R, γ > 0 Can be easily truncated, discretized Easy to introduce new possibility functions Less obvious for distribution on N 1 (x µ) 2 2 ( 1 + x2 ν σ 2 ) γ 2 (x x 0) 2 + γ 2 ) ν+1 2 Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
79 Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
80 Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
81 Inference with outer measures Representing uncertainty Uncertain variable Ingredients: A sample space Ω u for deterministic but uncertainty phenomena A probability space (Ω r, F, P( ω u )) for any ω u Ω u A state space X and a parameter space Θ An uncertain variable is a mapping such that X : Ω u Ω r Θ X (ω u, ω r ) (X u (ω u ), X r (ω r )) X r : Ω r X is a random variable P(Xr 1 (B) ) is constant over Xu 1 [θ] for any B X and θ Θ 1. implies that θ is sufficiently informative about X r 2. can deduce the conditional distribution p( θ) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
82 Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
83 Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
84 Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
85 Inference with outer measures Representing uncertainty Assumption & basic concepts Assumption Henceforth: p( θ) = δ θ and Θ = X Concept The (deterministic) uncertain variables X and Y are (weakly) independent if f X,Y (x, y) = f X (x)f Y (y) Even if X and Y are not independent f X 1 and 1 f Y of (X, Y ) with are valid descriptions f X (x) = sup y f X,Y (x, y) and f Y (y) = sup f X,Y (x, y) x information loss Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
86 Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
87 Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Example: Define the self-information as I(x) = log f(x) then H(X) = Ē(I(X)) = f log f meaningful on uncountable spaces Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
88 Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
89 Inference with outer measures Representing uncertainty Expectation By identification For a non-negative function ϕ Ē(ϕ(X)) = ϕ f Intuitively E (X) = argmax f(x) x Example: Maximum-likelihood estimate with i.i.d. samples y 1,..., y n p( x) E (X y 1:n ) = argmax x f(x) n p(y i x) i=1 can justify profile likelihood Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
90 Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
91 Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
92 Inference with outer measures Representing uncertainty Law of large numbers Let X 1, X 2,... be a collection of weakly independent uncertain variables on R d with possibility function f then S n = n 1 n i=1 X n is described by { n f Sn (y) = sup f(x i ) : 1 } n (x x n ) = y. i=1 Proposition If f(x) 0 when x and argmax x f(x) = µ, then f Sn verifies lim f S n = 1 µ, n where the limit is considered point-wise. Confirms the intuitive definition of expectation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
93 Inference with outer measures Representing uncertainty CLT Proposition If d = 1, argmax x f(x) = µ and f is twice differentiable on the right of µ, then the possibility function f n describing the uncertain variable n(s n µ) verifies 1 µ (x) if + f(µ) 0 lim f n(x) = exp ( n f(µ) (x µ) 2) if +f(µ) 2 0 1(x) otherwise, for any x [µ, ) and similarly on (, µ]. Consequences: 9 limiting possibility functions (!) Suggest a definition of the variance as 1/f (µ) Recover exactly the Laplace approximation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
94 Inference with outer measures Representing uncertainty CLT Proposition If d = 1, argmax x f(x) = µ and f is twice differentiable on the right of µ, then the possibility function f n describing the uncertain variable n(s n µ) verifies 1 µ (x) if + f(µ) 0 lim f n(x) = exp ( n f(µ) (x µ) 2) if +f(µ) 2 0 1(x) otherwise, for any x [µ, ) and similarly on (, µ]. Consequences: 9 limiting possibility functions (!) Suggest a definition of the variance as 1/f (µ) Recover exactly the Laplace approximation Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
95 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
96 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
97 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
98 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
99 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
100 Inference with outer measures Representing uncertainty Markov chain Concept A collection {X n } n is a (weak) Markov chain if f Xn ( X 1:n 1 ) = f Xn ( X n 1 ) Occupation time η x at x X η x = n 0 1 x (X n ) The point x is recurrent if Ē(η x X 0 = x) = meaningful on uncountable spaces no guarantees on the actual behaviour of the chain Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
101 Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
102 Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
103 Inference with outer measures Representing uncertainty Filtering for possibility functions A state space model Consider a partially-observed Markov chain {X n } n on X such that X n = G(X n 1 ) + V n Y n = H(X n ) + W n with {V n } n and {W n } n i.i.d. such that f Xn ( X n 1 ) = g( X n 1 ) and f Yn ( X n ) = h( X n ) Filtering equations f Xn (x y 1:n 1 ) = sup g(x x )f Xn 1 (x y 1:n 1 ) x X h(y n x)f Xn (x y 1:n 1 ) f Xn (x y 1:n ) = sup x X h(y n x )f Xn (x y 1:n 1 ). Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
104 Inference with outer measures Representing uncertainty Kalman filter Recursion f n 1 (x y 1:n 1 ) = N (x; m n 1, Σ n 1 ) g(x x ) = N (x; F x, Q) h(y x) = N (y; Hx, R) Same means m n n 1, m n and spreads Σ n n 1, Σ n Different marginal likelihood ( f Yn (y n ) = exp 1 ) 2 (y n Hm n n 1 ) T Sn 1 (y n Hm n n 1 ) with S n = HΣ n n 1 H T + R J. H. and A. Bishop. Smoothing and filtering with a class of outer measures. In: SIAM Journal on Uncertainty Quantification 6.2 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
105 Inference with outer measures Representing uncertainty Kalman filter Recursion f n 1 (x y 1:n 1 ) = N (x; m n 1, Σ n 1 ) g(x x ) = N (x; F x, Q) h(y x) = N (y; Hx, R) Same means m n n 1, m n and spreads Σ n n 1, Σ n Different marginal likelihood ( f Yn (y n ) = exp 1 ) 2 (y n Hm n n 1 ) T Sn 1 (y n Hm n n 1 ) with S n = HΣ n n 1 H T + R J. H. and A. Bishop. Smoothing and filtering with a class of outer measures. In: SIAM Journal on Uncertainty Quantification 6.2 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
106 Inference with outer measures Complex systems Natural language processing: Bike theft Figure: Map of the surroundings (Google Maps). Red-dotted rectangle: area of interest, red dot: location of bike theft. A. Bishop, J. H., D. Angley, and B. Ristić. Spatio-temporal tracking from natural language statements using outer probability theory. In: Elsevier Information Sciences (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
107 Inference with outer measures Complex systems Natural language processing: Bike theft Information to be confirmed: 1. Suspect alibi: I was with a friend at the tram stop on the intersection of La Trobe St. and Elizabeth St. 2. CCTV: Recording of the theft The witnesses declarations are: 1. The suspect has been seen on Elizabeth St. around 2.07p.m. 2. The suspect turned at the intersection of Swanston and Abeckett St. between 2.25p.m. and 2.35p.m. 3. The suspect has been seen near RMIT building 80 around 2.35p.m. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
108 Inference with outer measures Complex systems Natural language processing: Bike theft Information to be confirmed: 1. Suspect alibi: I was with a friend at the tram stop on the intersection of La Trobe St. and Elizabeth St. 2. CCTV: Recording of the theft The witnesses declarations are: 1. The suspect has been seen on Elizabeth St. around 2.07p.m. 2. The suspect turned at the intersection of Swanston and Abeckett St. between 2.25p.m. and 2.35p.m. 3. The suspect has been seen near RMIT building 80 around 2.35p.m. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
109 Inference with outer measures Complex systems Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
110 Inference with outer measures Complex systems Complex system Xn False alarms True observation Yn position position time time J. H. Detection and estimation of partially-observed dynamical systems: an outer-measure approach. In: arxiv:1801:00571 (2018) Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
111 Inference with outer measures Complex systems Modelling Uncertain counting measure with X : ω u N a N-valued uncertain variable N(ω u) i=1 δ Xi(ω u) {X i } i a collection of X-valued uncertain variables First-moment outer measure F X (B) = Ē( max i {1,...,N} 1 B (X i ) ) Proposition If X and X are independent then F X +X (x) = max{f X (x), F X (x)}. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
112 Inference with outer measures Complex systems Modelling Uncertain counting measure with X : ω u N a N-valued uncertain variable N(ω u) i=1 δ Xi(ω u) {X i } i a collection of X-valued uncertain variables First-moment outer measure F X (B) = Ē( max i {1,...,N} 1 B (X i ) ) Proposition If X and X are independent then F X +X (x) = max{f X (x), F X (x)}. Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
113 Inference with outer measures Complex systems Detection and estimation of dynamical systems No information on false positives: f (y 1,..., y n ) = 1, y 1,..., y n O, n 0 Lower bound on the probability of detection: h(φ x) = α = p D 1 α Lower bound for the probability of staying in the state space: g(ψ x) = β = p S 1 β Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
114 Inference with outer measures Complex systems Detection and estimation of dynamical systems No information on false positives: f (y 1,..., y n ) = 1, y 1,..., y n O, n 0 Lower bound on the probability of detection: h(φ x) = α = p D 1 α Lower bound for the probability of staying in the state space: g(ψ x) = β = p S 1 β Jérémie Houssineau (NUS) Multi-object dynamical systems September 13, / 67
Sequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationAuxiliary Particle Methods
Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationA second-order PHD filter with mean and variance in target number
1 A second-order PHD filter with mean and variance in target number Isabel Schlangen, Emmanuel Delande, Jérémie Houssineau and Daniel E. Clar arxiv:174.284v1 stat.me 7 Apr 217 Abstract The Probability
More informationBayesian Inference in Astronomy & Astrophysics A Short Course
Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning
More informationExtreme Value Analysis and Spatial Extremes
Extreme Value Analysis and Department of Statistics Purdue University 11/07/2013 Outline Motivation 1 Motivation 2 Extreme Value Theorem and 3 Bayesian Hierarchical Models Copula Models Max-stable Models
More informationConcentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand
Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationThe Multiple Model Labeled Multi-Bernoulli Filter
18th International Conference on Information Fusion Washington, DC - July 6-9, 215 The Multiple Model Labeled Multi-ernoulli Filter Stephan Reuter, Alexander Scheel, Klaus Dietmayer Institute of Measurement,
More informationUnsupervised Learning
Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College
More informationFast Sequential Monte Carlo PHD Smoothing
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 11 Fast Sequential Monte Carlo PHD Smoothing Sharad Nagappa and Daniel E. Clark School of EPS Heriot Watt University
More informationarxiv: v2 [math.pr] 28 Feb 2017
On a representation of partially-distinguishable populations Jeremie Houssineau a,, Daniel E. Clark b arxiv:1608.00723v2 [math.pr] 28 Feb 2017 a Department of Statistics and Applied Probability, National
More information9 Multi-Model State Estimation
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State
More informationA new class of interacting Markov Chain Monte Carlo methods
A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008 Outline 1 Introduction Stochastic
More informationBAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS. Francesco Papi and Du Yong Kim
3rd European Signal Processing Conference EUSIPCO BAYESIAN MULTI-TARGET TRACKING WITH SUPERPOSITIONAL MEASUREMENTS USING LABELED RANDOM FINITE SETS Francesco Papi and Du Yong Kim Department of Electrical
More informationMachine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014
Machine Learning Probability Basics Basic definitions: Random variables, joint, conditional, marginal distribution, Bayes theorem & examples; Probability distributions: Binomial, Beta, Multinomial, Dirichlet,
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationDynamic models 1 Kalman filters, linearization,
Koller & Friedman: Chapter 16 Jordan: Chapters 13, 15 Uri Lerner s Thesis: Chapters 3,9 Dynamic models 1 Kalman filters, linearization, Switching KFs, Assumed density filters Probabilistic Graphical Models
More informationPART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics
Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability
More informationA new iterated filtering algorithm
A new iterated filtering algorithm Edward Ionides University of Michigan, Ann Arbor ionides@umich.edu Statistics and Nonlinear Dynamics in Biology and Medicine Thursday July 31, 2014 Overview 1 Introduction
More informationRao-Blackwellized Particle Filter for Multiple Target Tracking
Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized
More informationPattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions
Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite
More informationLecture 7 Introduction to Statistical Decision Theory
Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7
More informationLecture 4: Probabilistic Learning. Estimation Theory. Classification with Probability Distributions
DD2431 Autumn, 2014 1 2 3 Classification with Probability Distributions Estimation Theory Classification in the last lecture we assumed we new: P(y) Prior P(x y) Lielihood x2 x features y {ω 1,..., ω K
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More informationArtificial Intelligence
Artificial Intelligence Roman Barták Department of Theoretical Computer Science and Mathematical Logic Summary of last lecture We know how to do probabilistic reasoning over time transition model P(X t
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint
More informationKalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein
Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time
More informationLecture 4: Probabilistic Learning
DD2431 Autumn, 2015 1 Maximum Likelihood Methods Maximum A Posteriori Methods Bayesian methods 2 Classification vs Clustering Heuristic Example: K-means Expectation Maximization 3 Maximum Likelihood Methods
More informationParticle Filters: Convergence Results and High Dimensions
Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian
More informationMonte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan
Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationArtificial Intelligence
Artificial Intelligence Probabilities Marc Toussaint University of Stuttgart Winter 2018/19 Motivation: AI systems need to reason about what they know, or not know. Uncertainty may have so many sources:
More informationPatterns of Scalable Bayesian Inference Background (Session 1)
Patterns of Scalable Bayesian Inference Background (Session 1) Jerónimo Arenas-García Universidad Carlos III de Madrid jeronimo.arenas@gmail.com June 14, 2017 1 / 15 Motivation. Bayesian Learning principles
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationDynamic System Identification using HDMR-Bayesian Technique
Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in
More informationNegative Association, Ordering and Convergence of Resampling Methods
Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal
More informationPractical conditions on Markov chains for weak convergence of tail empirical processes
Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationConvergence in Distribution
Convergence in Distribution Undergraduate version of central limit theorem: if X 1,..., X n are iid from a population with mean µ and standard deviation σ then n 1/2 ( X µ)/σ has approximately a normal
More informationFinal Examination. STA 711: Probability & Measure Theory. Saturday, 2017 Dec 16, 7:00 10:00 pm
Final Examination STA 711: Probability & Measure Theory Saturday, 2017 Dec 16, 7:00 10:00 pm This is a closed-book exam. You may use a sheet of prepared notes, if you wish, but you may not share materials.
More informationMobile Robot Localization
Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationAdvanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering
Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London
More informationCardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions
Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions Stephan Reuter, Daniel Meissner, Benjamin Wiling, and Klaus Dietmayer Institute of Measurement, Control, and
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationA Random Finite Set Conjugate Prior and Application to Multi-Target Tracking
A Random Finite Set Conjugate Prior and Application to Multi-Target Tracking Ba-Tuong Vo and Ba-Ngu Vo School of Electrical, Electronic and Computer Engineering The University of Western Australia, Crawley,
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationan introduction to bayesian inference
with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena
More informationBasic math for biology
Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationSmoothing Algorithms for the Probability Hypothesis Density Filter
Smoothing Algorithms for the Probability Hypothesis Density Filter Sergio Hernández Laboratorio de Procesamiento de Información GeoEspacial. Universidad Católica del Maule. Talca, Chile. shernandez@ucm.cl.
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationContraction properties of Feynman-Kac semigroups
Journées de Statistique Marne La Vallée, January 2005 Contraction properties of Feynman-Kac semigroups Pierre DEL MORAL, Laurent MICLO Lab. J. Dieudonné, Nice Univ., LATP Univ. Provence, Marseille 1 Notations
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationCPHD filtering in unknown clutter rate and detection profile
CPHD filtering in unnown clutter rate and detection profile Ronald. P. S. Mahler, Ba Tuong Vo, Ba Ngu Vo Abstract In Bayesian multi-target filtering we have to contend with two notable sources of uncertainty,
More informationIncorporating Track Uncertainty into the OSPA Metric
14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 Incorporating Trac Uncertainty into the OSPA Metric Sharad Nagappa School of EPS Heriot Watt University Edinburgh,
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationUniversity of Toronto Department of Statistics
Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704
More informationMcGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper
McGill University Faculty of Science Department of Mathematics and Statistics Part A Examination Statistics: Theory Paper Date: 10th May 2015 Instructions Time: 1pm-5pm Answer only two questions from Section
More informationRobotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard
Robotics 2 Data Association Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Data Association Data association is the process of associating uncertain measurements to known tracks. Problem
More informationMean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral
Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.
More informationExponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger
Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm by Korbinian Schwinger Overview Exponential Family Maximum Likelihood The EM Algorithm Gaussian Mixture Models Exponential
More informationProbability and Measure
Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationGraphical Models and Kernel Methods
Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.
More informationsimple if it completely specifies the density of x
3. Hypothesis Testing Pure significance tests Data x = (x 1,..., x n ) from f(x, θ) Hypothesis H 0 : restricts f(x, θ) Are the data consistent with H 0? H 0 is called the null hypothesis simple if it completely
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationGaussian Mixture PHD and CPHD Filtering with Partially Uniform Target Birth
PREPRINT: 15th INTERNATIONAL CONFERENCE ON INFORMATION FUSION, ULY 1 Gaussian Mixture PHD and CPHD Filtering with Partially Target Birth Michael Beard, Ba-Tuong Vo, Ba-Ngu Vo, Sanjeev Arulampalam Maritime
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More information