Introduction to Sensor Data Fusion Methods and Applications

Similar documents
Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Selected Tracking and Fusion Applications for the Defence and Security Domain

ON TRACKING-DRIVEN ROAD MAP EXTRACTION FROM GMTI RADAR DATA

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Multi Hypothesis Track Extraction and Maintenance of GMTI Sensor Data

Tracking and Identification of Multiple targets

Sensor Tasking and Control

2D Image Processing. Bayes filter implementation: Kalman filter

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart

Fundamentals of Data Assimila1on

2D Image Processing. Bayes filter implementation: Kalman filter

Statistical Learning Theory

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Introduction to Machine Learning

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

From Bayes to Extended Kalman Filter

Artificial Intelligence

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Detection theory. H 0 : x[n] = w[n]

Click Prediction and Preference Ranking of RSS Feeds

GMTI Tracking in the Presence of Doppler and Range Ambiguities

Tracking of convoys by airborne STAP radar

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Introduction to Bayesian Statistics

Statistical Multisource-Multitarget Information Fusion

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set

2D Image Processing (Extended) Kalman and particle filter

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

CONDENSATION Conditional Density Propagation for Visual Tracking

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Expectation Propagation Algorithm

Multivariate statistical methods and data mining in particle physics

The Bayes classifier

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project

Context Driven Tracking using Particle Filters

Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis?

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

CSE 598C Vision-based Tracking Seminar. Times: MW 10:10-11:00AM Willard 370 Instructor: Robert Collins Office Hours: Tues 2-4PM, Wed 9-9:50AM

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters

Mobile Robot Localization

Linear Dynamical Systems

Introduction to Statistical Inference

Approximate Inference Part 1 of 2

Intro to Probability. Andrei Barbu

Probability and Information Theory. Sargur N. Srihari

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

COM336: Neural Computing

Introduction to Mobile Robotics Probabilistic Robotics

LECTURE NOTE #3 PROF. ALAN YUILLE

Human Pose Tracking I: Basics. David Fleet University of Toronto

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Statistical Machine Learning from Data

Synthetic Weather Radar: Offshore Precipitation Capability

Failure prognostics in a particle filtering framework Application to a PEMFC stack

Robot Localisation. Henrik I. Christensen. January 12, 2007

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

MODELLING TEMPORAL VARIATIONS BY POLYNOMIAL REGRESSION FOR CLASSIFICATION OF RADAR TRACKS

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

COMPUTATIONAL LEARNING THEORY

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization

Tracking and Data Fusion Applications

Clustering K-means. Clustering images. Machine Learning CSE546 Carlos Guestrin University of Washington. November 4, 2014.

01 Probability Theory and Statistics Review

B4 Estimation and Inference

Linear discriminant functions

COS Lecture 16 Autonomous Robot Navigation

Signal Detection Basics - CFAR

Approximate Inference Part 1 of 2

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach

Lecture 7 Introduction to Statistical Decision Theory

MODULE -4 BAYEIAN LEARNING

NPFL108 Bayesian inference. Introduction. Filip Jurčíček. Institute of Formal and Applied Linguistics Charles University in Prague Czech Republic

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

Probabilistic Graphical Models

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University

Advanced statistical methods for data analysis Lecture 1

A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Lecture 4 Discriminant Analysis, k-nearest Neighbors

Dynamic models 1 Kalman filters, linearization,

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

The Mixed Labeling Problem in Multi Target Particle Filtering

Probabilistic Graphical Models for Image Analysis - Lecture 1

Generative classifiers: The Gaussian classifier. Ata Kaban School of Computer Science University of Birmingham

Clustering K-means. Machine Learning CSE546. Sham Kakade University of Washington. November 15, Review: PCA Start: unsupervised learning

TSRT14: Sensor Fusion Lecture 8

Structural Reliability

Machine Learning 1. Linear Classifiers. Marius Kloft. Humboldt University of Berlin Summer Term Machine Learning 1 Linear Classifiers 1

Lecture 8: Signal Detection and Noise Assumption

Localization and Tracking of Radioactive Source Carriers in Person Streams

Transcription:

Introduction to Sensor Data Fusion Methods and Applications Last lecture: Why Sensor Data Fusion? Motivation, general context Discussion of examples oral examination: 6 credit points after the end of the semester prerequisite: participation in the excercises, running programs continuation in Summer: lectures and seminar on advanced topics job opportunities as research assistant in ongoing projects, practicum subsequently: master theses at Fraunhofer FKIE, PhD theses possible slides/script: email to wolfgang.koch@fkie.fraunhofer.de, download

Sensor & Information Fusion: Basic Task -/ information sources: defined by operational requirements

Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, unresolved, false, deceptive, hard to formalize, contradictory...

Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, unresolved, false, deceptive, hard to formalize, contradictory... information sources: defined by operational requirements

single sensors/network measurements kinematical parameters classification attributes data processing/fusion temporal integration / logical analysis statistical estimation / data association combination with a priori information condensed information objects represented by track structures quantitative accuracy/reliability measures

A Generic Tracking and Sensor Data Fusion System Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: - Sensor Performance - Object Characteristics - Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: - Track Cancellation - Object Classification / ID - Track-to-Track Fusion Man-Machine Interface: - Object Representation - Displaying Functions - Interaction Facilities Sensor System Sensor System

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities source of information: received waveforms (space-time)

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities detection: a decision process for data rate reduction

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities data fusion input: estimated target parameters, false plots

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities core function: associate sensor data to established tracks

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities track maintenance: updating of established target tracks

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities initiation: establish new tracks, re-initiate lost tracks

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities exploit available context information (e.g. topography)

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities fusion/management of pre-processed track information

Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities user interface: presentation, interaction, decision support

Target Tracking: Basic Idea, Demonstration Problem-inherent uncertainties and ambiguities! BAYES: processing scheme for soft, delayed decision sensor performance: resolution conflicts DOPPLER blindness environment: dense situations clutter jamming/deception target characteristics: qualitatively distinct maneuvering phases background knowledge vehicles on road networks tactics

pdf: t k 1 Probability densities functions (pdf) p(x k 1 Z k 1 ) represent imprecise knowledge on the state x k 1 based on imprecise measurements Z k 1.

pdf: t k 1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. p(x k Z k 1 ) {z } prediction = R dx k 1 p(x k x k 1 ) {z } dynamics p(x k 1 Z k 1 ). {z } old knowledge

pdf: t k 1 t k : kein plot missing sensor detection: data processing = prediction (not always: exploitation of negative sensor evidence)

pdf: t k 1 pdf: t k Prädiktion: tk+1 missing sensor information: increasing knowledge dissipation

pdf: t k 1 pdf: t k t k+1 : ein plot sensor information on the kinematical object state

pdf: t k 1 likelihood (Sensormodell) pdf: t k Prädiktion: tk+1 BAYES formula: p(x k+1 Z k+1 ) {z } new knowledge = p(z k+1 x k+1 ) p(x k+1 Z k ) R dxk+1 p(z k+1 x k+1 ) p(x k+1 Z k ) {z} {z } plot prediction

pdf: t k 1 pdf: t k+1 (Bayes) pdf: t k filtering = sensor data processing

pdf: t k 1 pdf: t k t k+1 : drei plots ambiguities by false plots: 1 + 3 data interpretation hypotheses ( detection probability, false alarm statistics)

pdf: t k 1 pdf: t k pdf: t k+1 Multimodal pdfs reflect ambiguities inherent in the data.

pdf: t k 1 pdf: t k pdf: t k+1 Prädiktion: t k+2 temporal propagation: dissipation of the probability densities

pdf: t k 1 pdf: t k pdf: t k+1 t k+2 : ein plot association tasks: sensor data $ interpretation hypotheses

pdf: t k 1 likelihood pdf: t k pdf: t k+1 Prädiktion: t k+2 BAYES: p(x k+2 Z k+2 )= p(z k+2 x k+2 ) p(x k+2 Z k+1 ) R dxk+2 p(z k+2 x k+2 ) p(x k+2 Z k+1 )

pdf: t k 1 pdf: t k pdf: t k+1pdf: t k+2 in particular: re-calculation of the hypothesis weights

pdf: t k 1 pdf: t k pdf: t k+1pdf: t k+2 How does new knowledge affect the knowledge in the past of a past state?

pdf: t k 1 Retrodiktion: t k+1 pdf: t k pdf: t k+2 retrodiction : a retrospective analysis of the past

t k 1 t k t k+1 t k+2 optimal information processing at present and for the past

Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k, accumulated sensor data Z k a priori knowledge: target dynamics models, sensor model, road maps prediction: p(x k 1 Z k 1 ) filtering: p(x k Z k 1 ) retrodiction: p(x l 1 Z k ) dynamics model! road maps sensor data Z k! sensor model filtering output dynamics model p(x k Z k 1 ) p(x k Z k ) p(x l Z k ) finite mixture: inherent ambiguity (data, model, road network) optimal estimators: e.g. minimum mean squared error (MMSE) initiation of pdf iteration: multiple hypothesis track extraction

Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity

Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates

Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: sensor resolution: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates characteristic parameters band-/beam width group measurements: resolution probability important: qualitatively correct modeling

Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: sensor resolution: object behavior: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates characteristic parameters band-/beam width group measurements: resolution probability important: qualitatively correct modeling applications: high maneuvering capability qualitatively distinct maneuvering phases dynamic object parameters a priori unknown

Demonstration: Multiple Hypothesis Tracking

Tracking Application: Ground Picture Production GMTI Radar: Ground Moving Target Indicator wide area, all-weather, day/night, real-time surveillance of a dynamically evolving ground or near-to-ground situation GMTI Tracking: Some Characteristic Aspects backbone of a ground picture: moving target tracks airborne, dislocated, mobile sensor platforms vehicles, ships, low-flyers, radars, convoys occlusions: Doppler-blindness, topography road maps, terrain information, tactical rules dense target / dense clutter situations: MHT

Examples of GMTI Tracks (live exercise)

Multiple Sensor Security Assistance Systems 1 General Task Task Covert & Automated Surveillance of a Person Stream: Identification of Anomalous Behavior Towards a Solution Exploit Heterogeneous Multiple Sensor Systems. For OFFICIAL USE ONLY

Multiple Sensor Security Assistance Systems 2 General Task Task Covert & Automated Surveillance of a Person Stream: Identification of Anomalous Behavior Towards a Solution Exploit Heterogeneous Multiple Sensor Systems. Sensor Surveillance Data Kinematics: Where? When? Data Fusion Attributes: What? When? Person Classification For OFFICIAL USE ONLY

Security Applications: Consider Well-defined Access Regions. 3 Important Task: Detect persons carrying hazardous materials in a person flow. Escalators / Stairways Tunnels / Underground Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion For OFFICIAL USE ONLY

Security Applications: Consider Well-defined Access Regions. 4 Important Task: Detect persons carrying hazardous materials in a person flow. Escalators / Stairways Tunnels / Underground Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion Track Track Extraction Extraction / Maintenance Maintenance Laser-Range-Scanner Sensors Sensors Video Video Data Data Supporting Supporting Information Information Attributes Attributes Chemical Chemical Sensors Sensors EU Project HAMLeT: Hazardous Material Localization and Person Tracking For OFFICIAL USE ONLY

Detection of Persons and Goods with High Threat Potential 5 Exploit the full potential of specific sensors (attributes) Associate measured attributes/signatures to individuals Tracking and classification of individuals in person streams Covert operation: avoid interference with normal public live Avoid fatigue in situations with low frequency of suspicious events Indoor Radar Video Laser IR Camera Chemical Sensors For OFFICIAL USE ONLY

Multiple Sensor Security Assistance Systems Experience of human security personnel remains indispensable. Technical assistance by focusing their attention to critical situations. Security assistance systems: covert operation, continuous time. Combination of strengths of automated and human data exploitation. screening: real time analysis of large data streams high decision confidence in individual situations

On Characterizing Tracking / Fusion Performance a well-understood paradigm: air surveillance with multiple radars Many results can be transfered to other sensors (IR, E/O, sonar, acoustics). Sensor Data Fusion: tracks represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What?

On Characterizing Tracking / Fusion Performance a well-understood paradigm: air surveillance with multiple radars Many results can be transfered to other sensors (IR, E/O, sonar, acoustics). Sensor Data Fusion: tracks represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What? By sensor data fusion we wish to establish one-to-one associations between: targets in the field of view $ identified tracks in the tracking computer Strictly speaking, this is only possible under ideal conditions regarding the sensor performance and underlying target situation. The tracking/fusion performance can thus be measured by its deficiencies when compared with this ideal goal.

1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a.

1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a. 2. Unavoidably, false tracks will be extracted in case of a high false return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding deficiencies are: mean number of falsely extracted targets per time, mean life time of a false track before its deletion.

1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a. 2. Unavoidably, false tracks will be extracted in case of a high false return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding deficiencies are: mean number of falsely extracted targets per time, mean life time of a false track before its deletion. 3. A target should be represented by one and the same track until leaving the field of view. Related performance measures/deficiencies: mean life time of tracks related to true targets, probability of an identity switch between targets, probability of a target not being represented by a track.

4. The track inaccuracy (error covariance of a state estimate) should be as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances produced (consistency). If this is not the case, we speak of a track loss.

4. The track inaccuracy (error covariance of a state estimate) should be as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances produced (consistency). If this is not the case, we speak of a track loss. A track must really represent a target! Challenges: low detection probability high clutter density low update rate agile targets dense target situations formations, convoys target-split events (formation, weapons) jamming, deception Basic Tasks: models: sensor, target, environment data association problems estimation problems process control, realization! physics! combinatorics! probability, statistics! computer science

Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time.

Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data.

Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known.

Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known. Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known. Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. Solution: Derive iteration formulae for calculating the pdfs! Develop a mechanism for initiation! By doing so, exploit all background information available! Derive state estimates from the pdfs along with appropriate quality measures!

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration!

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1)

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y)

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! {z } =p(y x) p(x)

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)!

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y) 2 2 2 2

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y) 2 2 2 2 transformed RV y = t[x]: p(y) = R dx p(y, x) = R dx p(y x) p x (x)

How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y) 2 2 2 2 transformed RV y = t[x]: p(y) = R dxp(y, x) = R dxp(y x)p x (x) = R dx (y t[x]) px (x) =:[T p x ](y) (T : p x 7! p, transfer operator )

Characterize an object by quantitatively describable properties: object state object position x on a strait line: x 2 R kinematic state x =(r >, ṙ >, r > ) >, x 2 R 9 position r =(x, y, z) >, velocity ṙ, acceleration r Examples: joint state of two objects: x =(x > 1, x> 2 )> kinematic state x, object extension X z.b. ellipsoid: symmetric, positively definite matrix kinematic state x, object class class z.b. bird, sailing plane, helicopter, passenger jet,... Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Interpret unknown object states as random variables, x [1D] or x, X [vector / matrix variate]), characterized by corresponding probability density functions (pdf). The concrete shape of the pdf p(x) contains the full knowledge on x!

Information on a random variable (RV) can be extracted by integration from the corresponding pdf.! at present: one dimensional case: How probable is it that x 2 (a, b) R holds? Answer: P {x 2 (a, b)} = Z b a dx p(x) ) p(x) 0 in particular: P {x 2 R} = Z 1 1 dx p(x) =1 (normalzation) intuitive interpretation: the object is somewhere in R loosely: p(x) dx is probabity for x having a value between x and x + dx

How to characterize the properties of a pdf? specifically: How to associate a single expected value to a RV? The maximum of the pdf is sometimes but not always useful!

How to characterize the properties of a pdf? specifically: How to associate a single expected value to a RV? The maximum of the pdf is sometimes but not always useful! (! examples) instead: Calculate the centroid of the pdf! E[x] = Z 1 1 dx x p(x) = x expectation value more generally: Consider functions g : x 7! g(x) of the RV x! E[g(x)] = Z 1 1 dx g(x) p(x), expectation value of the observable g Example: Consider the observable 1 2 mx2 (kinetic energy, x = speed)

An important observable: the error of an estimate Quality: How useful is an expectation value x = E[x]? Consider special obervables as distance measure: g(x) = x x oder g(x) =(x x) 2 quadratic measures: computationally more comfortable! expected error of the expectation value x: V[x] =E[(x x) 2 ], x = p V[x] variance, standard deviation Exercise 2.1 Show that V[x] =E[x 2 ] E[x] 2 holds. Expectation value of the observable x 2 also called 2nd moment of the pdf of x.

Exercise 2.2 Calculate expectation and variance of the uniform density of a RV x 2 R in the intervall [a, b]. p(x) =U( x {z} ZV ; a, b {z} Parameter )= 8 < : 1 b a x 2 [a, b] 0 sonst! Pdf correctly normalized? Z 1 1 dx U(x; a, b) = 1 b a Z b a dx =1 E[x] = Z 1 1 dx x U(x; a, b) =b + a 2 V[x] = 1 b a Z b dx a x2 E[x] 2 = 1 (b a)2 12

Important example: x normally distributed over R (Gauß) wanted: probabilities concentrated around µ quadratic distance: x µ 2 = 1 2 (x µ)2 / 2 (mathematically convenient!) Parameter is a measure of the width of the pdf: 2 = 1 2 for large distances, i.e. x µ 2 1 2, the pdf shall decay quickly. simplest approach: p(x) =e x µ 2 (> 0 8x 2 R, normalization?) Normalized for p(x) = p(x)/ R 1 1 dx p(x)! Formula collection delivers: Z 1 1 dx p(x) =p 2 An admissible pdf with the required properties is obviously given by: N (x; µ, )= 1 p 2 exp (x µ) 2 2 2!

Exercise 2.3 Show for the Gauß density p(x) =N (x; µ, ): E[x] =µ, V[x] = 2 E[x] = Z 1 1 dx x N (x; µ, )=µ V[x] =E[x 2 ] E[x] 2 = 2 Use substitution and partial integration! Use R 1 1 dx e 1 2 x2 = p 2!

How to incorporate certain knowledge into pdfs? (x; µ) = Consider the special pdf: 1ly sharp peak at µ 8 < : 1 x = µ 0 x 6= µ holding: Z 1 1 dx (x; µ) =1 Intuitively interpretable as a limit: (x; µ) = lim!0 N (x; µ, )= lim!0 1 p 2 e 1 (x µ)2 2 2 Alternative: (x; a) = lim b!a U(x; a, b) For observables g this holds: E[g(x)] = Z 1 1 dx g(x) (x; y) =g(y) in particular: V[x] =E[(x E[x]) 2 ]=E[x 2 ] E[x] 2 =0

Characterize an object by quantitatively describable properties: object state object position x on a strait line: x 2 R X kinematic state x =(r >, ṙ >, r > ) >, x 2 R 9 position r =(x, y, z) >, velocity ṙ, acceleration r Examples: joint state of two objects: x =(x > 1, x> 2 )> kinematic state x, object extension X z.b. ellipsoid: symmetric, positively definite matrix kinematic state x, object class class z.b. bird, sailing plane, helicopter, passenger jet,... Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!