Introduction to Sensor Data Fusion Methods and Applications

Size: px
Start display at page:

Download "Introduction to Sensor Data Fusion Methods and Applications"

Transcription

1 Introduction to Sensor Data Fusion Methods and Applications Last lecture: Why Sensor Data Fusion? Motivation, general context Discussion of examples oral examination: 6 credit points after the end of the semester prerequisite: participation in the excercises, running programs continuation in Summer: lectures and seminar on advanced topics job opportunities as research assistant in ongoing projects, practicum subsequently: master theses at Fraunhofer FKIE, PhD theses possible slides/script: to wolfgang.koch@fkie.fraunhofer.de, download

2 Sensor & Information Fusion: Basic Task -/ information sources: defined by operational requirements

3 Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, unresolved, false, deceptive, hard to formalize, contradictory...

4 Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, unresolved, false, deceptive, hard to formalize, contradictory... information sources: defined by operational requirements

5 single sensors/network measurements kinematical parameters classification attributes data processing/fusion temporal integration / logical analysis statistical estimation / data association combination with a priori information condensed information objects represented by track structures quantitative accuracy/reliability measures

6 A Generic Tracking and Sensor Data Fusion System Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: - Sensor Performance - Object Characteristics - Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: - Track Cancellation - Object Classification / ID - Track-to-Track Fusion Man-Machine Interface: - Object Representation - Displaying Functions - Interaction Facilities Sensor System Sensor System

7 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities source of information: received waveforms (space-time)

8 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities detection: a decision process for data rate reduction

9 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities data fusion input: estimated target parameters, false plots

10 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities core function: associate sensor data to established tracks

11 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities track maintenance: updating of established target tracks

12 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities initiation: establish new tracks, re-initiate lost tracks

13 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities exploit available context information (e.g. topography)

14 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities fusion/management of pre-processed track information

15 Tracking & Fusion System Sensor System Sensing Hardware: Received Waveforms Detection Process: Data Rate Reduction Signal Processing: Parameter Estimation Sensor Data Sensor Control A Priori Knowledge: Sensor Performance Object Characteristics Object Environment Track Initiation: Multiple Frame Track Extraction Sensor Data to Track Association Track Maintenance: Prediction, Filtering Retrodiction Track File Storage Track Processing: Track Cancellation Object Classification / ID Track to Track Fusion Man Machine Interface: Object Representation Displaying Functions Interaction Facilities user interface: presentation, interaction, decision support

16 Target Tracking: Basic Idea, Demonstration Problem-inherent uncertainties and ambiguities! BAYES: processing scheme for soft, delayed decision sensor performance: resolution conflicts DOPPLER blindness environment: dense situations clutter jamming/deception target characteristics: qualitatively distinct maneuvering phases background knowledge vehicles on road networks tactics

17 pdf: t k 1 Probability densities functions (pdf) p(x k 1 Z k 1 ) represent imprecise knowledge on the state x k 1 based on imprecise measurements Z k 1.

18 pdf: t k 1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. p(x k Z k 1 ) {z } prediction = R dx k 1 p(x k x k 1 ) {z } dynamics p(x k 1 Z k 1 ). {z } old knowledge

19 pdf: t k 1 t k : kein plot missing sensor detection: data processing = prediction (not always: exploitation of negative sensor evidence)

20 pdf: t k 1 pdf: t k Prädiktion: tk+1 missing sensor information: increasing knowledge dissipation

21 pdf: t k 1 pdf: t k t k+1 : ein plot sensor information on the kinematical object state

22 pdf: t k 1 likelihood (Sensormodell) pdf: t k Prädiktion: tk+1 BAYES formula: p(x k+1 Z k+1 ) {z } new knowledge = p(z k+1 x k+1 ) p(x k+1 Z k ) R dxk+1 p(z k+1 x k+1 ) p(x k+1 Z k ) {z} {z } plot prediction

23 pdf: t k 1 pdf: t k+1 (Bayes) pdf: t k filtering = sensor data processing

24 pdf: t k 1 pdf: t k t k+1 : drei plots ambiguities by false plots: data interpretation hypotheses ( detection probability, false alarm statistics)

25 pdf: t k 1 pdf: t k pdf: t k+1 Multimodal pdfs reflect ambiguities inherent in the data.

26 pdf: t k 1 pdf: t k pdf: t k+1 Prädiktion: t k+2 temporal propagation: dissipation of the probability densities

27 pdf: t k 1 pdf: t k pdf: t k+1 t k+2 : ein plot association tasks: sensor data $ interpretation hypotheses

28 pdf: t k 1 likelihood pdf: t k pdf: t k+1 Prädiktion: t k+2 BAYES: p(x k+2 Z k+2 )= p(z k+2 x k+2 ) p(x k+2 Z k+1 ) R dxk+2 p(z k+2 x k+2 ) p(x k+2 Z k+1 )

29 pdf: t k 1 pdf: t k pdf: t k+1pdf: t k+2 in particular: re-calculation of the hypothesis weights

30 pdf: t k 1 pdf: t k pdf: t k+1pdf: t k+2 How does new knowledge affect the knowledge in the past of a past state?

31 pdf: t k 1 Retrodiktion: t k+1 pdf: t k pdf: t k+2 retrodiction : a retrospective analysis of the past

32 t k 1 t k t k+1 t k+2 optimal information processing at present and for the past

33 Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k, accumulated sensor data Z k a priori knowledge: target dynamics models, sensor model, road maps prediction: p(x k 1 Z k 1 ) filtering: p(x k Z k 1 ) retrodiction: p(x l 1 Z k ) dynamics model! road maps sensor data Z k! sensor model filtering output dynamics model p(x k Z k 1 ) p(x k Z k ) p(x l Z k ) finite mixture: inherent ambiguity (data, model, road network) optimal estimators: e.g. minimum mean squared error (MMSE) initiation of pdf iteration: multiple hypothesis track extraction

34 Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity

35 Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates

36 Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: sensor resolution: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates characteristic parameters band-/beam width group measurements: resolution probability important: qualitatively correct modeling

37 Difficult Operational Conditions object detection: small objects: detection probability P D < 1 fading: consecutive missing plots (interference) moving platforms: minimum detectable velocity measurements: sensor resolution: object behavior: false returns (residual clutter, birds, clouds) low data update rates (long-range radar, e.g.) measurement errors, overlapping gates characteristic parameters band-/beam width group measurements: resolution probability important: qualitatively correct modeling applications: high maneuvering capability qualitatively distinct maneuvering phases dynamic object parameters a priori unknown

38 Demonstration: Multiple Hypothesis Tracking

39

40

41 Tracking Application: Ground Picture Production GMTI Radar: Ground Moving Target Indicator wide area, all-weather, day/night, real-time surveillance of a dynamically evolving ground or near-to-ground situation GMTI Tracking: Some Characteristic Aspects backbone of a ground picture: moving target tracks airborne, dislocated, mobile sensor platforms vehicles, ships, low-flyers, radars, convoys occlusions: Doppler-blindness, topography road maps, terrain information, tactical rules dense target / dense clutter situations: MHT

42 Examples of GMTI Tracks (live exercise)

43 Multiple Sensor Security Assistance Systems 1 General Task Task Covert & Automated Surveillance of a Person Stream: Identification of Anomalous Behavior Towards a Solution Exploit Heterogeneous Multiple Sensor Systems. For OFFICIAL USE ONLY

44 Multiple Sensor Security Assistance Systems 2 General Task Task Covert & Automated Surveillance of a Person Stream: Identification of Anomalous Behavior Towards a Solution Exploit Heterogeneous Multiple Sensor Systems. Sensor Surveillance Data Kinematics: Where? When? Data Fusion Attributes: What? When? Person Classification For OFFICIAL USE ONLY

45 Security Applications: Consider Well-defined Access Regions. 3 Important Task: Detect persons carrying hazardous materials in a person flow. Escalators / Stairways Tunnels / Underground Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion For OFFICIAL USE ONLY

46 Security Applications: Consider Well-defined Access Regions. 4 Important Task: Detect persons carrying hazardous materials in a person flow. Escalators / Stairways Tunnels / Underground Fundamental Problem: Limited Spatio-temporal Resolution of Chemical Sensors Key to Solution: Compensate poor resolution by Space-time Sensor Data Fusion Track Track Extraction Extraction / Maintenance Maintenance Laser-Range-Scanner Sensors Sensors Video Video Data Data Supporting Supporting Information Information Attributes Attributes Chemical Chemical Sensors Sensors EU Project HAMLeT: Hazardous Material Localization and Person Tracking For OFFICIAL USE ONLY

47 Detection of Persons and Goods with High Threat Potential 5 Exploit the full potential of specific sensors (attributes) Associate measured attributes/signatures to individuals Tracking and classification of individuals in person streams Covert operation: avoid interference with normal public live Avoid fatigue in situations with low frequency of suspicious events Indoor Radar Video Laser IR Camera Chemical Sensors For OFFICIAL USE ONLY

48 Multiple Sensor Security Assistance Systems Experience of human security personnel remains indispensable. Technical assistance by focusing their attention to critical situations. Security assistance systems: covert operation, continuous time. Combination of strengths of automated and human data exploitation. screening: real time analysis of large data streams high decision confidence in individual situations

49 On Characterizing Tracking / Fusion Performance a well-understood paradigm: air surveillance with multiple radars Many results can be transfered to other sensors (IR, E/O, sonar, acoustics). Sensor Data Fusion: tracks represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What?

50 On Characterizing Tracking / Fusion Performance a well-understood paradigm: air surveillance with multiple radars Many results can be transfered to other sensors (IR, E/O, sonar, acoustics). Sensor Data Fusion: tracks represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What? By sensor data fusion we wish to establish one-to-one associations between: targets in the field of view $ identified tracks in the tracking computer Strictly speaking, this is only possible under ideal conditions regarding the sensor performance and underlying target situation. The tracking/fusion performance can thus be measured by its deficiencies when compared with this ideal goal.

51 1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a.

52 1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a. 2. Unavoidably, false tracks will be extracted in case of a high false return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding deficiencies are: mean number of falsely extracted targets per time, mean life time of a false track before its deletion.

53 1. Let a target be detected at first by a sensor at time t a. Usually, a time delay is involved until a confirmed track has finally been established at time t e (track extraction). A measure of deficiency is thus: extraction delay t e t a. 2. Unavoidably, false tracks will be extracted in case of a high false return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding deficiencies are: mean number of falsely extracted targets per time, mean life time of a false track before its deletion. 3. A target should be represented by one and the same track until leaving the field of view. Related performance measures/deficiencies: mean life time of tracks related to true targets, probability of an identity switch between targets, probability of a target not being represented by a track.

54 4. The track inaccuracy (error covariance of a state estimate) should be as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances produced (consistency). If this is not the case, we speak of a track loss.

55 4. The track inaccuracy (error covariance of a state estimate) should be as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances produced (consistency). If this is not the case, we speak of a track loss. A track must really represent a target! Challenges: low detection probability high clutter density low update rate agile targets dense target situations formations, convoys target-split events (formation, weapons) jamming, deception Basic Tasks: models: sensor, target, environment data association problems estimation problems process control, realization! physics! combinatorics! probability, statistics! computer science

56 Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time.

57 Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data.

58 Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known.

59 Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known. Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

60 Summary: BAYESian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Objective: Learn as much as possible about the individual target states at each time by analyzing the time series which is constituted by the sensor data. Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets temporal evolution is usually not well-known. Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. Solution: Derive iteration formulae for calculating the pdfs! Develop a mechanism for initiation! By doing so, exploit all background information available! Derive state estimates from the pdfs along with appropriate quality measures!

61 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration!

62 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1)

63 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y)

64 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! {z } =p(y x) p(x)

65 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)!

66 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y)

67 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y) transformed RV y = t[x]: p(y) = R dx p(y, x) = R dx p(y x) p x (x)

68 How to deal with probability density functions? pdf p(x): Extract probability statements about the RV x by integration! naïvely: positive and normalized functions (p(x) 0, R dx p(x) =1) conditional pdf p(x y) = p(x,y) : Impact of information on y on RV x? p(y) marginal density p(x) = R dy p(x, y) = R dy p(x y) p(y): Enter y! Bayes: p(x y)= p(y x)p(x) p(y) = R p(y x)p(x) dx p(y x)p(x) : p(x y) p(y x), p(x)! certain knowledge on x: p(x) = (x y) = lim 1!0 p e 1 (x y) transformed RV y = t[x]: p(y) = R dxp(y, x) = R dxp(y x)p x (x) = R dx (y t[x]) px (x) =:[T p x ](y) (T : p x 7! p, transfer operator )

69 Characterize an object by quantitatively describable properties: object state object position x on a strait line: x 2 R kinematic state x =(r >, ṙ >, r > ) >, x 2 R 9 position r =(x, y, z) >, velocity ṙ, acceleration r Examples: joint state of two objects: x =(x > 1, x> 2 )> kinematic state x, object extension X z.b. ellipsoid: symmetric, positively definite matrix kinematic state x, object class class z.b. bird, sailing plane, helicopter, passenger jet,... Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

70 Interpret unknown object states as random variables, x [1D] or x, X [vector / matrix variate]), characterized by corresponding probability density functions (pdf). The concrete shape of the pdf p(x) contains the full knowledge on x!

71 Information on a random variable (RV) can be extracted by integration from the corresponding pdf.! at present: one dimensional case: How probable is it that x 2 (a, b) R holds? Answer: P {x 2 (a, b)} = Z b a dx p(x) ) p(x) 0 in particular: P {x 2 R} = Z 1 1 dx p(x) =1 (normalzation) intuitive interpretation: the object is somewhere in R loosely: p(x) dx is probabity for x having a value between x and x + dx

72 How to characterize the properties of a pdf? specifically: How to associate a single expected value to a RV? The maximum of the pdf is sometimes but not always useful!

73 How to characterize the properties of a pdf? specifically: How to associate a single expected value to a RV? The maximum of the pdf is sometimes but not always useful! (! examples) instead: Calculate the centroid of the pdf! E[x] = Z 1 1 dx x p(x) = x expectation value more generally: Consider functions g : x 7! g(x) of the RV x! E[g(x)] = Z 1 1 dx g(x) p(x), expectation value of the observable g Example: Consider the observable 1 2 mx2 (kinetic energy, x = speed)

74 An important observable: the error of an estimate Quality: How useful is an expectation value x = E[x]? Consider special obervables as distance measure: g(x) = x x oder g(x) =(x x) 2 quadratic measures: computationally more comfortable! expected error of the expectation value x: V[x] =E[(x x) 2 ], x = p V[x] variance, standard deviation Exercise 2.1 Show that V[x] =E[x 2 ] E[x] 2 holds. Expectation value of the observable x 2 also called 2nd moment of the pdf of x.

75 Exercise 2.2 Calculate expectation and variance of the uniform density of a RV x 2 R in the intervall [a, b]. p(x) =U( x {z} ZV ; a, b {z} Parameter )= 8 < : 1 b a x 2 [a, b] 0 sonst! Pdf correctly normalized? Z 1 1 dx U(x; a, b) = 1 b a Z b a dx =1 E[x] = Z 1 1 dx x U(x; a, b) =b + a 2 V[x] = 1 b a Z b dx a x2 E[x] 2 = 1 (b a)2 12

76 Important example: x normally distributed over R (Gauß) wanted: probabilities concentrated around µ quadratic distance: x µ 2 = 1 2 (x µ)2 / 2 (mathematically convenient!) Parameter is a measure of the width of the pdf: 2 = 1 2 for large distances, i.e. x µ 2 1 2, the pdf shall decay quickly. simplest approach: p(x) =e x µ 2 (> 0 8x 2 R, normalization?) Normalized for p(x) = p(x)/ R 1 1 dx p(x)! Formula collection delivers: Z 1 1 dx p(x) =p 2 An admissible pdf with the required properties is obviously given by: N (x; µ, )= 1 p 2 exp (x µ) 2 2 2!

77 Exercise 2.3 Show for the Gauß density p(x) =N (x; µ, ): E[x] =µ, V[x] = 2 E[x] = Z 1 1 dx x N (x; µ, )=µ V[x] =E[x 2 ] E[x] 2 = 2 Use substitution and partial integration! Use R 1 1 dx e 1 2 x2 = p 2!

78 How to incorporate certain knowledge into pdfs? (x; µ) = Consider the special pdf: 1ly sharp peak at µ 8 < : 1 x = µ 0 x 6= µ holding: Z 1 1 dx (x; µ) =1 Intuitively interpretable as a limit: (x; µ) = lim!0 N (x; µ, )= lim!0 1 p 2 e 1 (x µ)2 2 2 Alternative: (x; a) = lim b!a U(x; a, b) For observables g this holds: E[g(x)] = Z 1 1 dx g(x) (x; y) =g(y) in particular: V[x] =E[(x E[x]) 2 ]=E[x 2 ] E[x] 2 =0

79 Characterize an object by quantitatively describable properties: object state object position x on a strait line: x 2 R X kinematic state x =(r >, ṙ >, r > ) >, x 2 R 9 position r =(x, y, z) >, velocity ṙ, acceleration r Examples: joint state of two objects: x =(x > 1, x> 2 )> kinematic state x, object extension X z.b. ellipsoid: symmetric, positively definite matrix kinematic state x, object class class z.b. bird, sailing plane, helicopter, passenger jet,... Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications

More information

Selected Tracking and Fusion Applications for the Defence and Security Domain

Selected Tracking and Fusion Applications for the Defence and Security Domain Applications for the Defence and Security Domain Wolfgang Koch Fraunhofer FKIE Neuenahrer Strasse 20 D 53343 Wachtberg, Germany Tel +49 228 9435 373 Fax +49 228 9435 685 email w.koch@ieee.org Abstract

More information

ON TRACKING-DRIVEN ROAD MAP EXTRACTION FROM GMTI RADAR DATA

ON TRACKING-DRIVEN ROAD MAP EXTRACTION FROM GMTI RADAR DATA In: Stilla U, Rottensteiner F, Hinz S (Eds) CMRT5. IAPRS, Vol. XXXVI, Part 3/W24 --- Vienna, Austria, August 29-3, 25 ON TRACKING-DRIVEN ROAD MAP EXTRACTION FROM GMTI RADAR DATA W.Koch,M.Ulmke FGAN-FKIE,

More information

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Target Tracking Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Linear Dynamical System (LDS) Stochastic process governed by is the state vector is the input vector is the process

More information

Multi Hypothesis Track Extraction and Maintenance of GMTI Sensor Data

Multi Hypothesis Track Extraction and Maintenance of GMTI Sensor Data Multi Hypothesis Track Extraction and Maintenance of GMTI Sensor Data Jost Koller FGAN-FKIE Neuenahrer Strasse 20 D-53343 Wachtberg, Germany koller@fgan.de Martin Ulmke FGAN-FKIE Neuenahrer Strasse 20

More information

Tracking and Identification of Multiple targets

Tracking and Identification of Multiple targets Tracking and Identification of Multiple targets Samir Hachour, François Delmotte, Eric Lefèvre, David Mercier Laboratoire de Génie Informatique et d'automatique de l'artois, EA 3926 LGI2A first name.last

More information

Sensor Tasking and Control

Sensor Tasking and Control Sensor Tasking and Control Sensing Networking Leonidas Guibas Stanford University Computation CS428 Sensor systems are about sensing, after all... System State Continuous and Discrete Variables The quantities

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Data Association Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Data Association Data association is the process of associating uncertain measurements to known tracks. Problem

More information

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart 1 Motivation Imaging is a stochastic process: If we take all the different sources of

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

Statistical Learning Theory

Statistical Learning Theory Statistical Learning Theory Part I : Mathematical Learning Theory (1-8) By Sumio Watanabe, Evaluation : Report Part II : Information Statistical Mechanics (9-15) By Yoshiyuki Kabashima, Evaluation : Report

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection REGLERTEKNIK Lecture Outline AUTOMATIC CONTROL Target Tracking: Lecture 3 Maneuvering Target Tracking Issues Maneuver Detection Emre Özkan emre@isy.liu.se Division of Automatic Control Department of Electrical

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM

More information

From Bayes to Extended Kalman Filter

From Bayes to Extended Kalman Filter From Bayes to Extended Kalman Filter Michal Reinštein Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Roman Barták Department of Theoretical Computer Science and Mathematical Logic Summary of last lecture We know how to do probabilistic reasoning over time transition model P(X t

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the

More information

Detection theory. H 0 : x[n] = w[n]

Detection theory. H 0 : x[n] = w[n] Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal

More information

Click Prediction and Preference Ranking of RSS Feeds

Click Prediction and Preference Ranking of RSS Feeds Click Prediction and Preference Ranking of RSS Feeds 1 Introduction December 11, 2009 Steven Wu RSS (Really Simple Syndication) is a family of data formats used to publish frequently updated works. RSS

More information

GMTI Tracking in the Presence of Doppler and Range Ambiguities

GMTI Tracking in the Presence of Doppler and Range Ambiguities 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 GMTI Tracing in the Presence of Doppler and Range Ambiguities Michael Mertens Dept. Sensor Data and Information

More information

Tracking of convoys by airborne STAP radar

Tracking of convoys by airborne STAP radar Tracking of convoys by airborne STAP radar Richard Klemm Department ARB FGAN-FHR Wachtberg, Germany Email: eur.klemm@t-online.de Michael Mertens Department SDF FGAN-FKIE Wachtberg, Germany Email: mertens@fgan.de

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Introduction to Bayesian Statistics

Introduction to Bayesian Statistics School of Computing & Communication, UTS January, 207 Random variables Pre-university: A number is just a fixed value. When we talk about probabilities: When X is a continuous random variable, it has a

More information

Statistical Multisource-Multitarget Information Fusion

Statistical Multisource-Multitarget Information Fusion Statistical Multisource-Multitarget Information Fusion Ronald P. S. Mahler ARTECH H O U S E BOSTON LONDON artechhouse.com Contents Preface Acknowledgments xxm xxv Chapter 1 Introduction to the Book 1 1.1

More information

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set

A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set 1th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 A Gaussian Mixture Motion Model and Contact Fusion Applied to the Metron Data Set Kathrin Wilkens 1,2 1 Institute

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

CONDENSATION Conditional Density Propagation for Visual Tracking

CONDENSATION Conditional Density Propagation for Visual Tracking CONDENSATION Conditional Density Propagation for Visual Tracking Michael Isard and Andrew Blake Presented by Neil Alldrin Department of Computer Science & Engineering University of California, San Diego

More information

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Myungsoo Jun and Raffaello D Andrea Sibley School of Mechanical and Aerospace Engineering Cornell University

More information

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models.  Domingo Gallardo Robots Autónomos 2. Bayesian Estimation and sensor models Domingo Gallardo Depto. CCIA http://www.rvg.ua.es/master/robots References Recursive State Estimation: Thrun, chapter 2 Sensor models and robot

More information

Expectation Propagation Algorithm

Expectation Propagation Algorithm Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,

More information

Multivariate statistical methods and data mining in particle physics

Multivariate statistical methods and data mining in particle physics Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general

More information

The Bayes classifier

The Bayes classifier The Bayes classifier Consider where is a random vector in is a random variable (depending on ) Let be a classifier with probability of error/risk given by The Bayes classifier (denoted ) is the optimal

More information

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry May 2015 1 Abstract In this article, we explore

More information

Context Driven Tracking using Particle Filters

Context Driven Tracking using Particle Filters 18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 Context Driven Tracking using Particle Filters Rik Claessens, Gregor Pavlin and Patrick de Oude University of Liverpool,

More information

Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis?

Summary of Past Lectures. Target Tracking: Lecture 4 Multiple Target Tracking: Part I. Lecture Outline. What is a hypothesis? REGLERTEKNIK Summary of Past Lectures AUTOMATIC COROL Target Tracing: Lecture Multiple Target Tracing: Part I Emre Özan emre@isy.liu.se Division of Automatic Control Department of Electrical Engineering

More information

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015 Probability Refresher Kai Arras, University of Freiburg Winter term 2014/2015 Probability Refresher Introduction to Probability Random variables Joint distribution Marginalization Conditional probability

More information

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

Markov localization uses an explicit, discrete representation for the probability of all position in the state space. Markov Kalman Filter Localization Markov localization localization starting from any unknown position recovers from ambiguous situation. However, to update the probability of all positions within the whole

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

CSE 598C Vision-based Tracking Seminar. Times: MW 10:10-11:00AM Willard 370 Instructor: Robert Collins Office Hours: Tues 2-4PM, Wed 9-9:50AM

CSE 598C Vision-based Tracking Seminar. Times: MW 10:10-11:00AM Willard 370 Instructor: Robert Collins Office Hours: Tues 2-4PM, Wed 9-9:50AM CSE 598C Vision-based Tracking Seminar Times: MW 10:10-11:00AM Willard 370 Instructor: Robert Collins Office Hours: Tues 2-4PM, Wed 9-9:50AM What is Tracking? typical idea: tracking a single target in

More information

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters Matthew Walter 1,2, Franz Hover 1, & John Leonard 1,2 Massachusetts Institute of Technology 1 Department of Mechanical Engineering

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Introduction to Statistical Inference

Introduction to Statistical Inference Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory

More information

Intro to Probability. Andrei Barbu

Intro to Probability. Andrei Barbu Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems

More information

Probability and Information Theory. Sargur N. Srihari

Probability and Information Theory. Sargur N. Srihari Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

COM336: Neural Computing

COM336: Neural Computing COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk

More information

Introduction to Mobile Robotics Probabilistic Robotics

Introduction to Mobile Robotics Probabilistic Robotics Introduction to Mobile Robotics Probabilistic Robotics Wolfram Burgard 1 Probabilistic Robotics Key idea: Explicit representation of uncertainty (using the calculus of probability theory) Perception Action

More information

LECTURE NOTE #3 PROF. ALAN YUILLE

LECTURE NOTE #3 PROF. ALAN YUILLE LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit Statistics Lent Term 2015 Prof. Mark Thomson Lecture 2 : The Gaussian Limit Prof. M.A. Thomson Lent Term 2015 29 Lecture Lecture Lecture Lecture 1: Back to basics Introduction, Probability distribution

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Introduction: MLE, MAP, Bayesian reasoning (28/8/13) STA561: Probabilistic machine learning Introduction: MLE, MAP, Bayesian reasoning (28/8/13) Lecturer: Barbara Engelhardt Scribes: K. Ulrich, J. Subramanian, N. Raval, J. O Hollaren 1 Classifiers In this

More information

Statistical Machine Learning from Data

Statistical Machine Learning from Data Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Ensembles Samy Bengio IDIAP Research Institute, Martigny, Switzerland, and Ecole Polytechnique Fédérale de Lausanne

More information

Synthetic Weather Radar: Offshore Precipitation Capability

Synthetic Weather Radar: Offshore Precipitation Capability Synthetic Weather Radar: Offshore Precipitation Capability Mark S. Veillette 5 December 2017 Sponsors: Randy Bass, FAA ANG-C6 and Rogan Flowers, FAA AJM-33 DISTRIBUTION STATEMENT A: Approved for public

More information

Failure prognostics in a particle filtering framework Application to a PEMFC stack

Failure prognostics in a particle filtering framework Application to a PEMFC stack Failure prognostics in a particle filtering framework Application to a PEMFC stack Marine Jouin Rafael Gouriveau, Daniel Hissel, Noureddine Zerhouni, Marie-Cécile Péra FEMTO-ST Institute, UMR CNRS 6174,

More information

Robot Localisation. Henrik I. Christensen. January 12, 2007

Robot Localisation. Henrik I. Christensen. January 12, 2007 Robot Henrik I. Robotics and Intelligent Machines @ GT College of Computing Georgia Institute of Technology Atlanta, GA hic@cc.gatech.edu January 12, 2007 The Robot Structure Outline 1 2 3 4 Sum of 5 6

More information

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden. System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors

More information

MODELLING TEMPORAL VARIATIONS BY POLYNOMIAL REGRESSION FOR CLASSIFICATION OF RADAR TRACKS

MODELLING TEMPORAL VARIATIONS BY POLYNOMIAL REGRESSION FOR CLASSIFICATION OF RADAR TRACKS MODELLING TEMPORAL VARIATIONS BY POLYNOMIAL REGRESSION FOR CLASSIFICATION OF RADAR TRACKS Lars W. Jochumsen 1,2, Jan Østergaard 2, Søren H. Jensen 2, Morten Ø. Pedersen 1 1 Terma A/S, Hovmarken 4, Lystrup,

More information

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart Robotics Mobile Robotics State estimation, Bayes filter, odometry, particle filter, Kalman filter, SLAM, joint Bayes filter, EKF SLAM, particle SLAM, graph-based SLAM Marc Toussaint U Stuttgart DARPA Grand

More information

COMPUTATIONAL LEARNING THEORY

COMPUTATIONAL LEARNING THEORY COMPUTATIONAL LEARNING THEORY XIAOXU LI Abstract. This paper starts with basic concepts of computational learning theory and moves on with examples in monomials and learning rays to the final discussion

More information

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Lior Wolf 1 Introduction to Machine Learning Maximum Likelihood and Bayesian Inference Lecturers: Eran Halperin, Lior Wolf 2014-15 We know that X ~ B(n,p), but we do not know p. We get a random sample from X, a

More information

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization Prof. Daniel Cremers 6. Mixture Models and Expectation-Maximization Motivation Often the introduction of latent (unobserved) random variables into a model can help to express complex (marginal) distributions

More information

Tracking and Data Fusion Applications

Tracking and Data Fusion Applications Wolfgang Koch FGAN-FKIE Neuenahrer Strasse 20 D 53343 Wachtberg, Germany Tel +49 228 9435 373 Fax +49 228 9435 685 em ail w. och@ fgan. de Abstract In many engineering applications, including surveillance,

More information

Clustering K-means. Clustering images. Machine Learning CSE546 Carlos Guestrin University of Washington. November 4, 2014.

Clustering K-means. Clustering images. Machine Learning CSE546 Carlos Guestrin University of Washington. November 4, 2014. Clustering K-means Machine Learning CSE546 Carlos Guestrin University of Washington November 4, 2014 1 Clustering images Set of Images [Goldberger et al.] 2 1 K-means Randomly initialize k centers µ (0)

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

B4 Estimation and Inference

B4 Estimation and Inference B4 Estimation and Inference 6 Lectures Hilary Term 27 2 Tutorial Sheets A. Zisserman Overview Lectures 1 & 2: Introduction sensors, and basics of probability density functions for representing sensor error

More information

Linear discriminant functions

Linear discriminant functions Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative

More information

COS Lecture 16 Autonomous Robot Navigation

COS Lecture 16 Autonomous Robot Navigation COS 495 - Lecture 16 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization

More information

Signal Detection Basics - CFAR

Signal Detection Basics - CFAR Signal Detection Basics - CFAR Types of noise clutter and signals targets Signal separation by comparison threshold detection Signal Statistics - Parameter estimation Threshold determination based on the

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory

More information

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach

Tracking of Extended Objects and Group Targets using Random Matrices A New Approach Tracing of Extended Objects and Group Targets using Random Matrices A New Approach Michael Feldmann FGAN Research Institute for Communication, Information Processing and Ergonomics FKIE D-53343 Wachtberg,

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

NPFL108 Bayesian inference. Introduction. Filip Jurčíček. Institute of Formal and Applied Linguistics Charles University in Prague Czech Republic

NPFL108 Bayesian inference. Introduction. Filip Jurčíček. Institute of Formal and Applied Linguistics Charles University in Prague Czech Republic NPFL108 Bayesian inference Introduction Filip Jurčíček Institute of Formal and Applied Linguistics Charles University in Prague Czech Republic Home page: http://ufal.mff.cuni.cz/~jurcicek Version: 21/02/2014

More information

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 Prof. Steven Waslander p(a): Probability that A is true 0 pa ( ) 1 p( True) 1, p( False) 0 p( A B) p( A) p( B) p( A B) A A B B 2 Discrete Random Variable X

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Introduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne 26/9/2011 (EPFL) Graphical Models 26/9/2011 1 / 28 Outline

More information

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University

Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking Namrata Vaswani Dept. of Electrical and Computer Engineering Iowa State University Abnormal Activity Detection and Tracking 1 The Problem Goal: To track activities

More information

Advanced statistical methods for data analysis Lecture 1

Advanced statistical methods for data analysis Lecture 1 Advanced statistical methods for data analysis Lecture 1 RHUL Physics www.pp.rhul.ac.uk/~cowan Universität Mainz Klausurtagung des GK Eichtheorien exp. Tests... Bullay/Mosel 15 17 September, 2008 1 Outline

More information

A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM

A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM Approved for public release; distribution is unlimited. A MULTI-SENSOR FUSION TRACK SOLUTION TO ADDRESS THE MULTI-TARGET PROBLEM By Dr. Buddy H. Jeun, Jay Jayaraman Lockheed Martin Aeronautical Systems,

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Lecture 4 Discriminant Analysis, k-nearest Neighbors

Lecture 4 Discriminant Analysis, k-nearest Neighbors Lecture 4 Discriminant Analysis, k-nearest Neighbors Fredrik Lindsten Division of Systems and Control Department of Information Technology Uppsala University. Email: fredrik.lindsten@it.uu.se fredrik.lindsten@it.uu.se

More information

Dynamic models 1 Kalman filters, linearization,

Dynamic models 1 Kalman filters, linearization, Koller & Friedman: Chapter 16 Jordan: Chapters 13, 15 Uri Lerner s Thesis: Chapters 3,9 Dynamic models 1 Kalman filters, linearization, Switching KFs, Assumed density filters Probabilistic Graphical Models

More information

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion Today Probability and Statistics Naïve Bayes Classification Linear Algebra Matrix Multiplication Matrix Inversion Calculus Vector Calculus Optimization Lagrange Multipliers 1 Classical Artificial Intelligence

More information

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

Robotics. Lecture 4: Probabilistic Robotics. See course website   for up to date information. Robotics Lecture 4: Probabilistic Robotics See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Sensors

More information

The Mixed Labeling Problem in Multi Target Particle Filtering

The Mixed Labeling Problem in Multi Target Particle Filtering The Mixed Labeling Problem in Multi Target Particle Filtering Yvo Boers Thales Nederland B.V. Haasbergerstraat 49, 7554 PA Hengelo, The Netherlands yvo.boers@nl.thalesgroup.com Hans Driessen Thales Nederland

More information

Probabilistic Graphical Models for Image Analysis - Lecture 1

Probabilistic Graphical Models for Image Analysis - Lecture 1 Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.

More information

Generative classifiers: The Gaussian classifier. Ata Kaban School of Computer Science University of Birmingham

Generative classifiers: The Gaussian classifier. Ata Kaban School of Computer Science University of Birmingham Generative classifiers: The Gaussian classifier Ata Kaban School of Computer Science University of Birmingham Outline We have already seen how Bayes rule can be turned into a classifier In all our examples

More information

Clustering K-means. Machine Learning CSE546. Sham Kakade University of Washington. November 15, Review: PCA Start: unsupervised learning

Clustering K-means. Machine Learning CSE546. Sham Kakade University of Washington. November 15, Review: PCA Start: unsupervised learning Clustering K-means Machine Learning CSE546 Sham Kakade University of Washington November 15, 2016 1 Announcements: Project Milestones due date passed. HW3 due on Monday It ll be collaborative HW2 grades

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Structural Reliability

Structural Reliability Structural Reliability Thuong Van DANG May 28, 2018 1 / 41 2 / 41 Introduction to Structural Reliability Concept of Limit State and Reliability Review of Probability Theory First Order Second Moment Method

More information

Machine Learning 1. Linear Classifiers. Marius Kloft. Humboldt University of Berlin Summer Term Machine Learning 1 Linear Classifiers 1

Machine Learning 1. Linear Classifiers. Marius Kloft. Humboldt University of Berlin Summer Term Machine Learning 1 Linear Classifiers 1 Machine Learning 1 Linear Classifiers Marius Kloft Humboldt University of Berlin Summer Term 2014 Machine Learning 1 Linear Classifiers 1 Recap Past lectures: Machine Learning 1 Linear Classifiers 2 Recap

More information

Lecture 8: Signal Detection and Noise Assumption

Lecture 8: Signal Detection and Noise Assumption ECE 830 Fall 0 Statistical Signal Processing instructor: R. Nowak Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(0, σ I n n and S = [s, s,..., s n ] T

More information

Localization and Tracking of Radioactive Source Carriers in Person Streams

Localization and Tracking of Radioactive Source Carriers in Person Streams Localization and Tracking of Radioactive Carriers in Person Streams Monika Wieneke and Wolfgang Koch Dept. Sensor Data and Information Fusion Fraunhofer FKIE, Wachtberg, Germany Email: monika.wieneke@fkie.fraunhofer.de

More information