Information fusion for scene understanding
|
|
- Neal Wade
- 5 years ago
- Views:
Transcription
1 Information fusion for scene understanding Philippe XU CNRS, HEUDIASYC University of Technology of Compiègne France Franck DAVOINE Thierry DENOEUX
2 Context of the PhD thesis Sino-French research collaboration Heudiasyc, UTC and CNRS Key Lab of Machine Perception, Peking University LIAMA, Sino-European Laboratory PSA Peugeot Citroën, Shanghai, Paris Research projects Labex MS2T ANR-NSFC PRETIV MPR LIAMA ICT-ASIA PREDiMap 2
3 Outline 1. Information fusion for scene understanding 2. Reasoning on sets with belief functions 3. Calibration of classifiers 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images 3
4 Outline 1. Information fusion for scene understanding a) Scene understanding b) Combining pattern classifiers c) Bayesian fusion 2. Reasoning on sets with belief functions 3. Calibration of classifiers 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images 4
5 Multi-sensors-based perception 5
6 Trainable multi-sensor fusion Sensors Perception Camera LIDAR Stereo Sensor Classes Sky Features Multi-class classification Grass Tree Road Obst 6
7 Non-trainable multi-sensor fusion Reasoning with images: What the driver sees Adapted for driver assistance Camera LIDAR Flexible and robust frame: Sensors can be added or removed New classes can be easily defined Sensors failure Stereo Classes Sky Sensor Grass Road Partial classification Tree Car Obst* 7
8 Probabilistic model Uncertainty is represented by an a posteriori probability distribution: P( x) Ignorance is represented by a uniform probability distribution: U Ω ω i = 1 Ω Source reliability: P Ω x,r ω i = P Ω x ω i P R r = 1 + U Ω ω i P R (r = 0) 8
9 Bayesian combination P ω i x 1, x 2 P ω i x 1 P(ω i x 2 ) P(ω i ) The estimation of the prior class distribution is a difficult task. Uniform prior distribution: P ω i = 1 Ω Validity: conditional independence P x 1, x 2 ω i = P x 1 ω i P(x 2 ω i ) 9
10 Combination rules Product Combination rules Support of the class ω i μ i = P ω i x j Average μ i + = P ω i x j Minimum Maximum μ i = min P ω i x j μ i = max P ω i x j Properties Prod. Avg. Min. Max. Uniform distribution as neutral element Idempotence Representation of categorical information Combination of contradictory information 10
11 How many classes do you see? 48 11
12 Sky Class organization Other stuff Ground Vegetation Infrastructure Movable Obstacle Four wheeled vehicle Road Building Pedestrian Lane marking Tree Vertical structures Two wheeled vehicle Grass Other kind Other kind Animal Other kind 12
13 Class refinement W S Sky Sky W G Ground 0.5 Ground 0.5 W V Vegetation Vegetation Vegetation Vegetation Q Grass0.25 Road0.25 Tree/Bush 0.17 Obstacles 0.17 Sky
14 Outlines 1. Information fusion for scene understanding 2. Reasoning on sets with belief functions a) Information representation b) Combination rules c) Operations over the frame of discernment 3. Calibration of classifiers 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images 14
15 Information representation Mass function: m Ω 2 Ω [0,1] m Ω = 0, m Ω A = 1 Vacuous: m Ω Ω = 1 A Ω Simple A w : m A = 1 w, m Ω = w Discounting α m: α m A = 1 α m A α m Ω = 1 α m Ω + α 15
16 Information representation Belief and plausibility: Bel A = m(b), Pl A = m(b) B A B A Contour function: pl ω = Pl( ω ) Pignistic probability: BetP ω = A Ω,ω A m(a) A 16
17 Class refinement W S Sky Sky W G Ground 0.5 Ground 0.5 W V Vegetation Vegetation Vegetation Vegetation Q Grass 0.5 Road Tree/Bush Obstacles Sky 17
18 Dempster s rule of combination m 1 m 2 = 0, m 1 m 2 A = 1 1 κ B C=A m 1 (B)m 2 (C) where κ = B C= m B m(c) Hypothesis: the sources of information are supposed to be independent Properties: commutative, associative and has the vacuous mass as unique neutral element 18
19 Cautious rule m 1 m 2 = A w 1 A w 2 A w A = B A q(b) ( 1) B A +1, q A = B A m(b) Properties: idempotent, commutative, associative and has the vacuous mass as unique neutral element 19
20 Triangular norm-based rule m 1 m 2 = A w 1 A w 2 A, m 1 m 2 = A w 1 A w 2 A w 1 T s w 2 = w 1 w 2 if s = 0, w 1 w 2 if s = 1, log s 1 + sw 1 1 s w 2 1 s 1 otherwise. m 1 T s m 2 = A w 1 A T s w 2 A 20
21 Outlines 1. Information fusion for scene understanding 2. Reasoning on sets with belief functions 3. Calibration of classifiers a) Probabilistic calibration b) Evidential extension c) Experimental results 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images 21
22 Posterior probability P(y = 1 s) Calibration of SVM scores SVM score s 22
23 Posterior probability P(y = 1 s) Uncertainty of the calibration SVM score s 23
24 Belief and plausibility Belief and plausibility Isotonic regression SVM score s 24
25 Logistic regression Sigmoid function: P y = 1 s h s θ = Likelihood function: exp(θ 0 + θ 1 s) L X θ = p i y i 1 pi 1 y i with p i = h xi (θ) Plausibility contour: pl x Ω ω, s = sup pl X Θ ln ω 1 1 θ 1 s, θ 1 25
26 Evidential logistic regression 26
27 Belief and plausibility Belief and plausibility Evidential logistic regression SVM score s SVM score s 27
28 Classification results Adult #train=600, #test= Australian #train=300, #test=390 Diabetes #train=300, #test=468 Scenario (a) (b) (c) (a) (b) (c) (a) (b) (c) Probabilistic 83,24 82,70 80,90 85,13 85,90 85,90 78,42 77,14 53,42 Inv. Pignistic 83,32 82,79 81,02 85,13 85,90 86,41 78,63 77,14 54,70 Likelihood 83,29 83,03 81,65 85,13 86,67 88,46 79,06 77,35 68,16 28
29 Belief and plausibility Decision boundary SVM score s 29
30 Outlines 1. Information fusion for scene understanding 2. Theory of belief functions 3. Calibration of classifiers 4. Combination of pedestrian detectors a) Calibration and clustering of bounding boxes b) Combination of detectors c) Experimental results 5. Local fusion in over-segmented images 30
31 # Algorithm Features Classifier Training 1 VJ Haar AdaBoost INRIA 2 HOG HOG Linear SVM INRIA 3 HikSVM HOG HIK SVM INRIA 4 LatSVM HOG Latent SVM PASCAL/INRIA 5 MultiResC HOG Latent SVM Caltech 6 PoseInv HOG AdaBoost INRIA 7 DBN HOG DeepNet INRIA/Caltech 8 MOCO HOG+LBP Latent SVM Caltech 9 paucboost HOG+COV paucboost INRIA 10 ACF Channels AdaBoost INRIA/Caltech 11 MultiFtr+Motion Multiple Linear SVM TUD-Motion 12 PLS Multiple PLS+QDA INRIA 13 Shapelet Gradients AdaBoost INRIA 14 ConvNet Pixels DeepNet INRIA 31
32 VJ [Viola02] HOG [Dalal05] ACF+SDt [Park13]
33 Clustering of bounding boxes Non-maximal suppression (NMS) Greedy: NMS with decreasing score =>Hierarchical clustering Overlap area: a = area BB i BB j area BB i BB j Threshold 1/2. 33
34 Calibrated probability Number of occurrences Calibration of the HOG detector Logistic 0.0 Isotonic SVM scores SVM scores There are almost infinitely more negative samples than positive samples. 34
35 Calibration of detectors 35
36 Combination rules The combination of multiple low scored BBs will lead to an even lower confident BB. Positive scores associated to BBs should be considered as supporting only the presence of a pedestrian. {1} α 1 1 α 2 = 1 α 1α 2 {1} α 1 1 α 2 = 1 α 1 α 2 {1} α 1 T s 1 α 2 = 1 α 1T s α 2 36
37 Distance between detectors Distance between two mass functions d(m 1, m 2 ): 1 A B 2 A B m 1 A m 2 A A,B Ω\{ } Distance between two simple mass functions: 0 d 1 α 1, 1 α 2 = α 1 α d 1 α 1, 0 0 = α 1 2 m 1 B m 2 B 1 2 Detectors: D C k, C l = 1 n i=1 n d(m k,i, m l,i )
38 Distance between detectors Detector clustering #20: CrossTalk #22: ACF #27: MultiFtr+Motion #28: MultiFtr+Motion+2Ped #8: MT-DPM #9: MT-DPM+Context Detector number #6: MultiResC #7: MultiResC+2Ped 38
39 Comparison of combination rules 39
40 Reasonable case scenario 40
41 Overall case scenario 41
42 Results 42
43 Results 43
44 Outlines 1. Information fusion for scene understanding 2. Reasoning on sets with belief functions 3. Calibration of classifiers 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images a) Image over-segmentation b) Construction of detection modules c) Experimental results 44
45 Multi-sensor fusion Sensors Camera LIDAR Stereo Sensor Classes Sky Perception Detection modules Fusion in unified space Grass Tree Road Obst 45
46 46
47 Pixel-based module m ground = 1 m Ω = 1 m sky = 1 47
48 Stereo-based module 48
49 LiDAR-based module 49
50 Surface layout and vegetation modules 50
51 Temporal propagation module 51
52 Classification modules Module Frame of discernment #1 Pixel Ω S = {sky, sky} #2 Pixel Ω G = {ground, ground} #3 Stereo Ω G = {ground, ground} #4 LiDAR Ω G = {ground, ground} #5 Surface Λ = {ground, vertical, sky} #6 Texture Ω V = {vegetation, vegetation} #7 Optical flow multiple 52
53 Ground/Non-ground classification 53
54 Raw image Stereo LiDAR Pixel Optical flow Combination Ground truth 54
55 Three-class classification 55
56 Raw image Surface layout Probabilist Evidential Ground truth 56
57 Three-class classification 57
58 Probabilist Evidential Ground truth 58
59 Our contributions 1. Information fusion for scene understanding 2. Reasoning on sets with belief functions 3. Calibration of classifiers 4. Combination of pedestrian detectors 5. Local fusion in over-segmented images 59
60 Conclusions Flexible and robust fusion framework New detection modules can be added/removed New classes can be defined Calibration can be used to transform the output of many types of classifiers The combination framework can be used at both the object and pixel levels 60
61 Perspectives Combination with additional souces of information (GPS, maps, etc.) Combination at both object and segment level. Selection of subsets of classifiers Reinforcing existing algorithms 61
62 Publications [1] Ph. Xu, F. Davoine, J.-B. Bordes, H. Zhao and T. Denoeux. Multimodal Information Fusion for Urban Scene Understanding. Machine Vision and Applications (MVA), [2] Ph. Xu, F. Davoine, J.-B. Bordes and T. Denoeux. Fusion d informations pour la comprehension de scènes. Traitement du signal (TS), [3] Ph. Xu, F. Davoine and T. Denoeux. Evidential Logistic Regression for Binary SVM Classifier Calibration. International Conference on Belief Function (BELIEF), [4] Ph. Xu, F. Davoine and T. Denoeux. Evidential Combination of Pedestrian Detectors. British Machine Vision Conference (BMVC), [5] Ph. Xu, F. Davoine, J.-B. Bordes, H. Zhao and T. Denoeux. Information Fusion on Oversegmented Images: An Application for Urban Scene Understanding. International Conference on Machine Vision Applications (MVA), [6] J.-B. Bordes, Ph. Xu, F. Davoine, H. Zhao and T. Denoeux. Information Fusion and Evidential Grammars for Object Class Segmentation. IROS Workshop of Planning, Perception and Navigation for Intelligent Vehicles, [7] J.-B. Bordes, F. Davoine, Ph. Xu, T. Denoeux. Evidential Grammars for Image Interpretation. Application to multimodal traffic scene understanding. Integrated Uncertainty inknowledge Modelling and Decision Making (IUKM), [8] Ph. Xu, F. Davoine and T. Denoeux. Transformation de scores SVM en fonctions de croyance. Congrès national sur la Reconnaissance de Formes et l Intelligence Artificielle (RFIA), [9] Ph. Xu, F. Davoine, J.-B. Bordes and T. Denoeux. Fusion d informations sur des images sursegmentées : Une application à la comprehension de scènes routières. Congrès des jeunes chercheurs envision par ordinateur (ORASIS),
Evidential combination of pedestrian detectors
XU ET AL.: EVIDENTIAL COMBINATION OF PEDESTRIAN DETECTORS 1 Evidential combination of pedestrian detectors Philippe Xu 1 https://www.hds.utc.fr/~xuphilip Franck Davoine 12 franck.davoine@gmail.com Thierry
More informationHandling imprecise and uncertain class labels in classification and clustering
Handling imprecise and uncertain class labels in classification and clustering Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) COST Action IC 0702 Working group C, Mallorca,
More informationIntroduction to belief functions
Introduction to belief functions Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) http://www.hds.utc.fr/ tdenoeux Spring School BFTA 2011 Autrans, April 4-8, 2011 Thierry
More informationThe cautious rule of combination for belief functions and some extensions
The cautious rule of combination for belief functions and some extensions Thierry Denœux UMR CNRS 6599 Heudiasyc Université de Technologie de Compiègne BP 20529 - F-60205 Compiègne cedex - France Thierry.Denoeux@hds.utc.fr
More informationThe Unnormalized Dempster s Rule of Combination: a New Justification from the Least Commitment Principle and some Extensions
J Autom Reasoning manuscript No. (will be inserted by the editor) 0 0 0 The Unnormalized Dempster s Rule of Combination: a New Justification from the Least Commitment Principle and some Extensions Frédéric
More informationLogistic Regression. Machine Learning Fall 2018
Logistic Regression Machine Learning Fall 2018 1 Where are e? We have seen the folloing ideas Linear models Learning as loss minimization Bayesian learning criteria (MAP and MLE estimation) The Naïve Bayes
More informationProbabilistic Machine Learning. Industrial AI Lab.
Probabilistic Machine Learning Industrial AI Lab. Probabilistic Linear Regression Outline Probabilistic Classification Probabilistic Clustering Probabilistic Dimension Reduction 2 Probabilistic Linear
More informationJOINT INTERPRETATION OF ON-BOARD VISION AND STATIC GPS CARTOGRAPHY FOR DETERMINATION OF CORRECT SPEED LIMIT
JOINT INTERPRETATION OF ON-BOARD VISION AND STATIC GPS CARTOGRAPHY FOR DETERMINATION OF CORRECT SPEED LIMIT Alexandre Bargeton, Fabien Moutarde, Fawzi Nashashibi and Anne-Sophie Puthon Robotics Lab (CAOR),
More informationSpatial Bayesian Nonparametrics for Natural Image Segmentation
Spatial Bayesian Nonparametrics for Natural Image Segmentation Erik Sudderth Brown University Joint work with Michael Jordan University of California Soumya Ghosh Brown University Parsing Visual Scenes
More informationGraphical Object Models for Detection and Tracking
Graphical Object Models for Detection and Tracking (ls@cs.brown.edu) Department of Computer Science Brown University Joined work with: -Ying Zhu, Siemens Corporate Research, Princeton, NJ -DorinComaniciu,
More informationSMPS 08, 8-10 septembre 2008, Toulouse. A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France
SMPS 08, 8-10 septembre 2008, Toulouse A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France The frame of reference: climate sensitivity Climate sensitivity
More informationMidterm Review CS 7301: Advanced Machine Learning. Vibhav Gogate The University of Texas at Dallas
Midterm Review CS 7301: Advanced Machine Learning Vibhav Gogate The University of Texas at Dallas Supervised Learning Issues in supervised learning What makes learning hard Point Estimation: MLE vs Bayesian
More informationBelief functions: basic theory and applications
Belief functions: basic theory and applications Thierry Denœux 1 1 Université de Technologie de Compiègne, France HEUDIASYC (UMR CNRS 7253) https://www.hds.utc.fr/ tdenoeux ISIPTA 2013, Compiègne, France,
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationECE521 week 3: 23/26 January 2017
ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear
More informationTowards Fully-automated Driving
Towards Fully-automated Driving Challenges and Potential Solutions Dr. Gijs Dubbelman Mobile Perception Systems EE-SPS/VCA Mobile Perception Systems 6 PhDs, postdoc, project manager, software engineer,
More informationLearning from data with uncertain labels by boosting credal classifiers
Learning from data with uncertain labels by boosting credal classifiers ABSTRACT Benjamin Quost HeuDiaSyC laboratory deptartment of Computer Science Compiègne University of Technology Compiègne, France
More informationMaarten Bieshaar, Günther Reitberger, Stefan Zernetsch, Prof. Dr. Bernhard Sick, Dr. Erich Fuchs, Prof. Dr.-Ing. Konrad Doll
Maarten Bieshaar, Günther Reitberger, Stefan Zernetsch, Prof. Dr. Bernhard Sick, Dr. Erich Fuchs, Prof. Dr.-Ing. Konrad Doll 08.02.2017 By 2030 road traffic deaths will be the fifth leading cause of death
More informationSequential adaptive combination of unreliable sources of evidence
Sequential adaptive combination of unreliable sources of evidence Zhun-ga Liu, Quan Pan, Yong-mei Cheng School of Automation Northwestern Polytechnical University Xi an, China Email: liuzhunga@gmail.com
More informationPairwise Classifier Combination using Belief Functions
Pairwise Classifier Combination using Belief Functions Benjamin Quost, Thierry Denœux and Marie-Hélène Masson UMR CNRS 6599 Heudiasyc Université detechnologiedecompiègne BP 059 - F-6005 Compiègne cedex
More informationUNSUPERVISED LEARNING
UNSUPERVISED LEARNING Topics Layer-wise (unsupervised) pre-training Restricted Boltzmann Machines Auto-encoders LAYER-WISE (UNSUPERVISED) PRE-TRAINING Breakthrough in 2006 Layer-wise (unsupervised) pre-training
More informationA novel k-nn approach for data with uncertain attribute values
A novel -NN approach for data with uncertain attribute values Asma Trabelsi 1,2, Zied Elouedi 1, and Eric Lefevre 2 1 Université de Tunis, Institut Supérieur de Gestion de Tunis, LARODEC, Tunisia trabelsyasma@gmail.com,zied.elouedi@gmx.fr
More informationTwo-stage Pedestrian Detection Based on Multiple Features and Machine Learning
38 3 Vol. 38, No. 3 2012 3 ACTA AUTOMATICA SINICA March, 2012 1 1 1, (Adaboost) (Support vector machine, SVM). (Four direction features, FDF) GAB (Gentle Adaboost) (Entropy-histograms of oriented gradients,
More informationVisual Object Detection
Visual Object Detection Ying Wu Electrical Engineering and Computer Science Northwestern University, Evanston, IL 60208 yingwu@northwestern.edu http://www.eecs.northwestern.edu/~yingwu 1 / 47 Visual Object
More informationIntroduction to Gaussian Process
Introduction to Gaussian Process CS 778 Chris Tensmeyer CS 478 INTRODUCTION 1 What Topic? Machine Learning Regression Bayesian ML Bayesian Regression Bayesian Non-parametric Gaussian Process (GP) GP Regression
More informationarxiv: v1 [cs.cv] 11 Jun 2008
HUMAN EXPERTS FUSION FOR IMAGE CLASSIFICATION Arnaud MARTIN and Christophe OSSWALD arxiv:0806.1798v1 [cs.cv] 11 Jun 2008 Abstract In image classification, merging the opinion of several human experts is
More informationFinal Exam, Machine Learning, Spring 2009
Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3
More informationSupport Vector Machine. Industrial AI Lab.
Support Vector Machine Industrial AI Lab. Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories / classes Binary: 2 different
More informationMidterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas
Midterm Review CS 6375: Machine Learning Vibhav Gogate The University of Texas at Dallas Machine Learning Supervised Learning Unsupervised Learning Reinforcement Learning Parametric Y Continuous Non-parametric
More informationStatistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart
Statistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart 1 Motivation and Problem In Lecture 1 we briefly saw how histograms
More informationDecision fusion for postal address recognition using belief functions
Decision fusion for postal address recognition using belief functions David Mercier, Genevieve Cron, Thierry Denoeux, Marie-Hélène Masson To cite this version: David Mercier, Genevieve Cron, Thierry Denoeux,
More informationTracking and Identification of Multiple targets
Tracking and Identification of Multiple targets Samir Hachour, François Delmotte, Eric Lefèvre, David Mercier Laboratoire de Génie Informatique et d'automatique de l'artois, EA 3926 LGI2A first name.last
More informationFINAL: CS 6375 (Machine Learning) Fall 2014
FINAL: CS 6375 (Machine Learning) Fall 2014 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for
More informationECE521 Lecture7. Logistic Regression
ECE521 Lecture7 Logistic Regression Outline Review of decision theory Logistic regression A single neuron Multi-class classification 2 Outline Decision theory is conceptually easy and computationally hard
More information9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering
Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make
More informationPILCO: A Model-Based and Data-Efficient Approach to Policy Search
PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol
More informationFPGA Implementation of a HOG-based Pedestrian Recognition System
MPC Workshop Karlsruhe 10/7/2009 FPGA Implementation of a HOG-based Pedestrian Recognition System Sebastian Bauer sebastian.bauer@fh-aschaffenburg.de Laboratory for Pattern Recognition and Computational
More informationCPSC 340: Machine Learning and Data Mining. MLE and MAP Fall 2017
CPSC 340: Machine Learning and Data Mining MLE and MAP Fall 2017 Assignment 3: Admin 1 late day to hand in tonight, 2 late days for Wednesday. Assignment 4: Due Friday of next week. Last Time: Multi-Class
More informationThreat assessment of a possible Vehicle-Born Improvised Explosive Device using DSmT
Threat assessment of a possible Vehicle-Born Improvised Explosive Device using DSmT Jean Dezert French Aerospace Lab. ONERA/DTIM/SIF 29 Av. de la Div. Leclerc 92320 Châtillon, France. jean.dezert@onera.fr
More informationOutline. Supervised Learning. Hong Chang. Institute of Computing Technology, Chinese Academy of Sciences. Machine Learning Methods (Fall 2012)
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Linear Models for Regression Linear Regression Probabilistic Interpretation
More informationParameter Estimation. Industrial AI Lab.
Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed
More informationCh 4. Linear Models for Classification
Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,
More informationMachine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li. https://funglee.github.io
Machine Learning Lecture 4: Regularization and Bayesian Statistics Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 207 Overfitting Problem
More informationMachine Learning (CS 567) Lecture 2
Machine Learning (CS 567) Lecture 2 Time: T-Th 5:00pm - 6:20pm Location: GFS118 Instructor: Sofus A. Macskassy (macskass@usc.edu) Office: SAL 216 Office hours: by appointment Teaching assistant: Cheol
More informationBayesian Learning (II)
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP
More informationPictorial Structures Revisited: People Detection and Articulated Pose Estimation. Department of Computer Science TU Darmstadt
Pictorial Structures Revisited: People Detection and Articulated Pose Estimation Mykhaylo Andriluka Stefan Roth Bernt Schiele Department of Computer Science TU Darmstadt Generic model for human detection
More informationAssessing Map-Based Maneuver Hypotheses using Probabilistic Methods and Evidence Theory
Assessing Map-Based Maneuver Hypotheses using Probabilistic Methods and Evidence Theory Dominik Petrich, Thao Dang, Gabi Breuel and Christoph Stiller 2 Abstract The prediction of the behavior of other
More informationMODULE -4 BAYEIAN LEARNING
MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities
More informationClick Prediction and Preference Ranking of RSS Feeds
Click Prediction and Preference Ranking of RSS Feeds 1 Introduction December 11, 2009 Steven Wu RSS (Really Simple Syndication) is a family of data formats used to publish frequently updated works. RSS
More informationPartially-Hidden Markov Models
Partially-Hidden Markov Models Emmanuel Ramasso and Thierry Denœux and Noureddine Zerhouni Abstract This paper addresses the problem of Hidden Markov Models (HMM) training and inference when the training
More informationApplication of Evidence Theory to Construction Projects
Application of Evidence Theory to Construction Projects Desmond Adair, University of Tasmania, Australia Martin Jaeger, University of Tasmania, Australia Abstract: Crucial decisions are necessary throughout
More informationSupport Vector Machine. Industrial AI Lab. Prof. Seungchul Lee
Support Vector Machine Industrial AI Lab. Prof. Seungchul Lee Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories /
More informationEssence of Machine Learning (and Deep Learning) Hoa M. Le Data Science Lab, HUST hoamle.github.io
Essence of Machine Learning (and Deep Learning) Hoa M. Le Data Science Lab, HUST hoamle.github.io 1 Examples https://www.youtube.com/watch?v=bmka1zsg2 P4 http://www.r2d3.us/visual-intro-to-machinelearning-part-1/
More informationSupport Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM
1 Support Vector Machines (SVM) in bioinformatics Day 1: Introduction to SVM Jean-Philippe Vert Bioinformatics Center, Kyoto University, Japan Jean-Philippe.Vert@mines.org Human Genome Center, University
More informationThe Origin of Deep Learning. Lili Mou Jan, 2015
The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets
More informationCombination of classifiers with optimal weight based on evidential reasoning
1 Combination of classifiers with optimal weight based on evidential reasoning Zhun-ga Liu 1, Quan Pan 1, Jean Dezert 2, Arnaud Martin 3 1. School of Automation, Northwestern Polytechnical University,
More informationDiscriminative part-based models. Many slides based on P. Felzenszwalb
More sliding window detection: ti Discriminative part-based models Many slides based on P. Felzenszwalb Challenge: Generic object detection Pedestrian detection Features: Histograms of oriented gradients
More informationBayesian Networks Inference with Probabilistic Graphical Models
4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning
More informationBoosting: Algorithms and Applications
Boosting: Algorithms and Applications Lecture 11, ENGN 4522/6520, Statistical Pattern Recognition and Its Applications in Computer Vision ANU 2 nd Semester, 2008 Chunhua Shen, NICTA/RSISE Boosting Definition
More informationVision for Mobile Robot Navigation: A Survey
Vision for Mobile Robot Navigation: A Survey (February 2002) Guilherme N. DeSouza & Avinash C. Kak presentation by: Job Zondag 27 February 2009 Outline: Types of Navigation Absolute localization (Structured)
More informationThe intersection probability and its properties
The intersection probability and its properties Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe Montbonnot, France Abstract In this paper we introduce the intersection probability, a Bayesian approximation
More informationBelief Propagation for Traffic forecasting
Belief Propagation for Traffic forecasting Cyril Furtlehner (INRIA Saclay - Tao team) context : Travesti project http ://travesti.gforge.inria.fr/) Anne Auger (INRIA Saclay) Dimo Brockhoff (INRIA Lille)
More informationIntroduc)on to Bayesian methods (con)nued) - Lecture 16
Introduc)on to Bayesian methods (con)nued) - Lecture 16 David Sontag New York University Slides adapted from Luke Zettlemoyer, Carlos Guestrin, Dan Klein, and Vibhav Gogate Outline of lectures Review of
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationSemantics of the relative belief of singletons
Semantics of the relative belief of singletons Fabio Cuzzolin INRIA Rhône-Alpes 655 avenue de l Europe, 38334 SAINT ISMIER CEDEX, France Fabio.Cuzzolin@inrialpes.fr Summary. In this paper we introduce
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationLearning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014
Learning with Noisy Labels Kate Niehaus Reading group 11-Feb-2014 Outline Motivations Generative model approach: Lawrence, N. & Scho lkopf, B. Estimating a Kernel Fisher Discriminant in the Presence of
More informationLinear Classification
Linear Classification Lili MOU moull12@sei.pku.edu.cn http://sei.pku.edu.cn/ moull12 23 April 2015 Outline Introduction Discriminant Functions Probabilistic Generative Models Probabilistic Discriminative
More informationChapter 14 Combining Models
Chapter 14 Combining Models T-61.62 Special Course II: Pattern Recognition and Machine Learning Spring 27 Laboratory of Computer and Information Science TKK April 3th 27 Outline Independent Mixing Coefficients
More informationIntroduction. Chapter 1
Chapter 1 Introduction In this book we will be concerned with supervised learning, which is the problem of learning input-output mappings from empirical data (the training dataset). Depending on the characteristics
More informationOutline: Ensemble Learning. Ensemble Learning. The Wisdom of Crowds. The Wisdom of Crowds - Really? Crowd wiser than any individual
Outline: Ensemble Learning We will describe and investigate algorithms to Ensemble Learning Lecture 10, DD2431 Machine Learning A. Maki, J. Sullivan October 2014 train weak classifiers/regressors and how
More informationCSCI-567: Machine Learning (Spring 2019)
CSCI-567: Machine Learning (Spring 2019) Prof. Victor Adamchik U of Southern California Mar. 19, 2019 March 19, 2019 1 / 43 Administration March 19, 2019 2 / 43 Administration TA3 is due this week March
More informationStatistical Learning. Philipp Koehn. 10 November 2015
Statistical Learning Philipp Koehn 10 November 2015 Outline 1 Learning agents Inductive learning Decision tree learning Measuring learning performance Bayesian learning Maximum a posteriori and maximum
More informationCS 231A Section 1: Linear Algebra & Probability Review
CS 231A Section 1: Linear Algebra & Probability Review 1 Topics Support Vector Machines Boosting Viola-Jones face detector Linear Algebra Review Notation Operations & Properties Matrix Calculus Probability
More informationReducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers Erin Allwein, Robert Schapire and Yoram Singer Journal of Machine Learning Research, 1:113-141, 000 CSE 54: Seminar on Learning
More informationMultisensor Data Fusion and Belief Functions for Robust Singularity Detection in Signals
4th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 20 Multisensor Data Fusion and Belief Functions for Robust Singularity Detection in Signals Gwénolé Le Moal, George
More informationCS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang
CS 231A Section 1: Linear Algebra & Probability Review Kevin Tang Kevin Tang Section 1-1 9/30/2011 Topics Support Vector Machines Boosting Viola Jones face detector Linear Algebra Review Notation Operations
More informationDynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji
Dynamic Data Modeling, Recognition, and Synthesis Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Contents Introduction Related Work Dynamic Data Modeling & Analysis Temporal localization Insufficient
More informationApplication to Intelligent Transportation Systems M. Shawky Heudiasyc Laboratory, JRU with CNRS Université de Technologie de Compiègne France
Future Challenges in Design Frameworks for Embedded Systems Application to Intelligent Transportation Systems M. Shawky Heudiasyc Laboratory, JRU with CNRS Université de Technologie de Compiègne France
More informationCredal Classification
Credal Classification A. Antonucci, G. Corani, D. Maua {alessandro,giorgio,denis}@idsia.ch Istituto Dalle Molle di Studi sull Intelligenza Artificiale Lugano (Switzerland) IJCAI-13 About the speaker PhD
More informationJoint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model
Joint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model Gavin Powell & David Marshall The Geometric Computing & Computer Vision Group,
More informationProbabilistic Machine Learning
Probabilistic Machine Learning by Prof. Seungchul Lee isystes Design Lab http://isystes.unist.ac.kr/ UNIST Table of Contents I.. Probabilistic Linear Regression I... Maxiu Likelihood Solution II... Maxiu-a-Posteriori
More informationMulti-Object Association Decision Algorithms with Belief Functions
ulti-object Association Decision Algorithms with Belief Functions Jérémie Daniel and Jean-Philippe Lauffenburger Université de Haute-Alsace UHA) odélisation Intelligence Processus Systèmes IPS) laboratory
More informationMachine Learning Basics Lecture 7: Multiclass Classification. Princeton University COS 495 Instructor: Yingyu Liang
Machine Learning Basics Lecture 7: Multiclass Classification Princeton University COS 495 Instructor: Yingyu Liang Example: image classification indoor Indoor outdoor Example: image classification (multiclass)
More information2D Image Processing (Extended) Kalman and particle filter
2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz
More informationUncertainty Quantification for Machine Learning and Statistical Models
Uncertainty Quantification for Machine Learning and Statistical Models David J. Stracuzzi Joint work with: Max Chen, Michael Darling, Stephen Dauphin, Matt Peterson, and Chris Young Sandia National Laboratories
More informationCS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning
CS 446 Machine Learning Fall 206 Nov 0, 206 Bayesian Learning Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes Overview Bayesian Learning Naive Bayes Logistic Regression Bayesian Learning So far, we
More informationCS4495/6495 Introduction to Computer Vision. 8C-L3 Support Vector Machines
CS4495/6495 Introduction to Computer Vision 8C-L3 Support Vector Machines Discriminative classifiers Discriminative classifiers find a division (surface) in feature space that separates the classes Several
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning Tobias Scheffer, Niels Landwehr Remember: Normal Distribution Distribution over x. Density function with parameters
More informationA hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France
Ambiguity, uncertainty and climate change, UC Berkeley, September 17-18, 2009 A hierarchical fusion of expert opinion in the Transferable Belief Model (TBM) Minh Ha-Duong, CNRS, France Outline 1. Intro:
More informationBayesian Deep Learning
Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference
More informationSYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I
SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationShape of Gaussians as Feature Descriptors
Shape of Gaussians as Feature Descriptors Liyu Gong, Tianjiang Wang and Fang Liu Intelligent and Distributed Computing Lab, School of Computer Science and Technology Huazhong University of Science and
More informationGaussian Processes as Continuous-time Trajectory Representations: Applications in SLAM and Motion Planning
Gaussian Processes as Continuous-time Trajectory Representations: Applications in SLAM and Motion Planning Jing Dong jdong@gatech.edu 2017-06-20 License CC BY-NC-SA 3.0 Discrete time SLAM Downsides: Measurements
More informationGlobal Behaviour Inference using Probabilistic Latent Semantic Analysis
Global Behaviour Inference using Probabilistic Latent Semantic Analysis Jian Li, Shaogang Gong, Tao Xiang Department of Computer Science Queen Mary College, University of London, London, E1 4NS, UK {jianli,
More informationarxiv: v2 [cs.ne] 22 Feb 2013
Sparse Penalty in Deep Belief Networks: Using the Mixed Norm Constraint arxiv:1301.3533v2 [cs.ne] 22 Feb 2013 Xanadu C. Halkias DYNI, LSIS, Universitè du Sud, Avenue de l Université - BP20132, 83957 LA
More informationContinuous updating rules for imprecise probabilities
Continuous updating rules for imprecise probabilities Marco Cattaneo Department of Statistics, LMU Munich WPMSIIP 2013, Lugano, Switzerland 6 September 2013 example X {1, 2, 3} Marco Cattaneo @ LMU Munich
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More information