Large Scale Environment Partitioning in Mobile Robotics Recognition Tasks

Similar documents
JOURNAL OF PHYSICAL AGENTS, VOL. 4, NO. 2, MAY Large Scale Environment Partitioning in Mobile Robotics Recognition Tasks

Information Theory in Computer Vision and Pattern Recognition

2D Image Processing (Extended) Kalman and particle filter

CS 6375 Machine Learning

Mobile Robot Localization

Discriminative part-based models. Many slides based on P. Felzenszwalb

Vision for Mobile Robot Navigation: A Survey

Mobile Robot Localization

Self Adaptive Particle Filter

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

CS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang

CS 231A Section 1: Linear Algebra & Probability Review

Gradient Sampling for Improved Action Selection and Path Synthesis

Sequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them

Lecture 11 Linear regression

Introduction to Mobile Robotics Information Gain-Based Exploration. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras

EBEM: An Entropy-based EM Algorithm for Gaussian Mixture Models

Information Theory in Computer Vision and Pattern Recognition

Machine Learning Lecture 3

Collaborative topic models: motivations cont

Vote. Vote on timing for night section: Option 1 (what we have now) Option 2. Lecture, 6:10-7:50 25 minute dinner break Tutorial, 8:15-9

Artificial neural networks

Applied Statistics. Multivariate Analysis - part II. Troels C. Petersen (NBI) Statistics is merely a quantization of common sense 1

Logic and machine learning review. CS 540 Yingyu Liang

Lecture 6: Edge Detection. CAP 5415: Computer Vision Fall 2008

Machine Learning. Boris

Maximum Likelihood (ML), Expectation Maximization (EM) Pieter Abbeel UC Berkeley EECS

ECE521 Lectures 9 Fully Connected Neural Networks

A Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem

Pattern recognition. "To understand is to perceive patterns" Sir Isaiah Berlin, Russian philosopher

Bayesian Classifiers and Probability Estimation. Vassilis Athitsos CSE 4308/5360: Artificial Intelligence I University of Texas at Arlington

A Discriminatively Trained, Multiscale, Deformable Part Model

Intelligent Systems Statistical Machine Learning

Predictive analysis on Multivariate, Time Series datasets using Shapelets

Artificial Neural Networks Examination, June 2005

Multivariate statistical methods and data mining in particle physics

Expectation Maximization, and Learning from Partly Unobserved Data (part 2)

Data Informatics. Seon Ho Kim, Ph.D.

Face detection and recognition. Detection Recognition Sally

Notes on Machine Learning for and

Towards Fully-automated Driving

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Discriminative Direction for Kernel Classifiers

Machine Learning Basics Lecture 7: Multiclass Classification. Princeton University COS 495 Instructor: Yingyu Liang

From perceptrons to word embeddings. Simon Šuster University of Groningen

UNSUPERVISED CLUSTERING WITH MST: APPLICATION TO ASTEROID DATA Mai 2005 O. Michel, P.Bendjoya,

Variational sampling approaches to word confusability

Loss Functions, Decision Theory, and Linear Models

A Study of Covariances within Basic and Extended Kalman Filters

Maximum Likelihood Estimation. only training data is available to design a classifier

Energy Based Models. Stefano Ermon, Aditya Grover. Stanford University. Lecture 13

Independent Component Analysis and Unsupervised Learning. Jen-Tzung Chien

Loss Functions and Optimization. Lecture 3-1

Scalable robust hypothesis tests using graphical models

Statistical Rock Physics

Information Dynamics Foundations and Applications

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata

Bayesian Methods in Artificial Intelligence

Pattern Recognition and Machine Learning

Subspace Methods for Visual Learning and Recognition

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

CS 380: ARTIFICIAL INTELLIGENCE

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Deep unsupervised learning

SIFT: Scale Invariant Feature Transform

Interpreting Deep Classifiers

Invariant Scattering Convolution Networks

Loss Functions and Optimization. Lecture 3-1

The Law of Luminous Intensity Variation and Technical Vision

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Deep Reinforcement Learning. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 19, 2017

Autonomous Mobile Robot Design

Domain adaptation for deep learning

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Midterm, Fall 2003

Unit 8: Introduction to neural networks. Perceptrons

Independent Component Analysis and Unsupervised Learning

Curve Fitting Re-visited, Bishop1.2.5

CSE 473: Artificial Intelligence Spring 2014

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Adaptive Crowdsourcing via EM with Prior

EM Algorithm LECTURE OUTLINE

Does the Wake-sleep Algorithm Produce Good Density Estimators?

Fundamentals of Computational Neuroscience 2e

Hierarchical Boosting and Filter Generation

Improved Action Selection and Path Synthesis using Gradient Sampling

CS570 Introduction to Data Mining

Boosting. Acknowledgment Slides are based on tutorials from Robert Schapire and Gunnar Raetsch

12 slots, 2 hours each. A homework: visualization, simple testing, and simple classification algorithms.

ECE521 Lecture7. Logistic Regression

Pattern Recognition and Machine Learning. Artificial Neural networks

Sections 18.6 and 18.7 Artificial Neural Networks

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

Modeling Complex Temporal Composition of Actionlets for Activity Prediction

Brief Introduction of Machine Learning Techniques for Content Analysis

Machine Learning and Deep Learning! Vincent Lepetit!

Feature extraction: Corners and blobs

Global Scene Representations. Tilke Judd

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Spatial Bayesian Nonparametrics for Natural Image Segmentation

Transcription:

Large Scale Environment in Mobile Robotics Recognition Tasks Boyan Bonev, Miguel Cazorla {boyan,miguel}@dccia.ua.es Robot Vision Group Department of Computer Science and Artificial Intelligence University of Alicante April 27th, 27 Bonev, Cazorla Environment in Recognition Tasks 1 / 35 WAF 28

Outline 1 Localization and vision Initial approach approach 2 3 4 Bonev, Cazorla Environment in Recognition Tasks 2 / 35 WAF 28

Outline Localization and vision Initial approach approach 1 Localization and vision Initial approach approach 2 3 4 Bonev, Cazorla Environment in Recognition Tasks 3 / 35 WAF 28

Localization Localization and vision Initial approach approach Mobile robots Sensors + Maps Localization Bonev, Cazorla Environment in Recognition Tasks 4 / 35 WAF 28

Vision-based localization Localization and vision Initial approach approach Environments Ad-hoc Natural Indoor Outdoor Bonev, Cazorla Environment in Recognition Tasks 5 / 35 WAF 28

Localization and vision Initial approach approach Appearance based visual recognition Visual recognition approaches Structural-description Structure from high level features Appearance-based Images or low level features 34567891 5 1 y69a1,1 34567891 5 1 y69a1,2 34567891 5 1 y69a1,3 34567891 5 1 y69a1,4 34567891 5 1 y69a1,5 34567891 5 1 y69a1,6 34567891 5 1 y69a1,7 34567891 5 1 y69a1,8 34567891 5 1 y69a1,9 34567891 5 1 y69a1,1 34567891 5 1 y69a2,1 34567891 5 1 y69a2,2 34567891 5 1 y69a2,3 34567891 5 1 y69a2,4 34567891 5 1 y69a2,5 34567891 5 1 y69a2,6 34567891 5 1 y69a2,7 34567891 5 1 y69a2,8 34567891 5 1 y69a2,9 34567891 5 1 y69a2,1 34567891 5 1 y69a3,1 34567891 5 1 y69a3,2 34567891 5 1 y69a3,3 34567891 5 1 y69a3,4 34567891 5 1 y69a3,5 34567891 5 1 y69a3,6 34567891 5 1 y69a3,7 34567891 5 1 y69a3,8 34567891 5 1 y69a3,9 34567891 5 1 y69a3,1 34567891 5 1 y69a4,1 34567891 5 1 y69a4,2 34567891 5 1 y69a4,3 34567891 5 1 y69a4,4 34567891 5 1 y69a4,5 34567891 5 1 y69a4,6 34567891 5 1 y69a4,7 34567891 5 1 y69a4,8 34567891 5 1 y69a4,9 34567891 5 1 y69a4,1 34567891 5 1 y69a5,1 34567891 5 1 y69a5,2 34567891 5 1 y69a5,3 34567891 5 1 y69a5,4 34567891 5 1 y69a5,5 34567891 5 1 y69a5,6 34567891 5 1 y69a5,7 34567891 5 1 y69a5,8 34567891 5 1 y69a5,9 34567891 5 1 y69a5,1 Bonev, Cazorla Environment in Recognition Tasks 6 / 35 WAF 28

Omnidirectional images Localization and vision Initial approach approach Omnidirectional images Local views with 36 o visibility Independence of the direction of the route. Convenient representation for rotation-invariant recognition Bonev, Cazorla Environment in Recognition Tasks 7 / 35 WAF 28

Omnidirectional images Localization and vision Initial approach approach Bonev, Cazorla Environment in Recognition Tasks 8 / 35 WAF 28

Feature selection approach Localization and vision Initial approach approach N F N S M M M 1-Fold CV Train Test Low level filters Nitzberg Canny, Gradient Color Filters Histograms comparison 2,4, and bins discretization Images 4 3 2 # reference image Vectors Vectors All Features Selected F. Best F. Set Single similiarity function response H Error? 1 5 1 15 2 25 # test image Bonev, Cazorla Environment in Recognition Tasks 9 / 35 WAF 28

Localization and vision Initial approach approach Large environments Bonev, Cazorla Environment in Recognition Tasks 1 / 35 WAF 28

Approach Localization and vision Initial approach approach Unsupervised partitioning of the environment multiple hypotheses MCL to select a single hypothesis gradient Gradient of the JR divergence 5 1 15 2 25 3 35 4 # image # reference image Similiarity functions responses 4 H1 H2 H3 H4 3 H5 H6 H7 H8 H9 H1 2 H H H13 H14 H15 1 H16 H17 H18 H19 H2 5 1 15 2 25 # test image likelihood likelihood likelihood likelihood Particle filter, iterations 1,5,8, 1.5 1 2 3 4 1.5 1 2 3 4 1.5 1 2 3 4 1.5 1 2 3 4 particle position (# reference image) Bonev, Cazorla Environment in Recognition Tasks / 35 WAF 28

Outline 1 Localization and vision Initial approach approach 2 3 4 Bonev, Cazorla Environment in Recognition Tasks / 35 WAF 28

Sequence of images Bonev, Cazorla Environment in Recognition Tasks 13 / 35 WAF 28

Sequence of images Bonev, Cazorla Environment in Recognition Tasks 14 / 35 WAF 28

Sequence of images Bonev, Cazorla Environment in Recognition Tasks 15 / 35 WAF 28

Divide the problem How to divide the problem? Try all possible partitions Clustering algorithms Look for local variations in the information Bonev, Cazorla Environment in Recognition Tasks 16 / 35 WAF 28

Jensen-Rényi divergence Information-theoretic divergence measures entropy based unfeasible for multidimensional data Jensen-Rényi divergence α-entropy based feasible estimation in high-dimensional spaces (Hero and Michel, 22) J-R divergence applications may be defined between any number of probability distributions may be used to detect edges with a sliding window Region A Region B Data W1 W2 JR =.4 W1 W2 JR =.5 W1 W2 JR =.6 W1 W2 JR =.7 W1 W2 JR =.6 W1 W2 JR =.5 W1 W2 JR =.4 J-R divergence Bonev, Cazorla Environment in Recognition Tasks 17 / 35 WAF 28

Jensen-Rényi divergence J-R divergence simplified for two equally weighted distributions Data Region A Region B where JR α (p 1, p 2 ) = = H α ( p1 + p 2 2 ) H α(p 1 ) + H α (p 2 ), 2 Rényi entropy H α is estimated with Hero and Michel s method (based on minimal spanning trees) complexity depending on the number of samples O(N log N) W1 W2 JR =.4 W1 W2 JR =.5 W1 W2 JR =.6 W1 W2 JR =.7 W1 W2 JR =.6 W1 W2 JR =.5 W1 W2 JR =.4 J-R divergence Bonev, Cazorla Environment in Recognition Tasks 18 / 35 WAF 28

Jensen-Rényi divergence Response and entropy divergence analysis of Nitzberg filter Region A Region B Data W1 W1 W1 W2 W2 W2 JR =.4 JR =.5 JR =.6 W1 W1 W1 W2 W2 W2 JR =.7 JR =.6 JR =.5 response of Nitzberg filter local entropy divergence divergence gradient J-R divergence W1 W2 JR =.4 1 2 3 4 5 6 7 8 # Image Bonev, Cazorla Environment in Recognition Tasks 19 / 35 WAF 28

Multiscale Jensen-Rényi divergence JR divergence at various window sizes #241 JR divergence #254 3 2 window size 1 #273 22 23 24 25 26 27 28 # image Bonev, Cazorla Environment in Recognition Tasks 2 / 35 WAF 28

Resulting partitions Gradient of the JR divergence gradient 5 1 15 2 25 3 35 4 # image Bonev, Cazorla Environment in Recognition Tasks 21 / 35 WAF 28

Resulting partitions Bonev, Cazorla Environment in Recognition Tasks 22 / 35 WAF 28

Feature extraction Extraction of global features Rings Histograms 4 bins 1 2 3 Filters Bank 1 2. K 1 2.. 4 K = 17 68 Img i C = 4 C x K Feature Vector...... N = C x K x (B-1) = 24 features F Bonev, Cazorla Environment in Recognition Tasks 23 / 35 WAF 28

Training Selection of features N F N S M M M 1-Fold CV Train Test Images Vectors Vectors Error All Features Selected F.? Best F. Set Bonev, Cazorla Environment in Recognition Tasks 24 / 35 WAF 28

NNs in the feature space The ideal case: F N F N-1...... F 4 F 3 F 2 F 1 Bonev, Cazorla Environment in Recognition Tasks 25 / 35 WAF 28

NNs in the feature space The wrong case: F N F N-1...... F 4 F 3 F 2 F 1 Bonev, Cazorla Environment in Recognition Tasks 26 / 35 WAF 28

Response on the test set Classifier trained for images 14 148 Classifier trained for images 1 44 4 4 3 3 # reference image 2 # reference image 2 1 1 5 1 15 2 25 # test image 5 1 15 2 25 # test image Bonev, Cazorla Environment in Recognition Tasks 27 / 35 WAF 28

Several localization hypotheses Similiarity functions responses 4 # reference image 3 2 1 H1 H2 H3 H4 H5 H6 H7 H8 H9 H1 H H H13 H14 H15 H16 H17 H18 H19 H2 5 1 15 2 25 # test image Bonev, Cazorla Environment in Recognition Tasks 28 / 35 WAF 28

4 3 2 1 Similiarity functions responses 5 1 15 2 25 # test image H1 H2 H3 H4 H5 H6 H7 H8 H9 H1 H H H13 H14 H15 H16 H17 H18 H19 H2 Several localization hypotheses # reference image Monte Carlo Localization, given: A motion model A likelihood function for a given position Bonev, Cazorla Environment in Recognition Tasks 29 / 35 WAF 28

Outline 1 Localization and vision Initial approach approach 2 3 4 Bonev, Cazorla Environment in Recognition Tasks 3 / 35 WAF 28

MCL algorithm for disambiguation likelihood likelihood likelihood likelihood 1.5 Particle filter, iterations 1,5,8, 1 2 3 4 1.5 1 2 3 4 1.5 1 2 3 4 1.5 1 2 3 4 particle position (# reference image) Bonev, Cazorla Environment in Recognition Tasks 31 / 35 WAF 28

Single classifier Single similiarity function response H 4 likelihood 1.5 Particle filter, iteration 56 5 1 15 2 # reference image # reference image 3 2 1 5 1 15 2 25 # test image Bonev, Cazorla Environment in Recognition Tasks 32 / 35 WAF 28

Outline 1 Localization and vision Initial approach approach 2 3 4 Bonev, Cazorla Environment in Recognition Tasks 33 / 35 WAF 28

Visual localization approach Scalability Unsupervised IT-based partitioning Fast image recognition,.1sec Suitable for corridor-like scenarios Future work Generalize to 2D scenarios Bonev, Cazorla Environment in Recognition Tasks 34 / 35 WAF 28

Large Scale Environment in Mobile Robotics Recognition Tasks Boyan Bonev, Miguel Cazorla {boyan,miguel}@dccia.ua.es Robot Vision Group Department of Computer Science and Artificial Intelligence University of Alicante April 27th, 27 Bonev, Cazorla Environment in Recognition Tasks 35 / 35 WAF 28