CS 534: Computer Vision Segmentation III Statistical Nonparametric Methods for Segmentation

Similar documents
CS 556: Computer Vision. Lecture 21

Mean-Shift Tracker Computer Vision (Kris Kitani) Carnegie Mellon University

Nonparametric Methods Lecture 5

Robert Collins CSE598G Mean-Shift Blob Tracking through Scale Space

A NOTE ON THE CHOICE OF THE SMOOTHING PARAMETER IN THE KERNEL DENSITY ESTIMATE

Curve Fitting Re-visited, Bishop1.2.5

CS 664 Segmentation (2) Daniel Huttenlocher

Improved Fast Gauss Transform. Fast Gauss Transform (FGT)

Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials

Statistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart

Instance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2016

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

Lecture Notes on the Gaussian Distribution

Improved Fast Gauss Transform. (based on work by Changjiang Yang and Vikas Raykar)

Machine vision. Summary # 4. The mask for Laplacian is given

Machine vision, spring 2018 Summary 4

Pattern Recognition Problem. Pattern Recognition Problems. Pattern Recognition Problems. Pattern Recognition: OCR. Pattern Recognition Books

Gaussian mean-shift algorithms

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Mean Shift is a Bound Optimization

ECE Digital Image Processing and Introduction to Computer Vision. Outline

Edge Detection. CS 650: Computer Vision

Nonparametric probability density estimation

12 - Nonparametric Density Estimation

Lecture 6: Edge Detection. CAP 5415: Computer Vision Fall 2008

Reading. 3. Image processing. Pixel movement. Image processing Y R I G Q

Econ 582 Nonparametric Regression

Introduction to Computer Vision

Probabilistic Methods in Bioinformatics. Pabitra Mitra

Probability Models for Bayesian Recognition

Machine Learning Lecture 3

Nonparametric Methods

Feature extraction: Corners and blobs

Probabilistic modeling. The slides are closely adapted from Subhransu Maji s slides

SIFT: Scale Invariant Feature Transform

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

Maximum Likelihood Estimation. only training data is available to design a classifier

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES

Shape of Gaussians as Feature Descriptors

Modelling Non-linear and Non-stationary Time Series

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

O Combining cross-validation and plug-in methods - for kernel density bandwidth selection O

BACKPROPAGATION. Neural network training optimization problem. Deriving backpropagation

Image processing and nonparametric regression

Statistics: Learning models from data

Reconnaissance d objetsd et vision artificielle

Signal Modeling Techniques in Speech Recognition. Hassan A. Kingravi

CS4670: Computer Vision Kavita Bala. Lecture 7: Harris Corner Detec=on

Mixture Models and EM

Probabilistic & Unsupervised Learning

Learning Methods for Linear Detectors

Nonparametric Inference via Bootstrapping the Debiased Estimator

Medical Image Analysis

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba WWW:

Lecture 8: Interest Point Detection. Saad J Bedros

Master of Intelligent Systems - French-Czech Double Diploma. Hough transform

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods

Nonparametric Density Estimation

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Information Theory in Computer Vision and Pattern Recognition

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart

Biomedical Image Analysis. Segmentation by Thresholding

ECE Digital Image Processing and Introduction to Computer Vision

Optimization II: Unconstrained Multivariable

CPSC 540: Machine Learning

Computer Vision Lecture 3

Lecture 8: Interest Point Detection. Saad J Bedros

Joint Optimization of Segmentation and Appearance Models

PATTERN CLASSIFICATION

Efficient Inference in Fully Connected CRFs with Gaussian Edge Potentials

Feature Extraction Line & Curve

LoG Blob Finding and Scale. Scale Selection. Blobs (and scale selection) Achieving scale covariance. Blob detection in 2D. Blob detection in 2D

Kernel-based density. Nuno Vasconcelos ECE Department, UCSD

EEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1

Confidence intervals for kernel density estimation

Forecasting Wind Ramps

Maximum Smoothed Likelihood for Multivariate Nonparametric Mixtures

3.4 Linear Least-Squares Filter

p(d θ ) l(θ ) 1.2 x x x

Research Article Thalamus Segmentation from Diffusion Tensor Magnetic Resonance Imaging

Nonparametric Bayesian Methods (Gaussian Processes)

Machine Learning. Nonparametric Methods. Space of ML Problems. Todo. Histograms. Instance-Based Learning (aka non-parametric methods)

Automatic detection and of dipoles in large area SQUID magnetometry

Contrastive Divergence

Achieving scale covariance

Lecture 7: Edge Detection

Medical Image Analysis

Visuelle Perzeption für Mensch- Maschine Schnittstellen

Over-enhancement Reduction in Local Histogram Equalization using its Degrees of Freedom. Alireza Avanaki

Templates, Image Pyramids, and Filter Banks

Single Index Quantile Regression for Heteroscedastic Data

Linear Discrimination Functions

Feature detectors and descriptors. Fei-Fei Li

Roadmap. Introduction to image analysis (computer vision) Theory of edge detection. Applications

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University

Feature detectors and descriptors. Fei-Fei Li

Low-level Image Processing

Transcription:

CS 534: Computer Vision Segmentation III Statistical Nonparametric Methods for Segmentation Ahmed Elgammal Dept of Computer Science CS 534 Segmentation III- Nonparametric Methods - - 1 Outlines Density estimation Nonparametric kernel density estimation Mean shift Mean shift clustering Mean shift filtering and segmentation CS 534 Segmentation III- Nonparametric Methods - - 2 1

Statistical Background Density Estimation: Given a sample S={x i } i=1..n from a distribution obtain an estimate of the density function at any point. Parametric : Assume a parametric density family f (. θ), (ex. N(µ,σ 2 ) ) and obtain the best estimator of θ Advantages: Efficient Robust to noise: robust estimators can be used Problem with parametric methods An incorrectly specified parametric model has a bias that cannot be removed even by large number of samples. Nonparametric : directly obtain a good estimate from the sample. Most famous example: Histogram of the entire density CS 534 Segmentation III- Nonparametric Methods - - 3 Kernel Density Estimation 1950s + (Fix & Hodges 51, Rosenblatt 56, Parzen 62, Cencov 62) Given a set of samples S={x i } i=1..n we can obtain an estimate for the density at x as: CS 534 Segmentation III- Nonparametric Methods - - 4 2

x i x i x i x i x i i x i x i x i where K h (t)=k(t/h)/h called kernel function (window function) h : scale or bandwidth K satisfies certain conditions, e.g.: CS 534 Segmentation III- Nonparametric Methods - - 5 Kernel Estimation A variety of kernel shapes with different properties. Gaussian kernel is typically used for its continuity and differentiability. Multivariate case: Kernel Product Use same kernel function with different bandwidth h for each dimension. General form: avoid to store all the samples CS 534 Segmentation III- Nonparametric Methods - - 6 3

Kernel Density Estimation Advantages: Converge to any density shape with sufficient samples. asymptotically the estimate converges to any density. No need for model specification. Unlike histograms, density estimates are smooth, continuous and differentiable. Easily generalize to higher dimensions. All other parametric/nonparametric density estimation methods, e.g., histograms, are asymptotically kernel methods. In computer vision, the densities are multivariate and multimodal with irregular cluster shapes. CS 534 Segmentation III- Nonparametric Methods - - 7 Example: color clusters Cluster shapes are irregular Cluster boundaries are not well defined. CS 534 Segmentation III- Nonparametric Methods - - 8 4

Conversion - KDE Estimation using Gaussian Kernel Estimation using Uniform Kernel CS 534 Segmentation III- Nonparametric Methods - - 9 Conversion - KDE Estimation using Gaussian Kernel Estimation using Uniform Kernel CS 534 Segmentation III- Nonparametric Methods - - 10 5

Scale selection Important problem. Large literature. Small h results in ragged densities. Large h results in over smoothing. Best choice for h depends on the number of samples: small n, wide kernels large n, Narrow kernels CS 534 Segmentation III- Nonparametric Methods - - 11 Optimal scale Optimal kernel and optimal scale can be achieved by minimizing the mean integrated square error if we know the density! Normal reference rule: CS 534 Segmentation III- Nonparametric Methods - - 12 6

Scale selection CS 534 Segmentation III- Nonparametric Methods - - 13 From R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification Wiley, New York, 2nd edition, 2000 CS 534 Segmentation III- Nonparametric Methods - - 14 7

Mean Shift Given a sample S={s i :s i R n } and a kernel K, the sample mean using K at point x: Iteration of the form x m(x) will lead to the density local mode Let x is the center of the window Iterate until conversion. Compute the sample mean m(x) from the samples inside the window. Replace x with m(x) CS 534 Segmentation III- Nonparametric Methods - - 15 Mean Shift Given a sample S={s i :s i R n } and a kernel K, the sample mean using K at point x: Fukunaga and Hostler 1975 introduced the mean shift as the difference m(x)-x using a flat kernel. Iteration of the form x m(x) will lead to the density mode Cheng 1995 generalized the definition using general kernels and weighted data Recently popularized by D. Comaniciu and P. Meer 99+ Applications: Clustering[Cheng,Fu 85], image filtering, segmentation [Meer 99] and tracking [Meer 00]. CS 534 Segmentation III- Nonparametric Methods - - 16 8

Mean Shift Iterations of the form x m(x) are called mean shift algorithm. If K is a Gaussian (e.g.) and the density estimate using K is Using Gaussian Kernel K σ (x), the derivative is we can show that: the mean shift is in the gradient direction of the density estimate. CS 534 Segmentation III- Nonparametric Methods - - 17 Mean Shift The mean shift is in the gradient direction of the density estimate. Successive iterations would converge to a local maxima of the density, i.e., a stationary point: m(x)=x. Mean shift is a steepest-ascent like procedure with variable size steps that leads to fast convergence well-adjusted steepest ascent. CS 534 Segmentation III- Nonparametric Methods - - 18 9

CS 534 Segmentation III- Nonparametric Methods - - 19 Mean shift and Image Filtering Discontinuity preserving smoothing Recall, average or Gaussian filters blur images and do not preserve region boundaries. Mean shift application: Represent each pixel x as spatial location x s and range x r (color, intensity) Look for modes in the joint spatial-range space Use a product of two kernels: a spatial kernel with bandwidth h s and a range kernel with bandwidth h r Algorithm: For each pixel x i =(x is,x ir ) apply mean shift until conversion. Let the conversion point be (y is,y ir ) Assign z i = (x is,y ir ) as filter output Results: see the paper. CS 534 Segmentation III- Nonparametric Methods - - 20 10

CS 534 Segmentation III- Nonparametric Methods - - 21 CS 534 Segmentation III- Nonparametric Methods - - 22 11

CS 534 A. Elgammal!"# $%%% &'()*(+&$,)*,) -(&&%') ()(./*$* ()0 1(+2$)% $)&%..$3%)+%4 1(/ #99# CS 534 Segmentation III- Nonparametric Methods - - 23 :;<6 86 1'200( ;=><?6,@;<;A>B >AC D;BE?@?C6!"# $%%&$'%()* +)&'# +(#,- -#+(*#- (*./01 (2 ')345%#- $% #$'" 4(6#, $2 $ 7#'%)& 253 )+ 4$(&8(2# $++(*(%(#2 9#%8##* %"# '5&&#*% 4(6#, $*- $,, )%"#& 4(6#,2: 8(%" 2(3(,$&(%; 3#$25&#(* 9)%" 24$%($, $*- &$*<# -)3$(*2=!"# &#<()* 9)5*-$&(#2 $&# %"#* (-#*%(+(#- $2,)'( 8"#&# %"# +)&'# 7#'%)&2 -(7#&<#= >% (2 (*%#&#2%(*< %) *)%# %"$%: +)& $ <(7#* 4(6#,: %"# 3$<*(%5-# $*- )&(#*%$%()* )+ %"# +)&'# +(#,- $&# 2(3(,$& %) %")2# )+ %"#?)(*% -)3$(* 3#$* 2"(+% 7#'%)& ')345%#- $% %"$% 4(6#, $*4&)?#'%#- (*%) %"# 24$%($, -)3$(*= @)8#7#&: (* ')*%&$2% %)./01: %"# 3#$* 2"(+% 4&)'#-5&# 3)7#2 (* %"# -(&#'%()* )+ %"(2 7#'%)&: $8$; +&)3 %"# 9)5*-$&(#2=!"# #-<# +,)8 (*.AB1 (2 )9%$(*#- $% #$'",)'$%()* +)& $ <(7#* 2#% )+ -(&#'%()*2 $2 %"# 3$<*(%5-# )+ %"# <&$-(#*% )+ $ 23))%"#- (3$<#=!"# 9)5*-$&(#2 $&# -#%#'%#- $% (3$<#,)'$%()*2 8"('" #*')5*%#& %8) )44)2(%# -(&#'%()*2 )+ +,)8=!"# C5$*%(D$%()* )+ %"# #-<# +,)8 -(&#'%()*: ")8#7#&: 3$; (*%&)-5'# $&%(+$'%2= E#'$,, %"$% %"# -(&#'%()* )+ %"# 3#$* 2"(+% (2 -('%$%#- 2),#,; 9; %"# -$%$=!"# 3#$* 2"(+% 4&)'#-5&#F9$2#- (3$<# 2#<3#*%$%()* (2 $ 2%&$(<"%+)&8$&- #6%#*2()* )+ %"# -(2')*%(*5(%; 4&#2#&7(*< 23))%"(*< $,<)&(%"3= G$'" 4(6#, (2 $22)'($%#- 8(%" $!"#$"%"&'$( 3)-# )+ %"#?)(*% -)3$(* -#*2(%;,)'$%#- (* (%2 *#(<"9)&"))-: $+%#& *#$&9; 3)-#2 8#&# 4&5*#- $2 (* %"# <#*#&(' +#$%5&# 24$'# $*$,;2(2 %#'"*(C5# HI#'%()* AJ= 5,.6 #74 ),6 84!"#"$ %&'( )*+,- )&./&(-'-+0( K#%!! $*- "! "!!!" " " " " #: 9# %"# $#$%&'()%*(+, (*45% $*+(,%#&#- (3$<# 4(6#,2 (* %"#?)(*% 24$%($,F&$*<# -)3$(* $*%! %"#,$9#, )+ %"#!-. 4(6#, (* %"# 2#<3#*%#- (3$<#=!" E5* %"# 3#$* 2"(+% +(,%#&(*< 4&)'#-5&# +)& %"# (3$<# $*- 2%)&# ')) %"# (*+)&3$%()* $9)5% %"# $#$%&'()%*(+, ')*7#&<#*'# 4)(*% (* "! : (=#=: "!! #!"& =! " #" L#,(*#$%# (* %"#?)(*% -)3$(* %"# ',52%#&2 $' '!!"""( 9; <&)54(*< %)<#%"#& ')) "! 8"('" $&# ',)2#& %"$* )* (* %"# 24$%($, -)3$(* $*- )+ (* %"# &$*<# -)3$(*: (=#=: ')*'$%#*$%# %"# 9$2(*2 )+ $%%&$'%()* )+ %"# ')&&#24)*-(*< ')*7#&<#*'# 4)(*%2= $" M)& #$'"!!!" " " " " #: $22(<* %!! "' # "! $ $' %= %" N4%()*$,O G,(3(*$%# 24$%($, &#<()*2 ')*%$(*(*<,#22 %"$*, 4(6#,2=!"# ',52%#& -#,(*#$%()* 2%#4 '$* 9# &#+(*#- $'')&-(*< %) $ 4&()&( (*+)&3$%()* $*-: %"52: 4";2('2F9$2#- 2#<3#*%$%()* $,<)&(%"32: #=<=:.P1:.A/1: '$* 9# (*')&4)&$%#-= I(*'# %"(2 4&)'#22 (2 4#&+)&3#- )* &#<()* $-?$'#*'; <&$4"2: "(#&$&'"F ('$, %#'"*(C5#2,(Q#.AR1 '$* 4&)7(-# 2(<*(+('$*% 24##-F54=!"# #++#'% )+ %"# ',52%#& -#,(*#$%()* 2%#4 (2 2")8* (* M(<= B-= S)%# %"# +52()* (*%),$&<#& ")3)<#*#)52 &#<()*2 )+ %"# CS 534 Segmentation III- Nonparametric Methods - - 24 12

CS 534 A. Elgammal Mean Shift and Segmentation Similar to filtering but group clusters from the filtered image: group together all zi which are closer than hs in the spatial domain and closer than$%hr in the range # $%%&$'%()* +)&'# +(#,- -#+(*#- (*./01 (2 ')345%#!"#"$ %&'( domain. )*+,- )&./&(-'-+0( 1'200( ;=><?6,@;<;A>B >AC D;BE?@?C6 4(6#, $2 $ 7#'%)& 253 )+ 4$(&8(2# $++(*(%(#2 9#%8##* %"# *% 4(6#, $*- $,, )%"#& 4(6#,2: 8(%" 2(3(,$&(%; 3#$25&#%" 24$%($, $*- &$*<# -)3$(*2=!"# &#<()* 9)5*-$&(#2 #* (-#*%(+(#- $2,)'( 8"#&# %"# +)&'# 7#'%)&2 -(7#&<#= >% #&#2%(*< %) *)%# %"$%: +)& $ <(7#* 4(6#,: %"# 3$<*(%5-# &(#*%$%()* )+ %"# +)&'# +(#,- $&# 2(3(,$& %) %")2# )+ %"# -)3$(* 3#$* 2"(+% 7#'%)& ')345%#- $% %"$% 4(6#, $*'%#- (*%) %"# 24$%($, -)3$(*= @)8#7#&: (* ')*%&$2% %) "# 3#$* 2"(+% 4&)'#-5&# 3)7#2 (* %"# -(&#'%()* )+ %"(2 &: $8$; +&)3 %"# 9)5*-$&(#2= # #-<# +,)8 (*.AB1 (2 )9%$(*#- $% #$'",)'$%()* +)& $ 2#% )+ -(&#'%()*2 $2 %"# 3$<*(%5-# )+ %"# <&$-(#*% )+ $ %"#- (3$<#=!"# 9)5*-$&(#2 $&# -#%#'%#- $% (3$<# )*2 8"('" #*')5*%#& %8) )44)2(%# -(&#'%()*2 )+ +,)8= 5$*%(D$%()* )+ %"# #-<# +,)8 -(&#'%()*: ")8#7#&: 3$; -5'# $&%(+$'%2= E#'$,, %"$% %"# -(&#'%()* )+ %"# 3#$* 2 -('%$%#- 2),#,; 9; %"# -$%$= # 3#$* 2"(+% 4&)'#-5&#F9$2#- (3$<# 2#<3#*%$%()* (2 $ "%+)&8$&- #6%#*2()* )+ %"# -(2')*%(*5(%; 4&#2#&7(*< %"(*< $,<)&(%"3= G$'" 4(6#, (2 $22)'($%#- 8(%" $ &'$( 3)-# )+ %"#?)(*% -)3$(* -#*2(%;,)'$%#- (* (%2 9)&"))-: $+%#& *#$&9; 3)-#2 8#&# 4&5*#- $2 (* %"# (' +#$%5&# 24$'# $*$,;2(2 %#'"*(C5# HI#'%()* AJ= K#%!! $*- "! "!!!" " " " " #: 9# %"# $#$%&'()%*(+, (*45% $*+(,%#&#- (3$<# 4(6#,2 (* %"#?)(*% 24$%($,F&$*<# -)3$(* $*%! %"#,$9#, )+ %"#!-. 4(6#, (* %"# 2#<3#*%#- (3$<#=!" E5* %"# 3#$* 2"(+% +(,%#&(*< 4&)'#-5&# +)& %"# (3$<# $*- 2%)&# ')) %"# (*+)&3$%()* $9)5% %"# $#$%&'()%*(+, ')*7#&<#*'# 4)(*% (* "! : (=#=: "!! #!"& =! " #" L#,(*#$%# (* %"#?)(*% -)3$(* %"# ',52%#&2 $' '!!"""( 9; <&)54(*< %)<#%"#& ')) "! 8"('" $&# ',)2#& %"$* )* (* %"# 24$%($, -)3$(* $*- )+ (* %"# &$*<# -)3$(*: (=#=: ')*'$%#*$%# %"# 9$2(*2 )+ $%%&$'%()* )+ %"# ')&&#24)*-(*< ')*7#&<#*'# 4)(*%2= $" M)& #$'"!!!" " " " " #: $22(<* %!! "' # "! $ $' %= %" N4%()*$,O G,(3(*$%# 24$%($, &#<()*2 ')*%$(*(*<,#22 %"$*, 4(6#,2=!"# ',52%#& -#,(*#$%()* 2%#4 '$* 9# &#+(*#- $'')&-(*< %) $ 4&()&( (*+)&3$%()* $*-: %"52: 4";2('2F9$2#- 2#<3#*%$%()* $,<)&(%"32: #=<=:.P1:.A/1: '$* 9# (*')&4)&$%#-= I(*'# %"(2 CS 534 Segmentation III- Nonparametric Methods - - 25 4&)'#22 (2 4#&+)&3#- )* &#<()* $-?$'#*'; <&$4"2: "(#&$&'"F ('$, %#'"*(C5#2,(Q#.AR1 '$* 4&)7(-# 2(<*(+('$*% 24##-F54=!"# #++#'% )+ %"# ',52%#& -#,(*#$%()* 2%#4 (2 2")8* (* M(<= B-= S)%# %"# +52()* (*%),$&<#& ")3)<#*#)52 &#<()*2 )+ %"# CS 534 Segmentation III- Nonparametric Methods - - 26 13

Meanshift tracking!""#$%$&'#()$*#+, -%$'./&0 '1%%#&2,3%$4#,5 "%#6/71*,87'$2/7& 8/.#8/=77+,76#% 7BC#'2,87'$2/7& '1%%#&2,87'$2/7& $""#$%$&'#,47+#8 ;#<0<,/4$0#,2#4"8$2#D,7% 97+#(:##./&0 ;#<0<,4#$&(*=/32>,?1'$*(@$&$+#>, "$%2/'8#,3/82#%/&0A '787%>,/&2#&*/2E>,#+0#,=/*270%$4*A Slide from R. Collins CS 534 Segmentation III- Nonparametric Methods - - 27 Sources R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. Wiley, New York, 2nd edition, 2000 CS 534 Segmentation III- Nonparametric Methods - - 28 14