Lec 12 Review of Part I: (Hand-crafted) Features and Classifiers in Image Classification

Size: px
Start display at page:

Download "Lec 12 Review of Part I: (Hand-crafted) Features and Classifiers in Image Classification"

Transcription

1 Image Analysis & Retrieval Spring 2017: Image Analysis Lec 12 Review of Part I: (Hand-crafted) Features and Classifiers in Image Classification Zhu Li Dept of CSEE, UMKC Office: FH560E, Ph: x p.1

2 Outline HW-2 Quiz-1 Review part I: Image Features & Classifiers Image Formation: Homography Color Space & Color Features Filtering and Edge Features Harris Detector SIFT Aggregation: BoW, VLAD, Fisher, and Supervector Image Retrieval System Metrics: TPR/Precision, FPR, Recall, map Classification: knn, GMM Bayesian, SVM p.2

3 Homework-2: Aggregation Data Set: CDVS data set Global Codebook for VLAD, FisherVector Compute SIFT and DenseSIFT from each image, sort them into two arrays: hw2_sift: hw2_dsift Random sample a subset of them: use randperm() Compute PCA: o indx = randperm(n_sift); indx = indx(1:20*2^10); % sample 20k o [A0, s, eigv]=princomp(hw2_sift(indx,: )); Project SIFT o Dimension reduction: x = sift*a0(:,1:kd) Compute Kmeans and GMM models for VLAD and FV aggregation: o vl_feat: vl_kmeans(), vl_gmm(); p.3

4 VLAD- Vector of Locally Aggregated Descriptors Aggregate feature difference from the codebook Hard assignment by finding the NN of feature {x k } to {μ k } Compute aggregated differences 1 assign descriptors x 2 v k = j,s.t.nn x j =μ k x j μ k 2 compute x- i 5 L2 normalize v k = v k / v k 2 Final feature: k x d 3 v i =sum x- i for cell i v 1 v 2 v 3 v 4 v 5 p.4

5 Fisher Vector Aggregation FV encoding Gradient on the mean, for GMM component k, j=1..d In the end, we have 2K x D aggregation on the derivation w.r.t. the means and variances FV = u 1, u 2,, u K, v 1, v 2,, v K T Richer characterization of the code book with GMM: {μ k, Σ k, π k } Normalize by Fisher Kernel p.5

6 HW-2 Deliverables Functions: [vlad]=getvladsift(im, km, opt); % opt = sift, or dsft [fv]=getfvsift(im, gmm, opt); % can use vl_fisher() Plots: For each image in mp_fid, nmp_fid, we have {vlad, fv} x {kd=[16,24]} x{n=[24, 48, 96]}, total 12 features per image, plot the true positive distance and true negative distance histograms. Hint: hist(d0, 30); hist(d1, 30); TPR-FPR: p.6

7 Quiz-1 Covers what we reviewed today Format: True or False Multiple Choices Problem Solving Extra Credit (25%) Close Book, 1-page A4 cheating sheet allowed. cheating sheet need to be turned in with name and id number cheating sheet will also be graded for up to 10 pts Relax Quiz is actually on me. p.7

8 Best Cheating Sheet Award Cheating sheet is also graded. Up to 10 pts extra A nice way to summarize and organize/indexing your knowledge base. Cheating sheet + HW = what you really get from the class. p.8

9 Outline HW-2 Quiz-1 Review part I: Image Features & Classifiers Image Formation: Homography Color Space & Color Features Filtering and Edge Features Harris Detector SIFT Aggregation: BoW, VLAD, Fisher, and Supervector Image Retrieval System Metrics: TPR/Precision, FPR, Recall, map Classification: knn, GMM Bayesian, SVM p.9

10 Classification in Image Retrieval An image retrieval pipeline (hand crafted features) Image Formation Feature Computing Feature Aggregation Classification Homography, Color space Color histogram Filtering, Edge Detection HoG, Harris Detector, SIFT Bow VLAD Fisher Vector Supervector Knowledge/ Data Base p.10

11 Image Formation - Geometry p.11 X 0 x K I z y x f f v u w K Slide Credit: GaTech Computer Vision class, Intrinsic Assumptions Unit aspect ratio Optical center at (0,0) No skew of pixels Extrinsic Assumptions No rotation Camera at (0,0,0) X x

12 Image Formation: Homography In general, homography H maps 2d points according to, x =Hx Up to a scale, as [x, y, w]=[sx, sy, sw], so H has 8 DoF Affine Transform: 6 DoF: Contains a translation [t 1, t 2 ], and invertable affine matrix A [2x2] Similarity Transform, 4 DoF: a rigid transform that preserves distance if s=1: H = x y w = h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 h 9 H = x y w a 1 a 2 t 1 a 3 a 4 t s cos θ s sin(θ) t 1 s sin θ s cos(θ) t p.12

13 VL_FEAT Implementation VL_FEAT has an implementation: p.13

14 Image Formation Color Space RGB Color Space: 8 bit Red, Green, Blue Channels Mixed together YUV/YCbCr Color Space HSV Color Space Hue: color in polar coord Saturation Value p.14

15 Color Histogram Fixed Codebook Color Histogram: Fixed code book to generate histogram Example: MPEG 7 Scalable Color Descriptor HSV Color Space, 16x4x4 partition for bins Scalability thru quantization and Haar Transform p.15

16 Color Histogram Adaptive Codebook Adaptive Color Histogram No fixed code book Example: MPEG 7 Dominant Color Descriptor Distance Metric Not index-able p.16

17 Image Filters Basic Filter Operations 1/9 1/9 1 1/9 1 1/9 1 1/9 1 1/9 1 * 1/9 1 1/9 1 1/9 1 Region of Support p.17

18 Filter Properties Linear Filters are: Shift-Invariant: f(m-k, n-j)*h = g(m-k, n-j), if f*h=g Associative: f*h 1 *h 2 = f*(h 1 *h 2 ) this can save a lot of complexity Distributive: f*h1 + f*h2 = f*(h1+h2) useful in SIFT s DoG filtering. p.18

19 Edge Detection The need of smoothing Gaussian smoothing Combine differentiation and Gaussian derivative of Gaussian (x) f p.19

20 Image Gradient from DoG Filtering Image Gradient (at a scale): The gradient points in the direction of most rapid increase in intensity The edge strength is given by the gradient magnitude: The gradient direction is given by: p.20

21 HOG Histogram of Oriented Gradients Generate local histogram per block Size of HoG: n*m*4*9=36nm, for n x m cells. f h h9, h1,..., h9, h1,..., h9, h1,...,,..., h 4 9 Angle Magnitude HOG Cell size 5 Binary voting Magnitude voting p.21

22 Harris Detector Smoothed 2 nd moment: 1. Image derivatives (optionally blur first) 2 I x ( D) I, D) g( I ) I xi y ( D) ( 2 I xi y ( D) I y ( D) I x I y det M trace M Square of derivatives I x 2 I y 2 I x I y 3. Gaussian filter g( I ) g(i x2 ) g(i y2 ) g(i x I y ) 4. Harris Function detect corners, no need to compute the eigen values Rharris 2 det[ (, )] [trace( (, )) ] g I D ( I x ) g( I y ) [ g( I xi y)] [ g( I x ) g( I y I D )] 2 har p.22

23 Harris Detector Steps Response function -> Threshold -> Local Maxima Harris Detector NOT scale invariant R(x,y) p.23

24 Scale Space Theory - Lindeberg Scale Space Response via Laplacian of Gaussian The scale is controlled by σ 2 g 2 g 2 x 2 g 2 y g = e x+y 2 2σ Characteristic Scale: r image σ = 0.8r σ = 1.2r σ = 2r characteristic scale p.24

25 SIFT Detector via Difference of Gaussian However, LoG computing is expensive nonseperatable filter SIFT: uses Difference of Gaussian filters to approximate LoG Separate-able, much faster Box Filter Approximation you will see later. LoG Scale space construction By Gaussian Filtering, and Image Difference p.25

26 Peak Strength & Edge Removal Peak Strength: Interpolate true DoG response and pixel location by Taylor expansion Edge Removal: Re-do Harris type detection to remove edge on much reduced pixel set p.26

27 Scale Invariance thru Dominant Orientation Coding Voting for the dominant orientation Weighted by a Gaussian window to give more emphasis to the gradients closer to the center p.27

28 SIFT Description Rotate to dominant direction, divide Image Patch (16x16 pixels) around SIFT point into 4x4 Cells For each cell, compute an 8-bin edge histogram 0 2 angle histogram Spatial-scale space extrema Overall, we have 4x4x8=128 dimension p.28

29 SIFT Matching and Repeatability Prediction SIFT Distance True Love Test d(s 1 1, s 2 k ) d(s 1 1, s 2 k ) θ Not all SIFT are created equal Peak strength (DoG response at interpolated position) Combined scale/peak strength pmf p.29

30 Outline HW-2 Quiz-1 Review part I: Image Features & Classifiers Image Formation: Homography Color Space & Color Features Filtering and Edge Features Harris Detector SIFT Aggregation: BoW, VLAD, Fisher, and Supervector Image Retrieval System Metrics: TPR/Precision, FPR, Recall, map Classification: knn, GMM Bayesian, SVM p.30

31 Why Aggregation? Curse of Dimensionality + Decision Boundary / Indexing p.31

32 Histogram Aggregation: Bag-of-Words Codebook: Feature space: R d, k-means to get k centroids, {μ 1, μ 2,, μ k } BoW Hard Encoding: For n feature points,{x 1, x 2,,x n } assignment matrix: kxn, with column only 1-non zero entry Aggregated dimension: k k n p.32

33 Histogram Aggregation: Kernel Code Book Soft Encoding Kernel Code Book Soft Encoding Kernel Affinity: K x j, μ k = e k x j μ k 2 Assignment Matrix: A j,k = K(x j, μ k )/ k K(x j, μ k ) Encoding: k-dimensional: X(k)= 1 n j A j,k p.33

34 VLAD- Vector of Locally Aggregated Descriptors Aggregate feature difference from the codebook Hard assignment by finding the NN of feature {x k } to {μ k } Compute aggregated differences v k = j,s.t.nn x j =μ k x j μ k 1 assign descriptors x compute x- i v k = v k / v k 2 L2 normalize 3 v i =sum x- i for cell i Final feature: k x d v 1 v 2 v 3 v 4 v 5 p.34

35 Fisher Vector Aggregation FV encoding Gradient on the mean, for GMM component k, j=1..d In the end, we have 2K x D aggregation on the derivation w.r.t. the means and variances FV = u 1, u 2,, u K, v 1, v 2,, v K T Richer characterization of the code book with GMM: {μ k, Σ k, π k } Normalize by Fisher Kernel p.35

36 Supervector Aggregation Assuming we have UBM GMM model λ UBM = {P k, μ k, Σ k }, with identical prior and covariance Then for two utterance samples a and b, with GMM models λ a = {P k, μ k a, Σ k }, λ b = {P k, μ k b, Σ k }, The SV distance is, K λ a, λ b = k P k Σ k ( 1 2 ) μ k a It means the means of two models need to be normalized by the UBM covariance induced Mahanolibis distance metric This is also a linear kernel function scaled by the UBM covariances T ( P k Σ k ( 1 2 ) μ k b ) p.36

37 AKULA Adaptive KLUster Aggregation AKULA Descriptor: No fixed codebook! Outer loop: subspace optimization Inner loop: cluster centroids + SIFT count A 1 ={yc 1 1, yc 1 2,, yc 1 k ; pc 1 1, pc 1 2,, pc 1 k } A 2 ={yc 2 1, yc 2 2,, yc 2 k ; pc 2 1, pc 2 2,, pc 2 k } Distance metric: Min centroids distance, weighted by SIFT count (kind of manifolds tangent distance) d A 1, A 2 = 1 k k j=0 1 d min 1 j w min (j) + 1 k k i=0 2 d min 2 i w min (i) 1 d min 2 d min j = min i i = min j d j,i d j,i 1 w min 2 w min j = w j,i, i = w j,i, i = arg min i j = arg min j d j,i d j,i p.37

38 Image Retrieval Performance Metrics True Positive Rate False Positive Rate Precision Recall F-Measure TPR = TP/(TP+FN) FPR = FP/(FP+TN) Precision =TP/(TP + FP), Recall = TP/(TP + FN) F-measure = 2*(precision*recall)/(precision + recall) p.38

39 Retrieval Performance: Mean Average Precision map measures the retrieval performance across all queries p.39

40 map example MAP is computed across all query results as the average precision over the recall p.40

41 Compute TPR-FPR ROC Curve (Receiver Operating Characteristics) Confusion/distance matrix: d(j,k) Ground Truth: m(j,k) m j, k = 1, doc j and k are match 0, not match p.41

42 Classification in Image Retrieval A typical image retrieval pipeline Image Formation Feature Computing Feature Aggregation Classification Knowledge/ Data Base p.42

43 knn Classifier knn is Nonlinear Classifier Operations: majority vote Performance bounds on k: not worse off by 2 times the error rate of Bayesian error rate As k increases, it improves, but not much Accelerate NN retrieval operation Kd-tree: o a space partition scheme, to limit the number of data points comparison operation Kd-tree supported knn search: o Approx KNN search o Kd-Forest supported Approx Search p.43

44 Gaussian Generative Model Classifier Shaped by the relative shape of Covariance Matrix Matlab: [rec_label, err]=classify(q, x, y, method); Method = quadratic p.44

45 Why called SVM? The Support Vector Solution K(x i, x) p.45

46 Support Vector Machine Matlab: % train svm svm = svmtrain(x, y, 'showplot', true); % svm classification for k=1:5 figure(41); hold on; grid on; axis([ ]); [u,v]=ginput(1); q=[u,v]; plot(u, v, '*b'); rec = svmclassify(svm, q); fprintf('\n recognized as %c', rec); end p.46

47 Summary Image Analysis & Retrieval Part I Image Formation Color and Texture Features Interest Points Detection Harris Detector Invariant Interest Point Detection SIFT Aggregation Classification tools: knn, gmm, svm Performance metrics: o tp, fp, tn, fn, tpr/precision, fpr, recall, map Part II: Holistic approach Just throw the pixels to the algorithm to figure out what are the good features. Subspace approach: Eigenface, Fisherface, Laplacian Embedding Deep Learning p.47

Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction

Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E,

More information

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface Image Analysis & Retrieval Lec 14 Eigenface and Fisherface Zhu Li Dept of CSEE, UMKC Office: FH560E, Email: lizhu@umkc.edu, Ph: x 2346. http://l.web.umkc.edu/lizhu Z. Li, Image Analysis & Retrv, Spring

More information

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E, Contact:

More information

Achieving scale covariance

Achieving scale covariance Achieving scale covariance Goal: independently detect corresponding regions in scaled versions of the same image Need scale selection mechanism for finding characteristic region size that is covariant

More information

CS5670: Computer Vision

CS5670: Computer Vision CS5670: Computer Vision Noah Snavely Lecture 5: Feature descriptors and matching Szeliski: 4.1 Reading Announcements Project 1 Artifacts due tomorrow, Friday 2/17, at 11:59pm Project 2 will be released

More information

* h + = Lec 05: Interesting Points Detection. Image Analysis & Retrieval. Outline. Image Filtering. Recap of Lec 04 Image Filtering Edge Features

* h + = Lec 05: Interesting Points Detection. Image Analysis & Retrieval. Outline. Image Filtering. Recap of Lec 04 Image Filtering Edge Features age Analsis & Retrieval Outline CS/EE 5590 Special Topics (Class ds: 44873, 44874) Fall 06, M/W 4-5:5p@Bloch 00 Lec 05: nteresting Points Detection Recap of Lec 04 age Filtering Edge Features Hoework Harris

More information

LoG Blob Finding and Scale. Scale Selection. Blobs (and scale selection) Achieving scale covariance. Blob detection in 2D. Blob detection in 2D

LoG Blob Finding and Scale. Scale Selection. Blobs (and scale selection) Achieving scale covariance. Blob detection in 2D. Blob detection in 2D Achieving scale covariance Blobs (and scale selection) Goal: independently detect corresponding regions in scaled versions of the same image Need scale selection mechanism for finding characteristic region

More information

CSE 473/573 Computer Vision and Image Processing (CVIP)

CSE 473/573 Computer Vision and Image Processing (CVIP) CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu inwogu@buffalo.edu Lecture 11 Local Features 1 Schedule Last class We started local features Today More on local features Readings for

More information

Image Analysis & Retrieval. CS/EE 5590 Special Topics (Class Ids: 44873, 44874) Fall 2016, M/W Lec 15. Graph Laplacian Embedding

Image Analysis & Retrieval. CS/EE 5590 Special Topics (Class Ids: 44873, 44874) Fall 2016, M/W Lec 15. Graph Laplacian Embedding Image Analysis & Retrieval CS/EE 5590 Special Topics (Class Ids: 44873, 44874) Fall 2016, M/W 4-5:15pm@Bloch 0012 Lec 15 Graph Laplacian Embedding Zhu Li Dept of CSEE, UMKC Office: FH560E, Email: lizhu@umkc.edu,

More information

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Corners, Blobs & Descriptors With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Motivation: Build a Panorama M. Brown and D. G. Lowe. Recognising Panoramas. ICCV 2003 How do we build panorama?

More information

Lecture 8: Interest Point Detection. Saad J Bedros

Lecture 8: Interest Point Detection. Saad J Bedros #1 Lecture 8: Interest Point Detection Saad J Bedros sbedros@umn.edu Review of Edge Detectors #2 Today s Lecture Interest Points Detection What do we mean with Interest Point Detection in an Image Goal:

More information

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise Edges and Scale Image Features From Sandlot Science Slides revised from S. Seitz, R. Szeliski, S. Lazebnik, etc. Origin of Edges surface normal discontinuity depth discontinuity surface color discontinuity

More information

SIFT: SCALE INVARIANT FEATURE TRANSFORM BY DAVID LOWE

SIFT: SCALE INVARIANT FEATURE TRANSFORM BY DAVID LOWE SIFT: SCALE INVARIANT FEATURE TRANSFORM BY DAVID LOWE Overview Motivation of Work Overview of Algorithm Scale Space and Difference of Gaussian Keypoint Localization Orientation Assignment Descriptor Building

More information

Feature extraction: Corners and blobs

Feature extraction: Corners and blobs Feature extraction: Corners and blobs Review: Linear filtering and edge detection Name two different kinds of image noise Name a non-linear smoothing filter What advantages does median filtering have over

More information

Blobs & Scale Invariance

Blobs & Scale Invariance Blobs & Scale Invariance Prof. Didier Stricker Doz. Gabriele Bleser Computer Vision: Object and People Tracking With slides from Bebis, S. Lazebnik & S. Seitz, D. Lowe, A. Efros 1 Apertizer: some videos

More information

ECE 661: Homework 10 Fall 2014

ECE 661: Homework 10 Fall 2014 ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;

More information

Feature detectors and descriptors. Fei-Fei Li

Feature detectors and descriptors. Fei-Fei Li Feature detectors and descriptors Fei-Fei Li Feature Detection e.g. DoG detected points (~300) coordinates, neighbourhoods Feature Description e.g. SIFT local descriptors (invariant) vectors database of

More information

Image Analysis & Retrieval Lec 15 - Graph Embedding & Laplacianface

Image Analysis & Retrieval Lec 15 - Graph Embedding & Laplacianface CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 15 - Graph Embedding & Laplacianface Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E,

More information

Feature detectors and descriptors. Fei-Fei Li

Feature detectors and descriptors. Fei-Fei Li Feature detectors and descriptors Fei-Fei Li Feature Detection e.g. DoG detected points (~300) coordinates, neighbourhoods Feature Description e.g. SIFT local descriptors (invariant) vectors database of

More information

Improving Image Similarity With Vectors of Locally Aggregated Tensors (VLAT)

Improving Image Similarity With Vectors of Locally Aggregated Tensors (VLAT) Improving Image Similarity With Vectors of Locally Aggregated Tensors (VLAT) David Picard and Philippe-Henri Gosselin picard@ensea.fr, gosselin@ensea.fr ETIS / ENSEA - University of Cergy-Pontoise - CNRS

More information

SIFT: Scale Invariant Feature Transform

SIFT: Scale Invariant Feature Transform 1 SIFT: Scale Invariant Feature Transform With slides from Sebastian Thrun Stanford CS223B Computer Vision, Winter 2006 3 Pattern Recognition Want to find in here SIFT Invariances: Scaling Rotation Illumination

More information

Lecture 8: Interest Point Detection. Saad J Bedros

Lecture 8: Interest Point Detection. Saad J Bedros #1 Lecture 8: Interest Point Detection Saad J Bedros sbedros@umn.edu Last Lecture : Edge Detection Preprocessing of image is desired to eliminate or at least minimize noise effects There is always tradeoff

More information

CS4670: Computer Vision Kavita Bala. Lecture 7: Harris Corner Detec=on

CS4670: Computer Vision Kavita Bala. Lecture 7: Harris Corner Detec=on CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detec=on Announcements HW 1 will be out soon Sign up for demo slots for PA 1 Remember that both partners have to be there We will ask you to

More information

Lesson 04. KAZE, Non-linear diffusion filtering, ORB, MSER. Ing. Marek Hrúz, Ph.D.

Lesson 04. KAZE, Non-linear diffusion filtering, ORB, MSER. Ing. Marek Hrúz, Ph.D. Lesson 04 KAZE, Non-linear diffusion filtering, ORB, MSER Ing. Marek Hrúz, Ph.D. Katedra Kybernetiky Fakulta aplikovaných věd Západočeská univerzita v Plzni Lesson 04 KAZE ORB: an efficient alternative

More information

Recap: edge detection. Source: D. Lowe, L. Fei-Fei

Recap: edge detection. Source: D. Lowe, L. Fei-Fei Recap: edge detection Source: D. Lowe, L. Fei-Fei Canny edge detector 1. Filter image with x, y derivatives of Gaussian 2. Find magnitude and orientation of gradient 3. Non-maximum suppression: Thin multi-pixel

More information

Linear Classifiers as Pattern Detectors

Linear Classifiers as Pattern Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2014/2015 Lesson 16 8 April 2015 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear

More information

Linear Classifiers as Pattern Detectors

Linear Classifiers as Pattern Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2013/2014 Lesson 18 23 April 2014 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear

More information

Harris Corner Detector

Harris Corner Detector Multimedia Computing: Algorithms, Systems, and Applications: Feature Extraction By Dr. Yu Cao Department of Computer Science The University of Massachusetts Lowell Lowell, MA 01854, USA Part of the slides

More information

Blob Detection CSC 767

Blob Detection CSC 767 Blob Detection CSC 767 Blob detection Slides: S. Lazebnik Feature detection with scale selection We want to extract features with characteristic scale that is covariant with the image transformation Blob

More information

Performance Evaluation

Performance Evaluation Performance Evaluation Confusion Matrix: Detected Positive Negative Actual Positive A: True Positive B: False Negative Negative C: False Positive D: True Negative Recall or Sensitivity or True Positive

More information

Midterm: CS 6375 Spring 2015 Solutions

Midterm: CS 6375 Spring 2015 Solutions Midterm: CS 6375 Spring 2015 Solutions The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for an

More information

CS 3710: Visual Recognition Describing Images with Features. Adriana Kovashka Department of Computer Science January 8, 2015

CS 3710: Visual Recognition Describing Images with Features. Adriana Kovashka Department of Computer Science January 8, 2015 CS 3710: Visual Recognition Describing Images with Features Adriana Kovashka Department of Computer Science January 8, 2015 Plan for Today Presentation assignments + schedule changes Image filtering Feature

More information

EE 6882 Visual Search Engine

EE 6882 Visual Search Engine EE 6882 Visual Search Engine Prof. Shih Fu Chang, Feb. 13 th 2012 Lecture #4 Local Feature Matching Bag of Word image representation: coding and pooling (Many slides from A. Efors, W. Freeman, C. Kambhamettu,

More information

INTEREST POINTS AT DIFFERENT SCALES

INTEREST POINTS AT DIFFERENT SCALES INTEREST POINTS AT DIFFERENT SCALES Thank you for the slides. They come mostly from the following sources. Dan Huttenlocher Cornell U David Lowe U. of British Columbia Martial Hebert CMU Intuitively, junctions

More information

Advances in Computer Vision. Prof. Bill Freeman. Image and shape descriptors. Readings: Mikolajczyk and Schmid; Belongie et al.

Advances in Computer Vision. Prof. Bill Freeman. Image and shape descriptors. Readings: Mikolajczyk and Schmid; Belongie et al. 6.869 Advances in Computer Vision Prof. Bill Freeman March 3, 2005 Image and shape descriptors Affine invariant features Comparison of feature descriptors Shape context Readings: Mikolajczyk and Schmid;

More information

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems 1 Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems V. Estivill-Castro 2 Perception Concepts Vision Chapter 4 (textbook) Sections 4.3 to 4.5 What is the course

More information

CITS 4402 Computer Vision

CITS 4402 Computer Vision CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images

More information

Wavelet-based Salient Points with Scale Information for Classification

Wavelet-based Salient Points with Scale Information for Classification Wavelet-based Salient Points with Scale Information for Classification Alexandra Teynor and Hans Burkhardt Department of Computer Science, Albert-Ludwigs-Universität Freiburg, Germany {teynor, Hans.Burkhardt}@informatik.uni-freiburg.de

More information

Detectors part II Descriptors

Detectors part II Descriptors EECS 442 Computer vision Detectors part II Descriptors Blob detectors Invariance Descriptors Some slides of this lectures are courtesy of prof F. Li, prof S. Lazebnik, and various other lecturers Goal:

More information

Properties of detectors Edge detectors Harris DoG Properties of descriptors SIFT HOG Shape context

Properties of detectors Edge detectors Harris DoG Properties of descriptors SIFT HOG Shape context Lecture 10 Detectors and descriptors Properties of detectors Edge detectors Harris DoG Properties of descriptors SIFT HOG Shape context Silvio Savarese Lecture 10-16-Feb-15 From the 3D to 2D & vice versa

More information

Least Squares Classification

Least Squares Classification Least Squares Classification Stephen Boyd EE103 Stanford University November 4, 2017 Outline Classification Least squares classification Multi-class classifiers Classification 2 Classification data fitting

More information

Image Analysis. Feature extraction: corners and blobs

Image Analysis. Feature extraction: corners and blobs Image Analysis Feature extraction: corners and blobs Christophoros Nikou cnikou@cs.uoi.gr Images taken from: Computer Vision course by Svetlana Lazebnik, University of North Carolina at Chapel Hill (http://www.cs.unc.edu/~lazebnik/spring10/).

More information

Fisher Vector image representation

Fisher Vector image representation Fisher Vector image representation Machine Learning and Category Representation 2014-2015 Jakob Verbeek, January 9, 2015 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.14.15 A brief recap on kernel

More information

Gaussian and Linear Discriminant Analysis; Multiclass Classification

Gaussian and Linear Discriminant Analysis; Multiclass Classification Gaussian and Linear Discriminant Analysis; Multiclass Classification Professor Ameet Talwalkar Slide Credit: Professor Fei Sha Professor Ameet Talwalkar CS260 Machine Learning Algorithms October 13, 2015

More information

Image matching. by Diva Sian. by swashford

Image matching. by Diva Sian. by swashford Image matching by Diva Sian by swashford Harder case by Diva Sian by scgbt Invariant local features Find features that are invariant to transformations geometric invariance: translation, rotation, scale

More information

SIFT keypoint detection. D. Lowe, Distinctive image features from scale-invariant keypoints, IJCV 60 (2), pp , 2004.

SIFT keypoint detection. D. Lowe, Distinctive image features from scale-invariant keypoints, IJCV 60 (2), pp , 2004. SIFT keypoint detection D. Lowe, Distinctive image features from scale-invariant keypoints, IJCV 60 (), pp. 91-110, 004. Keypoint detection with scale selection We want to extract keypoints with characteristic

More information

Overview. Introduction to local features. Harris interest points + SSD, ZNCC, SIFT. Evaluation and comparison of different detectors

Overview. Introduction to local features. Harris interest points + SSD, ZNCC, SIFT. Evaluation and comparison of different detectors Overview Introduction to local features Harris interest points + SSD, ZNCC, SIFT Scale & affine invariant interest point detectors Evaluation and comparison of different detectors Region descriptors and

More information

Visual Object Detection

Visual Object Detection Visual Object Detection Ying Wu Electrical Engineering and Computer Science Northwestern University, Evanston, IL 60208 yingwu@northwestern.edu http://www.eecs.northwestern.edu/~yingwu 1 / 47 Visual Object

More information

SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels

SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels Karl Stratos June 21, 2018 1 / 33 Tangent: Some Loose Ends in Logistic Regression Polynomial feature expansion in logistic

More information

MIRA, SVM, k-nn. Lirong Xia

MIRA, SVM, k-nn. Lirong Xia MIRA, SVM, k-nn Lirong Xia Linear Classifiers (perceptrons) Inputs are feature values Each feature has a weight Sum is the activation activation w If the activation is: Positive: output +1 Negative, output

More information

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang.

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang. Machine Learning CUNY Graduate Center, Spring 2013 Lectures 11-12: Unsupervised Learning 1 (Clustering: k-means, EM, mixture models) Professor Liang Huang huang@cs.qc.cuny.edu http://acl.cs.qc.edu/~lhuang/teaching/machine-learning

More information

Advanced Features. Advanced Features: Topics. Jana Kosecka. Slides from: S. Thurn, D. Lowe, Forsyth and Ponce. Advanced features and feature matching

Advanced Features. Advanced Features: Topics. Jana Kosecka. Slides from: S. Thurn, D. Lowe, Forsyth and Ponce. Advanced features and feature matching Advanced Features Jana Kosecka Slides from: S. Thurn, D. Lowe, Forsyth and Ponce Advanced Features: Topics Advanced features and feature matching Template matching SIFT features Haar features 2 1 Features

More information

Kai Yu NEC Laboratories America, Cupertino, California, USA

Kai Yu NEC Laboratories America, Cupertino, California, USA Kai Yu NEC Laboratories America, Cupertino, California, USA Joint work with Jinjun Wang, Fengjun Lv, Wei Xu, Yihong Gong Xi Zhou, Jianchao Yang, Thomas Huang, Tong Zhang Chen Wu NEC Laboratories America

More information

Anomaly Detection for the CERN Large Hadron Collider injection magnets

Anomaly Detection for the CERN Large Hadron Collider injection magnets Anomaly Detection for the CERN Large Hadron Collider injection magnets Armin Halilovic KU Leuven - Department of Computer Science In cooperation with CERN 2018-07-27 0 Outline 1 Context 2 Data 3 Preprocessing

More information

EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING

EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING DATE AND TIME: June 9, 2018, 09.00 14.00 RESPONSIBLE TEACHER: Andreas Svensson NUMBER OF PROBLEMS: 5 AIDING MATERIAL: Calculator, mathematical

More information

Machine vision, spring 2018 Summary 4

Machine vision, spring 2018 Summary 4 Machine vision Summary # 4 The mask for Laplacian is given L = 4 (6) Another Laplacian mask that gives more importance to the center element is given by L = 8 (7) Note that the sum of the elements in the

More information

CS534 Machine Learning - Spring Final Exam

CS534 Machine Learning - Spring Final Exam CS534 Machine Learning - Spring 2013 Final Exam Name: You have 110 minutes. There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to others and come back to the

More information

CIS 520: Machine Learning Oct 09, Kernel Methods

CIS 520: Machine Learning Oct 09, Kernel Methods CIS 520: Machine Learning Oct 09, 207 Kernel Methods Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture They may or may not cover all the material discussed

More information

W vs. QCD Jet Tagging at the Large Hadron Collider

W vs. QCD Jet Tagging at the Large Hadron Collider W vs. QCD Jet Tagging at the Large Hadron Collider Bryan Anenberg: anenberg@stanford.edu; CS229 December 13, 2013 Problem Statement High energy collisions of protons at the Large Hadron Collider (LHC)

More information

Scale & Affine Invariant Interest Point Detectors

Scale & Affine Invariant Interest Point Detectors Scale & Affine Invariant Interest Point Detectors Krystian Mikolajczyk and Cordelia Schmid Presented by Hunter Brown & Gaurav Pandey, February 19, 2009 Roadmap: Motivation Scale Invariant Detector Affine

More information

Kernel methods for comparing distributions, measuring dependence

Kernel methods for comparing distributions, measuring dependence Kernel methods for comparing distributions, measuring dependence Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Principal component analysis Given a set of M centered observations

More information

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba WWW:

SURF Features. Jacky Baltes Dept. of Computer Science University of Manitoba   WWW: SURF Features Jacky Baltes Dept. of Computer Science University of Manitoba Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky Salient Spatial Features Trying to find interest points Points

More information

Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.

Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation. CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.

More information

Machine vision. Summary # 4. The mask for Laplacian is given

Machine vision. Summary # 4. The mask for Laplacian is given 1 Machine vision Summary # 4 The mask for Laplacian is given L = 0 1 0 1 4 1 (6) 0 1 0 Another Laplacian mask that gives more importance to the center element is L = 1 1 1 1 8 1 (7) 1 1 1 Note that the

More information

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES THEORY AND PRACTICE Bogustaw Cyganek AGH University of Science and Technology, Poland WILEY A John Wiley &. Sons, Ltd., Publication Contents Preface Acknowledgements

More information

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting

More information

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering

More information

Advanced Introduction to Machine Learning

Advanced Introduction to Machine Learning 10-715 Advanced Introduction to Machine Learning Homework Due Oct 15, 10.30 am Rules Please follow these guidelines. Failure to do so, will result in loss of credit. 1. Homework is due on the due date

More information

Face recognition Computer Vision Spring 2018, Lecture 21

Face recognition Computer Vision Spring 2018, Lecture 21 Face recognition http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 21 Course announcements Homework 6 has been posted and is due on April 27 th. - Any questions about the homework?

More information

Lecture 13 Visual recognition

Lecture 13 Visual recognition Lecture 13 Visual recognition Announcements Silvio Savarese Lecture 13-20-Feb-14 Lecture 13 Visual recognition Object classification bag of words models Discriminative methods Generative methods Object

More information

Metric Embedding of Task-Specific Similarity. joint work with Trevor Darrell (MIT)

Metric Embedding of Task-Specific Similarity. joint work with Trevor Darrell (MIT) Metric Embedding of Task-Specific Similarity Greg Shakhnarovich Brown University joint work with Trevor Darrell (MIT) August 9, 2006 Task-specific similarity A toy example: Task-specific similarity A toy

More information

Given a feature in I 1, how to find the best match in I 2?

Given a feature in I 1, how to find the best match in I 2? Feature Matching 1 Feature matching Given a feature in I 1, how to find the best match in I 2? 1. Define distance function that compares two descriptors 2. Test all the features in I 2, find the one with

More information

Outline: Ensemble Learning. Ensemble Learning. The Wisdom of Crowds. The Wisdom of Crowds - Really? Crowd wiser than any individual

Outline: Ensemble Learning. Ensemble Learning. The Wisdom of Crowds. The Wisdom of Crowds - Really? Crowd wiser than any individual Outline: Ensemble Learning We will describe and investigate algorithms to Ensemble Learning Lecture 10, DD2431 Machine Learning A. Maki, J. Sullivan October 2014 train weak classifiers/regressors and how

More information

Ensemble Methods. NLP ML Web! Fall 2013! Andrew Rosenberg! TA/Grader: David Guy Brizan

Ensemble Methods. NLP ML Web! Fall 2013! Andrew Rosenberg! TA/Grader: David Guy Brizan Ensemble Methods NLP ML Web! Fall 2013! Andrew Rosenberg! TA/Grader: David Guy Brizan How do you make a decision? What do you want for lunch today?! What did you have last night?! What are your favorite

More information

Learning Methods for Linear Detectors

Learning Methods for Linear Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2011/2012 Lesson 20 27 April 2012 Contents Learning Methods for Linear Detectors Learning Linear Detectors...2

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Here we approach the two-class classification problem in a direct way: We try and find a plane that separates the classes in feature space. If we cannot, we get creative in two

More information

EECS150 - Digital Design Lecture 15 SIFT2 + FSM. Recap and Outline

EECS150 - Digital Design Lecture 15 SIFT2 + FSM. Recap and Outline EECS150 - Digital Design Lecture 15 SIFT2 + FSM Oct. 15, 2013 Prof. Ronald Fearing Electrical Engineering and Computer Sciences University of California, Berkeley (slides courtesy of Prof. John Wawrzynek)

More information

Loss Functions and Optimization. Lecture 3-1

Loss Functions and Optimization. Lecture 3-1 Lecture 3: Loss Functions and Optimization Lecture 3-1 Administrative Assignment 1 is released: http://cs231n.github.io/assignments2017/assignment1/ Due Thursday April 20, 11:59pm on Canvas (Extending

More information

Reconnaissance d objetsd et vision artificielle

Reconnaissance d objetsd et vision artificielle Reconnaissance d objetsd et vision artificielle http://www.di.ens.fr/willow/teaching/recvis09 Lecture 6 Face recognition Face detection Neural nets Attention! Troisième exercice de programmation du le

More information

Corner detection: the basic idea

Corner detection: the basic idea Corner detection: the basic idea At a corner, shifting a window in any direction should give a large change in intensity flat region: no change in all directions edge : no change along the edge direction

More information

Image retrieval, vector quantization and nearest neighbor search

Image retrieval, vector quantization and nearest neighbor search Image retrieval, vector quantization and nearest neighbor search Yannis Avrithis National Technical University of Athens Rennes, October 2014 Part I: Image retrieval Particular object retrieval Match images

More information

Lecture 7: Edge Detection

Lecture 7: Edge Detection #1 Lecture 7: Edge Detection Saad J Bedros sbedros@umn.edu Review From Last Lecture Definition of an Edge First Order Derivative Approximation as Edge Detector #2 This Lecture Examples of Edge Detection

More information

Multiple Similarities Based Kernel Subspace Learning for Image Classification

Multiple Similarities Based Kernel Subspace Learning for Image Classification Multiple Similarities Based Kernel Subspace Learning for Image Classification Wang Yan, Qingshan Liu, Hanqing Lu, and Songde Ma National Laboratory of Pattern Recognition, Institute of Automation, Chinese

More information

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION

SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION SUPERVISED LEARNING: INTRODUCTION TO CLASSIFICATION 1 Outline Basic terminology Features Training and validation Model selection Error and loss measures Statistical comparison Evaluation measures 2 Terminology

More information

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013 UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013 Exam policy: This exam allows two one-page, two-sided cheat sheets; No other materials. Time: 2 hours. Be sure to write your name and

More information

The state of the art and beyond

The state of the art and beyond Feature Detectors and Descriptors The state of the art and beyond Local covariant detectors and descriptors have been successful in many applications Registration Stereo vision Motion estimation Matching

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

Sparse Kernel Machines - SVM

Sparse Kernel Machines - SVM Sparse Kernel Machines - SVM Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Support

More information

Shape of Gaussians as Feature Descriptors

Shape of Gaussians as Feature Descriptors Shape of Gaussians as Feature Descriptors Liyu Gong, Tianjiang Wang and Fang Liu Intelligent and Distributed Computing Lab, School of Computer Science and Technology Huazhong University of Science and

More information

FINAL: CS 6375 (Machine Learning) Fall 2014

FINAL: CS 6375 (Machine Learning) Fall 2014 FINAL: CS 6375 (Machine Learning) Fall 2014 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for

More information

Face detection and recognition. Detection Recognition Sally

Face detection and recognition. Detection Recognition Sally Face detection and recognition Detection Recognition Sally Face detection & recognition Viola & Jones detector Available in open CV Face recognition Eigenfaces for face recognition Metric learning identification

More information

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 Exam policy: This exam allows two one-page, two-sided cheat sheets (i.e. 4 sides); No other materials. Time: 2 hours. Be sure to write

More information

Support Vector Machines and Speaker Verification

Support Vector Machines and Speaker Verification 1 Support Vector Machines and Speaker Verification David Cinciruk March 6, 2013 2 Table of Contents Review of Speaker Verification Introduction to Support Vector Machines Derivation of SVM Equations Soft

More information

day month year documentname/initials 1

day month year documentname/initials 1 ECE471-571 Pattern Recognition Lecture 13 Decision Tree Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi

More information

Overview. Harris interest points. Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points

Overview. Harris interest points. Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different detectors Region descriptors and their performance

More information

Machine Learning Concepts in Chemoinformatics

Machine Learning Concepts in Chemoinformatics Machine Learning Concepts in Chemoinformatics Martin Vogt B-IT Life Science Informatics Rheinische Friedrich-Wilhelms-Universität Bonn BigChem Winter School 2017 25. October Data Mining in Chemoinformatics

More information

Data Mining Techniques

Data Mining Techniques Data Mining Techniques CS 6220 - Section 3 - Fall 2016 Lecture 12 Jan-Willem van de Meent (credit: Yijun Zhao, Percy Liang) DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Linear Dimensionality

More information

Classification. The goal: map from input X to a label Y. Y has a discrete set of possible values. We focused on binary Y (values 0 or 1).

Classification. The goal: map from input X to a label Y. Y has a discrete set of possible values. We focused on binary Y (values 0 or 1). Regression and PCA Classification The goal: map from input X to a label Y. Y has a discrete set of possible values We focused on binary Y (values 0 or 1). But we also discussed larger number of classes

More information