Affine-invariant Shape Recognition Using Grassmann Manifold

Similar documents
A Robust Adaptive Digital Audio Watermarking Scheme Against MP3 Compression

Pose Determination from a Single Image of a Single Parallelogram

â, Đ (Very Long Baseline Interferometry, VLBI)

An Example file... log.txt

This document has been prepared by Sunder Kidambi with the blessings of


II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES

Thermal Conductivity of Electric Molding Composites Filled with β-si 3 N 4

A Double-objective Rank Level Classifier Fusion Method

Ä D C Ã F D {f n } F,

Application of ICA and PCA to extracting structure from stock return

EXTRACT THE PLASTIC PROPERTIES OF METALS US- ING REVERSE ANALYSIS OF NANOINDENTATION TEST

38 1 Vol. 38, No ACTA AUTOMATICA SINICA January, Bag-of-phrases.. Image Representation Using Bag-of-phrases

Ú Bruguieres, A. Virelizier, A. [4] Á «Î µà Monoidal

Linear Algebra in Computer Vision. Lecture2: Basic Linear Algebra & Probability. Vector. Vector Operations

Two-stage Pedestrian Detection Based on Multiple Features and Machine Learning

APPARENT AND PHYSICALLY BASED CONSTITUTIVE ANALYSES FOR HOT DEFORMATION OF AUSTENITE IN 35Mn2 STEEL

Lecture 16: Modern Classification (I) - Separating Hyperplanes

ADVANCES IN MATHEMATICS(CHINA)

Riemannian Metrics on the Space of Solid Shapes

Efficient Estimation for the Partially Linear Models with Random Effects

Periodic monopoles and difference modules

ON THE CLASSIFICATION OF HOMOGENEOUS 2-SPHERES IN COMPLEX GRASSMANNIANS. Fei, Jie; Jiao, Xiaoxiang; Xiao, Liang; Xu, Xiaowei

Statistical analysis on Stiefel and Grassmann Manifolds with applications in Computer Vision

New method for solving nonlinear sum of ratios problem based on simplicial bisection

Surface Modification of Nano-Hydroxyapatite with Silane Agent

An Efficient Representation for Computing Geodesics Between n-dimensional Elastic Shapes

Multiple Similarities Based Kernel Subspace Learning for Image Classification

Price discount model for coordination of dual-channel supply chain under e-commerce

ÇÙÐ Ò ½º ÅÙÐ ÔÐ ÔÓÐÝÐÓ Ö Ñ Ò Ú Ö Ð Ú Ö Ð ¾º Ä Ò Ö Ö Ù Ð Ý Ó ËÝÑ ÒÞ ÔÓÐÝÒÓÑ Ð º Ì ÛÓ¹ÐÓÓÔ ÙÒÖ Ö Ô Û Ö Ö ÖÝ Ñ ¹ ÝÓÒ ÑÙÐ ÔÐ ÔÓÐÝÐÓ Ö Ñ

Affine invariance revisited

Planning for Reactive Behaviors in Hide and Seek

Framework for functional tree simulation applied to 'golden delicious' apple trees

arxiv: v1 [math.dg] 31 Jul 2015

NONLINEAR MANIFOLDS IN PATTERN RECOGNITION AND IMAGE ANALYSIS. Anuj Srivastava Xiuwen Liu, and Washington Mio

The Square Root Velocity Framework for Curves in a Homogeneous Space

Books. Book Collection Editor. Editor. Name Name Company. Title "SA" A tree pattern. A database instance

General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator

18Ï È² 7( &: ÄuANOVAp.O`û5 571 Based on this ANOVA model representation, Sobol (1993) proposed global sensitivity index, S i1...i s = D i1...i s /D, w

Shape of Gaussians as Feature Descriptors

304 A^VÇÚO 1n ò while the commonly employed loss function for the precision of estimation is squared error loss function ( β β) ( β β) (1.3) or weight

u x + u y = x u . u(x, 0) = e x2 The characteristics satisfy dx dt = 1, dy dt = 1

NONPARAMETRIC BAYESIAN INFERENCE ON PLANAR SHAPES

Multisets mixture learning-based ellipse detection

Affine Normalization of Symmetric Objects

STUDY ON METHODS FOR COMPUTER-AIDED TOOTH SHADE DETERMINATION

TOPOLOGICAL ENTROPY FOR DIFFERENTIABLE MAPS OF INTERVALS

Synchronization of an uncertain unified chaotic system via adaptive control

Affine Structure From Motion

Announcements (repeat) Principal Components Analysis

Margin Maximizing Loss Functions

Research Article Cutting Affine Moment Invariants

Vectors. Teaching Learning Point. Ç, where OP. l m n

Riemannian Metric Learning for Symmetric Positive Definite Matrices

A Hybrid Time-delay Prediction Method for Networked Control System

PART IV LIVESTOCK, POULTRY AND FISH PRODUCTION

Lund Institute of Technology Centre for Mathematical Sciences Mathematical Statistics

Constraint relationship for reflectional symmetry and rotational symmetry

Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis

Research Trajectory. Alexey A. Koloydenko. 25th January Department of Mathematics Royal Holloway, University of London

LA PRISE DE CALAIS. çoys, çoys, har - dis. çoys, dis. tons, mantz, tons, Gas. c est. à ce. C est à ce. coup, c est à ce

Anti-synchronization of a new hyperchaotic system via small-gain theorem

An Introduction to Optimal Control Applied to Disease Models

A METHOD OF FINDING IMAGE SIMILAR PATCHES BASED ON GRADIENT-COVARIANCE SIMILARITY

T T V e g em D e j ) a S D } a o "m ek j g ed b m "d mq m [ d, )

ETNA Kent State University

Conditional-Mean Initialization Using Neighboring Objects in Deformable Model Segmentation

Time-delay feedback control in a delayed dynamical chaos system and its applications

Adjoint Orbits, Principal Components, and Neural Nets

CS 4495 Computer Vision Principle Component Analysis

ETIKA V PROFESII PSYCHOLÓGA

Differentiating Functions & Expressions - Edexcel Past Exam Questions

Discriminative Direction for Kernel Classifiers

SVM example: cancer classification Support Vector Machines

F(jω) = a(jω p 1 )(jω p 2 ) Û Ö p i = b± b 2 4ac. ω c = Y X (jω) = 1. 6R 2 C 2 (jω) 2 +7RCjω+1. 1 (6jωRC+1)(jωRC+1) RC, 1. RC = p 1, p

Fast Fourier Transform Solvers and Preconditioners for Quadratic Spline Collocation

CONVEX OPTIMIZATION OVER POSITIVE POLYNOMIALS AND FILTER DESIGN. Y. Genin, Y. Hachez, Yu. Nesterov, P. Van Dooren

Sub-b-convex Functions. and Sub-b-convex Programming

Information geometry for bivariate distribution control

Statistical Analysis of Tensor Fields

An improved algorithm for scheduling two identical machines with batch delivery consideration

THE ANALYSIS OF THE CONVECTIVE-CONDUCTIVE HEAT TRANSFER IN THE BUILDING CONSTRUCTIONS. Zbynek Svoboda

2 Hallén s integral equation for the thin wire dipole antenna

" #$ P UTS W U X [ZY \ Z _ `a \ dfe ih j mlk n p q sr t u s q e ps s t x q s y i_z { U U z W } y ~ y x t i e l US T { d ƒ ƒ ƒ j s q e uˆ ps i ˆ p q y

QUESTIONS ON QUARKONIUM PRODUCTION IN NUCLEAR COLLISIONS

hal , version 1-27 Mar 2014

Observations on the Stability Properties of Cooperative Systems

Geodesic Shape Spaces of Surfaces of Genus Zero

Solving SVM: Quadratic Programming

Machine Learning: Logistic Regression. Lecture 04

MATHEMATICAL entities that do not form Euclidean

Curriculum Vitae Wenxiao Zhao

Research Article Design of PDC Controllers by Matrix Reversibility for Synchronization of Yin and Yang Chaotic Takagi-Sugeno Fuzzy Henon Maps

CS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang

Closed-form Solutions to the Matrix Equation AX EXF = BY with F in Companion Form

Fuzzy Rough Model Based Rough Neural Network Modeling

Analysis of Spectral Kernel Design based Semi-supervised Learning

Bayesian Principal Geodesic Analysis in Diffeomorphic Image Registration

RBF Neural Network Adaptive Control for Space Robots without Speed Feedback Signal

Optimal Control of PDEs

Transcription:

½ 38 ½ 2 Þ Vol. 38, No. 2 2012 2 ACTA AUTOMATICA SINICA February, 2012 Ù Grassmann» å» Ç 1, 2, 3, 4 Ó¾å 5 Ä ý Kendall» èñò ó Ù Õ, ää ÁÙµ» ì»î Ì å ǑÜ. Ù Grassmann»Ò, Åå» èñ ê,  Š٠Grassmann» å» Ç ß. ß Þ» ÇÇ» à«,» Æ èñ ;, ý»» ºµý» Õ,» ½ ã. MPEG 7» ÂÄ, Kendall» Ù Procrustean Ç ß ³, Ç ß Û ; ñ Á Ç Â èä, ß å» ó Í, ñ Í Ô Î Á Ç. ³» Ç, Grassmann», å,» èñ,» DOI 10.3724/SP.J.1004.2012.00248 Affine-invariant Shape Recognition Using Grassmann Manifold 1, 2, 3 LIU Yun-Peng 1,2, 3, 4 LI Guang-Wei 5 SHI Ze-Lin 1,2, 3 Abstract Traditional Kendall shape space theory is only applied to similar transform. However, geometric transforms of the object in the imaging process should be represented by affine transform at most situations. We analyze the nonlinear geometry structure of the affine invariant shape space and propose an affine-invariant shape recognition algorithm based on Grassmann manifold geometry. Firstly, we compute the mean shape and covariance for every shape class in the train sets. Then, we construct their norm probability models on the tangent space at each mean shape. Finally, we compute the maximum likelihood class according to the measured object and prior learned shape models. We use the proposed algorithm to recognize shapes in standard shape dataset and real images. Experiment results on MPEG-7 shape dataset show that our recognition algorithm outperforms the algorithm based on Procrustean metric in traditional Kendall shape space theory. Experiment results on real images also show that the proposed algorithm exhibits higher capacity to affine transform than the Procrustean metric based algorithm and can recognize object classes with higher posterior probability. Key words Shape recognition, Grassmann manifold, affine invariant, shape space, mean shapes Ç ü ø ñ íõå Á Æ ü Æ, íµ» Ò,,» (Shape) ðê è Æ. Ù» ² á Á àå è Ò Þ Þ, ǑÜÅ Á ï Æ,» Ç Á» ǑÇ Ç ÁÂ.» þú 2010-06-11 õ 2010-10-13 Manuscript received June 11, 2010; accepted October 13, 2010 Á (60603097), Á ÁãÅ (CXJJ-65) Supported by National Natural Science Foundation of China (60603097), National Defense Innovation Foundation of Chinese Academy Sciences (CXJJ-65) Á ä þ Recommended by Associate Editor HU Zhan-Yi 1. Á Þ 110016 2. Á ½ ² Ò û 110016 3. Û äòµ ü Ç û 110016 4. Á µ 100049 5. ã Ò ã 266071 1. Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016 2. Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Sciences, Shenyang 110016 3. Key Laboratory of Image Understanding and Computer Vision, Liaoning Province, Shenyang 110016 4. Graduate University of Chinese Academy of Sciences, Beijing 100049 5. Department of Management Science and Engineering, Qingdao University, Qingdao 266071 Ç ê, ¾ ÙìÍ Áã ê ä Ù.» ðä Á Æ.» èñ ùþǒ è» [1]. 20 Î 60 ã, Æ Å ÑÅ» Ò, îæ ë õ, ³ ÙÄ ÇÇ ü èå Ö ÁË. Çð, 2010 ½ 4 IEEE Transactions on Pattern Analysis and Machine Intelligence Ñ Å» ä Òµ Ö [1], Ö» Â. Ä» ³» Ñ «ð» Ç Ã.» Ä àß, Ǒ : Ù (ù º)» Ù».,». ÁË, Ù»» èñ Ò. è Kendall [2] ǑãÄ, Á» Ä

2 : Ù Grassmann» å» Ç 249 Æ, ÁÙµ Õ,» èñǒ Õ èñ,» èñ Procrustean. ³Ù» èñ ð Kendall ð ½. Zhang [3] Á Õ» üǒôá» èñ è,» Ñ «Ä»». Huckemann [4] Kendall» èñ ½»,» Þ. Han [5] Procrustean л Õ Ç ß Á. è» ð Ù» èñò Ǒ, Ç Æ Æ ðø î, Ìàßð. Ùàó Ù ÁÙµ Õ ì», ä Ä ÁÙµ» ì»î Ì å ǑÜ [6]. è Grenander [7] ǑãÄ, Á ºù Ä»,» Ä Ǒ á»,» Ñ Þ Û ðó Â., Fletcher [8] Å ñ ǑÜ Æ Á Ã. Klassen Srivastava ø» û» èñ, ûùþèñ л [9 10], ð ùþ Ë è ð Â.» º [11] Ǒðè ê» àß, è Ä Æ ½ Ë» Æ,» Þ º. àßù Å Ù øèñ Ç àß, üå» èñ ê,. ÁË í» Ã ½Å ç,  Åè» àß: ËÆ,»±», Ë Æ Ä» [12]. í Ë [13] ÁÄ Á ñ º, î» Ä Ǒ Á.» àß Ù Grenander U» èñ. ³Ù» Ç, Ò Â, Ǒ ¾, Æ áãææµ. Đ :, º Òá Æ ÄÆ áä äî, í»ùµ³ Õ»» æ Á ë Þ, Ǒ ð ½ Þ Á ÃǑ, Å Á» Ç Â [14] ;,» ðè ê», Ù øèñ Ç àßç Ù»., Ì» èñ ê,» Ç, ÐÇ Ò ÆĐ» Ã. Kendall» èñò àó Ù Õ, å ÁÙµ», å» èñ, Grassmann» Ò» Ç ß; ß å» èñ, Ð ý» ½ Bayesian Ç. ǑÅ Ý Â ß, ó MPEG 7 ÁÇ» ñ ä Â ß Kendall» èñ Ù Procrustean Ç ß ½Å ³. ½ 1 ß µ Ã, ½ 2 å» èñ, ½ 3 Å» Ç ß, ½ 4, ð. 1 à 1.1» Ä áä Á, ó Á n Ë {z 1,z 2,,z n } Á Æ,, z i = (x i,y i ) Ä è Á, Á» è n p (p = 2) Å X Ä, ó Ǒ» Å. ÔàĐ ê», rank(x) = p. 1, 1 (a) n = 60, 1(b) n = 20, Ǒ»,  ãä» (» MPEG 7» [15] ). (a) 1» Ä Fig. 1 Shape representations Ù» X 1 = {u 1,,u n } X 2 = {v 1,,v n }, Â Ú (b) u T i = Av T i + b, i = 1,,n (1), A ð 2 2 ê Å; b ð 2 1 å, û X 1» X 2 ðå. 1.2 Bayesian Ý N»,, ½ i (i = 1,,N)» C i Ð Ǒ µ i, à«ǒ Σ i, C i N(µ i,σ i ).» C i (i = 1,,N) Î P (C i ), à ٠ù» X, Bayesian ùòđ,» Ç ÞǑ Î :

250 Þ 38 Ĉ = arg max C i P (C i X) = arg max C i P (X C i )P (C i ) = arg max C i P (X C i ) (2) Ù» èñðè ê èñ, Í Ù øèñ Ç àßµ» à «;» èñ Î Á»». 2 å» èñ 2.1 Grassmann» Grassmann»Ò ð» Ò, ³» ËÆ, Đ Þ [16 17]. Grassmann» Þ ß [18] èñç [19]. Grassmann» Ù² Òëë, Ñ ÙÇ ü Ç, ËÆíµ ²² à [20] ì» Ç» [21] Đ Ã» [22]. Ǒ Å, Grassmann»ß ù þ, ³Ù Grassmann» ³ Ǒ Þ [18, 23]. ùþ 1. Grassmann» G(k,n) ð n k Å [24]. : G(k,n) = Y Ok = {Y V : V O k } (3), Y ð n k Å, Y ãä ³, V ð k k Å. Grassmann» G(k,n) Ǒ Ä Ǒ n á åèñ R n k á èñ.» M ßð Þ è p M èñ T p M ùè Ë,, Ǒ. è p G(k,n), èñǒ T p G(k,n) = {ω ω = p g,g R (n k,k)} (4) p Ǒ p. G(k,n) ùþǒ ω = tr ( ω T ω ) (5) γ : t γ (t) Ǒ γ (0) Ǒ dγ dt (0) = ω», Æ Exp p (ω) = γ (1) Å» : Exp p (ω) = pv cos (θ) + U sin (θ) (6), UθV T = SV D (ω). Æ log p (q) = UθV T, Ô θ = arctan (S), USV T = p p T q (p T q) 1., Grassmann» (p,q)»ðǒ [23] d G (p,q) = ( k i=1 θ 2 i ) 1 2 = θ 2 (7), p q Ǒ Grassmann», p q Ñ ÆǑ θ 1,,θ k, ³Ù Æ Grassmann Ð û 2. ß 1 ÅÇ Grassmann»»ÐÄ. 2 Æ Grassmann Ð Fig. 2 Principal angles and Grassmann distance ß 1. d(p,q). Å Y 1 Y 2 ;. p q Ñ»Ð; 1) Ç Y 1 Y 2 Q 1 Q 2, : p = Y 1 = Q 1, q = Y 2 = Q 2 ; 2) Ç Q T 1 Q 2 µ: USV T = SV D (Q T 1 Q 2); 3) Ç Æ θ = cos 1 S; k 4) d(p,q) = θi 2. i=1 2.2 å» èñ ó å» èñ, Grassmann». Á» Ä, (1),» X 1 = {v 1,v 2,,v n } X 2 = {u 1,u 2,,u n } å

2 : Ù Grassmann» å» Ç 251, Ǒ u 1 1 v 1 1 [ ] u V = 2 1. 1 = v 2 1 A T 0. 1 b T 1 u n 1 v n 1 (8) R 2 èñ n å 1 = [1,1,,1] T Å R n 3 á èñ V, èñ V ðå,, å» èñ Grassmann»Ä Ǒ G(3,n). Ǒ ð è» å» Grassmann» è. ùþ. ùþ 2. C 2 n ð R 2 õ n Ë {u 1,u 2,,u n } ; å A: C 2 n C 2 n, Ãå» èñð C 2 n ³Ù A C 2 n /A,. 3, χ C 2 n, à s(χ) C 2 n/a Ǒ χ è å». Ä è» Ù è. 3 C 2 n : å» èñ Fig. 3 Orbits of C 2 n: affine invariant shape space 3 Ù Grassmann»» Ç 3.1» èñî» èñç Î, Ç». Ù» èñðê èñ, Í Ù ø èñ àßç. ùþ 3.» M x 1,x 2,,x m Karcher µ ùþǒ µ = arg min x M m d(x,x i ) 2 (9) i=1, d(x,x i ) Ǒ M»Ð [24]. Þ [24] Ç (11), àßµ» ˱, ð, ó Grassmann» µ ß 2. ß 2. Mean (x 1,x 2,,x m ). x 1,x 2,,x m G(k,n);. Karcher : µ; 1) Þ µ = x 1 ; 2) Ç A = τ m m log µ (x i ); i=1 3) A < ε, µ, à ù 4); 4) : A µ, UΣV T ; µ: µ = µv cos ( ) + U sin ( ), ù 2). Ëã ßµ, ½ 2) τ ǑĐ. Ù ù ä», ß» Karcher. Ù» èñð», Í Ç Ò Ä» è Ñ.» èñ Ä Æ Ã Æ èñ, Æ èñ T µ G ùþî. Î ÒÇ, ðè. Ë, Æ á Ðù Õ Ð (ù Ô ). Ôó üâ Ǒ» èñ Î º. Ù èñðåèñ, ÁÇ» èñ, û ½ á. x 1,x 2,,x m G(k,n), µ = Mean (x 1,,x m ). Ù èñ T µ G û {v 1,v 2,,v m }, Æà«ÅǑ = 1 m log µ (x i )log µ (x j ) T (10) i,j,» èñ N(µ, ) Î ç Ǒ Σ; 1 ϕ(x) = (2π) m 2 Σ 1 2 exp ( 12 ) log µ (x)σ 1 log µ (x) T (11), N» ½ Ë ß : ß 3. Learn Shape Model (x 1,x 2,,x m ). x 1,x 2,,x m G(k,n);.» º S, íõ» µ Æà«1) ß 2 Ç» Karcher µ; 2) Ç Æà«: = 1 log m i,j µ (x i ) log µ (x j ) T ; 3) Æ à «Σ ½ Æ µ: {eigenvectors, eigenvalues} = eig (Σ). à üâ Æ Æå» Þ, Ǒ» º ½ á, Ç. 3.2» Ç ß Ù ù è» X, Ç Ä ð

252 Þ 38 Ç X Õ. ß : ß 4. Shape likelihood(x,s,k). ý» x,» º S, k. Õ f 1) ý» x å» èñã: log µ (x); 2) Ç x Ç S Mahalanobis Ð: log µ (x) Σ 1 log µ (x) T ; 3) Ç x ³ Ù Ç S Õ : 1 (2π) m 2 Σ 1 2 exp ( 1 2 log µ (x) Σ 1 log µ (x) T). 4 Ǒ Å Ý Â ß, Á Ç MPEG 7» ä ½, ß Matlab Ý, Ò Ǒ Pentium Dualcore 2.5 GHz ËĐǑ 2 GB.  ٠Grassmann» àß Þ [2 5]» Ù Procrustean àß ½Å³., MPEG 7» ½, Ä ëô, MPEG 7» [15] À Ǒ» Ç Ä ý. Ì íõ 70 Á, Þ Áíõ 20»., íõ 3D Á», ó üâ 16 º 2D Á. 4.1 MPEG 7» 4.1.1» ÐÞ Á 20» á ü 15 Ǒ, Ë» º, à 5» Ǒý.,» ½Å,» Ç» Ç. ß 2 Ç 16 Á». Þ» Ǒ 150. 1  4, ǑÅ ï, à Å Ò Á. 4 (a) Ǒ», 4(b) Ǒ Ù Grassmann Ç, 4 (c) Ǒ Ù Procrustean Ç. 5 Å (a)» (a) Trained shapes 4»  Fig. 4 Results of mean shapes (b) Grassmann (b) Grassmann mean (c) Procrustean (c) Procrustean mean 5 Grassmann Procrustean ³ ((a) Grassmann ; (b) Procrustean ) Fig. 5 Comparison of Grassmann mean and Procrustean mean ((a) Grassmann mean; (b) Procrustean mean)

2 : Ù Grassmann» å» Ç 253 16» Grassmann Procrustean. 5(a) Ǒ Ù Grassmann Ç, 5(b) Ǒ Ù Procrustean Ç. Ð 5 Û,» 6 Procrustean» µå.» 8 É»».» 11 è 6, Ð 6 Û» 11 Procrustean» µå. Ç» «, Æ, Grassmann ³ Procrustean,». (a) Grasssmann (a) Grassmann mean (b) Procrustean (b) Procrustean mean 6 è» 11 ³ Fig.6 Comparison of zoomed shape means of Shape 11 4.1.2» Ç 1 Ç 16», ó ß 3 16» ½ Ë. Þ», ó ü 6 Æà«Å á. 1 Đ, Þ» 5 ý». ß 4 ½» Ç, û å» Ç ÃǑ ½. 1) Ç ÃǑ 2, Ù 15» ðá ü,» Þè n, 10, 10 Ç Ǒ Ç, n Ð 10 150 Þ. 7 Å Ǒ 15 î,» ß 4 Ç Î. 8(a) Å ³ Â. ÁãÄ, Ô ÁãÄ Ç. ÂÄ : á Ç, ß Ç Ç, Ä 100 î, Ç Ù ù; ³, Ù 25 î Ù Grassmann» ß Ù Procrustean ß Ç, Ù 25 î, ß Ç Ù Ù Procrustean ß. 2) Ç ÃǑ» 100. ǑÅ Ý ß å, å» Á x y Ç 7 4»» (10) (11) Î Fig. 7 Four shapes and Shapes (10-11) s posterior probabilities

254 Þ 38 (a) (a) Number of samples (b) (b) Noise 8 Ç ÃǑ Fig. 8 Classification performance versus number of samples and noise Î ² ³ Ô. ² ³Ð 40 db 17.5 db Þ, Þè ² ³, 10, 10 Ç Ǒ Ç. 9 Å 4 Î ² ³Ǒ 30 db 25 db». 8(b) Å Ç ÃǑ Â. Á ãä² ³, Ô ÁãÄ Ç. ÂÄ, á Ç ß Ç ; ² ³ Ù 30 db î, Ù Grassmann» ß Ç Û Ù Ù Procrustean ß. 10 Ç Ǒ Ç. 10 Å 4 ÁÇ«Ǒ 16 20». 9 4 λ (è½î 30 db ; è½ Î 25dB ) Fig. 9 Four shapes with adding noise (The top row is the shapes with adding 30 db noise; the bottom row is the shapes with adding 25dB noise) 3) å» Ç ÃǑ Ù Grassmann» ß Ù Á å», ǑÅ Ý ß å» å, á µ å» ½»,»» ½ Ç. ó Þ [25] àßµ á å, µ á å Á Ç«Ä å». ÁÇ«Ð 2 20, ½. Þè ÁÇ«, 10, 10 4 å»» (è½ǒ» ; è½ ÁÇ«Ǒ 16; è½ ÁÇ«Ǒ 20) Fig. 10 Four affine deformation shapes (The top row is original shapes; the middle row s transform standard deviation is 16; the bottom row s transform standard deviation is 20.) 11 (a) Åå» Ç ÃǑ Â. ÁãIJ ³, Ô ÁãÄ Ç. ÂÄ, á å, Ù Procrustean ß Ç ; Ù Grassmann» ß Ç ÃǑ. î ó Å è, á å î, å» Î ² ³Ǒ 30, ½. Þè ÁÇ«, 10, 10 Ç Ǒ Ç.  11 (b). Ð 11 Û : Î, ß Æ Ç, á å, Ù Procrustean ß Ç, Ù Grassmann» ß Ç ÚïÓ Þ. û Û Å ß.

2 : Ù Grassmann» å» Ç 255 (a)» ½ (b) Î ² ³Ǒ 30 db (a) Expeiments on original shape (b) Experiments on shapes with adding 30dB noise 11 å» Ç ÃǑ Fig. 11 Classification performance versus affine deformation 4.2 ä 4.2.1» Ç» º Ǒ½ 4.1 MPEG 7 Ë» º, ý» ÃǑ ñø Á». ÙÁÇ», Đ» Ù Ã. Ùß ñ ä Á à ÇĐ àß ½ Đ. Ù ñ ä Á Á èåò àß ½ Đ. Á ð, ä Á Đ ð ËÆ. Ù 12 (a) Ò ß ñ ä, ó àß Đ Á:, ä I I, à I x y w = I x + I y ; Ð I w 3 ÁÇ«ǑÃ Đ I w, ø 12 (b) Æä, û Æä ø Á». Ù 12 (a) è ñ ä, ó Þ [26] àß Đ Á, û ø Á». Ç,» Ǒ 100. 12 (c) Á» Î Å Ç Ç. Ð 12 (a) Û : 1) Á Ù Ë ñ, Á» Ù» Ë ÌðÙµÅÃ, å Õ è, Çð Á: É ; Á Õ, ß» Å Á Ç; 2) Ù ½ 3 Á,», ß» Å Á Ç; 3) Ù½ 4 Á à, Ù Á Đ, ø Á» ½ 3 Á ð Õ, ß» Å Á Ç. ÂÄ ß Í ñ Ô Î Á Ç. 4.2.2 å Ý ä Æ, ÂĐ å», è ð å» Õ Ã. ǑÅ Ý ß ñ Á» å,» º Á» ý» Ð Ë ñ ø. Æ è, üâ Ǒ Á Æ, è. ä, ÐÐ Á 1 â 6 â Þ, î ä Í üæ ä. Þ Ðü 100 ä, á ü 80 Ǒ, àß 20 Ǒý, Ç Â. Ô» Ù Đ Ä ½ 4.1. 20 ý» Î Ǒ ù Ç Â, Î, Ç Â. 13 Å ä. Ç Ù Grassmann» ß Ù Procrustean ß ø ä ½, Ç» Ǒ 100. 14 Šл ³, Ð 14 Û, 4 â 5 â, Procrustean Đ Ý, 6 â Procrustean Đ Ý, Procrustean Á». 15 Å Ð Ç Â. Ð 15 Û á ÐÇ, Ù Grassmann» Ç ß Î Ç, ½ 6 â Ù ù. Ù ßð å Â, ÝÅ Ð Æ, Ã Þ å.

256 Þ 38 Fig. 13 12 ñä Â (4 Á ÇǑ 8, 10, 4, 12) Fig. 12 Results of real scence images (The correct classes are: 8, 10, 4, 12) 13 Ë ñ üæä The different view real images from near to far Ù Procrustean ßð Õ Â, Ð 13 Û, Ð 1 â 4 â, Ð Á Æ, Ã, Procrustean ß Î, 0.45. ½ 5 â½ 6 â, Ð Á, Ã å, èǒ Õ,, Î Ç, 6 â à 0.6057. Kendall» èñò Õ Á», Ë ñ ä Á», ð Ò.

2 : Ù Grassmann» å» Ç 257 Ð Æ ÂÛ, Ù Grassmann» Ç ß Ù Procrustean ß Û. Ǒ Å å Á» Ò. 14 л Fig. 14 Mean shapes of different distance piers æ», ðä ÅÝ» èñò, Ǒ» Ç Â Å Ò í, Ǒ ñ Á ÇÂ Å Ò ß.» èñ Î ºî, ð» Æ èñ Ô, Á» Í Þ, Æ, Ù» Þ ìõ, ÌĐ óô ù Bingham [27] ;, Ä ß å, íµ½ Þ ì»» Ç Ã;, Ý ß îþ ÒǑðèá. Å ðé àå. References Fig. 15 5 15 Ð Ç Â Recognition results of piers at different distances å» èñ ê, Grassmann» Ò» Ç, å» èñùþ à«, Ð ÐÇ Ò ÆĐ» Ç Ã. ÂÄ, Â Ç ß³ Ù Kendall» èñò Ç ß Í. øèñ ß 1 Srivastava A, Damon J N, Dryden I L, Jermyn I H. Guest editors introduction to the special section on shape analysis and its applications in image understanding. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(4): 577 578 2 Kendall D G. Shape manifolds, procrustean metrics and complex projective spaces. Bulletin of London Mathematical Society, 1984, 16(2): 81 121 3 Zhang J, Zhang X, Krim H, Walter G G. Object representation and recognition in shape spaces. Pattern Recognition, 2003, 36(5): 1143 1154 4 Huckemann S, Hotz T, Munk A. Intrinsic MANOVA for Riemannian manifolds with an application to Kendall s space of planar shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(4): 593 603 5 Han Y X, Wang B, Idesawa M, Shimai H. Recognition of multiple configurations of objects with limited data. Pattern Recognition, 2010, 43(4): 1467 1475

258 Þ 38 6 Huttenlocher D P, Ullman S. Recognizing solid objects by alignment with an image. International Journal of Computer Vision, 1990, 5(2): 195 212 7 Grenander U, Miller M I. Pattern Theory: from Representation to Inference. New York: Oxford University Press, 2007 8 Fletcher P T, Whitaker T R. Riemannian metrics on the space of solid shapes. In: Proceedings of the International Workshop on Mathematical Foundations of Computational Anatomy. Copenhagen, Denmark: MICCAI, 2006. 1 11 9 Klassen E, Srivastava A, Mio W, Joshi S. Analysis of planar shapes using geodesic paths on shape spaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(3): 372 383 10 Srivastava A, Joshi S, Mio W, Liu X W. Statistical shape analysis: clustering, learning and testing. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(4): 590 602 11 Cootes T F, Taylor C J, Cooper D H, Graham J. Active shape models: their training and application. Computer Vision and Image Understanding, 1995, 61(1): 38 59 12 Wang B, Chen Y Q. An invariant shape representation: interior angle chain. International Journal of Pattern Recognition and Artificial Intelligence, 2007, 21(3): 543 559 13 Chen Xiao-Chun, Ye Mao-Dong, Ni Chen-Min. A method for shape recognition. Pattern Recognition and Artificial Intelligence, 2006, 19(6): 758 763 (í Ë, ãåý, ç. è» Ç àß. Ç ì Í, 2006, 19(6): 758 763) 14 Mumford D. Mathematic theories of shape: do they model perception? In: Proceedings of the Conference on Geometric Methods in Computer Vision. San Diego, USA: SPIE, 1991. 2 10 15 Latecki L J, Lakimper R, Eckhardt U. Shape descriptors for non-rigid shapes with a single closed contour. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Hilton Head Island, USA: IEEE, 2000. 424 429 16 Spivak M. A Comprehensive Introduction to Differential Geometry. Berlin: Berkeley, 1979 17 Berger M. A Panoramic View of Riemannian Geometry. Berlin: Springer, 2003 18 Edelman A, Arias T A, Smith S T. The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis and Applications, 1999, 20(2): 303 353 19 Lin D, Yan S, Tang X. Pursuing informative projection on Grassmann manifold. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition. New York, USA: IEEE, 2006. 1727 1734 20 Zhang L, Tse D N. Communication on the Grassmann manifold: a geometric approach to the noncoherent multipleantenna channel. IEEE Transactions on Information Theory, 2002, 48(2): 359 383 21 Amsallem D, Farhat C. An interpolation method for adapting reduced-order models and application to aeroelasticity. AIAA Journal, 2008, 46(7): 1803 1813 22 Subbarao R, Meer P. Nonlinear mean shift over Riemannian manifolds. International Journal of Computer Vision, 2009, 84(1): 1 20 23 Absil P A, Mahoney R, Sepulchre R. Riemannian geometry of Grassmann manifolds with a view on algorithmic computation. Arta Applicandae Mathematicae, 2004, 80(2): 199 220 24 Fletcher P T, Lu C, Pizer S M, Joshi S. Principal geodesic analysis for the study of nonlinear statistics of shape. IEEE Transactions on Medical Imaging, 2004, 23(8): 995 1005 25 Baker S, Matthews I. Lucas-Kanade 20 years on: a unifying framework. International Journal of Computer Vision, 2004, 56(3): 221 255 26 Marszalek M, Schmid C. Accurate object localization with shape masks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Minnesota, USA: IEEE, 2007. 1 8 27 Dryden I L, Mardia K V. Statistical Shape Analysis. New York: John Wiley and Sons, 1998 Á Þ µ. àåǒ Á Ð Ç. ². E-mail: ypliu@sia.cn (LIU Yun-Peng Ph. D. candidate at Shenyang Institute of Automation, Chinese Academy of Sciences. His research interest covers object tracking and recognition. Corresponding author of this paper.) Ó¾å ã ã. àå Ǒ Á Ð Ç, åëë. E-mail: liguangweispacetime@gmail.com (LI Guang-Wei Lecturer at Qingdao University. His research interest covers object tracking and recognition, and robust control.) Ä Á Þ Å. àåǒä Ò, Ç, ìíëë. E-mail: zlshi@sia.cn (SHI Ze-Lin Professor at Shenyang Institute of Automation, Chinese Academy of Sciences. His research interest covers image processing, pattern recognition, and intelligent control.)