On the Equivariance of the Orientation and the Tensor Field Representation Klas Nordberg Hans Knutsson Gosta Granlund Computer Vision Laboratory, Depa

Similar documents
The structure tensor in projective spaces

Representation and Learning of. Klas Nordberg Gosta Granlund Hans Knutsson

Sequential Filter Trees for Ecient D 3D and 4D Orientation Estimation Mats ndersson Johan Wiklund Hans Knutsson omputer Vision Laboratory Linkoping Un

The GET Operator. LiTH-ISY-R-2633

FFTs in Graphics and Vision. The Laplace Operator

Confidence and curvature estimation of curvilinear structures in 3-D

Backprojection of Some Image Symmetries Based on a Local Orientation Description

The quantum state as a vector

A geometric proof of the spectral theorem for real symmetric matrices

Lecture 10: A (Brief) Introduction to Group Theory (See Chapter 3.13 in Boas, 3rd Edition)

Linear Algebra: Matrix Eigenvalue Problems

Channel Representation of Colour Images

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

Spectral Theorem for Self-adjoint Linear Operators

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Introduction to Group Theory

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

4.2. ORTHOGONALITY 161

Lecture 10 - Eigenvalues problem


22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

G1110 & 852G1 Numerical Linear Algebra

On Mean Curvature Diusion in Nonlinear Image Filtering. Adel I. El-Fallah and Gary E. Ford. University of California, Davis. Davis, CA

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

An LMI description for the cone of Lorentz-positive maps II

Using Local 3D Structure for Segmentation of Bone from Computer Tomography Images

A Probabilistic Definition of Intrinsic Dimensionality for Images

A Novel Approach to the 2D Analytic Signal? Thomas Bulow and Gerald Sommer. Christian{Albrechts{Universitat zu Kiel

GROUP THEORY PRIMER. New terms: tensor, rank k tensor, Young tableau, Young diagram, hook, hook length, factors over hooks rule

8. Diagonalization.

E.G. KALNINS AND WILLARD MILLER, JR. The notation used for -series and -integrals in this paper follows that of Gasper and Rahman [3].. A generalizati

Eigenvectors and Hermitian Operators

Notes on the matrix exponential

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

Representation Theory

Representation theory and quantum mechanics tutorial Spin and the hydrogen atom

Large Scale Data Analysis Using Deep Learning

Physics 221A Fall 1996 Notes 14 Coupling of Angular Momenta

Linear Algebra using Dirac Notation: Pt. 2

1 Mathematical preliminaries

Highest-weight Theory: Verma Modules

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

1 Linear Algebra Problems

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

QM and Angular Momentum

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Detailed Proof of The PerronFrobenius Theorem

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

REPRESENTATION THEORY WEEK 7

Chapter 2 The Group U(1) and its Representations

Repeated Eigenvalues and Symmetric Matrices

MIT Final Exam Solutions, Spring 2017

Discrete Variable Representation

Announce Statistical Motivation Properties Spectral Theorem Other ODE Theory Spectral Embedding. Eigenproblems I

Symmetries, Groups, and Conservation Laws

4 Matrix product states

Elementary 2-Group Character Codes. Abstract. In this correspondence we describe a class of codes over GF (q),

GQE ALGEBRA PROBLEMS

The Spinor Representation

Announce Statistical Motivation ODE Theory Spectral Embedding Properties Spectral Theorem Other. Eigenproblems I

Particles I, Tutorial notes Sessions I-III: Roots & Weights

Page 52. Lecture 3: Inner Product Spaces Dual Spaces, Dirac Notation, and Adjoints Date Revised: 2008/10/03 Date Given: 2008/10/03

Mathematical Methods wk 2: Linear Operators

Isotropic harmonic oscillator

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

PEAT SEISMOLOGY Lecture 2: Continuum mechanics

Stability of Mass-Point Systems

Rank, Trace, Determinant, Transpose an Inverse of a Matrix Let A be an n n square matrix: A = a11 a1 a1n a1 a an a n1 a n a nn nn where is the jth col

Parameters characterizing electromagnetic wave polarization

CLASS OF BI-QUADRATIC (BQ) ELECTROMAGNETIC MEDIA

Chapter 3 Transformations

Lecture 1: Systems of linear equations and their solutions

Background Mathematics (2/2) 1. David Barber

MATHEMATICAL METHODS AND APPLIED COMPUTING

! 4 4! o! +! h 4 o=0! ±= ± p i And back-substituting into the linear equations gave us the ratios of the amplitudes of oscillation:.»» = A p e i! +t»»

CENTRAL SYMMETRY MODELLING

MATHEMATICS 217 NOTES

Weeks 6 and 7. November 24, 2013

2 W. LAWTON, S. L. LEE AND ZUOWEI SHEN is called the fundamental condition, and a sequence which satises the fundamental condition will be called a fu

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

INTRODUCTION TO REPRESENTATION THEORY AND CHARACTERS

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later.

1 Infinite-Dimensional Vector Spaces

Chapter III. Quantum Computation. Mathematical preliminaries. A.1 Complex numbers. A.2 Linear algebra review

8 Eigenvectors and the Anisotropic Multivariate Gaussian Distribution

q-alg/ Mar 96

Warped Products. by Peter Petersen. We shall de ne as few concepts as possible. A tangent vector always has the local coordinate expansion

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

Linear Algebra Massoud Malek

1 Introduction A one-dimensional burst error of length t is a set of errors that are conned to t consecutive locations [14]. In this paper, we general

Energy Tensors: Quadratic, Phase Invariant Image Operators

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Lecture 9: Eigenvalues and Eigenvectors in Classical Mechanics (See Section 3.12 in Boas)

3 Symmetry Protected Topological Phase

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Transcription:

On the Invariance of the Orientation and the Tensor Field Representation Klas Nordberg Hans Knutsson Gosta Granlund LiTH-ISY-R-530 993-09-08

On the Equivariance of the Orientation and the Tensor Field Representation Klas Nordberg Hans Knutsson Gosta Granlund Computer Vision Laboratory, Department of Electrical Engineering Linkoping University, S-58 83 LINK OPING, SWEDEN Phone: +46-3-8 6 34, Telefax: +46-3-3 85 6 Abstract The tensor representation has proven a successful tool as a mean to describe local multi-dimensional orientation. In this respect, the tensor representation is a map from the local orientation to a second order tensor. This paper investigates how variations of the orientation are mapped to variation of the tensor, thereby giving an explicit equivariance relation. The results may be used in order to design tensor based algorithms for extraction of image features dened in terms of local variations of the orientation, e.g. multidimensional curvature or circular symmetries. It is assumed that the variation of the local orientation can be described in terms of an orthogonal transformation group. Under this assumption a corresponding orthogonal transformation group, acting on the tensor, is constructed. Several correspondences between the two groups are demonstrated. Introduction The tensor representation for orientation was introduced by [Knutsson, 989] as a tool for managing orientation representation of images with dimensionality greater than two. The representation may be employed for arbitrary dimensionality, even though theoretical investigations and practical implementations have been carried out only for images of dimensionality two, three and four, see [Knutsson et al., 99a] and [Knutsson et al., 99b]. The main idea is to let the eigensystem of a symmetric and positive semidefinite tensor, in practice corresponding to an n n matrix, represent the orientation structure of a neighbourhood. As a simple example, consider the case of two-dimensional images. The local orientation of an image neighbourhood is then represented by a tensor T. Due to its symmetry, the tensor can be decomposed as T = ^e ^e + ^e ^e ; () where f^e ; ^e g are orthonormal eigenvectors of T with corresponding eigenvalues f ; g. The positive semideniteness of T implies that the eigenvalues can be ordered such that 0. The tensor representation suggested by [Knutsson, 989] uses the eigenvalues of T to describe both the energy content and the orientation structure of the neighbourhood and it uses the eigenvectors to describe the direction of the orientation. This is exemplied with the following three ideal cases. = > 0. The neighbourhood is isotropic, i.e. does not contain any oriented structure. > 0; = 0. The neighbourhood contains a dominant orientation which is perpendicular to ^e. = = 0. The neighbourhood contains no energy. The three-dimensional case is almost as simple. Here, the local orientation is represented by a 3 3 tensor T which is decomposed as T = ^e ^e + ^e ^e + 3 ^e 3 ^e 3 : () Again, f^e ; ^e ; ^e 3 g are orthonormal eigenvectors of T with corresponding eigenvalues 3 0. The orientation representation is exemplied with the following four ideal cases. = = 3 > 0. The neighbourhood is isotropic. = > 0; 3 = 0. The neighbourhood contains iso-curves with one dominant orientation which is perpendicular to the plane spanned by ^e and ^e.

> 0; = 3. The neighbourhood contains iso-surfaces with one dominant orientation. The iso-surfaces are perpendicular to ^e. = = 3 = 0. The neighbourhood contains no energy. [Knutsson et al., 99a] describes how tensors with the above characteristics may be constructed using the responses from quadrature lters. In general, an arbitrary variation of the orientation structure will be reected both in the eigenvalues and the eigenvectors of T. As an example, the variation may indicate a transition from a line to a plane structure in three dimensions. In the following, however, it is assumed that there is no variation in the eigenvalues of T, implying that the character of the orientation structure is constant and only the orientation changes. Furthermore it is assumed that this variation can be described as some type of transformation, A, acting on the eigenvectors of the tensor, i.e. acting on the orientation. As an example, consider a two-dimensional image containing a circle. An image neighbourhood on the circle will then contain a dominant orientation which is described by a vector ^e perpendicular to the circle segment and, hence, represented by a tensor T = ^e^e. When moving along the circle, the dominant orientation will change with a speed determined by the radius of the circle. In fact, this variation can be described as a rotation of the vector ^e. Consequently, also the representation tensors along the circle will vary somehow. The relations between the variation of the vectors and that of the tensors is, however, not apparent. Generally, A may be of arbitrary type, but in the following section it is assumed to be a linear operator, corresponding to an n n matrix. For instance, in the above example A would be represented by a rotation matrix. Let f^e k g denote the eigenvectors of T. According to the above, f^e k g will change between two image points, x 0 to x, as f^e k g x = Af^e k g x0 : (3) The representation tensor, T, is a function of the eigensystem, i.e. T = T(f^e k g); (4) and will therefore transform according to T x = T(Af^e k g x0 ): (5) Since T is a linear combination of outer products of the eigenvectors, Equation (5) may be rewritten as T x = A T x0 A ; (6) Though correct, this description of how T transforms is of little practical use and it would be much more convenient to nd an operator A, corresponding to A, such that T x = A[ T x0 ]: (7) The concept of equivariance was introduced by [Wilson and Knutsson, 988] and [Wilson and Spann, 988] in order to make a formal theory for feature representation. It implies that transformations of a feature are reected in transformations of the representation. In view of the previous discussion, A and A form such a pair of transformations called equivariance operators. The purpose of this paper is to establish a pair of equivariance operators for the tensor representation of orientation. The results may be used in order to design tensor based algorithms for extraction of features dened in terms of local variation of the orientation, e.g. curvature or circular symmetries. Derivation of results Let V and V denote a vector space and its corresponding dual space, both of the type R n for some integer n. The vector space V V is then the set of all linear maps from from V to itself. For convenience, elements of V are referred to as vectors whereas elements of V V are referred to as tensors. Let v be an arbitrary vector and dene T = v v ; (8) where the -sign indicates transpose. This implies that T is a tensor. We will now consider the case where v is a function of a real variable x, dened as v = v(x) = e xh v 0 ; (9) where H is an anti-hermitian tensor. The exponential function is here dened in terms of the familiar Taylor series, e xh = I + xh + x H + : : : (0)

valid for any tensor H. This functions has the interesting property of mapping anti-hermitian tensors to orthogonal tensors, see e.g. [Nordberg, 99]. Hence, for x R, Equation (0) denes a continuous set of orthogonal tensors which in fact forms a group. The motivation for introducing the exponential function is that it provides a mean to realize both rotations and other quite general orthogonal transformations. It can be proved, see e.g. [Nordberg, 99], that any n n anti-hermitian tensor, H, can be decomposed as H = n P k= i k f k f k ; () where ff k ; k = ; : : : ; ng are orthonormal eigenvectors of T with corresponding eigenvalues i k where k R. It should be noted that in general the eigenvectors f k are complex implying that the -operation also must include complex conjugation. Furthermore, these vectors are not proper elements of V but rather a complexied version thereof. For all cases of interest for this paper, H is real which implies that its eigenvectors as well as their corresponding eigenvalues come in complex conjugate pairs. Furthermore, the real and imaginary part of the eigenvectors are orthogonal, at least for eigenvectors with non-zero eigenvalues. Hence, each pair of complex conjugate eigenvectors, f k and f k, will dene a two-dimensional subspace of V. The corresponding eigenvalues, i k, will then determine the relative speed by which e xh rotates the projection of v on the subspace. In practice, we are often interested in operators e xh which are periodic in the parameter x. If normalizing the period to, this implies that all k are integers. For more details on this subject see e.g. [Nordberg, 99]. Hence, it is possible to construct an orthogonal operator which rotates the projection of v on arbitrary orthogonal two-dimensional subspaces of V, the angular velocity relative to x being arbitrary integers, by choosing H appropriately. Insertion of Equation (9) into the right hand side of Equation (8) gives T = T(x) = e xh v 0 v 0 e xh : () If v describes the orientation of an image neighbourhood, and Equation (9) describes how the vector changes when moving along some path in the image, then Equation () will describe how the orientation tensor changes along the same path. In the form presented by Equation (), however, the variation of T with respect to x is quite obscure. The right hand side consists of the matrix product between a variable orthogonal tensor, the tensor v 0 v 0 and the transpose of the rst orthogonal tensor. There is, however, another way of expressing this product. Let the parameter x have some xed value x 0. The mapping Q : V V! V V, dened as Q[ X ] = e x0h X e x0h ; (3) is then a linear map. Allowing x to vary implies that Q is a function of x and suggest the notation Q(x) instead of Q. The introduction of the map Q(x) means that the right hand side of Equation () can be rewritten as T(x) = Q(x)[ v 0 v 0 ]: (4) Hence, the tensor T is the image of v 0 v 0 under the map Q(x). We will now investigate the structure of Q. A scalar product on V V is dened by the function s : V V V V! R, where s(x; Y) = trace[ X Y ]: (5) This gives s(q(x) X; Q(x) Y) = trace[ (Q(x) X) Q(x)Y ] = trace[ (e xh X e xh ) e xh Y e xh ] = trace[ e xh X e xh e xh Y e xh ] = trace[ e xh e xh X e xh e xh Y ] = trace[ X Y ] = s(x; Y); (6) which implies that Q(x) is an orthogonal transformation on V V for all x R. Evidently, Q(x)Q(y) = Q(x + y) which implies that the set of all Q forms a group under composition of transformations. Hence, if v is subject to an orthogonal transformation group, then T is subject to an orthogonal transformation group as well. Let H be a xed and anti-hermitian tensor. A linear map H : V V! V V, is then de- ned by H[ X ] = H X X H: (7)

Using the notation H 0 = I, where I is the identity tensor, the following equation, which is easily proved by induction, gives an explicit form for repeated applications of H on X H k [ X ] = k P ( k l ) Hk l X ( H) l : (8) This equation is valid for all integers k 0, using the convention that H 0 = I where I is the identity map on V V. With this results at hand, insertion of Equation (0) into Equation (3) gives Q(x)[ X ] = " # " P P x k H k X k! P P P " P P kp x k k! ( x) l l! H l # = x k+l k! l! H k X ( H) l = x k (k l)! l! Hk l X ( H) l = kp ( k l ) Hk l X ( H) l = # x k H k [ X ] = e xh [ X ]: k! Inserted into Equation (4) this gives (9) T(x) = e xh [ v 0 v 0 ]: (0) Hence, if the vector v is transformed by the operator e xh, then the tensor T is transformed by the operator e xh, where H is dened by Equation (7). We have now established a correspondence between the transformations of the vector v and of the tensor T, given by Equations (9), (7) and (0). In this form, however, the correspondence is quite implicit and does not reveal any interesting properties. By examining the eigensystem of H and H and how they are related, much more information can be obtained. The eigensystem of H has already been treated in the text accompanying Equation (). When considering the eigensystem of H it is natural to use the term eigentensor for any tensor which after the mapping of H equals itself times a scalar constant. Assume that ff k ; k = ; : : : ; ng and fi k g is the eigensystem of H. It is then straightforward to prove that H[ f k f l ] = i ( k l ) f k f l : () Hence, any tensor of the type f k f is an eigentensor l of H with corresponding eigenvalue i( k l ). In fact, these tensors are the only eigentensors of H. More details on the eigensystem of H is found in [Nordberg, 99], Section 5.6. Since the eigenvectors and eigenvalues of H come in complex conjugate pairs, so must the eigentensors and the corresponding eigenvalues of H as well. Consequently, H constitutes an anti-hermitian map from V V to itself. This result could also have been derived using the scalar product dened by Equation (5). In the same way as each pair of eigenvectors of H denes a two-dimensional subspace of V, each pair of eigentensors, f k f and l fk f, will dene a two-dimensional subspace of V V. l Consequently, the operator e xh rotates the projection of T on each such subspace with a relative speed determined by k l. To summarize, if the transformation properties of v can be accounted to an orthogonal operator e xh, then T is transformed by the operator e xh. The eigensystem of the anti-hermitian tensor H describes how v is transformed in terms of how its projections on two-dimensional subspaces of V, dened by the eigenvectors, rotate with a speed relative to x determined by the corresponding eigenvalues. Given the eigensystem of H it is possible to construct the eigensystem of H. The eigentensors of H are simply the outer product between any possible choice of two eigenvectors of H, i.e. f k f, and the corresponding eigenvalues are the l dierences i( k l ). The eigentensors, f k f, will del ne a number of two-dimensional subspaces of V V and T will transform according to rotations in each such subspace with relative speed ( k l ). 3 Examples The previous section showed a correspondence between the transformation of v and that of T in terms of the eigensystems of H and H, two anti-hermitian mapping in V and V V respectively. In this section the correspondence is exemplied for the cases n = and n = 3. The two-dimensional case Assume n =. Orthogonal transformations of v are then simple two-dimensional rotations. With ^f = [ f+if p ]; ^f = [ f if p ]; () where f and f are two arbitrary orthonormal vectors

in R, the anti-hermitian tensor H which denes the transformation of v can be expressed as H = i ^f ^f i ^f ^f : (3) The operator e xh will then rotate any vector in R around the origin by the angle x. Furthermore, the eigenvalues of H are i and, according to the results from the previous section, this implies that H, the anti-hermitian map which governs the transformation of T, has the following eigensystem. Eigentensor Eigenvalue ^f ^f 0 ^f ^f 0 ^f ^f i ^f ^f i (4) It is easy to prove that independently of the choice of f and f, this amounts to e i e i Eigentensor i i i i i i i i Eigenvalue 0 0 i i (5) where is a constant determined by the choice of f and f. The exponential factors in front of the last two eigentensors can, however, be omitted since it is the eigenspaces of H which are of interest rather than specic eigentensors. The rst eigentensor pair of H, with eigenvalue 0, denes a two-dimensional subspace of V V which is spanned by the tensors 0 0 and 0 0 : (6) Since the eigenvalues are 0, this implies that the projection of T on this subspace is invariant with respect to the parameter x. The second eigentensor pair, with eigenvalues i, denes a two-dimensional subspace of V V which is spanned by the tensors 0 0 and 0 0 : (7) Since the eigenvalues are i, this implies that the projection of T on this subspace rotates with twice the speed of v. Hence, the tensor representation of two-dimensional orientation is in fact a type of double angle representation. This representation was introduced by [Granlund, 978] who suggested that a twodimensional vector should be used to represent the orientation by constructing the representation vector such that it rotates with twice the speed of the orientation. In the tensor case a tensor, corresponding to a four dimensional vector, is used instead and it is the projection of the tensor on a specic two-dimensional subspace which rotates with twice the speed of the orientation. The three-dimensional case Assume n = 3. Any orthogonal transformation of v is then described by a two-dimensional plane in which the projection of v is rotated by the angle x. Let ff ; f ; f 3 g be an orthonormal of set of vectors such that f and f span the plane of rotation. With ^f and ^f as dened by Equation (), the anti-hermitian tensor can be written H = i ^f ^f i ^f ^f + 0 f 3 f 3 (8) The eigenvalues of H are thus i and 0. Hence, the eigensystem of H is Eigentensor Eigenvalue ^f^f ; ^f ^f ; f 3 f 3 0 ^f f 3 ; f 3^f f 3^f ; ^f f 3 ^f^f ^f^f i i i i (9) According to the above, the projection of T on the two-dimensional subspace dened by ^f ^f and ^f ^f will rotate will twice the speed of the parameter x. Hence,

in the three-dimensional case, the orthogonal transformation of T will depend on the plane of rotation of v. As an example consider a cylinder in three dimensions. If cutting the cylinder with a plane perpendicular to its axes of symmetry, the result will be a circle. The normal vectors of the cylinder along the circle will lie in the plane. If moving along the circle, the normal vectors will transform according to a rotation in the plane. The three-dimensional rotation is determined by the three orthonormal vectors f ; f and f 3, where the rst two span the plane of rotation and the third is perpendicular to it. The orientation tensor, T, along the circular path will, according to the above, transform according to rotations in dierent two-dimensional subspaces of V V with relative speed ; and 0. In the case of a cylinder, however, the projection of the tensor on the planes with relative speed and 0 vanishes. The tensor will only have nonvanishing projection on the two-dimensional subspace dened by ^f ^f and ^f ^f and this projection rotates with twice the speed of x. 4 Discussion This paper have demonstrated an equivariance of the orientation and the tensor eld representation. The equivariance is based on the assumption that there is one and the same transformation A which acts on all eigenvectors of the representation tensor. This transformation is furthermore assumed to form an orthogonal operator group, e xh. These assumptions may of course not valid for any possible type of variation between two adjacent neighbourhoods of an image. If valid, however, then Section have proved the possibility to construct an orthogonal operator group, e xh, which acts on the representation tensor T. The latter group then describes the equivalent transformation of T relative to the former of the orientation. Both H and H are anti-hermitian maps, on V and V V respectively, and their eigensystems are closely related. As a general results from Section, we see that if i is the eigenvalue of H with largest absolute value then the corresponding value for H is i. Hence, in terms of rotations, there is always a projection of T on some two-dimensional subspace which rotates with twice the speed compared to the eigenvectors of T. In general, there are also projections of T which rotates in other subspaces with relative speed less than. The tensor representation of orientation has inspired a number of algorithms for detection of local gradients in the tensor eld, e.g. [Barman, 99], [Westin, 99] or [Westin and Knutsson, 99]. These algorithms are based on correlating the tensors in each neighbourhood with a xed set of tensor lters. By choosing the lters appropriately and combining the lter outputs carefully, estimates of e.g. local curvature can be obtained. The results of this paper suggest that the variation of the local tensor eld in an image may be seen as a consequence of the variation of the orientation eld. Assuming that the latter is subject to an orthogonal transformation group, it has been proved that the representation tensor is subject to an orthogonal transformation group as well. This implies the possibility of designing algorithms for detection of local variation of orientation, e.g. three-dimensional curvature, based on the transformation characteristics of the representation tensor T. For a given feature, de- ned in terms of local variation of the orientation, it must rst be established what transformation group acts on the orientation. If the transformation can be assumed to be orthogonal, the result will then be the eigensystem of H. Given this eigensystem, this paper explains how to form H, describing the transformation group of the tensor T. Hence, one way of dening new algorithms is to search of image neighbourhoods in which the tensor transforms according to the transformations group e xh. Another way is to estimate the transformation group in each neighbourhood of the image. If the assumption of orthogonal groups is valid, this implies that H is estimated for each neighbourhood and also that this anti-hermitian map may be used to represent the variation of the neighbourhood. 5 Acknowledege This work was nancially supported by the Swedish Board of Technical Deveopment. References [Barman, 99] Barman, H. (99). Hierarchical Curvature Estimation in Computer Vision. PhD thesis, Linkoping University, Sweden, S{58 83 Linkoping, Sweden. Dissertation No 53, ISBN 9{7870{797{8. [Granlund, 978] Granlund, G. H. (978). In search of a general picture processing operator. Computer Graphics and Image Processing, 8():55{78. [Knutsson, 989] Knutsson, H. (989). Representing local structure using tensors. In The 6th Scandi-

navian Conference on Image Analysis, pages 44{ 5, Oulu, Finland. Report LiTH{ISY{I{09, Computer Vision Laboratory, Linkoping University, Sweden, 989. [Knutsson et al., 99a] Knutsson, H., Barman, H., and Haglund, L. (99a). Robust orientation estimation in d, 3d and 4d using tensors. In Proceedings of International Conference on Automation, Robotics and Computer Vision. [Knutsson et al., 99b] Knutsson, H., Haglund, L., and Granlund, G. (99b). Adaptive ltering of image sequences and volumes. In Proceedings of International Conference on Automation, Robotics and Computer Vision. [Nordberg, 99] Nordberg, K. (99). Signal Representation and Signal Processing using Operators. Report LiTH{ISY{I{387, Computer Vision Laboratory, Linkoping University, Sweden. [Westin, 99] Westin, C.-F. (99). Feature extraction based on a tensor image description. Thesis No. 88, ISBN 9{7870{85{X. [Westin and Knutsson, 99] Westin, C.-F. and Knutsson, H. (99). Extraction of local symmetries using tensor eld ltering. In Proceedings of nd Singapore International Conference on Image Processing. IEEE Singapore Section. LiTH{ISY{R{55, Linkoping University, Sweden.