Rotations & reflections

Similar documents
Rotations & reflections

Non-square diagonal matrices

Jordan chain. Defn. E.g. T = J 3 (λ) Daniel Chan (UNSW) Lecture 29: Jordan chains & tableaux Semester / 10

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

Orthogonal complement

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) =

Definition of adjoint

Kernel. Prop-Defn. Let T : V W be a linear map.

Eigenvectors. Prop-Defn

Old & new co-ordinates

Quadratic forms. Defn

i.e. the i-th column of A is the value of T at the i-th standard basis vector e i R n.

Data fitting. Ideal equation for linear relation

Matrix of linear maps. Matrix-vector product

Data for a vector space

Here are some additional properties of the determinant function.

1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Lecture 11: Dimension. How does Span(S) vary with S

Matrix-valued functions

Matrix-valued functions

Spectral Theorem for Self-adjoint Linear Operators

Quantum Computing Lecture 2. Review of Linear Algebra

A PRIMER ON SESQUILINEAR FORMS

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

Chapter 2: Vector Geometry

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Numerical Linear Algebra

The following definition is fundamental.

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Math 396. An application of Gram-Schmidt to prove connectedness

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

235 Final exam review questions

The Spectral Theorem for normal linear maps

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Math 306 Topics in Algebra, Spring 2013 Homework 7 Solutions

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Elementary linear algebra

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

6 Inner Product Spaces

A proof of the Jordan normal form theorem

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Math Linear Algebra II. 1. Inner Products and Norms

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

ALGEBRA 8: Linear algebra: characteristic polynomial

REPRESENTATION THEORY WEEK 7

4 Group representations

Exam questions with full solutions

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

1 Last time: least-squares problems

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen

MATH 115A: SAMPLE FINAL SOLUTIONS

Lecture 3: QR-Factorization

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

MATH Linear Algebra

Some notes on Coxeter groups

Chapter 5: Matrices. Daniel Chan. Semester UNSW. Daniel Chan (UNSW) Chapter 5: Matrices Semester / 33

Singular Value Decomposition (SVD) and Polar Form

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

CLASSICAL GROUPS DAVID VOGAN

Chapter 6: Orthogonality

REPRESENTATIONS OF U(N) CLASSIFICATION BY HIGHEST WEIGHTS NOTES FOR MATH 261, FALL 2001

Chapter 1: Introduction to Vectors (based on Ian Doust s notes)

Math 108b: Notes on the Spectral Theorem

SYMPLECTIC GEOMETRY: LECTURE 1

CHAPTER 6. Representations of compact groups

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

Pseudoinverse & Moore-Penrose Conditions

Categories and Quantum Informatics: Hilbert spaces

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Lecture 15, 16: Diagonalization

The Singular Value Decomposition and Least Squares Problems

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

The Jordan canonical form

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

TWO REMARKS ABOUT NILPOTENT OPERATORS OF ORDER TWO

Math 593: Problem Set 10

MATH JORDAN FORM

Lecture 4 Orthonormal vectors and QR factorization

Linear Algebra 2 Spectral Notes

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Lecture 1: Review of linear algebra

Problem # Max points possible Actual score Total 120

Definitions for Quizzes

The Cartan Dieudonné Theorem

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Linear Algebra: Matrix Eigenvalue Problems

ISOMETRIES OF R n KEITH CONRAD

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Spectral Theorems in Euclidean and Hermitian Spaces

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Linear Algebra using Dirac Notation: Pt. 2

NOTES ON BILINEAR FORMS

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Transcription:

Rotations & reflections Aim lecture: We use the spectral thm for normal operators to show how any orthogonal matrix can be built up from rotations & reflections. In this lecture we work over the fields F = R & C. We let ( ) cos θ sin θ R θ = sin θ cos θ the -matrix which rotates anti-clockwise about ( 0 0) through angle θ. Note that det R θ = 1 & that R π is the diagonal matrix I = ( 1) ( 1). Defn Let A M nn (R). We say that A is a rotation matrix if it is orthogonally similar to I n R θ for some θ. We say that A is a reflection matrix if it is orthogonally similar to I n 1 ( 1). Rem Note that this generalises the usual notions in dim & 3. Also, rotations are orthogonal with determinant 1 whilst reflections are orthogonal with det -1. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 1 / 9

Orthogonal matrices in O, O 3 Recall that orthogonal matrices have det ±1. We defer the proof of Theorem 1 Let A O. If det A = 1 then A is orthog sim to R θ for some θ (so tr A = cos θ) & if det A = 1 then A is a reflection. Let A O 3. If det A = 1 then A is orthog sim to (1) R θ (so tr A = 1 + cos θ) & if det A = 1 then A is orthog sim to ( 1) R θ (so tr A = 1 + cos θ) for some θ. Corollary Let A, B M (R) or A, B M 33 (R). If A, B are rotation matrices, then so is AB. If A, B are reflection matrices, then AB is a rotn. Proof. In both cases, AB is an orthog matrix with det (±1) = 1 so the thm gives the result. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 / 9

Example E.g. Verify that the following is a rotation matrix & determine the axis of rotn & angle of rotn. A = 1 1 1 3 1 A Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 3 / 9

E-values & e-vectors of real matrices Recall the conjugation map ( ) : C n C n is a conjugate linear isomorphism. Furthermore, for v, v C n we have v = v & v v v v. Prop 1 Let A M nn (R) & λ C. 1 The conjugation map restricts to a conjugate linear isomorphism ( ) : E λ E λ, that is, E λ = E λ. In particular, λ is an e-value of A iff λ is. The conjugation map sends any basis for E λ to a basis for E λ. Proof. 1) Let v E λ. Then Av = A v = Av = λv = λ v. Hence E λ E λ & the reverse inclusion follows from this result applied to λ. Conjugation is injective so 1) follows. ) follows from the fact that conjugate lin isom take bases to bases (just as is the case for lin isom). The proof is a mild modification of the lin case. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 4 / 9

E-theory of R θ To study rotations, we need to know the e-theory for R θ well. Prop We may unitarily diagonalise (over C) R θ = U((e iθ ) (e iθ ))U where ( ) 1 1 U =. i i In particular, E e iθ = C ( 1 ), Ee i iθ = C ( 1 ) ( 1 ) = C i i. Proof. easy ex. Let s just check the e iθ -e-space and use Prop 1. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 5 / 9

General classification of orthogonal matrices Theorem Let A O n. 1 If det A = 1 then A is orthogonally similar to I r R θ1... R θs for some r N, θ 1,..., θ s R. If det A = 1 then A is orthogonally similar to ( 1) I r R θ1... R θs for some r N, θ 1,..., θ s R. Proof. This proof is a classic example of the use of complex numbers to answer questions about reals! Note that A viewed as complex matrix is unitary & hence normal. Hence the e-values have modulus 1 & we may apply the spectral thm for normal operators to conclude there is an A-invariant orthogonal direct sum decomposition into e-spaces of the form C n = E 1 E 1 E e iθ 1 E e iθ 1... E e iθs E e iθs where we used the propn to re-write some of the e-spaces. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 6 / 9

Alternate version of theorem The theorem will follow if we can find an orthonormal basis of real vectors (abbrev to real orthonormal basis) such that wrt the corresp co-ord system P O n, the representing matrix P T AP is a direct sum of rotation matrices & matrices of the form ±I. Indeed, the only thing left to do is replace all copies of I with R π to get the desired final form. The thm thus follows from the following more precise result. Theorem (Version for purposes of proof) 1 There are real orthonormal bases for E 1, E 1. There is a real orthonormal basis for V j = E e iθ j E e iθ j such that the matrix representing A restricted to V j wrt the corresp the co-ord system is R θj... R θj (there are dim E e iθ j copies of R θj ). Proof. 1) Note E 1 = ker(a I ), the kernel of a matrix with real entries, so we may find an orthonormal basis for it consisting of real vectors. Sim E 1 has an orthonormal basis of real vectors. Part 1) follows. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 7 / 9

Proof cont d ) To simplify notn, we drop the subscript j so θ = θ j, V = V j etc. Let {v 1,..., v m } be an orthonormal basis for E e iθ so {v 1,..., v m, v 1,..., v m } is an orthonormal basis for V. Hence V is an orthog direct sum of the subspaces W l = Span(v l, v l ) so it suffices to find a real orthonormal co-ord system for W l. Using the o/n co-ord system C = (v l v l ) : C W l, the representing matrix is C 1 T A C = (e iθ ) (e iθ ) where T A : W l W l is the left multn by A on W l. Prop then shows that for ( ) 1 1 U = & C R = CU we have C 1 R i i T A C R = UC 1 T A CU = U((e iθ ) (e iθ ))U = R θ Hence the thm follows from Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 8 / 9

Final lemma to complete proof Lemma The co-ord system C R is 1) real and, ) orthonormal. ( Note that C R = vl +v l i v l v l ) = (w + w ) say. For 1), we just check w ± = w ±. For ), just note that C R is a composite of isomorphisms of inner product spaces, so is an isomorphism of inner product spaces itself. Daniel Chan (UNSW) Lecture 4: Orthogonal matrices & rotations Semester 013 9 / 9