. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

Similar documents
j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

A = , A 32 = n ( 1) i +j a i j det(a i j). (1) j=1

c i r i i=1 r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4].

Lecture Notes: Matrix Inverse. 1 Inverse Definition. 2 Inverse Existence and Uniqueness

Lecture Notes: Solving Linear Systems with Gauss Elimination

1 Last time: least-squares problems

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Econ Slides from Lecture 7

1 Inner Product and Orthogonality

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Review of Linear Algebra

2. Every linear system with the same number of equations as unknowns has a unique solution.

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later.

CS 246 Review of Linear Algebra 01/17/19

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Recall the convention that, for us, all vectors are column vectors.

Lecture 1: Systems of linear equations and their solutions

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

ENGR-1100 Introduction to Engineering Analysis. Lecture 21

Lecture 15, 16: Diagonalization

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Matrix Representation

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Math 108b: Notes on the Spectral Theorem

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

More chapter 3...linear dependence and independence... vectors

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

Determinants Chapter 3 of Lay

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

c Igor Zelenko, Fall

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Lecture 1 and 2: Random Spanning Trees

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

MA 265 FINAL EXAM Fall 2012

Singular Value Decomposition (SVD)

Knowledge Discovery and Data Mining 1 (VO) ( )

LINEAR ALGEBRA SUMMARY SHEET.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Spectral radius, symmetric and positive matrices

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Computational math: Assignment 1

Lecture 8 : Eigenvalues and Eigenvectors

Introduction to Matrix Algebra

Math 407: Linear Optimization

Numerical Linear Algebra Homework Assignment - Week 2

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Cheat Sheet for MATH461

Chapter 6 Inner product spaces

ENGR-1100 Introduction to Engineering Analysis. Lecture 21. Lecture outline

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Quantum Computing Lecture 2. Review of Linear Algebra

Math 2114 Common Final Exam May 13, 2015 Form A

Solution of Linear Equations

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Linear Algebra - Part II

Chapter 4 Euclid Space

Final Exam Practice Problems Answers Math 24 Winter 2012

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

ELE/MCE 503 Linear Algebra Facts Fall 2018

EECS 275 Matrix Computation

det(ka) = k n det A.

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

Lecture 14: Orthogonality and general vector spaces. 2 Orthogonal vectors, spaces and matrices

CHAPTER 3. Matrix Eigenvalue Problems

Problem # Max points possible Actual score Total 120

G1110 & 852G1 Numerical Linear Algebra

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

ENGI 9420 Lecture Notes 2 - Matrix Algebra Page Matrix operations can render the solution of a linear system much more efficient.

Problem 1: Solving a linear equation

MAT 1332: CALCULUS FOR LIFE SCIENCES. Contents. 1. Review: Linear Algebra II Vectors and matrices Definition. 1.2.

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Repeated Eigenvalues and Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

12. Perturbed Matrices

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Diagonalization by a unitary similarity transformation

Extra Problems for Math 2050 Linear Algebra I

Solutions to Review Problems for Chapter 6 ( ), 7.1

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

NORMS ON SPACE OF MATRICES

Chapter 2 Notes, Linear Algebra 5e Lay

Properties of Linear Transformations from R n to R m

Chapter 6: Orthogonality

Chap 3. Linear Algebra

Eigenvalues and Eigenvectors

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

INVERSE OF A MATRIX [2.2]

MTH 2032 SemesterII

Mathematical foundations - linear algebra

Linear Classification: Perceptron

Eigenvalues and Eigenvectors

Transcription:

Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. An n n matrix A is orthogonal if (i) its inverse A exists, and (ii) A T = A. Example. Consider A = [ cosθ sinθ sinθ cosθ [ cosθ sinθ sinθ cosθ ]. The following is a 3 3 orthogonal matrix: /3 /3 /3 /3 /3 /3 /3 /3 /3 ]. It is orthogonal because A T = A = Lemma. If A is orthogonal, then A T is also orthogonal. Proof. The lemma thus follows. (A T ) T = (A ) T = (A T ) To explain the next property of orthogonal matrices, we need to define two new concepts. Let S be a set of non-zero vectors v, v,..., v k of the same dimensionality. We say that S is orthogonal if v i v j = for any i j. Furthermore, we say that S is orthonormal if (i) S is orthogonal, and (ii) v i = v i v i = for any i [,k]. For example, is orthogonal but not orthonormal. If, however, we scale each of the above vectors to have length, then the resulting vector set becomes orthonormal: / / / 6 / 6 / 6 / 3 / 3 / 3 Lemma. An orthogonal set of vectors must be linearly independent.

Proof. Suppose that S = {v, v,..., v k }. Assume, on the contrary, that S is not linearly independent. Hence, there exist real values c,c,...,c k that are not all zero, and make the following hold: c v +c v +...+c k v k =. Suppose, without loss of generality, that c i for some i [,k]. Then, we multiply both sides of the above equation by v i, and obtain: c v v i +c v v i +...+c k v k v i = c i v i v i =. The above equation contradicts the fact that c i and v i is a non-zero vector. We are now ready to reveal another way to define orthogonal matrix: Lemma 3. Let A be an n n matrix with row vectors r, r,..., r n, and column vectors c, c,..., c n. Both the following statements are true: A is orthogonal if and only if {r, r,..., r n } is orthonormal. A is orthogonal if and only if {c, c,..., c n } is orthonormal. Proof. We will prove only the first statement because applying the same argument on A T proves the second. Let B = AA T. Denote by b ij the element of B at the i-th row and j-th column. We know that b ij = r i r j (note that the j-th column of A T has the same components as r j ). A is orthogonal if and only if B is an identity matrix, which in turn is true if and only if b ij = when i = j, and b ij = otherwise. The lemma thus follows. Lemma 4. The determinant of an orthogonal matrix A can only be or. Proof. From A T = A, we know that AA T = I where I is an identity matrix. Hence, det(aa T ) = det(a)det(a T ) = (det(a)) =. The lemma thus follows. Symmetric Matrix Recall that an n n matrix A is symmetric if A = A T. Next, we give several nice properties of such matrices. Lemma 5. All the eigenvalues of a symmetric matrix must be real values (i.e., they cannot be complex numbers). We omit the proof of the lemma. Note that the above lemma is not true for general square matrices (i.e., it is possible for an eigenvalue to be a complex number). Lemma 6. Let λ and λ be two different eigenvalues of a symmetric matrix A. Also, suppose that x is an eigenvector of A corresponding to λ, and x is an eigenvector of A corresponding to λ. It must holds that x x =.

Proof. By definition of eigenvalue and eigenvector, we know: From (), we have Ax = λ x () Ax = λ x () x T A T = λ x T x T A = λ x T x T Ax = λ x T x (by ()) x T λ x = λ x T x x T x (λ λ ) = (by λ λ ) x T x =. The lemma then follows from the fact that x x = x T x. Example. Consider A = We know that A has two eigenvalues λ = and λ =. For eigenvalue λ =, all the eigenvectors can be represented as x = x = v u,x = u,x 3 = v x x x 3 satisfying: with u,v R. Setting (u,v) to (,) and (,) respectively gives us two linearly independent eigenvectors: x =,x = For eigenvalue λ =, all the eigenvectors can be represented as x = x = t,x = t,x 3 = t with t R. Setting t = gives us another eigenvector: x 3 = x x x 3 satisfying: Vectors x, x, and x 3 are linearly independent. According to Lemma 6, both x x 3 and x x 3 must be. You can verify that this is indeed the case. From an earlier lecture, we already know that every symmetric matrix can be diagonalized because it definitely has n linearly independent eigenvectors. The next lemma strengthens this fact: 3

Lemma 7. Every n n symmetric matrix has an orthogonal set of n eigenvectors. We omit the proof of the lemma (which is rather non-trivial). Note that n eigenvectors in the lemma must be linearly independent, according to Lemma. Example 3. Let us consider again the matrix A in Example. We have obtained eigenvectors x,x,x 3. Clearly, they do not constitute an orthogonal set because x,x are not orthogonal. We will replace x with a different x that is still an eigenvector of A for eigenvalue λ =, and is orthogonal to x. From Example, we know that all eigenvectors corresponding to λ have the form For such a vector to be orthogonal to x =, we need: ( )(v u)+u = v = u As you can see, there are infinitely many such vectors, any of which can be x except produce one, we can choose u =,v =, which gives x = {x,x,x 3} is thus an orthogonal set of eigenvectors of A.. v u u v.. To Corollary. Every n n symmetric matrix has an orthonormal set of n eigenvectors. Proof. The orthonormal set can be obtained by scaling all vectors in the orthogonal set of Lemma 7 to have length. Now we prove an important lemma about symmetric matrices. Lemma 8. Let A be an n n symmetric matrix. There exist an orthogonal matrix Q such that A = Qdiag[λ,λ,...,λ n ]Q, where λ,λ,...,λ n are eigenvalues of A. Proof. From an earlier lecture, we know that given a set of linearly independent eigenvectors v,v,...,v n corresponding to eigenvalues λ,λ,...,λ n respectively, we can produce Q by placing v i as the i-th column of Q, for each i [,n], such that A = Qdiag[λ,λ,...,λ n ]Q. From Corollary, we know that we can find an orthonormal set of v,v,...,v n. By Lemma 3, it follows that Q is an orthogonal matrix. Example 4. Consider once again the matrix A in Example. In Example 3, we have obtained an orthogonal set of eigenvectors: 4

By scaling, we obtain the following orthonormal set of eigenvectors: / / / 6 / 6 / 6 / 3 / 3 / 3 Recall that these eigenvectors correspond to eigenvalues,, and, respectively. We thus produce: Q = such that A = Qdiag[,, ]Q. / / 6 / 3 / / 6 / 3 / 6 / 3 5