j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Similar documents
. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

A = , A 32 = n ( 1) i +j a i j det(a i j). (1) j=1

c i r i i=1 r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4].

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Lecture Notes: Matrix Inverse. 1 Inverse Definition. 2 Inverse Existence and Uniqueness

Lecture Notes: Solving Linear Systems with Gauss Elimination

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Review of Linear Algebra

1 Inner Product and Orthogonality

1 Last time: least-squares problems

Econ Slides from Lecture 7

NORMS ON SPACE OF MATRICES

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

AN ITERATION. In part as motivation, we consider an iteration method for solving a system of linear equations which has the form x Ax = b

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Linear Algebra - Part II

Math 108b: Notes on the Spectral Theorem

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Mathematical foundations - linear algebra

MTH 2032 SemesterII

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Lecture 1: Systems of linear equations and their solutions

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Knowledge Discovery and Data Mining 1 (VO) ( )

Linear Classification: Perceptron

Lecture 8 : Eigenvalues and Eigenvectors

Math 407: Linear Optimization

CS 246 Review of Linear Algebra 01/17/19

COMP 558 lecture 18 Nov. 15, 2010

Lecture 15, 16: Diagonalization

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Repeated Eigenvalues and Symmetric Matrices

Linear Algebra Review. Vectors

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Recall the convention that, for us, all vectors are column vectors.

Quantum Computing Lecture 2. Review of Linear Algebra

Chapter 6: Orthogonality

In this section again we shall assume that the matrix A is m m, real and symmetric.

Multivariate Statistical Analysis

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

12. Perturbed Matrices

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Properties of Linear Transformations from R n to R m

Lecture 3: QR-Factorization

Chap 3. Linear Algebra

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Spectral radius, symmetric and positive matrices

Homework 2. Solutions T =

2. Every linear system with the same number of equations as unknowns has a unique solution.

There are two things that are particularly nice about the first basis

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later.

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Functional Analysis Review

Linear Algebra & Geometry why is linear algebra useful in computer vision?

FREE VIBRATION RESPONSE OF UNDAMPED SYSTEMS

Large Scale Data Analysis Using Deep Learning

DS-GA 1002 Lecture notes 10 November 23, Linear models

12x + 18y = 30? ax + by = m

Lecture # 3 Orthogonal Matrices and Matrix Norms. We repeat the definition an orthogonal set and orthornormal set.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Problem # Max points possible Actual score Total 120

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Eigenvalues and Eigenvectors

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

5. Orthogonal matrices

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

Solutions to Review Problems for Chapter 6 ( ), 7.1

Linear Algebra: Matrix Eigenvalue Problems

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

More chapter 3...linear dependence and independence... vectors

Transpose & Dot Product

Section 7.3: SYMMETRIC MATRICES AND ORTHOGONAL DIAGONALIZATION

Singular Value Decomposition

Diagonalization by a unitary similarity transformation

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Chapter 1. Matrix Algebra

Review of similarity transformation and Singular Value Decomposition

Linear Algebra Massoud Malek

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Lecture 3: Matrix and Matrix Operations

Transpose & Dot Product

c Igor Zelenko, Fall

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Linear Algebra (Review) Volker Tresp 2017

Lecture 7. Econ August 18

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

Math 3191 Applied Linear Algebra

Chapter 4 Euclid Space

Basic Calculus Review

Transcription:

Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u i ] and v = [v i ] be two n vectors. Define the dot product between them denoted as u v as the real value n i= u iv i. Similarly, let u = [u j ] and v = [v j ] be two n vectors. Define the dot product between them again denoted as u v as the real value n j= u jv j. Definition. Let u be a vector. Define u = u u, and call it the norm or the length of u. Definition 3. Let S be a set of non-zero (column or row) vectors v, v,..., v k of the same dimensionality. We say that S is orthogonal if v i v j = for any i j. Furthermore, we say that S is orthonormal if (i) S is orthogonal, and (ii) v i = for any i [, k]. For example,,, is orthogonal but not orthonormal. If, however, we scale each of the above vectors to have length, then the resulting vector set becomes orthonormal: / /, / 6 / 6 / 6, / 3 / 3 / 3 Lemma. An orthogonal set of vectors must be linearly independent. Proof. Suppose that S = {v, v,..., v k }. Assume, on the contrary, that S is not linearly independent. Hence, there exist real values c, c,..., c k that are not all zero, and make the following hold: c v + c v +... + c k v k =. Suppose, without loss of generality, that c i for some i [, k]. Then, we multiply both sides of the above equation by v i, and obtain: c v v i + c v v i +... + c k v k v i = c i v i v i =. The above equation contradicts the fact that c i and v i is a non-zero vector.

Definition 4. An n n matrix A is orthogonal if (i) its inverse A exists, and (ii) A T = A. Example. Consider A = [ cos θ sin θ sin θ cos θ [ cos θ sin θ ] sin θ cos θ. The following is a 3 3 orthogonal matrix: /3 /3 /3 /3 /3 /3 /3 /3 /3 ]. It is orthogonal because A T = A = An orthogonal matrix must be formed by an orthonormal set of vectors: Lemma. Let A be an n n matrix with row vectors r, r,..., r n, and column vectors c, c,..., c n. Both the following statements are true: A is orthogonal if and only if {r, r,..., r n } is orthonormal. A is orthogonal if and only if {c, c,..., c n } is orthonormal. Proof. We will prove only the first statement because applying the same argument on A T proves the second. Let B = AA T. Denote by b ij the element of B at the i-th row and j-th column. We know that b ij = r i r j (note that the j-th column of A T has the same components as r j ). A is orthogonal if and only if B is an identity matrix, which in turn is true if and only if b ij = when i = j, and b ij = otherwise. The lemma thus follows. Symmetric Matrix Recall that an n n matrix A is symmetric if A = A T. In this section, we will learn several nice properties of such matrices. Lemma 3. All the eigenvalues of a symmetric matrix must be real values (i.e., they cannot be complex numbers). All eigenvectors of the matrix must contain only real values. We omit the proof of the lemma (which is not difficult, but requires the definition of matrices on complex numbers). Note that the above lemma is not true for general square matrices (i.e., it is possible for an eigenvalue to be a complex number). Lemma 4. Let λ and λ be two different eigenvalues of a symmetric matrix A. Also, suppose that x is an eigenvector of A corresponding to λ, and x is an eigenvector of A corresponding to λ. It must holds that x x =. Proof. By definition of eigenvalue and eigenvector, we know: Ax = λ x () Ax = λ x ()

From (), we have x T A T = T λ x x T A = T λ x x T Ax = λ x T x (by ()) x T λ x = λ x T x x T x (λ λ ) = (by λ λ ) x T x =. The lemma then follows from the fact that x x = x T x. Example. Consider A = We know that A has two eigenvalues λ = and λ =. For eigenvalue λ =, all the eigenvectors can be represented as x = x x x 3 satisfying: with u, v R. eigenvectors: x = v u, x = u, x 3 = v Setting (u, v) to (, ) and (, ) respectively gives us two linearly independent x =, x = For eigenvalue λ =, all the eigenvectors can be represented as x = x = t, x = t, x 3 = t with t R. Setting t = gives us another eigenvector: x 3 = x x x 3 satisfying: Vectors x, x, and x 3 are linearly independent. According to Lemma 4, both x x 3 and x x 3 must be, which is indeed the case. From an earlier lecture, we already know that every symmetric matrix can be diagonalized because it definitely has n linearly independent eigenvectors. The next lemma strengthens this fact: 3

Lemma 5. Every n n symmetric matrix has an orthogonal set of n eigenvectors. We omit the proof of the lemma (which is rather non-trivial). Note that n eigenvectors in the lemma must be linearly independent, according to Lemma. Example 3. Let us consider again the matrix A in Example. We have obtained eigenvectors x, x, x 3. Clearly, they do not constitute an orthogonal set because x, x are not orthogonal. We will replace x with a different x that is still an eigenvector of A for eigenvalue λ =, and is orthogonal to x. From Example, we know that all eigenvectors corresponding to λ have the form For such a vector to be orthogonal to x =, we need: ()(v u) + u = v = u As you can see, there are infinitely many such vectors, any of which can be x except produce one, we can choose u =, v =, which gives x = {x, x, x 3} is thus an orthogonal set of eigenvectors of A.. v u u v.. To Corollary. Every n n symmetric matrix has an orthonormal set of n eigenvectors. Proof. The orthonormal set can be obtained by scaling all vectors in the orthogonal set of Lemma 5 to have length. Now we prove an important lemma about symmetric matrices. Lemma 6. Let A be an n n symmetric matrix. There exist an orthogonal matrix Q such that A = Q diag[λ, λ,..., λ n ] Q, where λ, λ,..., λ n are eigenvalues of A. Proof. From an earlier lecture, we know that given a set of linearly independent eigenvectors v, v,..., v n corresponding to eigenvalues λ, λ,..., λ n respectively, we can produce Q by placing v i as the i-th column of Q, for each i [, n], such that A = Q diag[λ, λ,..., λ n ] Q. From Corollary, we know that we can find an orthonormal set of v, v,..., v n. By Lemma, it follows that Q is an orthogonal matrix. Example 4. Consider once again the matrix A in Example. In Example 3, we have obtained an orthogonal set of eigenvectors:,, 4

By scaling, we obtain the following orthonormal set of eigenvectors: / /, / 6 / 6 / 6, / 3 / 3 / 3 Recall that these eigenvectors correspond to eigenvalues,, and, respectively. We thus produce: Q = such that A = Q diag[,, ] Q. / / 6 / 3 / / 6 / 3 / 6 / 3 5