complex dot product x, y =

Similar documents
which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear Algebra: Matrix Eigenvalue Problems

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Repeated Eigenvalues and Symmetric Matrices

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

CS 143 Linear Algebra Review

1 Last time: least-squares problems

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

GQE ALGEBRA PROBLEMS

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 369 Linear Algebra


Linear Algebra. Chapter 8: Eigenvalues: Further Applications and Computations Section 8.2. Applications to Geometry Proofs of Theorems.

Problem 1: Solving a linear equation

Numerical Linear Algebra Homework Assignment - Week 2

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later.

MATH 221, Spring Homework 10 Solutions

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Review problems for MA 54, Fall 2004.

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MATH 583A REVIEW SESSION #1

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

MA 265 FINAL EXAM Fall 2012

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Spectral Theorem for Self-adjoint Linear Operators

Diagonalizing Matrices

Econ Slides from Lecture 7

Linear Algebra Lecture Notes-II

A Brief Outline of Math 355

Math 462: Homework 2 Solutions

Linear Algebra Primer

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

MTH 2032 SemesterII

Cheat Sheet for MATH461

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Recall the convention that, for us, all vectors are column vectors.

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Transpose & Dot Product

Agenda: Understand the action of A by seeing how it acts on eigenvectors.

Linear algebra II Homework #1 due Thursday, Feb A =

Conceptual Questions for Review

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

21 Symmetric and skew-symmetric matrices

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra- Final Exam Review

235 Final exam review questions

Chapter 6. Eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

Transpose & Dot Product

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

The following definition is fundamental.

Numerical Methods - Numerical Linear Algebra

Lecture 6 Positive Definite Matrices

Math Matrix Algebra

Chapter 7: Symmetric Matrices and Quadratic Forms

Symmetric matrices and dot products

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

7. Symmetric Matrices and Quadratic Forms

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

c Igor Zelenko, Fall

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

8. Diagonalization.

Convex Functions and Optimization

The Singular Value Decomposition

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

MAT Linear Algebra Collection of sample exams

Algebra II. Paulius Drungilas and Jonas Jankauskas

Practice Final Exam Solutions for Calculus II, Math 1502, December 5, 2013

Matrix Vector Products

Eigenvalues and Eigenvectors

Chapter Two Elements of Linear Algebra

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Chapter 5 Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Linear Algebra Practice Problems

1. General Vector Spaces

There are two things that are particularly nice about the first basis

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

LINEAR ALGEBRA KNOWLEDGE SURVEY

Eigenvalues and Eigenvectors

Mathematical Methods wk 2: Linear Operators

MIT Final Exam Solutions, Spring 2017

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Transcription:

MODULE 11 Topics: Hermitian and symmetric matrices Setting: A is an n n real or complex matrix defined on C n with the complex dot product x, y = Notation: A = A T, i.e., a ij = a ji. We know from Module 4 that n x j y j. j=1 Ax, y = x, A y for all x, y C n. Definition: If A = A T then A is symmetric. ( If A = ) A then A is Hermitian. 1 i Examples: is symmetric but not Hermitian ( i 1 ) 1 i is Hermitian but not symmetric. i 1 Note that a real symmetric matrix is Hermitian. Theorem: If A is Hermitian then Ax, x is real for all x C n. Proof: Ax, x = x, Ax = Ax, x, so the number Ax, x is real. Theorem: If A is Hermitian then its eigenvalues are real. Proof: Suppose that Au = λu, then λu, u = λu, u = Au, u = u, A u = u, Au = u, λu = λu, u. It follows that λ = λ so that λ is real. Theorem: If A is Hermitian then eigenvectors corresponding to distinct eigenvalues are orthogonal. Proof: Suppose that Au = λu and Av = µv for λ µ. Then λu, v = Au, v = u, Av = u, µv = µu, v. 66

Since µ λ it follows that u, v = 0. Suppose that A has n distinct eigenvalues {λ 1,..., λ n } with corresponding orthogonal eigenvectors {u 1,..., u n }. Let us also agree to scale the eigenvectors so that u i, u j = δ ij where δ ij is the so-called Kronecker delta δ ij = { 0 if i j 1 if i = j. We recall that the eigenvalue equation can be written in matrix form AU = UΛ where U = (u 1 u 2... u n ) and Λ = diag{λ 1,..., λ n }. We now observe from the orthonormality of the eigenvectors that U U = I. Hence U 1 = U and consequently A = UΛU. In other words, A can be diagonalied in a particularly simple way. Example: Suppose A = ( 1 ) i i 1 We saw that A is Hermitian; its eigenvalues are the roots of (1 λ) 2 1 = 0 so that λ 1 = 0, u 1 = (1, i)/ 2 λ 2 = 2, u 2 = (1, i)/ 2 which shows that u 1, u 2 = 0. Thus U U = 1 1 i 2 1 i 1 1 1 = 2 i i 1 0 0 1 and A = U 0 0 U. 0 2 67

We saw from the example A = 1 1 0 1 that the eigenvector system for A can only be written in the form A 1 1 = 0 0 ( 1 1 0 0 ) 1 0. 0 1 This matrix A cannot be diagonalied because we do not have n linearly independent eigenvectors. However, a Hermitian matrix can always be diagonalied because we can find an orthonormal eigenvector basis of C n regardless of whether the eigenvalues are distinct or not. Theorem: If A is Hermitian then A has n orthonormal eigenvectors {u 1,..., u n } and A = UΛU where Λ = diag{λ 1,..., λ n }. Proof: If all the eigenvalues were distinct then the results follows from the forgoing discussion. Here we shall outline a proof this result for the case where A is real so that the setting is R n rather than C n. Define the function f(y) = Ay, y Analysis tells us that this function has a minimum on the set y, y = 1. The minimier can be found with Lagrange multipliers. We have the Lagrangian L(y) = Ay, y λ(y, y 1) and the minimier x 1 is found from the necessary condition L(y) = 0. We compute L y k = A k, y + Ay, e k 2λy k = 2(A k, y λy k ) 68

so that we have to solve L Ay λy = 0 y, y = 1. The solution to this problem is the eigenvector x 1 with eigenvalue λ 1. If we now apply the Gram-Schmidt method to the set of n + 1 dependent vectors {x 1, e 1,..., e n } we end up with n orthogonal vectors {x 1, y 2,..., y n }. We now minimmie f(y) = Ay, y subject to y = 1 over all y span{y 2,..., y 2 } and get the next eigenvalue λ 2 and eigenvector x 2 which then necessarily satisfies x 1, x 2 = 0. Next we find an orthogonal basis of R n of the form {x 1, x 2, y 3,..., y n } and minimie f(y) over span {y 3,..., y n } subject to the constraint that y = 1. We keep going and find eventually n eigenvectors, all of which are mutually orthogonal. Afterwards all eigenvectors can be normalied so that we have an orthonormal eigenvector basis of R n which we denote by {u 1,..., u n }. If A is real and symmetric then this choice of eigenvectors satisfies U T U = I. If A is complex then the orthonormal eigenvectors satisfy U U = I. In either case the eigenvalue equation AU = UΛ can be rewritten as A = UΛU or equivalently, U AU = Λ. We say that A can be diagonalied. We shall have occasion to use this result when we talk about solving first order systems of linear ordinary differential equations. 69

The diagonaliation theorem guarantees that for each eigenvalue one can find an eigenvector which is orthogonal to all the other eigenvectors regardless of whether the eigenvalue is distinct or not. This means that if an eigenvalue µ is a repeated root of multiplicity k (meaning that det(a λi) = (λ µ) k g(λ), where g(µ) 0), then dim N (A µi) = k. We simply find an orthogonal basis of this null space. The process is, as usual, the application of Gaussian elimination to (A µi)x = 0 and finding k linearly independent vectors in the null space which can then be made orthogonal with the Gram-Schmidt process. Some consequences of this theorem: Throughout we assume that A is Hermitian and that x is not the ero vector. Theorem: Ax, x > 0 if and only if all eigenvalues of A are strictly positive. Proof: Suppose that Ax, x > 0. If λ is a non-positive eigenvalue with eigenvector u then Au, u = λu, u 0 contrary to assumption. Hence the eigenvalues must be positive. Conversely, assume that all eigenvalues are positive. Let x be arbitrary, then the existence of an orthonormal eigenvector basis {u j } assures that x = α 1 u 1 + + α n u n. If we substitute this expansion into Ax, x we obtain n Ax, x λ j α j α j > 0 j=1 for any non-ero x. Definition: A is positive definite if Ax, x > 0 for x 0. A is positive semi-definite if Ax, x 0 for all x. A is negative definite if Ax, x < 0 for x 0. A is negative semi-definite if Ax, x 0 for all x. Note that for a positive or negative definite matrix Ax = 0 if and only if x = 0 while a semi-definite matrix may have a ero eigenvalue. 70

An application: Consider the real quadratic polynomial P (x, y, ) = a 11 x 2 + a 12 xy + a 13 x + a 22 y 2 + a 23 y + a 33 2 + b 1 x + b 2 y + b 3 + c We know from analytic geometry that P (x, y, ) = 0 describes an ellipsoid, paraboloid or hyperboloid in R 3. But how do we find out? We observe that P (x, y, ) = 0 can be rewritten as where P (x, y, ) = Hence A can be diagonalied A x y, x y A = a 11 a 12 /2 a 13 /2 a 12 /2 a 22 a 23 /2 a 13 /2 a 23 /2 a 33 Define the new vector (new coordinates) Then P (X, Y, X) = Λ A = UΛU T + X Y = U T x y X Y, X Y b, x y c = 0 is a symmetric real matrix. + b, U X Y + c = λ 1 X 2 + λ 2 Y 2 + λ 3 2 + lower order terms where {λ j } are the eigenvalues of A. If all are positive or negative we have an ellipsoid, if all are non-ero but not of the same sign we have a hyperboloid, and if at least one eigenvalue is ero we have a paraboloid. centered but not their type. The lower order terms determine where these shapes are 71

Module 11 Homework 1) Suppose that A has the property A = A. In this case A is said to be skew-hermitian. i) Show that all eigenvalues of A have to be purely imaginary. ii) Prove or disprove: The eigenvectors corresponding to distinct eigenvalues of a skew-hermitian matrix are orthogonal with respect to the complex dot product. iii) If A is real and skew-hermitian what does this imply about A T? 2) Let A be a skew-hermitian matrix. i) Show that for each x C n we have ReAx, x = 0 ii) Show that for any matrix A we have Ax, x = Bx, x + Cx, x where B is Hermitian and C is skew-hermitian. 3) Suppose that U U = I. What can you say about the eigenvalues of U? 4) Find a new coordinate system such that the conic section x 2 2xy + 4y 2 = 6 is in standard form so that you can read off whether it is an ellipse, parabola or hyperbola. 72