22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Similar documents
LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Exercise Set 7.2. Skills

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture 15, 16: Diagonalization

1 Last time: least-squares problems

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 315: Linear Algebra Solutions to Assignment 7

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Diagonalization of Matrix

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Chapter 7: Symmetric Matrices and Quadratic Forms

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

MA 265 FINAL EXAM Fall 2012

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

22m:033 Notes: 3.1 Introduction to Determinants

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

I. Multiple Choice Questions (Answer any eight)

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Math 205, Summer I, Week 4b:

Eigenvalues and Eigenvectors A =

Math 3191 Applied Linear Algebra

Chapter 6: Orthogonality

Problem 1: Solving a linear equation

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MAT 1302B Mathematical Methods II

Eigenvalue and Eigenvector Homework

PRACTICE PROBLEMS FOR THE FINAL

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

and let s calculate the image of some vectors under the transformation T.

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Practice problems for Exam 3 A =

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Diagonalizing Matrices

Applied Linear Algebra in Geoscience Using MATLAB

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Jordan Canonical Form Homework Solutions

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Chapter 3 Transformations

22m:033 Notes: 1.8 Linear Transformations

235 Final exam review questions

7. Symmetric Matrices and Quadratic Forms

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Solutions to practice questions for the final

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

Recall : Eigenvalues and Eigenvectors

MAC Module 12 Eigenvalues and Eigenvectors

Math 314H Solutions to Homework # 3

City Suburbs. : population distribution after m years

Solutions to Final Exam

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

Review problems for MA 54, Fall 2004.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Spectral Theorem for Self-adjoint Linear Operators

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Linear Algebra Practice Final

4. Linear transformations as a vector space 17

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Chapter 3. Determinants and Eigenvalues

MTH 2032 SemesterII

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Topic 1: Matrix diagonalization

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Final Exam Practice Problems Answers Math 24 Winter 2012

Homework 11 Solutions. Math 110, Fall 2013.

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

MATH 221, Spring Homework 10 Solutions

Sample Final Exam: Solutions

Eigenvalues and Eigenvectors 7.2 Diagonalization

MATH 115A: SAMPLE FINAL SOLUTIONS

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

Diagonalization. Hung-yi Lee

Math 308 Practice Final Exam Page and vector y =

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

1 Inner Product and Orthogonality

MATH 235. Final ANSWERS May 5, 2015

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Lecture 12: Diagonalization

Linear Algebra 2 Spectral Notes

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Linear Algebra II Lecture 13

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra- Final Exam Review

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Reduction to the associated homogeneous system via a particular solution

Transcription:

m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3,

Symmetric matrices Definition. A symmetric matrix is a matrix A such that A = A T. In other words a symmetric matrix is a square matrix A such that a ij = a ji. Example. 3 4 5 3 7 4 5 7 is symmetric but is not. 3 4 5 3 7 4 5 7

Diagonalization of Symmetric Matrices We will see that any symmetric matrix is diagonalizable. This is surprising enough, but we will also see that in fact a symmetric matrix is similar to a diagonal matrix in a very special way. Recall that, by our definition, a matrix A is diagonalizable if and only if there is an invertible matrix P such that A = P DP where D is a diagonal matrix. We make a stronger definition. Definition. A matrix A is orthogonally diagonalizable if and only if there is an orthogonal matrix P such that A = P DP where D is a diagonal matrix. Remark. Recall that any orthogonal matrix A is invertible and also that A = A T. Thus we can say that A matrix A is orthogonally diagonalizable if there is a square matrix P such that A = P DP T where D is a diagonal matrix. Remark.3 Recall (see page 5) the formula for transpose of a product: (MN) T = N T M T. Using this we can 3

see that any orthogonally diagonalizable A must be symmetric. This is because A T = (P DP T ) T = ((P T ) T D T P T = P DP T = A. Although we do not prove Proposition. the following theorem used in the proof will help us find the matrix P. in its own right: Proposition.4 If A is symmetric then any two eigenvalues from different eigenspaces are orthogonal For example if all of the eigenspaces are one dimensional then the set of eigenvectors is an orthogonal set. 4

Example.5 Suppose A = 3 We calculate the characteristic polynomial by expanding the determinant along top row (or left column): 3 λ A = λ λ = = (3 λ) (( λ)( λ) 4) (( λ)) + 4 = (3 λ)( 3λ + λ 4) ( λ) = (3 λ)(λ 3λ ) 4 + 4λ = 3λ 9λ 6 λ 3 + 3λ + λ 4 + 4λ = λ 3 + 6λ 3λ 5

The eigenvalues are the roots of λ 3 6λ + 3λ + = We could use the usual test for rational roots or just stare at this and note that λ = is a root. We then do long division of polynomials and see that: λ 3 6λ + 3λ + λ + = λ 7λ + = (λ 5)(λ ) So we have three distinct eigenvectors and we know the matrix is diagonalizable. Next we find eigenvectors for these values. We recall that these vectors are not unique but are all multiples of each other. To make a long story short here are three such vectors:,, 6

First we should at least verify this. We note that 3 = 5 So the first vector is an eigenvector with eigenvalue 5. Next 3 = 4 4 So the first vector is an eigenvector with eigenvalue. and finally 3 = So the first vector is an eigenvector with eigenvalue -. At this point we should check our calculations by directly verifying that this is an orthogonal set of vectors. But now to get our orthogonal matrix we need an orthonormal basis. It is clear that each of our vectors has length 3. 7

So construct our matrix P = 3 3 3 3 3 3 3 3 3 How would we check that this matrix is orthogonal? One way is to calculate P make our augmented matrix and row reduce, etc. However if we only want to verify this and not do all that calculation we only need to check that P P T = I, which is a lot easier. We would expect that 5 P AP T =. Why? We leave it as an exercise to check this. Example.6 In general, we have found that if the characteristic polynomial has a root λ of order n > or more we do not know if we can find n linearly independent eigenvalues for this eigenspace. But for symmetric matrices, we always can. In fact we can sum all this up in what the text calls the Spectral Theorem: 8

Proposition.7 (The Spectral Theorem) An n n symmetric matrix has the following properties:. A has n real eigenvalues if we count multiplicity. For each eigenvalue the dimension of the corresponding eigenspace is equal to the algebraic multiplicity of that eigenvalue. 3. The eigenspaces are mutually orthogonal 4. A is orthogonally diagonalizable The next example will need our skills at the Gram Schmidt process. Consider A =. The characteristic polynomial is (λ ) (λ 4) There is one eigenvector with eigenvalue 4 and we can verify that we could use. 9

Even though the Spectral Theorem tells us we can diagonalize this matrix, to get the matrix P we will have to find an orthogonal basis for the null space for λ =. We are looking for an orthonormal basis for the null space of A ( )I = This matrix clearly row reduces to. We have two free variables. If we let z = t and y = s then x + s + t = and the null space consists of the vectors x s t y = s = t + s z t So, is a basis for this eigenspace. Denote these vectors by b and b. However we need an orthonormal basis.

We let v = b =. Next we need v = b b v v v v. Now So Now v = v b = v v = = = = For hand calculation we could make life easier by choosing v =. Explain. So it should be clear that our orthogonal matrix could

be taken to be 3 6 6 3 6 3.