Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Similar documents
Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

(here, we allow β j = 0). Generally, vectors are represented as column vectors, not row vectors. β 1. β n. crd V (x) = R n

The Singular Value Decomposition

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

235 Final exam review questions

Chap 3. Linear Algebra

Diagonalization of Matrix

The Spectral Theorem for normal linear maps

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Symmetric and anti symmetric matrices

Linear Algebra: Matrix Eigenvalue Problems

Math 108b: Notes on the Spectral Theorem

Eigenvalues and Eigenvectors A =

Lecture 7: Positive Semidefinite Matrices

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

1. General Vector Spaces

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

1 Linear Algebra Problems

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Econ Slides from Lecture 7

Linear Algebra Lecture Notes-II

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

CS 246 Review of Linear Algebra 01/17/19

Spectral Theorem for Self-adjoint Linear Operators

Math Matrix Algebra

Multivariate Statistical Analysis

Knowledge Discovery and Data Mining 1 (VO) ( )

8. Diagonalization.

Recall : Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Math 315: Linear Algebra Solutions to Assignment 7

Mathematical foundations - linear algebra

Chapter 5 Eigenvalues and Eigenvectors

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

Linear algebra and applications to graphs Part 1

LinGloss. A glossary of linear algebra

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Quantum Computing Lecture 2. Review of Linear Algebra

1 Last time: least-squares problems

Definitions for Quizzes

Chapter 4 Euclid Space

Maths for Signals and Systems Linear Algebra in Engineering

Lecture 8 : Eigenvalues and Eigenvectors

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Methods for sparse analysis of high-dimensional data, II

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

EIGENVALUES AND EIGENVECTORS 3

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Eigenvalues and Eigenvectors

Study Notes on Matrices & Determinants for GATE 2017

The following definition is fundamental.

Eigenvalue and Eigenvector Homework

Math Linear Algebra Final Exam Review Sheet

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

REPRESENTATION THEORY WEEK 7

Methods for sparse analysis of high-dimensional data, II

Math Matrix Algebra

Linear Algebra Review

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Linear Algebra Review. Vectors

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Symmetric Matrices and Eigendecomposition

Notes on basis changes and matrix diagonalization

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Linear algebra 2. Yoav Zemel. March 1, 2012

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Chapter 3. Determinants and Eigenvalues

Linear Algebra Solutions 1

Math Camp Notes: Linear Algebra II

Mathematical foundations - linear algebra

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

Basic Concepts in Matrix Algebra

MATH 583A REVIEW SESSION #1

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

NOTES ON BILINEAR FORMS

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Ma/CS 6b Class 20: Spectral Graph Theory

Math Linear Algebra II. 1. Inner Products and Norms

Review of linear algebra

Transcription:

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms De La Fuente notes that, if an n n matrix has n distinct eigenvalues, it can be diagonalized. In this supplement, we will provide an additional, very important diagonalization result. Symmetric matrices can always be diagonalized; moreover, the change of basis matrices that carry out the diagonalization have a special form. This has an important application to quadratic forms, which in turn have application to the geometry of level sets of preferences, and to the analysis of variance-covariance matrices. 1 Diagonalization and Change of Basis Before proceeding with the diagonalization result for symmetric matrices, it is useful to discuss the relationship between diagonalization and change of basis. De La Fuente (page 151 defines a square matrix M to be diagonalizable if there exists an invertible matrix P such that P 1 MP is diagonal; he also (page 146 defines two square matrices A and B to be similar if there is an invertible matrix P such that P 1 AP = B, so a square matrix M is diagonalizable if and only if it is similar to a diagonal matrix. Theorem 2 tells us that a matrix is diagonalizable if and only if there is another basis so that the representation of the same transformation in the new basis is diagonal. Proposition 1 Fix an n-dimensional vector space X and a basis U = {u 1,..., u n }. An n n matrix P is invertible if and only if there is a basis W such that P = (Mtx U,W (id; in this case, ((Mtx U,W (id 1 = (Mtx W,U (id. Proof: Suppose that P is invertible. Let W = {w 1,..., w n }, where w j = ni=1 p ij u i. Since P is invertible, rank P = n, so W is a basis. P = (Mtx U,W (id. 1

Conversely, suppose there is a basis W such that P = (Mtx U,W (id. Then w j = n i=1 p ij u i. By the Commutative Diagram Theorem (see the Supplement to Section 3.3, (Mtx U,W (id (Mtx W,U (id = (Mtx U,U (id id = (Mtx U (id = I so P is invertible and (Mtx W,U (id = ((Mtx U,W (id 1. Theorem 2 Suppose that X is finite-dimensional If T L(X, X, and U, W are any two bases of X, then (Mtx W (T and (Mtx U (T are similar. Conversely, given similar matrices A, B with A = P 1 BP and any basis U, there is a basis W and T L(X, X such that B = (Mtx U (T, A = (Mtx W (T, P = (Mtx U,W (id, and P 1 = (Mtx W,U (id. Proof: For the first bullet, note that (Mtx W (T = (Mtx W,U (id (Mtx U (T (Mtx U,W (id by the Commutative Diagram Theorem. By Proposition 1, (Mtx W,U (id = ((Mtx U,W (id 1, so (Mtx W (T and (Mtx U (T are similar. For the second bullet, P is invertible, so by Proposition 1, there is a basis W such that P = (Mtx U,W (id and P 1 = (Mtx W,U (id. Let T L(X, X be the linear transformation such that (Mtx U (T = B. Then A = P 1 BP = (Mtx W,U (id (Mtx U (T (Mtx U,W (id = (Mtx W (id T id = (Mtx W (T by the Commutative Diagram Theorem. 2

2 Eigenvalues and Eigenvectors of a Linear Transformation De la Fuente defines eigenvalue and eigenvector for a matrix. Here, we define an eigenvalue for a linear transformation. For finite-dimensional spaces, we show that λ is an eigenvalue of T if and only if λ is an eigenvalue for every matrix representation of T. Definition 3 Let X be a vector space and T L(X, X. We say that λ is an eigenvalue of T and v 0 is an eigenvector corresponding to λ if T(v = λv. Theorem 4 Let X be a finite-dimensional vector space, and U any basis. Then λ is an eigenvalue of T if and only if λ is an eigenvalue of (Mtx U (T. v is an eigenvector of T corresponding to λ if and only if (crd U (v is an eigenvector of (Mtx U (T corresponding to λ. Proof: By the Commutative Diagram Theorem, T(v = λv (crd U (T(v = crd U (λv (Mtx U (T((crd U (v = λ((crd U (v Theorem 5 (Theorem 6.7 Let X be an n-dimensional vector space, T L(X, X, U any basis of X, and A = (Mtx U (T. Then the following are equivalent: A can be diagonalized there is a basis W for X consisting of eigenvectors of T there is a basis V for R n consisting of eigenvectors of A. Proof: Use de la Fuente s Theorem 6.7 and Theorem 4 above. 3

3 Diagonalization of Symmetric Real Matrices Definition 6 Let δ ij = { 1 if i = j 0 if i j A basis V = {v 1,..., v n } of R n is orthonormal if v i v j = δ ij. In other words, each basis element has unit length, and distinct basis elements are perpendicular. Example 7 The standard basis of R n is orthonormal. Definition 8 A real n n matrix A is unitary if A = A 1 (here, A denotes the transpose of A: ( A ij = A ji. Theorem 9 A real n n matrix A is unitary if and only if the columns of A are orthonormal. Proof: Let α j denote the j th column of A. A = A 1 A A = I α i α j = δ ij {α 1,..., α n } is orthonormal Remark 10 Let A be unitary and let V be the set of columns of A; let W be the standard basis of R n. Since A is unitary, it is invertible, so V is a basis of R n and A = (Mtx V,W (id, where id is the identity transformation on R n. Since V is orthonormal, the transformation between bases W and V preserves all geometry; the lengths of vectors, and the angles between vectors, are not changed by changing from one basis to the other. Theorem 11 Let T L(R n,r n, id the identity transformation in L(R n,r n, and suppose that (Mtx W (T, the matrix representation of T with respect to the standard basis W, is symmetric. Then there is an orthonormal basis V = {v 1,..., v n } of R n consisting of eigenvectors of T, so that (Mtx W (T = (Mtx W,V (id (Mtx V (T (Mtx V,W (id 4

where (Mtx V T is diagonal and the change of basis matrices (Mtx V,W (id and (Mtx W,V (id are unitary. The proof of the theorem requires a lengthy digression into the linear algebra of complex vector spaces. Here is a very brief outline. Let M = (Mtx W (t. Since M is a real matrix, its characteristic polynomial det((mtx W (T λi is an n th degree polynomial with real coefficients. Recall that an n th degree polynomial with coefficients in R or in C always has n roots (not necessarily distinct in C. 1 The inner product in C n is defined as follows: n x y = x i y j j=1 where c denotes the complex conjugate of any c C; note that this implies that x y = y x. The usual inner product in R n is the restriction of this inner product on C n to R n. Given any complex matrix A, define A to be the matrix whose (i, j th entry is a ji ; in other words, A is formed by taking the complex conjugate of each element of the transpose of A. It is easy to verify that given x, y C n and a complex n n matrix A, Ax y = x A y. Since M is real and symmetric, M = M. If λ C is an eigenvalue of M, with eigenvector x C n, then λ x 2 = λ(x x = (λx x = (Mx x = x (M x = x (Mx = x (λx = (λx x = λ(x x = λ x 2 = λ x 2 1 Although we do not need it here, you should also recall that if the coefficients of the polynomial lie in R, the roots occur in conjugate pairs. This fact is important for the solution of differential equations. 5

which proves that λ = λ, hence λ R. Notice that if M is real (not necessarily symmetric and λ R is an eigenvalue, then det(m λi = 0 v R n s.t. (M λiv = 0, so there is at least one real eigenvalue. The fact that M is real and symmetric implies that, if an eigenvalue has multiplicity m, one can find m independent real eigenvectors corresponding to that eigenvalue. Thus, there is a basis of R n consisting of eigenvectors, hence M is diagonalizable over R. To see that the eigenvectors corresponding to distinct eigenvalues are orthogonal, suppose that Tx = λx and Ty = ρy with ρ λ. Then λ(x y = (λx y = (Mx y = (Mx y = ( x M y = ( x M y = x (My = x (ρy = x (ρy = ρ(x y so (λ ρ(x y = 0; since λ ρ 0, we must have x y = 0. 4 Application to Quadratic Forms In the remainder of this note, we use Theorem 11 to study quadratic forms. Consider a quadratic form Let n f(x 1,..., x n = α ii x 2 i + β ij x i x j (1 i=1 i<j α ij = { βij 2 if i < j β ji 2 if i > j Let A = (α ij. Then f(x = x Ax. Since A is symmetric, R n has an orthonormal basis V = {v 1,..., v n } consisting of eigenvectors of A; let 6

λ 1,..., λ n denote the corresponding eigenvalues. Thus, A = U DU where D is diagonal (with diagonal elements λ 1,..., λ n, and U = (Mtx V,W (id is unitary. The columns of U (which, of course, are the rows of U are the coordinates of v 1,..., v n, expressed in terms of the standard basis W. f ( = ( A ( = ( U DU ( = ( U γ i v i D ( U = ( γi Uv i D ( γi Uv i = (γ 1,..., γ n D = λ i γ 2 i This proves the following corollary of Theorem 11. Corollary 12 Consider the quadratic form (1. 1. f has a global minimum at 0 if and only if λ i 0 for all i; the level sets of f are ellipsoids with principal axes aligned with the orthonormal eigenvectors v 1,..., v n. 2. f has a global maximum at 0 if and only if λ i 0 for all i; the level sets of f are ellipsoids with principal axes aligned with the orthonormal eigenvectors v 1,..., v n. 3. If λ i < 0 for some i and λ j > 0 for some j, then f has a saddle point at 0; the level sets of f are hyperboloids with principal axes aligned with the orthonormal eigenvectors v 1,..., v n. γ 1. γ n 7