Chapter 6. Eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

Similar documents
For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3.

Recall : Eigenvalues and Eigenvectors

Econ Slides from Lecture 7

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Lecture 15, 16: Diagonalization

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Review problems for MA 54, Fall 2004.

235 Final exam review questions

Math 315: Linear Algebra Solutions to Assignment 7

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

Diagonalization of Matrix

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

7. Symmetric Matrices and Quadratic Forms

Chapter 13. Convex and Concave. Josef Leydold Mathematical Methods WS 2018/19 13 Convex and Concave 1 / 44

Linear Algebra: Matrix Eigenvalue Problems

Math Matrix Algebra

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Lecture 10 - Eigenvalues problem

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Linear Algebra Primer

1. General Vector Spaces

Conceptual Questions for Review

Monotone Function. Function f is called monotonically increasing, if. x 1 x 2 f (x 1 ) f (x 2 ) x 1 < x 2 f (x 1 ) < f (x 2 ) x 1 x 2

3 (Maths) Linear Algebra

and let s calculate the image of some vectors under the transformation T.

CS 246 Review of Linear Algebra 01/17/19

MATH 221, Spring Homework 10 Solutions

Ma/CS 6b Class 20: Spectral Graph Theory

TBP MATH33A Review Sheet. November 24, 2018

Exercise Set 7.2. Skills

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

UNIT 6: The singular value decomposition.

Definition (T -invariant subspace) Example. Example

Chapter 3. Determinants and Eigenvalues

Knowledge Discovery and Data Mining 1 (VO) ( )

Review of Some Concepts from Linear Algebra: Part 2

Eigenvalues and Eigenvectors: An Introduction

Math 21b Final Exam Thursday, May 15, 2003 Solutions

Foundations of Matrix Analysis

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Eigenvalues and Eigenvectors

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Econ Slides from Lecture 8

Ma/CS 6b Class 20: Spectral Graph Theory

Math 108b: Notes on the Spectral Theorem

Math Matrix Algebra

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Lecture 7: Positive Semidefinite Matrices

ELE/MCE 503 Linear Algebra Facts Fall 2018

MA 265 FINAL EXAM Fall 2012

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Chapter 7: Symmetric Matrices and Quadratic Forms

Linear algebra and applications to graphs Part 1

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Dimension. Eigenvalue and eigenvector

vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

Math Linear Algebra Final Exam Review Sheet

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

Background Mathematics (2/2) 1. David Barber

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

The Eigenvalue Problem: Perturbation Theory

Topic 1: Matrix diagonalization

18.06 Professor Strang Quiz 3 May 2, 2008

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Linear Algebra Practice Problems

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

2 b 3 b 4. c c 2 c 3 c 4

CS 143 Linear Algebra Review

Numerical Methods - Numerical Linear Algebra

Study Guide for Linear Algebra Exam 2

Chapter 5. Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

There are six more problems on the next two pages

Chapter 3. Matrices. 3.1 Matrices

Linear Algebra: Characteristic Value Problem

Diagonalizing Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapter 3. Linear Equations. Josef Leydold Mathematical Methods WS 2018/19 3 Linear Equations 1 / 33

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Eigenvectors. Prop-Defn

1 Last time: least-squares problems

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Transcription:

Chapter 6 Eigenvalues Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

Closed Leontief Model In a closed Leontief input-output-model consumption and production coincide, i.e. V x = x = 1 x Is this possible for the given technology matrix V? This is a special case of a so called eigenvalue problem. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 2 / 45

Eigenvalue and Eigenvector A vector x R n, x = 0, is called eigenvector of an n n matrix A corresponding to eigenvalue λ R, if A x = λ x The eigenvalues of matrix A are all numbers λ for which an eigenvector does exist. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 3 / 45

Example Eigenvalue and Eigenvector For a 3 3 diagonal matrix we find A e 1 = a 11 0 0 0 a 22 0 0 0 a 33 1 0 = 0 a 11 0 0 = a 11 e 1 Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Analogously we find for an n n diagonal matrix A e i = a ii e i So the eigenvalue of a diagonal matrix are its diagonal elements with unit vectors e i as the corresponding eigenvectors. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 4 / 45

Computation of Eigenvalues In order to find eigenvectors of n n matrix A we have to solve equation A x = λx = λix (A λi)x = 0. If (A λi) is invertible then we get x = (A λi) 1 0 = 0. However, x = 0 cannot be an eigenvector (by definition) and thus λ cannot be an eigenvalue. Thus λ is an eigenvalue of A if and only if (A λi) is not invertible, i.e., if and only if det(a λi) = 0 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 5 / 45

Example Eigenvalues Compute the eigenvalues of matrix A = ( ) 1 2. 1 4 We have to find all λ R where A λi vanishes. 1 λ 2 det(a λi) = 1 4 λ = λ2 5λ + 6 = 0 The roots of this quadratic equation are λ 1 = 2 and λ 2 = 3. Thus matrix A has eigenvalues 2 and 3. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 6 / 45

Characteristic Polynomial For an n n matrix A det(a λi) is a polynomial of degree n in λ. It is called the characteristic polynomial of matrix A. The eigenvalues are then the roots of the characteristic polynomial. For that reason eigenvalues and eigenvectors are sometimes called the characteristic roots and characteristic vectors, resp., of A. The set of all eigenvalues of A is called the spectrum of A. Spectral methods make use of eigenvalues. Remark: It may happen that characteristic roots are complex (λ C). These are then called complex eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 7 / 45

Computation of Eigenvectors Eigenvectors x corresponding to a known eigenvalue λ 0 can be computed by solving equation (A λ 0 I)x = 0. ( ) 1 2 Eigenvectors of A = corresponding to λ 1 = 2: 1 4 ( ) ( ) ( ) 1 2 x 1 0 (A λ 1 I)x = = 1 2 0 Gaussian elimination yields: ( ) 2 v 1 = α 1 x 2 x 2 = α and x 1 = 2α for an α R \ {0} Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 8 / 45

Eigenspace If x is an eigenvector corresponding to eigenvalue λ, then each multiple αx is an eigenvector, too: A (αx) = α(a x) = αλ x = λ (αx) If x and y are eigenvectors corresponding to the same eigenvalue λ, then x + y is an eigenvector, too: A (x + y) = A x + A y = λ x + λ y = λ (x + y) The set of all eigenvectors corresponding to eigenvalue λ (including zero vector 0) is thus a subspace of R n and is called the eigenspace corresponding to λ. Computer programs always return bases of eigenspaces. (Beware: Bases are not uniquely determined!) Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 9 / 45

Example Eigenspace ( ) 1 2 Let A =. 1 4 Eigenvector corresponding to eigenvalue λ 1 = 2: v 1 = Eigenvector corresponding to eigenvalue λ 2 = 3: v 2 = ( ) 2 1 ( ) 1 1 Eigenvectors corresponding to eigenvalue λ i are all non-vanishing multiples of v i (i.e., = 0). Computer programs return normalized eigenvectors: v 1 = ( 2 5 1 5 ) and v 2 = ( 1 2 1 2 ) Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 10 / 45

Example Eigenvalues and Eigenvectors of A = 2 0 1 0 3 1. 0 6 2 Create the characteristic polynomial and compute its roots: 2 λ 0 1 det(a λi) = 0 3 λ 1 = (2 λ) λ (λ 5) = 0 0 6 2 λ Eigenvalues: λ 1 = 2, λ 2 = 0, and λ 3 = 5. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 11 / 45

Example Eigenvector(s) corresponding to eigenvalue λ 3 = 5 by solving equation (A λ 3 I)x = (2 5) 0 1 0 (3 5) 1 0 6 (2 5) x 1 x 2 x 3 = 0 Gaussian elimination yields 3 0 1 0 0 2 1 0 0 6 3 0 3 0 1 0 0 2 1 0 0 0 0 0 Thus x 3 = α, x 2 = 1 2 α, and x 1 = 1 3α for arbitrary α R \ {0}. Eigenvector v 3 = (2, 3, 6) t Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 12 / 45

Example Eigenvector corresponding to 1 λ 1 = 2: v 1 = 0 λ 2 = 0: v 2 = λ 3 = 5: v 3 = 0 3 2 6 2 3 6 Eigenvectors corresponding to eigenvalue λ i are all non-vanishing multiples of v i. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 13 / 45

Properties of Eigenvalues 1. A and A t have the same eigenvalues. 2. Let A and B be n n-matrices. Then A B and B A have the same eigenvalues. 3. If x is an eigenvector of A corresponding to λ, then x is an eigenvector of A k corresponding to eigenvalue λ k. 4. If x is an eigenvector of regular matrix A corresponding to λ, then x is an eigenvector of A 1 corresponding to eigenvalue 1 λ. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 14 / 45

Properties of Eigenvalues 5. The product of all eigenvalues λ i of an n n matrix A is equal to the determinant of A: det(a) = n i=1 This implies: A is regular if and only if all its eigenvalues are non-zero. 6. The sum of all eigenvalues λ i of an n n matrix A is equal to the sum of the diagonal elements of A (called the trace of A). λ i tr(a) = n i=1 a ii = n λ i i=1 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 15 / 45

Eigenvalues of Similar Matrices Let U be the transformation matrix and C = U 1 A U. If x is an eigenvector of A corresponding to eigenvalue λ, then U 1 x is an eigenvector of C corresponding to λ: C (U 1 x) = (U 1 AU)U 1 x = U 1 Ax = U 1 λx = λ (U 1 x) Similar matrices A and C have the same eigenvalues and (if we consider change of basis) the same eigenvectors. We want to choose a basis such that the matrix that represents the given linear map becomes as simple as possible. The simplest matrices are diagonal matrices. Can we (always) find a representation by means of a diagonal matrix? Unfortunately not in the general case. But... Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 16 / 45

Symmetric Matrix An n n matrix A is called symmetric, if A t = A. For a symmetric matrix A we find: All n eigenvalues are real. Eigenvectors u i corresponding to distinct eigenvalues λ i are orthogonal (i.e., u t i u j = 0 if i = j). There exists an orthonormal basis {u 1,..., u n } (i.e. the vectors u i are normalized and mutually orthogonal) that consists of eigenvectors of A, Transformation matrix U = (u 1,..., u n ) is then an orthogonal matrix: U t U = I U 1 = U t Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 17 / 45

Diagonalization For the i-th unit vector e i we find U t A U e i = U t A u i = U t λ i u i = λ i U t u i = λ i e i And thus U t A U = D = λ 1 0... 0 0 λ 2... 0...... 0 0... λ n Every symmetric matrix A becomes a diagonal matrix with the eigenvalues of A as its entries if we use the orthonormal basis of eigenvectors. This procedure is called diagonalization of matrix A. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 18 / 45

Example Diagonalization We want to diagonalize A = ( ) 1 2. 2 1 Eigenvalues λ 1 = 1 and λ 2 = 3 with respective normalized eigenvectors u 1 = ( 1 2 1 2 ) and u 2 = ( 1 2 1 2 ) With respect to basis {u 1, u 2 } matrix A becomes diagonal matrix ( ) 1 0 0 3 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 19 / 45

A Geometric Interpretation I ( ) 1 2 Function x Ax = x maps the unit circle in R 2 into an 2 1 ellipsis. The two semi-axes of the ellipsis are given by λ 1 v 1 and λ 2 v 2, resp. A 3v 2 v 1 v 2 v 1 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 20 / 45

Quadratic Form Let A be a symmetric matrix. Then function is called a quadratic form. q A : R n R, x q A (x) = x t A x 1 0 0 Let A = 0 2 0. Then 0 0 3 t x 1 1 0 0 q A (x) = 0 2 0 x 2 x 3 0 0 3 x 1 x 2 x 3 = x1 2 + 2 x2 2 + 3 x2 3 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 21 / 45

Example Quadratic Form In general we find for n n matrix A = (a ij ): q A (x) = n i=1 n j=1 a ij x i x j q A (x) = = x 1 x 2 x 3 x 1 x 2 t t 1 1 2 1 2 3 2 3 1 x 1 + x 2 2x 3 x 1 + 2x 2 + 3x 3 x 1 x 2 x 3 x 3 2x 1 + 3x 2 + x 3 = x 2 1 + 2x 1x 2 4x 1 x 3 + 2x 2 2 + 6x 2 x 3 + x 2 3 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 22 / 45

Definiteness A quadratic form q A is called positive definite, if for all x = 0, q A (x) > 0. positive semidefinite, if for all x, q A (x) 0. negative definite, if for all x = 0, q A (x) < 0. negative semidefinite, if for all x, q A (x) 0. indefinite in all other cases. A matrix A is called positive (negative) definite (semidefinite), if the corresponding quadratic form is positive (negative) definite (semidefinite). Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 23 / 45

Definiteness Every symmetric matrix is diagonalizable. Let {u 1,..., u n } be the orthonormal basis of eigenvectors of A. Then for every x: x = n c i (x)u i = Uc(x) i=1 U = (u 1,..., u n ) is the transformation matrix for the orthonormal basis, c the corresponding coefficient vector. So if D is the diagonal matrix of eigenvalues λ i of A we find q A (x) = x t A x = (Uc) t A Uc = c t U t AU c = c t D c and thus q A (x) = n c 2 i λ i i=1 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 24 / 45

Definiteness and Eigenvalues Equation q A (x) = n i=1 c2 i λ i immediately implies: Let λ i be the eigenvalues of symmetric matrix A. Then A (the quadratic form q A ) is positive definite, if all λ i > 0. positive semidefinite, if all λ i 0. negative definite, if all λ i < 0. negative semidefinite, if all λ i 0. indefinite, if there exist λ i > 0 and λ j < 0. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 25 / 45

Example Definiteness and Eigenvalues The eigenvalues of Thus the matrix is positive definite. The eigenvalues of ( ) 2 2 are λ 1 = 6 and λ 2 = 1. 2 5 5 1 4 1 2 1 are 4 1 5 λ 1 = 0, λ 2 = 3, and λ 3 = 9. The matrix is positive semidefinite. The eigenvalues of 7 5 4 5 7 4 4 4 2 are λ 1 = 6, λ 2 = 6 and λ 3 = 12. Thus the matrix is indefinite. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 26 / 45

Leading Principle Minors The definiteness of a matrix can also be determined by means of minors. Let A = (a ij ) be a symmetric n n matrix. Then the determinant of submatrix A k = a 11... a 1k..... a k1... a kk is called the k-th leading principle minor of A. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 27 / 45

Leading Principle Minors and Definiteness A symmetric Matrix A is positive definite, if and only if all A k > 0. negative definite, if and only if ( 1) k A k > 0 for all k. indefinite, if A = 0 and none of the two cases is holds. ( 1) k A k > 0 means that A 1, A 3, A 5,... < 0, and A 2, A 4, A 6,... > 0. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 28 / 45

Example Leading Principle Minors Definiteness of matrix 2 1 0 A = 1 3 1 0 1 2 A and q A are positive definite. A 1 = det(a 11 ) = a 11 = 2 > 0 a 11 a 12 A 2 = a 21 a 22 = 2 1 1 3 > 0 2 1 0 A 3 = A = 1 3 1 = 8 0 1 2 > 0 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 29 / 45

Example Leading Principle Minors Definiteness of matrix 1 1 2 A = 1 2 3 2 3 1 A and q A are indefinite. A 1 = det(a 11 ) = a 11 = 1 > 0 a 11 a 12 A 2 = a 21 a 22 = 1 1 1 2 = 1 > 0 1 1 2 A 3 = A = 1 2 3 = 28 < 0 2 3 1 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 30 / 45

Principle Minors Unfortunately the condition for semidefiniteness is more tedious. Let A = (a ij ) be a symmetric n n matrix. Then the determinant of submatrix A i1,...,i k = a i1,i 1... a i1,i k..... a ik,i 1... a ik,i k 1 i 1 <... < i k n. is called a principle minor of order k of A. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 31 / 45

Principle Minors and Semidefiniteness A symmetric matrix A is positive semidefinite, if and only if all A i1,...,i k 0. negative semidefinite, if and only if ( 1) k A i1,...,i k 0 for all k. indefinite in all other cases. ( 1) k A i1,...,i k 0 means that A i1,...,i k 0, if k is even, and A i1,...,i k 0, if k is odd. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 32 / 45

Example Principle Minors Definiteness of matrix 5 1 4 A = 1 2 1 4 1 5 The matrix is positive semidefinite. (But not positive definite!) principle minors of order 1: A 1 = 5 0 A 2 = 2 0 A 3 = 5 0 principle minors of order 2: 5 1 A 1,2 = 1 2 5 4 A 1,3 = 4 5 2 1 A 2,3 = 1 5 = 9 0 = 9 0 = 9 0 principle minors of order 3: A 1,2,3 = A = 0 0 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 33 / 45

Example Principle Minors Definiteness of matrix 5 1 4 A = 1 2 1 4 1 5 The matrix is negative semidefinite. (But not negative definite!) principle minors of order 1: A 1 = 5 0 A 2 = 2 0 A 3 = 5 0 principle minors of order 2: 5 1 A 1,2 = 1 2 5 4 A 1,3 = 4 5 2 1 A 2,3 = 1 5 = 9 0 = 9 0 = 9 0 principle minors of order 3: A 1,2,3 = A = 0 0 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 34 / 45

Leading Principle Minors and Semidefiniteness Obviously every positive definite matrix is also positive semidefinite (but not necessarily the other way round). Matrix A = 2 1 0 1 3 1 0 1 2 is positive definite as all leading principle minors are positive (see above). Therefore A is also positive semidefinite. In this case there is no need to compute the non-leading principle minors. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 35 / 45

Recipe for Semidefiniteness Recipe for finding semidefiniteness of matrix A: 1. Compute all leading principle minors: If the condition for positive definiteness holds, then A is positive definite and thus positive semidefinite. Else if the condition for negative definiteness holds, then A is negative definite and thus negative semidefinite. Else if det(a) = 0, then A is indefinite. 2. Else also compute all non-leading principle minors: If the condition for positive semidefiniteness holds, then A is positive semidefinite. Else if the condition for negative semidefiniteness holds, then A is negative semidefinite. Else A is indefinite. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 36 / 45

Ellipse Equation ax 2 + by 2 = 1, a, b > 0 describes an ellipse in canonical form. 1/ b 1/ a The semi-axes have length 1 a and 1 b, resp. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 37 / 45

A Geometric Interpretation II Term ax 2 + by 2 is a quadratic form with matrix ( ) a 0 A = 0 b It has eigenvalues and normalized eigenvectors λ 1 = a with v 1 = e 1 and λ 2 = b with v 2 = e 2. 1 λ2 v 2 1 λ1 v 1 These eigenvectors coincide with the semi-axes of the ellipse. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 38 / 45

A Geometric Interpretation II Now let A be a symmetric 2 2 matrix with positive eigenvalues. Equation x t Ax = 1 describes an ellipse where the semi-axes (principle axes) coincide coincides with its normalized eigenvectors as seen below. 1 λ2 v 2 1 λ1 v 1 Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 39 / 45

A Geometric Interpretation II By a change of basis from {e 1, e 2 } to {v 1, v 2 } by means of transformation U = (v 1, v 2 ) this ellipse is rotated into canonical form. U t 1 λ2 v 2 1 λ1 v 1 1 λ2 e 2 1 λ1 e 1 (That is, we have diagonalize matrix A.) Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 40 / 45

An Application in Statistics Suppose we have n observations of k metric attributes X 1,..., X k which we combine into a vector: x i = (x i1,..., x ik ) R k The arithmetic mean then is (as for univariate data) x = 1 n n x i = (x 1,..., x k ) i=1 The total sum of squares is a measure for the statistical dispersion TSS = n x i x 2 = i=1 ( k n ) x ij x j 2 = j=1 i=1 k TSS j j=1 and can be computed component-wise. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 41 / 45

An Application in Statistics A change of basis by means of an orthogonal matrix does not change TSS. However, it changes the contributions of each of the components. Can we find a basis such that a few components contribute much more to the TSS than the remaining ones? Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 42 / 45

Principle Component Analysis (PCA) Assumptions: The data are approximately multinormal distributed. Procedure: 1. Compute the covariance matrix Σ. 2. Compute all eigenvalues and normalized eigenvectors of Σ. 3. Sort eigenvalues such that λ 1 λ 2... λ k. 4. Use corresponding eigenvectors v 1,..., v k as new basis. 5. The contribution to TSS of the first m components in this basis is m j=1 TSSj k j=1 TSSj m j=1 λ j k j=1 λ j. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 43 / 45

Principle Component Analysis (PCA) By means of PCA it is possible to reduce the number of dimensions without reducing the TSS substantially. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 44 / 45

Summary Eigenvalues and eigenvectors Characteristic polynomial Eigenspace Properties of eigenvalues Symmetric Matrices and Diagonalization Quadratic forms Definitness Principle minors Principle component analysis Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 45 / 45