Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Similar documents
Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Lecture 10 - Eigenvalues problem

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

Chapter 5 Eigenvalues and Eigenvectors

Math 315: Linear Algebra Solutions to Assignment 7

Linear Algebra Practice Problems

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Eigenvalues and Eigenvectors A =

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 489AB Exercises for Chapter 2 Fall Section 2.3

CHAPTER 3. Matrix Eigenvalue Problems

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Foundations of Matrix Analysis

Linear Algebra: Matrix Eigenvalue Problems

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math Linear Algebra Final Exam Review Sheet

Eigenvalues and Eigenvectors

Math 205, Summer I, Week 4b:

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Knowledge Discovery and Data Mining 1 (VO) ( )

ELE/MCE 503 Linear Algebra Facts Fall 2018

Diagonalization of Matrix

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MA 265 FINAL EXAM Fall 2012

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Introduction to Matrix Algebra

Symmetric and anti symmetric matrices

Notes on Linear Algebra and Matrix Theory

Eigenvalues and Eigenvectors

Conceptual Questions for Review

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Econ Slides from Lecture 7

Eigenvalue and Eigenvector Problems

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

4. Linear transformations as a vector space 17

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Lecture 15, 16: Diagonalization

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Math 489AB Exercises for Chapter 1 Fall Section 1.0

Quantum Computing Lecture 2. Review of Linear Algebra

Computational Methods. Eigenvalues and Singular Values

Lecture Summaries for Linear Algebra M51A

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Linear Algebra Highlights

MTH 464: Computational Linear Algebra

Lec 2: Mathematical Economics

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Chapter 3. Determinants and Eigenvalues

CHARACTERIZATIONS. is pd/psd. Possible for all pd/psd matrices! Generating a pd/psd matrix: Choose any B Mn, then

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Generalized eigenvector - Wikipedia, the free encyclopedia

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Review problems for MA 54, Fall 2004.

0.1 Rational Canonical Forms

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Chap 3. Linear Algebra

Linear algebra and applications to graphs Part 1

MATH 583A REVIEW SESSION #1

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Eigenvalues and eigenvectors

CAAM 335 Matrix Analysis

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Math 314H Solutions to Homework # 3

c Igor Zelenko, Fall

Definition (T -invariant subspace) Example. Example

4. Determinants.

Online Exercises for Linear Algebra XM511

Dimension. Eigenvalue and eigenvector

Practice Problems for the Final Exam

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

Eigenvalue and Eigenvector Homework

Eigenvalues and Eigenvectors. Review: Invertibility. Eigenvalues and Eigenvectors. The Finite Dimensional Case. January 18, 2018

Exercise Sheet 1.

Topic 1: Matrix diagonalization

LinGloss. A glossary of linear algebra

Announcements Wednesday, November 01

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

vibrations, light transmission, tuning guitar, design buildings and bridges, washing machine, Partial differential problems, water flow,...

Linear Algebra in Actuarial Science: Slides to the lecture

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Transcription:

Lecture 2: Eigenvalues, eigenvectors and similarity The single most important concept in matrix theory. German word eigen means proper or characteristic. KTH Signal Processing 1 Magnus Jansson/Emil Björnson Example: Stability of linear systems Consider a linear discrete time homogenous system: x(n +1)=Ax(n) If the spectral radius of A is greater than 1, and x(n) is not orthogonal to the corresponding eigenvectors, x(n +1)will grow in the direction of the unstable modes. KTH Signal Processing 3 Magnus Jansson/Emil Björnson Definition Consider: Square matrix A Mn If there exists x C n ( x 0) and λ C such that Ax = λx λ is an eigenvalue of A x is an eigenvector of A associated with λ More terminology: Spectrum of A: Set of all eigenvalues. Notation: σ(a). Eigenspace of A: Set of all eigenvectors. Spectral radius of A: ρ(a) =max{ λ : λ σ(a)}. KTH Signal Processing 2 Magnus Jansson/Emil Björnson Example: Filter output power maximization Let y(t) be a vector of observations at time t, and let z(t) =x T y(t) be the output of a filter with filter coefficients in the vector x. The mean output power is E{z 2 (t)} =E{ ( x T y(t) )( y T (t)x ) } = x T E{y(t)y T (t)}x = x T Ryx Assume we want to find the filter that maximizes the output power under a normalizing constraint on the filter gain max x x T Ryx subject to x T x =1 The solution to this is given by the unit length eigenvector corresponding to the largest eigenvalue of Ry. KTH Signal Processing 4 Magnus Jansson/Emil Björnson

What about zero? Recall: If exists x C n ( x 0) and λ C such that Ax = λx λ is an eigenvalue of A x is an eigenvector of A associated with λ Question: Why x 0? Answer: We always have A0 = λ0 (uninteresting solution!) However: λ =0is an important case: Ax =0=0x In fact: A Mn is singular iff it exists x 0such that Ax =0x =0or, equivalently, iff 0 σ(a) KTH Signal Processing 5 Magnus Jansson/Emil Björnson How to find an eigenvalue? Rewrite eigenvalue definition (A Mn): λx Ax =0 (λi A)x =0 Observation: Eigenvalues make (λi A) singular, det(λi A) =0. Definition: Characteristic polynomial is pa(t) =det(ti A). pa(t) is polynomial of degree n: Has n solutions/roots to pa(t) =0. Conclusions: These roots are the eigenvalues of A. A Mn has n (complex) eigenvalues. Some eigenvalues may have (algebraic) multiplicity! KTH Signal Processing 7 Magnus Jansson/Emil Björnson Polynomials of matrices Consider: Scalar polynomial p(t) =akt k + ak 1t k 1 +...+ a0. Define: Matrix polynomial p(a) =aka k + ak 1A k 1 +...+ a0i for A Mn. Theorem: If λ is an eigenvalue of A and x the associated eigenvector, p(a)x = p(λ)x. Thus, x is also eigenvector of p(a) associated with eigenvalue p(λ). KTH Signal Processing 6 Magnus Jansson/Emil Björnson Trace and determinant Definition: Trace is tr(a) = n i=1 a ii Definition: Determinant is det(a) =[Laplace expansion in 0.3.1] Theorem: Expressed using eigenvalues as tr(a) = n i=1 λi, det(a) = n i=1 λi. Observation: Coefficients in characteristic polynomial pa(t) =t n + an 1t n 1 +...+ a0 where an 1 = tr(a), a0 =( 1) n det(a). Formulas exist for all ak; see the book. KTH Signal Processing 8 Magnus Jansson/Emil Björnson

Similarity Consider: A Mn, B Mn Definition: B is similar to A if there exists a nonsingular S Mn such that A = S 1 BS Notation: B A Transformation A S 1 AS called similarity transformation by the similarity matrix S. KTH Signal Processing 9 Magnus Jansson/Emil Björnson Diagonalizable matrices Definition: A Mn is diagonalizable if it is similar to a diagonal matrix. Theorem: A is diagonalizable iff A has n linearly independent eigenvectors. KTH Signal Processing 11 Magnus Jansson/Emil Björnson Equivalence class Similarity is reflexive A A symmetric B A implies A B transitive C B and B A imply C A Divides all matrices into (disjoint) equivalence classes: Each class has a representative matrix A. The class includes all matrices similar to A. Theorem: If B A, pb(t) =pa(t). B and A have the same eigenvalues (counting multiplicity). KTH Signal Processing 10 Magnus Jansson/Emil Björnson Linearly independent eigenvectors Assume: λ1,λ2,...,λn are distinct eigenvalues of A Mn xi is eigenvector associated with λi Theorem: {x1,x2,...,xn} is a linearly independent set. Conclusion: If A Mn has n distinct eigenvalues A is diagonalizable. (The converse is not true.) KTH Signal Processing 12 Magnus Jansson/Emil Björnson

Simultaneous diagonalization Definition: A, B Mn commute if AB = BA Definition: Two diagonalizable matrices A, B Mn are simultaneously diagonalizable if there exists a single similarity matrix S Mn diagonalizing both A and B. Theorem: A, B commute iff they are simultaneously diagonalizable. KTH Signal Processing 13 Magnus Jansson/Emil Björnson How to find the eigenvector? Rewrite eigenvalue definition (A Mn): λx Ax =0 (λi A)x =0 Observation: x lies in the nullspace of λi A. Calculate eigenvectors: Solve (λi A)x =0for eigenvalue λ. System of equations: Use Gauss elimination. Eigenvector is non-unique: Any scaling (x 0) Nullspace can have large dimension. KTH Signal Processing 15 Magnus Jansson/Emil Björnson Eigenvalues of products Assume: A Mm,n and B Mn,m with m n. Theorem: pba(t) =t n m pab(t) BA has the same eigenvalues as AB plus n m additional eigenvalues at zero. If m = n and A (or B) is nonsingular, AB is similar to BA. KTH Signal Processing 14 Magnus Jansson/Emil Björnson Eigenspace The set of all eigenvectors satisfying Ax = λx for a given λ σ(a) is called the eigenspace of A corresponding to λ. The eigenspace, together with the zero vector, is a subspace of C n and it is exactly the nullspace of λi A. KTH Signal Processing 16 Magnus Jansson/Emil Björnson

Multiplicity Algebraic multiplicity: Multiplicity of the corresponding root of the characteristic polynomial. Geometric multiplicity: Number of linearly independent eigenvectors associated with the eigenvalue. Theorem: Algebraic multiplicity Geometric multiplicity Definition: If strict inequality for some eigenvalue, the matrix is defective. Theorem: A is diagonalizable iff it is not defective. KTH Signal Processing 17 Magnus Jansson/Emil Björnson Transpose and conjugate transpose Transpose: A and A T have same eigenvalues. Left/right eigenvectors are interchanged and complex conjugated. Conjugate transpose: Eigenvalues of A are complex conjugates of eigenvalues of A. Left/right eigenvectors are interchanged. KTH Signal Processing 19 Magnus Jansson/Emil Björnson Left eigenvectors A nonzero vector y C n is a left eigenvector of A Mn if y A = μy. Observe that μ σ(a). Theorem (Biorthogonality): Let y A = μy and Ax = λx. Then, if μ λ we have y x =0. Observation: If A is Hermitian (A = A ), x = y for same eigenvalue. Biorthogonality implies that A has n pair-wise orthogonal eigenvectors of (at least if eigenvalues are distinct, more later). KTH Signal Processing 18 Magnus Jansson/Emil Björnson