AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Similar documents
Notes on Linear Algebra and Matrix Theory

CS 246 Review of Linear Algebra 01/17/19

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra Massoud Malek

Review of Some Concepts from Linear Algebra: Part 2

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

Spectral radius, symmetric and positive matrices

In these chapter 2A notes write vectors in boldface to reduce the ambiguity of the notation.

2. Matrix Algebra and Random Vectors

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Numerical Linear Algebra Homework Assignment - Week 2

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J

Section 3.9. Matrix Norm

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

9. Banach algebras and C -algebras

Quantum Computing Lecture 2. Review of Linear Algebra

Jim Lambers MAT 610 Summer Session Lecture 2 Notes

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Chapter 3. Linear and Nonlinear Systems

Recall the convention that, for us, all vectors are column vectors.

Optimization Theory. A Concise Introduction. Jiongmin Yong

NORMS ON SPACE OF MATRICES

Econ Slides from Lecture 7

Review of Linear Algebra

Detailed Proof of The PerronFrobenius Theorem

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chap 3. Linear Algebra

Chapter 3 Transformations

Elementary linear algebra

Spectral Theorem for Self-adjoint Linear Operators

Properties of Linear Transformations from R n to R m

Ma/CS 6b Class 20: Spectral Graph Theory

5 Compact linear operators

Properties of Matrices and Operations on Matrices

Introduction to Matrix Algebra

LECTURE NOTES ELEMENTARY NUMERICAL METHODS. Eusebius Doedel

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Jordan Normal Form. Chapter Minimal Polynomials

Chapter 1: Systems of Linear Equations

Foundations of Matrix Analysis

Computational math: Assignment 1

Eigenvalues and Eigenvectors

FIXED POINT ITERATIONS

Chapter 7 Iterative Techniques in Matrix Algebra

0.1 Rational Canonical Forms

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Repeated Eigenvalues and Symmetric Matrices

Math 489AB Exercises for Chapter 1 Fall Section 1.0

Analysis-3 lecture schemes

1. General Vector Spaces

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 240 Calculus III

Analysis Preliminary Exam Workshop: Hilbert Spaces

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012.

The Solution of Linear Systems AX = B

and let s calculate the image of some vectors under the transformation T.

4. Linear transformations as a vector space 17

Diagonalization of Matrix

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular

AN ITERATION. In part as motivation, we consider an iteration method for solving a system of linear equations which has the form x Ax = b

Ma/CS 6b Class 20: Spectral Graph Theory

Markov Chains and Stochastic Sampling

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Compact operators on Banach spaces

Definition (T -invariant subspace) Example. Example

A TOUR OF LINEAR ALGEBRA FOR JDEP 384H

November 18, 2013 ANALYTIC FUNCTIONAL CALCULUS

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018

The following definition is fundamental.

Eigenvalue and Eigenvector Homework

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Matrix Vector Products

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

G1110 & 852G1 Numerical Linear Algebra

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Chapter 3. Determinants and Eigenvalues

Recall : Eigenvalues and Eigenvectors

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Linear Algebra: Characteristic Value Problem

Numerical Analysis: Solving Systems of Linear Equations

2 Two-Point Boundary Value Problems

Math Linear Algebra Final Exam Review Sheet

Some bounds for the spectral radius of the Hadamard product of matrices

T.8. Perron-Frobenius theory of positive matrices From: H.R. Thieme, Mathematics in Population Biology, Princeton University Press, Princeton 2003

Lecture Note 7: Iterative methods for solving linear systems. Xiaoqun Zhang Shanghai Jiao Tong University

Basic Concepts in Matrix Algebra

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Geometric Mapping Properties of Semipositive Matrices

Transcription:

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim n An 1/n, where represents any matrix norm. 1. Introduction It is a well-known fact from the theory of Banach algebras that the spectral radius of any element A is given by the formula ρ(a) lim n A n 1/n. (1.1) For a matrix, the spectrum is just the collection of eigenvalues, so this formula yields a technique for estimating for the top eigenvalue. The proof of Equation 1.1 is beautiful but advanced. See, for example, Rudin s treatment in his Functional Analysis. It turns out that elementary techniques suffice to develop the formula for matrices. 2. Preliminaries For completeness, we shall briefly introduce the major concepts required in the proof. It is expected that the reader is already familiar with these ideas. 2.1. Norms. A norm is a mapping from a vector space X into the nonnegative real numbers R + which has three properties: (1) x 0 if and only if x 0; (2) αx α x for any scalar α and vector x; and (3) x + y x + y for any vectors x and y. The most fundamental example of a norm is the Euclidean norm 2 which corresponds to the standard topology on R n. It is defined by x 2 (x 1, x 2,..., x n ) 2 x 2 1 + x 2 2 + + x 2 n. Date: 30 November 2001. 1

SPECTRAL RADIUS FORMULA 2 One particular norm we shall consider is the l norm which is defined for x in R n by the formula x (x 1, x 2,..., x n ) max i {x i }. For a matrix A, define the infinity norm as A max i j A ij. This norm is consistent with itself and with the l vector norm. That is, AB A B and Ax A x, where A and B are matrices and x is a vector. Two norms and on a vector space X are said be equivalent if there are positive constants C and C such that C x x C x for every vector x. For finite-dimensional spaces, we have the following powerful result. Theorem 2.1. All norms on a finite-dimensional vector space are equivalent. Proof. We shall demonstrate that any norm on R n is equivalent to the Euclidean norm. Let {e i } be the canonical basis for R n, so any vector has an expression as x x i e i. First, let us check that is continuous with respect to the Euclidean norm. For all pairs of vectors x and y, x y (x i y i )e i x i y i e i max i { e i } x i y i M { (x i y i ) 2 } 1/2 M x y 2, where M max i { e i }. In other words, when two vectors are nearby with respect to the Euclidean norm, they are also nearby with respect to any other norm. Notice that the Cauchy-Schwarz inequality for real numbers has played a starring role at this stage. Now, consider the unit sphere with respect to the Euclidean norm, S {x R n : x 2 1}. This set is evidently closed and bounded in the Euclidean topology, so the Heine-Borel theorem shows that it

SPECTRAL RADIUS FORMULA 3 is compact. Therefore, the continuous function attains maximum and minimum values on S, say C and C. That is, C x 2 x C x 2 for any x with unit Euclidean norm. But every vector y can be expressed as y αx for some x on the Euclidean unit sphere. If we multiply the foregoing inequality by α and draw the scalar into the norms, we reach y 2 y C C y 2 for any vector y. It remains to check that the constants and C C are positive. They are clearly nonnegative since is nonnegative, and C C by definition. Assume that 0, which implies the existence of a point x on the Euclidean unit C sphere for which x 0. But then x 0, a contradiction. 2.2. The spectrum of a matrix. For an n-dimensional matrix A, consider the equation Ax λx, (2.1) where x is a nonzero vector and λ is a complex number. Numbers λ which satisfy Equation 2.1 are called eigenvalues and the corresponding x are called eigenvectors. Nonzero vector solutions to this equation exist if and only if det(a λi) 0, (2.2) where I is the identity matrix. The left-hand side of Equation 2.2 is called the characteristic polynomial of A because it is a polynomial in λ of degree n whose solutions are identical with the eigenvalues of A. The algebraic multiplicity of an eigenvalue λ is the multiplicity of λ as a root of the characteristic polynomial. Meanwhile, the geometric multiplicity of λ is the number of linearly independent eigenvectors corresponding to this eigenvalue. The geometric multiplicity of an eigenvalue never exceeds the algebraic multiplicity. Now, the collection of eigenvalues of a matrix, along with their geometric and algebraic multiplicities, completely determines the eigenstructure of the matrix. It turns out that all matrices with the same eigenstructure are similar to each other. That is, if A and B have the same eigenstructure, there exists a nonsingular matrix S such that S 1 AS B. We call the set of all eigenvalues of a matrix A its spectrum, which is written as σ(a). The spectral radius ρ(a) is defined by ρ(a) sup{ λ : λ σ(a)}.

SPECTRAL RADIUS FORMULA 4 In other words, the spectral radius measures the largest magnitude attained by any eigenvalue. 2.3. Jordan canonical form. We say that a matrix is in Jordan canonical form if it is block-diagonal and each block has the form.... λ It can be shown that the lone eigenvalue of this Jordan block is λ. Moreover, the geometric multiplicity of λ is exactly one and the algebraic multiplicity of λ is exactly d, the block size. The eigenvalues of a block-diagonal matrix are simply the eigenvalues of its blocks with the algebraic and geometric multiplicities of identical eigenvalues summed across the blocks. Therefore, a diagonal matrix composed of Jordan blocks has its eigenstructure laid bare. Using the foregoing facts, it is easy to construct a matrix in Jordan canonical form which has any eigenstructure whatsoever. Therefore, every matrix is similar to a matrix in Jordan canonical form. Define the choose function ( n k) according to the following convention: d d ( ) { n n! when k 0,..., n and k!(n k)! k 0 otherwise. Lemma 2.1. If J is a Jordan block with eigenvalue λ, then the components of its nth power satisfy (J n ) ij. ( ) n λ n j+i. (2.3) j i Proof. For n 1, it is straightforward to verify that Equation 2.3 yields J 1.... λ d d

SPECTRAL RADIUS FORMULA 5 Now, for arbitrary indices i and j, use induction to calculate that as advertised. ( ) d J n+1 (J n ) ij ik J kj k1 d {( ) } {( } n 1 λ )λ n k+i 1 j+k k i j k k1 ( ) ( ) n λ (n j+i)+1 n + j i j (i + 1) ( ) n + 1 λ (n+1) j+i j i λ n j+(i+1) 3. The spectral radius formula First, we prove the following special case. Theorem 3.1. For any matrix A, the spectral radius formula holds for the infinity matrix norm: A n 1/n ρ(a). Proof. Throughout this argument, we shall denote the l vector and matrix norms by. Let S be a similarity transform such that S 1 AS has Jordan form: 1 J S 1 AS J.... Using the consistency of, we develop the following bounds. and A n 1/n SJ n S 1 1/n J s { S S 1 } 1/n J n 1/n, { S A n 1/n 1 SJ n S 1 S S S 1 } 1/n { S S 1 } 1/n J n 1/n. In each inequality, the former term on the right-hand side tends toward one as n approaces infinity. Therefore, we need only investigate the behavior of J n 1/n.

SPECTRAL RADIUS FORMULA 6 Now, the matrix J is block-diagonal, so its powers are also blockdiagonal with the blocks exponentiated individually: J n J n 1 Since we are using the infinity norm,... J n s. J n max { Jk n }. k The nth root is monotonic, so we may draw it inside the maximum to obtain { J n 1/n max J n k 1/n}. k What is the norm of an exponentiated Jordan block? Recall the fact that the infinity norm of a matrix equals the greatest absolute row sum, and apply it to the explicit form of Jk n provided in Lemma 2.1. J n k d k j1 (J n k ) 1j d k ( ) n λ k n j+1 j 1 j1 ( ) } λ k { λ n k 1 d k n d k λ k d k j, j 1 j1 where λ k is the eigenvalue of block J k and d k is the block size. Bound the choose function above and below with 1 ( n j 1) n d k, and write M k λ k 1 d k j λ k dk j to obtain the relation M k λ k n J n k M k n d k λ k n. Extracting the nth root and taking the limit as n approaches infinity, we reach lim n J n k 1/n λ k. A careful reader will notice that the foregoing argument does not apply to a Jordan block J k with a zero eigenvalue. But such a matrix is nilpotent: placing a large exponent on J k yields the zero matrix. The norm of a zero matrix is zero, so we have lim J k n n 1/n 0.

SPECTRAL RADIUS FORMULA 7 Combining these facts, we conclude that lim J n 1/n max { λ k } ρ(j). n k The spectrum of the Jordan form J is identical with the spectrum of A, which completes the proof. It is quite easy to bootstrap the general result from this special case. Corollary 3.1. The spectral radius formula holds for any matrix and any norm: A n 1/n ρ(a). Proof. Theorem 2.1 on the equivalence of norms yields the inequality C A n A n C A n for positive numbers C and C. Extract the nth root of this inequality, and take the limit. The root drives the constants toward one, which leaves the relation lim n An 1/n lim n An 1/n lim A n 1/n. n Apply Theorem 3.1 to the upper and lower bounds to reach lim n An 1/n ρ(a).