THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n. Christopher S. Withers and Saralees Nadarajah (Received July 2008)

Size: px
Start display at page:

Download "THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n. Christopher S. Withers and Saralees Nadarajah (Received July 2008)"

Transcription

1 NEW ZEALAND JOURNAL OF MATHEMATICS Volume 38 (2008), THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n Christopher S. Withers and Saralees Nadarajah (Received July 2008) Abstract. When a square matrix A, is diagonalizable, (for example, when A is Hermitian or has distinct eigenvalues), then A n can be written as a sum of the nth powers of its eigenvalues with matrix weights. However, if a 1 occurs in its Jordan form, then the form is more complicated: A n can be written as a sum of polynomials of degree n in its eigenvalues with coefficients depending on n. In this case to a first approximation for large n, A n is proportional to n m 1 λ n with a constant matrix multiplier, λ is the eigenvalue of maximum modulus and m is the maximum multiplicity of λ. 1. Introduction and Summary Let A be an s s complex matrix with eigenvalues λ 1,, λ s ordered so that r 1 def = λ 1 = = λ N > r 0 def = λ N+1 λ s. (1.1) If all eigenvalues have the same modulus then (1.1) should be interpreted as r 1 def = λ 1 = = λ s with N = s. If A is Hermitian, or more generally if it has diagonal Jordan form, then for n 1, A n = s λn j W j, for certain matrices {W j }. Let us write r j and θ j for the amplitude and phase of λ j, that is, for 1 j s. Then this implies the approximations λ j = r j e iθj (1.2) A n = r n 1 C n + r n 0 E n, A n = λ n 1 W 1 + r n 0 F n (1.3) (with the latter holding if N = 1), N C n = e inθj W j with E n, F n and C n bounded as n increases. These cases are covered in Sections 2 and 3. Otherwise, A n can be written as a sum of polynomials in each eigenvalue: A n = f j (λ j ), 2000 Mathematics Subject Classification Primary 15A21 Canonical forms, reductions, classification Secondary 15A24 Matrix equations and identities. Key words and phrases: powers of matrices, approximations, Jordan form, Singular value decomposition.

2 172 C.S. WITHERS AND S. NADARAJAH f j (λ) = m j 1 k=0 λ n k W jk k for n 1 and certain matrices {W jk }. Here m j is the dimension of the jth Jordan block of A and r is the number of blocks. So, λ j has multiplicity at least m j. Suppose the blocks for λ 1 are ordered so that with (1.4) interpreted as m = m 1 = = m M > m j, M < j N (1.4) m = m 1 = = m N when all Jordan blocks have dimension m. Then for n 1, A n = r1 n [C n + n 1 E m 1 n], C n = M e i(n m+1)θj W j,m 1 with En and C n bounded as n increases. For example, if there is only one Jordan block with eigenvalue λ 1 then A n = λ n 1 [W 1,m 1 + n 1 F m 1 n], F n is bounded as n increases. Details are given in Section 4. Section 5 gives results similar to (1.3) for (AA T ) n and related forms for large n based on the singular-value decomposition (SVD). In this section A need not be square. Throughout this note, a n = O(b n ) means that an integer m and a positive constant K exist such that for all n m, a n K b n. 2. A n for A Hermitian Consider the case A is Hermitian, that is A T = A, A T is the transpose of the complex conjugate of A. Then the eigenvalues are real, s A = HΛH T = λ j p j p T j, (2.1) Λ = λ 1,, λ s, H = (p 1,, p s ), and p j is the eigenvector of λ j, that is, a solution of Ap j = λ j p j. The eigenvectors can be taken as orthogonal and scaled to have unit norm, so that p T j p k = δ jk and s p jp T j = I s, δ jj = 1 and δ jk = 0 for j k. If A is real symmetric then H can be taken as real. So, by (1.1) we have the well known expression s A n = HΛ n H T = λ n j p j p T j. (2.2) It follows that A n = r1 n C n + r0 n E n, n = 0, 1,, C n = N sn j p jp T j, s j = sign(λ j ) and E n is bounded as n increases. So, for n 1, if N = 1 then

3 THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n 173 A n = λ n 1 D + r0 n F n, D = p 1 p T 1 and F n is bounded as n increases. On the other hand, if N 2 then the multiplier C n alternates between C +, its form for n even, and C, its form for n odd, C + = s p jp T j, C = s p jp T j s p j= 1 jp T j. Also if det(a) 0 then (2.2) extends to n = 1, 2,, and in fact to complex n. For example, A 1 = HΛ 1 H T = s λ 1 j p j p T j. Section 5 gives results similar to those of Section 3 but for (AA T ) n, (AA T ) n A. These are based on the SVD of A which need not be square. 3. Diagonal Jordan Form Now suppose that A has diagonal Jordan form, that is A = P ΛP 1 for some matrix P and Λ of (2.1). Such a matrix is said to be diagonalizable. If A and its eigenvalues are real, then P can be taken as real. Writing P = (p 1,, p s ), Q T = P 1 = (q 1,, q s ) T, we have qj T p k = δ jk, s p jqj T = I s. By (1.1) and (1.2), for n 1, A n = P Λ n P 1 = s λ n j p j qj T = r1 n C n + r0 n E n, (3.1) C n = N p einθj j qj T and E n is bounded as n increases. For example, if N = 1 then A n = λ n 1 D + r0 n F n, D = p 1 q1 T and F n is bounded as n increases. On the other hand, if N = 2 then A n = λ n 1 D n + r0 n F n, D n = p 1 q1 T +e in(θ2 θ1) p 2 q2 T and F n is bounded as n increases. In particular if θ 2 θ 1 = 2π/K for some K = 1, 2,, then D n takes K distinct forms: D n = D m when n modulo(k) = m, m = 0, 1,, K 1. If det(a) 0 then (3.1) extends to n = 1, 2,, and to complex n. For example, A 1 = P Λ 1 P 1 = s λ 1 j p j qj T. 4. General Jordan Form Now suppose that A has non-diagonal Jordan form, that is, it has one or more multiple eigenvalues, and A = P JP 1, J = J 1,, J r, J j = J mj (λ j ), J m (λ) = λi m + U m = λ λ λ for some matrix P, and U m is the m m matrix with 1s on the superdiagonal and 0s else: (U m ) jk = δ j,k 1. Again if A and its eigenvalues are real, then P can be taken as real. Methods for obtaining P are known, see, for example, Dunford and Schwartz (1958), Finkbeiner II (1978), Horn and Johnson (1985) and Golub and van Loan

4 174 C.S. WITHERS AND S. NADARAJAH (1996). The m m matrix Um n is defined as J m (0) n, so that Um 0 = I, Um =, Um =, Um =, Um m 1 = with U k m = 0 for k m. Also by the binomial expansion, for λ 0 and n 1, J m (λ) n = min(n,m 1) j=0 λ n j Um j (4.1) j = λ n e n,m 1 (λ) [U m 1 m + n 1 Ω n ] = d n,m 1 (λ) + n m 2 λ n Ω n = n m 1 λ n Ω e nk (λ) = ( ) n k λ k, d nk (λ) = λ n e nk (λ)um m 1 and Um m 1 is the m m matrix of 0s except for a 1 in the upper right corner. Further, Ω n, Ω n and Ω n are bounded as n increases. Another way to write (4.1) is 1 e n1 e n2 e n,m 1 J m (λ) n = λ n 0 1 e n1 e n,m n,

5 THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n 175 at e nk = e nk (λ). So, we obtain the exact formula A n = P J n P 1 = P J n 1,, J n r P 1, Jj n is given by (4.1). Suppose now that N of the Jordan blocks {J j } have eigenvalues with the maximum amplitude r 1, and that these are ordered so that r 1 = λ 1 = = λ N > r 0 = max N<j s λ j. Because of multiplicities, this N is not same as N of (1.1). Suppose also that the first M N blocks have the maximum multiplicity m, that is, (1.4) holds, and that m 0 is the maximum multiplicity of eigenvalues of amplitude r 0. Then for n 1, A n = P J n 1,, J n N, 0,, 0 P 1 + n m0 1 r n 0 Ψ n = r n 1 e n,m 1 (r 1 )P d n,m 1 (e iθ1 ),, d n,m 1 (e iθ M ),, 0,, 0 P 1 + n m 2 r n 1 Ψ n for {θ j } of (1.2), Ψ n and Ψ n are bounded as n increases. Now partition P and its inverse into r r blocks to match J, say P = (P jk ), P 1 = (P jk ). So, A n is the r r block matrix with (j, k) element giving (A n ) jk = A n = = P jc Jc n P ck (4.2) N P jc Jc n P ck + O(n m0 1 r0 n ) = r n 1 e n,m 1 (r 1 ) M P jc d n,m 1 (e iθc ) P ck + O(n m 1 r1 n ), r1 n m+1 [C n + n 1 Ψ n ] = n m 1 r1 n Ψ n, m 1 (C n ) jk = M e i(n m+1)θc P jc U m 1 m P ck = O(1) and Ψ n and Ψ n are bounded as n increases. The last multiplier P jc Um m 1 is an m m matrix with (p, q) element (P jc ) p1 (P ck ) mq. For example, if λ 1 > λ j for j > 1, then M = 1 and A n = λ n m+1 1 [D + n 1 Ψ n ], m 1 [D jk ] pq = (P j1 ) p1 (P 1k ) mq and Ψ n is bounded as n increases. If det(a) 0 then (4.2) and (4.1) extend to n = 1, 2,, and to complex n. For example, (A 1 ) jk = P jc J 1 c P ck, P ck

6 176 C.S. WITHERS AND S. NADARAJAH Example 4.1. Take m 1 J m ( λ) 1 = λ 1 j Um. j A = j= Then r = 3, J 1 = 1, J 2 = 2, J 3 = J 2 (4), P = P 1 = J n = n n n4 n n A n = 4 n (Bn/4 + C) + 2 n D + E = n4 n [B/4 + n 1 Φ n ], B = C = D = E = and Φ n is bounded as n increases.

7 THE nth POWER OF A MATRIX AND APPROXIMATIONS FOR LARGE n 177 Example 4.2. Take A = Then r = 2, J 1 = J 2 (2), J 2 = J 1 (1) = 1, P = P 1 = J n = 2n n2 n n So, 18A n = (Bn/2 + C)2 n + D = n2 n [B/2 + n 1 Φ n], B = C = D = and Φ n is bounded as n increases. 5. Behaviour of (AA T ) n and Related Forms for Large n This section gives results similar to those of Section 3 but for (AA T ) n and (AA T ) n A. These are based on the SVD of A. The SVD of an m n matrix A can be written A = θ j l j rj T, r = min(m, n), lj T l k = rj T r k = δ jk and 0 < θ 1 θ r. So, for 1 j r, Ar j = θ j l j and A T l j = θ j r j. So, AA T = θj 2 l j lj T, A T A = AA T A = θj 2 r j rj T, θj 3 l j rj T.

8 178 C.S. WITHERS AND S. NADARAJAH That is, {θj 2} are the non-zero eigenvalues of AAT and A T A, and the corresponding eigenvectors are {l j } and {r j }. Similarly, we have for n 1, (AA T ) n = θj 2n l j lj T, (A T A) n = (AA T ) n A = (A T A) n A T = θ 2n j r j r T j, θ 2n+1 j l j r T j, θ 2n+1 j r j l T j. So, if max 1 j s θ j < θ s+1 = = θ r then as n, (AA T ) n = θr 2n L sr + θs 2n n = θr 2n Ξ n, (A T A) n = θr 2n R sr + θs 2n n = θr 2n Ξ n, (AA T ) n A = θr 2n+1 B sr + θs 2n n = θr 2n (A T A) n A T = θ 2n+1 r C sr + θ 2n s n = θr 2n = r Ξ n, Ξ n, L sr = r j=s+1 l jlj T, R sr = r j=s+1 r jrj T, B sr j=s+1 l jrj T and C sr = r j=s+1 r jlj T with n, n, n, n, Ξ n, Ξ n, Ξ n, Ξ n, L sr, R sr, B sr and C sr bounded as n increases. Acknowledgments The authors would like to thank the Managing Editor and the referee for carefully reading the paper and for their comments which greatly improved the paper. References [1] N. Dunford and J. T. Schwartz, Linear Operators, Part I: General Theory, Interscience, [2] D. T. Finkbeiner II, Introduction to Matrices and Linear Transformations, Third Edition, Freeman, [3] G. H. Golub and C. F. van Loan, Matrix Computations, Third Edition, Johns Hopkins University Press, Baltimore, [4] R. A. Horn and C. R. Johnson, Matrix Analysis, Cambridge University Press, C.S. Withers Applied Mathematics Group Industrial Research Limited Lower Hutt NEW ZEALAND c.withers@irl.cri.nz S. Nadarajah School of Mathematics University of Manchester Manchester M13 9PL UNITED KINGDOM mbbsssn2@manchester.ac.uk

Solutions to the recurrence relation u n+1 = v n+1 + u n v n in terms of Bell polynomials

Solutions to the recurrence relation u n+1 = v n+1 + u n v n in terms of Bell polynomials Volume 31, N. 2, pp. 245 258, 2012 Copyright 2012 SBMAC ISSN 0101-8205 / ISSN 1807-0302 (Online) www.scielo.br/cam Solutions to the recurrence relation u n+1 = v n+1 + u n v n in terms of Bell polynomials

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Math Spring 2011 Final Exam

Math Spring 2011 Final Exam Math 471 - Spring 211 Final Exam Instructions The following exam consists of three problems, each with multiple parts. There are 15 points available on the exam. The highest possible score is 125. Your

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

Frame Diagonalization of Matrices

Frame Diagonalization of Matrices Frame Diagonalization of Matrices Fumiko Futamura Mathematics and Computer Science Department Southwestern University 00 E University Ave Georgetown, Texas 78626 U.S.A. Phone: + (52) 863-98 Fax: + (52)

More information

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.

More information

Lecture 11: Diagonalization

Lecture 11: Diagonalization Lecture 11: Elif Tan Ankara University Elif Tan (Ankara University) Lecture 11 1 / 11 Definition The n n matrix A is diagonalizableif there exits nonsingular matrix P d 1 0 0. such that P 1 AP = D, where

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Jordan Normal Form and Singular Decomposition

Jordan Normal Form and Singular Decomposition University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ

More information

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information

The Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix

The Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix The Eigenvalue Shift Technique and Its Eigenstructure Analysis of a Matrix Chun-Yueh Chiang Center for General Education, National Formosa University, Huwei 632, Taiwan. Matthew M. Lin 2, Department of

More information

Fundamentals of Engineering Analysis (650163)

Fundamentals of Engineering Analysis (650163) Philadelphia University Faculty of Engineering Communications and Electronics Engineering Fundamentals of Engineering Analysis (6563) Part Dr. Omar R Daoud Matrices: Introduction DEFINITION A matrix is

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Math Homework 8 (selected problems)

Math Homework 8 (selected problems) Math 102 - Homework 8 (selected problems) David Lipshutz Problem 1. (Strang, 5.5: #14) In the list below, which classes of matrices contain A and which contain B? 1 1 1 1 A 0 0 1 0 0 0 0 1 and B 1 1 1

More information

Symmetric and anti symmetric matrices

Symmetric and anti symmetric matrices Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues. Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science Computational Methods CMSC/AMSC/MAPL 460 Eigenvalues and Eigenvectors Ramani Duraiswami, Dept. of Computer Science Eigen Values of a Matrix Recap: A N N matrix A has an eigenvector x (non-zero) with corresponding

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E. 3.3 Diagonalization Let A = 4. Then and are eigenvectors of A, with corresponding eigenvalues 2 and 6 respectively (check). This means 4 = 2, 4 = 6. 2 2 2 2 Thus 4 = 2 2 6 2 = 2 6 4 2 We have 4 = 2 0 0

More information

Solving Homogeneous Systems with Sub-matrices

Solving Homogeneous Systems with Sub-matrices Pure Mathematical Sciences, Vol 7, 218, no 1, 11-18 HIKARI Ltd, wwwm-hikaricom https://doiorg/112988/pms218843 Solving Homogeneous Systems with Sub-matrices Massoud Malek Mathematics, California State

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

Notes on basis changes and matrix diagonalization

Notes on basis changes and matrix diagonalization Notes on basis changes and matrix diagonalization Howard E Haber Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 April 17, 2017 1 Coordinates of vectors and matrix

More information

Two Results About The Matrix Exponential

Two Results About The Matrix Exponential Two Results About The Matrix Exponential Hongguo Xu Abstract Two results about the matrix exponential are given. One is to characterize the matrices A which satisfy e A e AH = e AH e A, another is about

More information

Repeated Eigenvalues and Symmetric Matrices

Repeated Eigenvalues and Symmetric Matrices Repeated Eigenvalues and Symmetric Matrices. Introduction In this Section we further develop the theory of eigenvalues and eigenvectors in two distinct directions. Firstly we look at matrices where one

More information

Jordan Canonical Form

Jordan Canonical Form Jordan Canonical Form Massoud Malek Jordan normal form or Jordan canonical form (named in honor of Camille Jordan) shows that by changing the basis, a given square matrix M can be transformed into a certain

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 489AB Exercises for Chapter 2 Fall Section 2.3 Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course

More information

Lecture notes: Applied linear algebra Part 2. Version 1

Lecture notes: Applied linear algebra Part 2. Version 1 Lecture notes: Applied linear algebra Part 2. Version 1 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 First, some exercises: xercise 0.1 (2 Points) Another least

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

EXPANSIONS FOR FUNCTIONS OF DETERMINANTS OF POWER SERIES

EXPANSIONS FOR FUNCTIONS OF DETERMINANTS OF POWER SERIES CANADIAN APPLIED MATHEMATICS QUARTERLY Volume 18, Number 1, Spring 010 EXPANSIONS FOR FUNCTIONS OF DETERMINANTS OF POWER SERIES CHRISTOPHER S. WITHERS AND SARALEES NADARAJAH ABSTRACT. Let Aɛ) be an analytic

More information

The Lanczos and conjugate gradient algorithms

The Lanczos and conjugate gradient algorithms The Lanczos and conjugate gradient algorithms Gérard MEURANT October, 2008 1 The Lanczos algorithm 2 The Lanczos algorithm in finite precision 3 The nonsymmetric Lanczos algorithm 4 The Golub Kahan bidiagonalization

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

1 Introduction. 2 Determining what the J i blocks look like. December 6, 2006

1 Introduction. 2 Determining what the J i blocks look like. December 6, 2006 Jordan Canonical Forms December 6, 2006 1 Introduction We know that not every n n matrix A can be diagonalized However, it turns out that we can always put matrices A into something called Jordan Canonical

More information

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

Mathematical Foundations

Mathematical Foundations Chapter 1 Mathematical Foundations 1.1 Big-O Notations In the description of algorithmic complexity, we often have to use the order notations, often in terms of big O and small o. Loosely speaking, for

More information

Linear Methods in Data Mining

Linear Methods in Data Mining Why Methods? linear methods are well understood, simple and elegant; algorithms based on linear methods are widespread: data mining, computer vision, graphics, pattern recognition; excellent general software

More information

GENERAL ARTICLE Realm of Matrices

GENERAL ARTICLE Realm of Matrices Realm of Matrices Exponential and Logarithm Functions Debapriya Biswas Debapriya Biswas is an Assistant Professor at the Department of Mathematics, IIT- Kharagpur, West Bengal, India. Her areas of interest

More information

Eigenvalue and Eigenvector Problems

Eigenvalue and Eigenvector Problems Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

3 (Maths) Linear Algebra

3 (Maths) Linear Algebra 3 (Maths) Linear Algebra References: Simon and Blume, chapters 6 to 11, 16 and 23; Pemberton and Rau, chapters 11 to 13 and 25; Sundaram, sections 1.3 and 1.5. The methods and concepts of linear algebra

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Numerical Methods for Solving Large Scale Eigenvalue Problems

Numerical Methods for Solving Large Scale Eigenvalue Problems Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

More information

Control Systems. Linear Algebra topics. L. Lanari

Control Systems. Linear Algebra topics. L. Lanari Control Systems Linear Algebra topics L Lanari outline basic facts about matrices eigenvalues - eigenvectors - characteristic polynomial - algebraic multiplicity eigenvalues invariance under similarity

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information

Diagonalization by a unitary similarity transformation

Diagonalization by a unitary similarity transformation Physics 116A Winter 2011 Diagonalization by a unitary similarity transformation In these notes, we will always assume that the vector space V is a complex n-dimensional space 1 Introduction A semi-simple

More information

Abed Elhashash, Uriel G. Rothblum, and Daniel B. Szyld. Report August 2009

Abed Elhashash, Uriel G. Rothblum, and Daniel B. Szyld. Report August 2009 Paths of matrices with the strong Perron-Frobenius property converging to a given matrix with the Perron-Frobenius property Abed Elhashash, Uriel G. Rothblum, and Daniel B. Szyld Report 09-08-14 August

More information

MATH 581D FINAL EXAM Autumn December 12, 2016

MATH 581D FINAL EXAM Autumn December 12, 2016 MATH 58D FINAL EXAM Autumn 206 December 2, 206 NAME: SIGNATURE: Instructions: there are 6 problems on the final. Aim for solving 4 problems, but do as much as you can. Partial credit will be given on all

More information

Review of similarity transformation and Singular Value Decomposition

Review of similarity transformation and Singular Value Decomposition Review of similarity transformation and Singular Value Decomposition Nasser M Abbasi Applied Mathematics Department, California State University, Fullerton July 8 7 page compiled on June 9, 5 at 9:5pm

More information

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280 Math 307 Spring 2019 Exam 2 3/27/19 Time Limit: / Name (Print): Problem Points Score 1 15 2 20 3 35 4 30 5 10 6 20 7 20 8 20 9 20 10 20 11 10 12 10 13 10 14 10 15 10 16 10 17 10 Total: 280 Math 307 Exam

More information

Eigenvalue Problems and Singular Value Decomposition

Eigenvalue Problems and Singular Value Decomposition Eigenvalue Problems and Singular Value Decomposition Sanzheng Qiao Department of Computing and Software McMaster University August, 2012 Outline 1 Eigenvalue Problems 2 Singular Value Decomposition 3 Software

More information

ABSOLUTELY FLAT IDEMPOTENTS

ABSOLUTELY FLAT IDEMPOTENTS ABSOLUTELY FLAT IDEMPOTENTS JONATHAN M GROVES, YONATAN HAREL, CHRISTOPHER J HILLAR, CHARLES R JOHNSON, AND PATRICK X RAULT Abstract A real n-by-n idempotent matrix A with all entries having the same absolute

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

MAT2342 : Introduction to Linear Algebra Mike Newman, 5 October assignment 1

MAT2342 : Introduction to Linear Algebra Mike Newman, 5 October assignment 1 [/8 MAT4 : Introduction to Linear Algebra Mike Newman, 5 October 07 assignment You must show your work. You may use software or solvers of some sort to check calculation of eigenvalues, but you should

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

JUST THE MATHS UNIT NUMBER 9.8. MATRICES 8 (Characteristic properties) & (Similarity transformations) A.J.Hobson

JUST THE MATHS UNIT NUMBER 9.8. MATRICES 8 (Characteristic properties) & (Similarity transformations) A.J.Hobson JUST THE MATHS UNIT NUMBER 9.8 MATRICES 8 (Characteristic properties) & (Similarity transformations) by A.J.Hobson 9.8. Properties of eigenvalues and eigenvectors 9.8. Similar matrices 9.8.3 Exercises

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

A Note on Eigenvalues of Perturbed Hermitian Matrices

A Note on Eigenvalues of Perturbed Hermitian Matrices A Note on Eigenvalues of Perturbed Hermitian Matrices Chi-Kwong Li Ren-Cang Li July 2004 Let ( H1 E A = E H 2 Abstract and à = ( H1 H 2 be Hermitian matrices with eigenvalues λ 1 λ k and λ 1 λ k, respectively.

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

Eigenvalue and Eigenvector Homework

Eigenvalue and Eigenvector Homework Eigenvalue and Eigenvector Homework Olena Bormashenko November 4, 2 For each of the matrices A below, do the following:. Find the characteristic polynomial of A, and use it to find all the eigenvalues

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p

More information

Jordan Canonical Form Homework Solutions

Jordan Canonical Form Homework Solutions Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have

More information

Spectral inequalities and equalities involving products of matrices

Spectral inequalities and equalities involving products of matrices Spectral inequalities and equalities involving products of matrices Chi-Kwong Li 1 Department of Mathematics, College of William & Mary, Williamsburg, Virginia 23187 (ckli@math.wm.edu) Yiu-Tung Poon Department

More information

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit V: Eigenvalue Problems Lecturer: Dr. David Knezevic Unit V: Eigenvalue Problems Chapter V.2: Fundamentals 2 / 31 Eigenvalues and Eigenvectors Eigenvalues and eigenvectors of

More information

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same. Introduction Matrix Operations Matrix: An m n matrix A is an m-by-n array of scalars from a field (for example real numbers) of the form a a a n a a a n A a m a m a mn The order (or size) of A is m n (read

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

11.0 Introduction. An N N matrix A is said to have an eigenvector x and corresponding eigenvalue λ if. A x = λx (11.0.1)

11.0 Introduction. An N N matrix A is said to have an eigenvector x and corresponding eigenvalue λ if. A x = λx (11.0.1) Chapter 11. 11.0 Introduction Eigensystems An N N matrix A is said to have an eigenvector x and corresponding eigenvalue λ if A x = λx (11.0.1) Obviously any multiple of an eigenvector x will also be an

More information

Lecture 8 : Eigenvalues and Eigenvectors

Lecture 8 : Eigenvalues and Eigenvectors CPS290: Algorithmic Foundations of Data Science February 24, 2017 Lecture 8 : Eigenvalues and Eigenvectors Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Hermitian Matrices It is simpler to begin with

More information

Eigenvalue perturbation theory of structured real matrices and their sign characteristics under generic structured rank-one perturbations

Eigenvalue perturbation theory of structured real matrices and their sign characteristics under generic structured rank-one perturbations Eigenvalue perturbation theory of structured real matrices and their sign characteristics under generic structured rank-one perturbations Christian Mehl Volker Mehrmann André C. M. Ran Leiba Rodman April

More information

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector

More information

Solving Systems of Linear Differential Equations with Real Eigenvalues

Solving Systems of Linear Differential Equations with Real Eigenvalues Solving Systems of Linear Differential Equations with Real Eigenvalues David Allen University of Kentucky February 18, 2013 1 Systems with Real Eigenvalues This section shows how to find solutions to linear

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

The Jordan canonical form

The Jordan canonical form The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part

More information

Multiple eigenvalues

Multiple eigenvalues Multiple eigenvalues arxiv:0711.3948v1 [math.na] 6 Nov 007 Joseph B. Keller Departments of Mathematics and Mechanical Engineering Stanford University Stanford, CA 94305-15 June 4, 007 Abstract The dimensions

More information

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7 Linear Algebra Rekha Santhanam Johns Hopkins Univ. April 3, 2009 Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, 2009 1 / 7 Dynamical Systems Denote owl and wood rat populations at time k

More information

Math 577 Assignment 7

Math 577 Assignment 7 Math 577 Assignment 7 Thanks for Yu Cao 1. Solution. The linear system being solved is Ax = 0, where A is a (n 1 (n 1 matrix such that 2 1 1 2 1 A =......... 1 2 1 1 2 and x = (U 1, U 2,, U n 1. By the

More information

Math 408 Advanced Linear Algebra

Math 408 Advanced Linear Algebra Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x

More information

Chapter 3. Determinants and Eigenvalues

Chapter 3. Determinants and Eigenvalues Chapter 3. Determinants and Eigenvalues 3.1. Determinants With each square matrix we can associate a real number called the determinant of the matrix. Determinants have important applications to the theory

More information

Block Lanczos Tridiagonalization of Complex Symmetric Matrices

Block Lanczos Tridiagonalization of Complex Symmetric Matrices Block Lanczos Tridiagonalization of Complex Symmetric Matrices Sanzheng Qiao, Guohong Liu, Wei Xu Department of Computing and Software, McMaster University, Hamilton, Ontario L8S 4L7 ABSTRACT The classic

More information

Characteristic Polynomial

Characteristic Polynomial Linear Algebra Massoud Malek Characteristic Polynomial Preleminary Results Let A = (a ij ) be an n n matrix If Au = λu, then λ and u are called the eigenvalue and eigenvector of A, respectively The eigenvalues

More information

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J Class Notes 4: THE SPECTRAL RADIUS, NORM CONVERGENCE AND SOR. Math 639d Due Date: Feb. 7 (updated: February 5, 2018) In the first part of this week s reading, we will prove Theorem 2 of the previous class.

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS EIGENVALUE PROBLEMS p. 1/4 EIGENVALUE PROBLEMS p. 2/4 Eigenvalues and eigenvectors Let A C n n. Suppose Ax = λx, x 0, then x is a (right) eigenvector of A, corresponding to the eigenvalue

More information