Problems of Eigenvalues/Eigenvectors

Size: px
Start display at page:

Download "Problems of Eigenvalues/Eigenvectors"

Transcription

1 67 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Spectrum Decomposition Theorem Singular Value Decomposition with Applications QR Iterations for Computing Eigenvalues A Markov Process e A and Differential Equations Other Topics with Applications

2 68 Definition and Examples Let A R n n. If v 0 such that Av = λv, λ is called an eigenvalue of matrix A, and v is called an eigenvector corresponding to (or belonging to) the eigenvalue λ. Note that v is an eigenvector implies that αv is also an eigenvector for all α 0. We define the Eigenspace(λ) as the vector space spanned by all of the eigenvectors corresponding to the eigenvalue λ. Examples: 2 0. A = A = A = A = B = C = 3, λ =2,u =, λ =2,u =, λ =4,u =, λ = j, u = 0 0, thenλ =3,u =, λ 2 =,u 2 =, λ 2 =,u 2 =, λ 2 =2,u 2 = j 0, λ 2 = j, u 2 =, thenτ =4,v = j ; λ 2 =, u 2 = ; τ 2 =2,v 2 =, j = Ax = λx (λi A)x = 0, x 0 det(λi A) =P (λ) =0.

3 69 Gershgorin s Disk Theorem Note that u i 2 =and v i 2 =fori =, 2. Denote U =[u, u 2 ]andv =[v, v 2 ], then U BU =, V CV = Note that V t = V but U t U. Let A R n n,thendet(λi A) is called the characteristic polynomial of matrix A. Fundamental Theorem of Algebra A real polynomial P (λ) =λ n + a n λ n + + a 0 of degree n has n roots {λ i } such that ( n ) ( P (λ) =(λ λ )(λ λ 2 ) (λ λ n )=λ n n ) λ i λ n + +( ) n λ i i= i= n i= λ i = n i= a ii = tr(a) n i= λ i = det(a) Gershgorin s Disk/Circle Theorem Every eigenvalue of matrix A R n n lies in at least one of the disks D i = {x x a ii a ij }, i n j i 3 Example: B = 0 4, λ,λ 2,λ 3 S S 2 D 3, where S = {z z 3 2}, S 2 = {z z 4 }, S 3 = {z z 5 4}. Notethatλ =6.566, λ 2 =3.0000, λ 3 = Amatrixissaidtobediagonally dominant if j i a ij < a ii, i n. A diagonally dominant matrix is invertible.

4 70 Theorem: Let A, P R n n, with P nonsingular, then λ is an eigenvalue of A with eigenvector x iff λ is an eigenvalue of P AP with eigenvector P x. Theorem: Let A R n n and let λ be an eigenvalue of A with eigenvector x. Then (a) αλ is an eigenvalue of matrix αa with eigenvector x (b) λ μ is an eigenvalue of matrix A μi with eigenvector x (c) If A is nonsingular, then λ 0andλ is an eigenvalue of A with eigenvector x Definition: A matrix A is similar to B, denotebya B, iff there exists an invertible matrix U such that U AU = B. Furthermore, a matrix A is orthogonally similar to B, iff there exists an orthogonal matrix Q such that Q t AQ = B. Theorem: Two similar matrices have the same eigenvalues, i.e., A B λ(a) = λ(b).

5 7 Diagonalization of Matrices Theorem: Suppose A R n n has n linearly independent eigenvectors v, v 2,..., v n corresponding to eigenvalues λ, λ 2,..., λ n. Let V = [v, v 2,..., v n ], then V AV = diag[λ,λ 2,...,λ n ]. If A R n n has n distinct eigenvalues, then their corresponding eigenvectors are linearly independent. Thus, any matrix with distinct eigenvalues can be diagonalized. Not all matrices have distinct eigenvalues, therefore not all matrices are diagonalizable. Nondiagonalizable Matrices A = , B = Diagonalizable Matrices C =, D = 2 0, E = , K = 0 Spectrum Decomposition Theorem* Every real symmetric matrix can be diagonalized.

6 72 Similarity transformation and triangularization Schur s Theorem: A R n n, an orthogonal matrix U such that U t AU = T is upper- Δ. The eigenvlues must be shared by the similarity matrix T and appear along its main diagonal. Hint: By induction, suppose that the theorem has been proved for all matrices of order n, and consider A R n n with Ax = λx and x 2 =,then ahouseholder matrix H such that H x = βe, e.g., β = x 2, hence H AH t e = H A(H e )=H A(β x)=h β Ax = β λ(h x)=β λ(βe )=λe Thus, λ H AH t = O A () Spectrum Decomposition Theorem: Every real symmetric matrix can be diagonalized by an orthogonal matrix. Q t AQ =ΛorA = QΛQ t = n i= λ i q i q t i Definition: A symmetric matrix A R n n is nonnegative definite if x t Ax 0 x R n, x 0. Definition: A symmetric matrix A R n n is positive definite if x t Ax > 0 x R n, x 0. Singular Value Decomposition Theorem: Each matrix A R m n can be decomposed as A = UΣV t, where both U R m m and V R n n are orthogonal. Moreover, Σ R m n = diag[σ,σ 2,...,σ k, 0,...,0] is essentially diagonal with the singular values satisfying σ σ 2... σ k > 0. A = UΣV t = k i= σ i u i v t i Example: 2 A =, B = , C = 0 0

7 73 A Jacobi Transform (Givens Rotation) J(i, k; θ) = c s s c J hh =ifh i or h k, wherei<k J ii = J kk = c =cosθ J ki = s = sin θ, J ik = s =sinθ Let x, y R n,theny = J(i, k; θ)x implies that y i = cx i + sx k y k = sx i + cx k c = x i x 2 i +x 2 k, s = x k x 2 i +x 2 k, x = 2 3 4, cos θ sin θ = / 5 2/ 5, then J(2, 4; θ)x =

8 74 Jacobi Transforms (Givens Rotations) The Jacobi method consists of a sequence of orthogonal similarity transformations such that JKJ t K t J2J t AJ t J 2 J K J K =Λ where each J i is orthogonal, so is Q = J J 2 J K J K. Each Jacobi transform (Given rotation) is just a plane rotation designed to annihilate one of the off-diagonal matrix elements. Let A =(a ij ) be symmetric, then B = J t (p, q, θ)aj(p, q, θ), where b rp = ca rp sa rq b rq = sa rp + ca rq for r p, r q for r p, r q b pp = c 2 a pp + s 2 a qq 2sca pq b qq = s 2 a pp + c 2 a qq +2sca pq b pq =(c 2 s 2 )a pq + sc(a pp a qq ) To set b pq =0,wechoosec, s such that α = cot(2θ) = c2 s 2 2sc = a qq a pp 2a pq () For computational convenience, let t = s c,thent2 +2αt = 0 whose smaller root (in absolute sense) can be computed by t = sgn(α) α2 ++ α, and c =, s = ct, τ = s +t 2 +c (2) Remark b pp = a pp ta pq b qq = a qq + ta pq b rp = a rp s(a rq + τa rp ) b rq = a rq + s(a rp τa rq )

9 75 Algorithm of Jacobi Transforms to Diagonalize A A (0) A for k = 0,,, until convergence Let a (k) pq = Max i<j{ a (k) ij } Compute α k = a(k) t = qq a (k) pp 2a (k) pq sgn(α) α 2 ++ α, solve cot(2θ k )=α k for θ k. c = +t 2,, s = ct τ = s +c A (k+) J t k A(k) J k,wherej k = J(p, q, θ k ) endfor

10 76 Convergence of Jacobi Algorithm to Diagonalize A Proof: Since a (k) pq a (k) ij for i j, p q, then a pq (k) 2 off(a (k) /2N, wheren = n(n ),and 2 off(a (k) )= ( ) n (k) 2, i j a ij the sum of square off-diagonal elements of A (k) Furthermore, off(a (k+) ) = off(a (k) ) 2 ( a (k) pq ) 2 ( ) 2 +2 a (k+) pq = off(a (k) ) 2 ( ) 2 a pq (k), since a (k+) pq =0 off(a (k) ) ( N ), since a (k) pq 2 off(a (k) /2N Thus off(a (k+) ) ( N ) k+ off(a (0) ) 0 as k Example: c s 0 A = 2 3, J(, 2; θ) = s c Then 4c 2 4cs +3s 2 2c 2 + cs 2s 2 s A () = J t (, 2; θ)aj(, 2; θ) = 2c 2 + cs 2s 2 3c 2 +4cs +4s 2 c s c Note that off(a () )=2< 0 = off(a (0) )=off(a)

11 77 Example for Convergence of Jacobi Algorithm A (0) = , A () = A (2) = , A (3) = A (4) = , A (5) = A (6) = , A (5) =

12 78 Power of A Matrix and Its Eigenvalues are eigen- Theorem: Let λ,λ 2,,λ n be eigenvalues of A R n n. Then λ k,λk 2,,λk n values of A k R n n with the same corresponding eigenvectors of A. Thatis, Av i = λ i v i A k v i = λ k i v i i n Suppose that the matrix A R n n has n linearly independent eigenvectors v, v,, v n corresponding to eigenvalues λ,λ 2,,λ n.thenanyx R n can be written as x = c v + c 2 v c n v n Then A k x = λ k c v + λ k 2c 2 v λ k nc n v n In particular, if λ > λ j for 2 j n and c 0,thenA k x will tend to lie in the direction v when k is large enough.

13 79 Power Method for Computing the Largest Eigenvalues Suppose that the matrix A R n n is diagonalizable and that U AU = diag(λ,λ 2,,λ n ) with U =[v, v 2,, v n ]and λ > λ 2 λ 3 λ n. Given u (0) R n,then power method produces a sequence of vectors u (k) as follows. for k =, 2, z (k) = Au (k ) r (k) = z (k) m = z (k),forsome m n. u (k) = z (k) /r (k) endfor λ must be real since the complex eigenvalues must appear in a relatively conjugate pair. Let u (0) = 2 A = 2 [ 0 ] [,thenu (5) = λ =3 λ 2 = ], v = 2 [,andr (5) = ], v 2 = [ 2 ]

14 80 QR Iterations for Computing Eigenvalues % % Script File: shiftqr.m % Solving Eigenvalues by shift-qr factorization % Nrun=5; fin=fopen( datamatrix.txt ); fgetl(fin); % read off the header line n=fscanf(fin, %d,); A=fscanf(fin, %f,[n n]); A=A ; SaveA=A; for k=:nrun, s=a(n,n); A=A-s*eye(n); [Q R]=qr(A); A=R*Q+s*eye(n); end eig(savea) % % datamatrix.txt % Matrices for computing eigenvalues by QR factorization or shift-qr for shift-qr studies

15 8 Norms of Vectors and Matrices Definition: A vector norm on R n is a function that satisfies τ : R n R + = {x 0 x R} () τ(x) > 0 x 0, τ(0) =0 (2) τ(cx) = c τ(x) c R, x R n (3) τ(x + y) τ(x)+τ(y) x, y R n Hölder norm (p-norm) x p =( n i= x i p ) /p for p. (p=) x = n i= x i (Mahattan or City-block distance) (p=2) x 2 =( n i= x i 2 ) /2 (Euclidean distance) (p= ) x = max i n { x i } ( -norm)

16 82 Definition: A matrix norm on R m n is a function τ : R m n R + = {x 0 x R} that satisfies () τ(a) > 0 A O, τ(o) =0 (2) τ(ca) = c τ(a) c R, A R m n (3) τ(a + B) τ(a)+τ(b) A, B R m n Consistency Property: τ(ab) τ(a)τ(b) A, B (a) τ(a) =max{ a ij i m, j n} (b) A F = [ mi= nj= a 2 ij] /2 (Fröbenius norm) Subordinate Matrix Norm: A = max x 0 { Ax / x } () If A R m n,then A = max j n ( m i= a ij ) (2) If A R m n,then A = max i m ( nj= a ij ) (3) Let A R n n be real symmetric, then A 2 = max i n λ i,whereλ i λ(a)

17 83 Theorem: Let x R n and let A =(a ij ) R n n. Define A = Sup u ={ Au } Proof: For u =, A = Sup{ Au } = n n n n n n a ij u j a ij u j = u j a ij i= j= j= i= j= i= Then n n n A Max j n { a ij } u j = Max j n { a ij } i= j= i= On the other hand, let n i= a ik = Max j n { n i= a ij } and choose u = e k,which completes the proof. Theorem: Let A =[a ij ] R m n, and define A = Max u ={ Au }. n Show that A = Max i m a ij j= n n Proof: Let a Kj = Max i m a ij, for any x Rn with x =,wehave j= j= Ax = Max i m { nj= a ij x j } Max i m { nj= a ij x j } Max i m { nj= a ij x } Max i m { nj= a ij } = n j= a Kj In particular, if we pick up y R n such that y j = sign(a Kj ), j n, then y =,and Ay = n j= a Kj, which completes the proof.

18 84 Theorem: Let A =[a ij ] R n n, and define A 2 = Max x 2 ={ Ax 2 }. Show that A 2 = ρ(a t A)= maximum eigenvalue of A t A (spectral radius) (Proof) Letλ, λ 2,, λ n be eigenvalues and their corresponding unit eigenvectors u, u 2,, u n of matrix A t A,thatis, (A t A)u i = λ i u i and u i 2 = i n. Since u, u 2,, u n must be an orthonormal basis based on spectrum decomposition n theorem, for any x R n,wehavex = c i u i.then i= A 2 = Max x 2 ={ Ax 2 } = Max x 2 ={ Ax 2 2 } = Max x 2 ={x t A t Ax} = = n Max x 2 = λ i c 2 i i= Max j n { λ j }

19 85 AMarkovProcess Suppose that 0% of the people outside Taiwan move in, and 20% of the people indside Taiwan move out in each year. Let y k and z k be the population at the end of the k th year, outside Taiwan and inside Taiwan, respectively. Then we have y k z k = y k λ =.0, λ 2 =0.7 z k y k z k = k y z 0 = 3 2 k 0 0 (0.7) k 2 y 0 z 0 A Markov matrix A is nonnegative with each colume adding to. (a) λ = is an eigenvalue with a nonnegative eigenvector x. (b) The other eigenvalues satisfy λ i. (c) If any power of A has all positive entries, and the other λ i <. Then A k u 0 approaches the steady state of u which is a multiple of x as long as the projection of u 0 in x is not zero. Check Perron-Fröbenius theorem in Strang s book.

20 86 e A and Differential Equations e A = I + A! + A2 2! + + Am m! + du dt = λu u(t) =e λt u(0) du dt = Au = 2 u u(t) =e ta u(0) 2 A = UΛU t for an orthogonal matrix U, then e A = Ue Λ U = Udiag[e λ,e λ 2,...,e λn ]U t Solve x 3x +2x =0. Let y = x, z = y = x,andletu =[x, y, z] t. The problem is reduced to solving 0 0 u = Au = 0 0 u Then u(t) =e ta u(0) = e 2t e t u(0)

21 87 Problems Solved by Matlab A = LetA,B,H,x, y, u, b be matrices and vectors defined below, and H = I 2uu t , B = , u = /2 /2 /2 /2, b = 6 2 5, x =, y =. Let A=LU=QR, find L, U; Q, R. 2. Find determinants and inverses of matrices A, B, and H. 3. Solve Ax = b, how to find the number of floating-point operations are required? 4. Find the ranks of matrices A, B, and H. 5. Find the characteristic polynomials of matrices A and B. 6. Find -norm, 2-norm, and -norm of matrices A, B, and H. 7. Find the eigenvalues/eigenvectors of matrices A and B. 8. Find matrices U and V such that U AU and V BV are diagonal matrices. 9. Find the singular values and singular vectors of matrices A and B. 0. Randomly generate a 4 4 matrix C with 0 C(i, j) 9.

Problems of Eigenvalues/Eigenvectors

Problems of Eigenvalues/Eigenvectors 5 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Singular Value Decomposition

More information

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2 The QR algorithm The most common method for solving small (dense) eigenvalue problems. The basic algorithm: QR without shifts 1. Until Convergence Do: 2. Compute the QR factorization A = QR 3. Set A :=

More information

Solving Linear Systems of Equations

Solving Linear Systems of Equations 1 Solving Linear Systems of Equations Many practical problems could be reduced to solving a linear system of equations formulated as Ax = b This chapter studies the computational issues about directly

More information

Vector Space and Linear Transform

Vector Space and Linear Transform 32 Vector Space and Linear Transform Vector space, Subspace, Examples Null space, Column space, Row space of a matrix Spanning sets and Linear Independence Basis and Dimension Rank of a matrix Vector norms

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Eigenvalue and Eigenvector Problems

Eigenvalue and Eigenvector Problems Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems

More information

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers.. EIGENVALUE PROBLEMS Background on eigenvalues/ eigenvectors / decompositions Perturbation analysis, condition numbers.. Power method The QR algorithm Practical QR algorithms: use of Hessenberg form and

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Basic Elements of Linear Algebra

Basic Elements of Linear Algebra A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

Algebra C Numerical Linear Algebra Sample Exam Problems

Algebra C Numerical Linear Algebra Sample Exam Problems Algebra C Numerical Linear Algebra Sample Exam Problems Notation. Denote by V a finite-dimensional Hilbert space with inner product (, ) and corresponding norm. The abbreviation SPD is used for symmetric

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

Numerical Methods for Solving Large Scale Eigenvalue Problems

Numerical Methods for Solving Large Scale Eigenvalue Problems Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized.

More information

Lecture 4 Eigenvalue problems

Lecture 4 Eigenvalue problems Lecture 4 Eigenvalue problems Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

The Eigenvalue Problem: Perturbation Theory

The Eigenvalue Problem: Perturbation Theory Jim Lambers MAT 610 Summer Session 2009-10 Lecture 13 Notes These notes correspond to Sections 7.2 and 8.1 in the text. The Eigenvalue Problem: Perturbation Theory The Unsymmetric Eigenvalue Problem Just

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

CS 143 Linear Algebra Review

CS 143 Linear Algebra Review CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set 6 Problems marked (T) are for discussions in Tutorial sessions. 1. Find the eigenvalues

More information

G1110 & 852G1 Numerical Linear Algebra

G1110 & 852G1 Numerical Linear Algebra The University of Sussex Department of Mathematics G & 85G Numerical Linear Algebra Lecture Notes Autumn Term Kerstin Hesse (w aw S w a w w (w aw H(wa = (w aw + w Figure : Geometric explanation of the

More information

Matrices A brief introduction

Matrices A brief introduction Matrices A brief introduction Basilio Bona DAUIN Politecnico di Torino Semester 1, 2014-15 B. Bona (DAUIN) Matrices Semester 1, 2014-15 1 / 41 Definitions Definition A matrix is a set of N real or complex

More information

The German word eigen is cognate with the Old English word āgen, which became owen in Middle English and own in modern English.

The German word eigen is cognate with the Old English word āgen, which became owen in Middle English and own in modern English. Chapter 4 EIGENVALUE PROBLEM The German word eigen is cognate with the Old English word āgen, which became owen in Middle English and own in modern English. 4.1 Mathematics 4.2 Reduction to Upper Hessenberg

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Math Homework 8 (selected problems)

Math Homework 8 (selected problems) Math 102 - Homework 8 (selected problems) David Lipshutz Problem 1. (Strang, 5.5: #14) In the list below, which classes of matrices contain A and which contain B? 1 1 1 1 A 0 0 1 0 0 0 0 1 and B 1 1 1

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

MAT 610: Numerical Linear Algebra. James V. Lambers

MAT 610: Numerical Linear Algebra. James V. Lambers MAT 610: Numerical Linear Algebra James V Lambers January 16, 2017 2 Contents 1 Matrix Multiplication Problems 7 11 Introduction 7 111 Systems of Linear Equations 7 112 The Eigenvalue Problem 8 12 Basic

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

EE5120 Linear Algebra: Tutorial 7, July-Dec Covers sec 5.3 (only powers of a matrix part), 5.5,5.6 of GS

EE5120 Linear Algebra: Tutorial 7, July-Dec Covers sec 5.3 (only powers of a matrix part), 5.5,5.6 of GS EE5 Linear Algebra: Tutorial 7, July-Dec 7-8 Covers sec 5. (only powers of a matrix part), 5.5,5. of GS. Prove that the eigenvectors corresponding to different eigenvalues are orthonormal for unitary matrices.

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

STAT200C: Review of Linear Algebra

STAT200C: Review of Linear Algebra Stat200C Instructor: Zhaoxia Yu STAT200C: Review of Linear Algebra 1 Review of Linear Algebra 1.1 Vector Spaces, Rank, Trace, and Linear Equations 1.1.1 Rank and Vector Spaces Definition A vector whose

More information

ACM 104. Homework Set 5 Solutions. February 21, 2001

ACM 104. Homework Set 5 Solutions. February 21, 2001 ACM 04 Homework Set 5 Solutions February, 00 Franklin Chapter 4, Problem 4, page 0 Let A be an n n non-hermitian matrix Suppose that A has distinct eigenvalues λ,, λ n Show that A has the eigenvalues λ,,

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

CS 246 Review of Linear Algebra 01/17/19

CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit V: Eigenvalue Problems Lecturer: Dr. David Knezevic Unit V: Eigenvalue Problems Chapter V.2: Fundamentals 2 / 31 Eigenvalues and Eigenvectors Eigenvalues and eigenvectors of

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: 11/21/07) Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

More information

Lec 2: Mathematical Economics

Lec 2: Mathematical Economics Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in 806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

More information

Introduction to Matrix Algebra

Introduction to Matrix Algebra Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions

Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions Linear Algebra, part 2 Eigenvalues, eigenvectors and least squares solutions Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2013 Main problem of linear algebra 2: Given

More information

Eigenvalues and eigenvectors

Eigenvalues and eigenvectors Chapter 6 Eigenvalues and eigenvectors An eigenvalue of a square matrix represents the linear operator as a scaling of the associated eigenvector, and the action of certain matrices on general vectors

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

EE263: Introduction to Linear Dynamical Systems Review Session 5

EE263: Introduction to Linear Dynamical Systems Review Session 5 EE263: Introduction to Linear Dynamical Systems Review Session 5 Outline eigenvalues and eigenvectors diagonalization matrix exponential EE263 RS5 1 Eigenvalues and eigenvectors we say that λ C is an eigenvalue

More information

Spectral radius, symmetric and positive matrices

Spectral radius, symmetric and positive matrices Spectral radius, symmetric and positive matrices Zdeněk Dvořák April 28, 2016 1 Spectral radius Definition 1. The spectral radius of a square matrix A is ρ(a) = max{ λ : λ is an eigenvalue of A}. For an

More information

Matrices, Moments and Quadrature, cont d

Matrices, Moments and Quadrature, cont d Jim Lambers CME 335 Spring Quarter 2010-11 Lecture 4 Notes Matrices, Moments and Quadrature, cont d Estimation of the Regularization Parameter Consider the least squares problem of finding x such that

More information

Chapter 3. Matrices. 3.1 Matrices

Chapter 3. Matrices. 3.1 Matrices 40 Chapter 3 Matrices 3.1 Matrices Definition 3.1 Matrix) A matrix A is a rectangular array of m n real numbers {a ij } written as a 11 a 12 a 1n a 21 a 22 a 2n A =.... a m1 a m2 a mn The array has m rows

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Nonlinear Programming Algorithms Handout

Nonlinear Programming Algorithms Handout Nonlinear Programming Algorithms Handout Michael C. Ferris Computer Sciences Department University of Wisconsin Madison, Wisconsin 5376 September 9 1 Eigenvalues The eigenvalues of a matrix A C n n are

More information

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors November 3, 2016 1 Definition () The (complex) number λ is called an eigenvalue of the n n matrix A provided there exists a nonzero (complex) vector v such that Av = λv, in which case the vector v is called

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

More information

Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury

Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury Matrix Theory, Math634 Lecture Notes from September 27, 212 taken by Tasadduk Chowdhury Last Time (9/25/12): QR factorization: any matrix A M n has a QR factorization: A = QR, whereq is unitary and R is

More information

Math 408 Advanced Linear Algebra

Math 408 Advanced Linear Algebra Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

Math 577 Assignment 7

Math 577 Assignment 7 Math 577 Assignment 7 Thanks for Yu Cao 1. Solution. The linear system being solved is Ax = 0, where A is a (n 1 (n 1 matrix such that 2 1 1 2 1 A =......... 1 2 1 1 2 and x = (U 1, U 2,, U n 1. By the

More information

Linear Algebra Formulas. Ben Lee

Linear Algebra Formulas. Ben Lee Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 1. Basic Linear Algebra Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Example

More information

ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems

ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems Part I: Review of basic theory of eigenvalue problems 1. Let A C n n. (a) A scalar λ is an eigenvalue of an n n A

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Up to this point, our main theoretical tools for finding eigenvalues without using det{a λi} = 0 have been the trace and determinant formulas

Up to this point, our main theoretical tools for finding eigenvalues without using det{a λi} = 0 have been the trace and determinant formulas Finding Eigenvalues Up to this point, our main theoretical tools for finding eigenvalues without using det{a λi} = 0 have been the trace and determinant formulas plus the facts that det{a} = λ λ λ n, Tr{A}

More information

Linear System Theory

Linear System Theory Linear System Theory Wonhee Kim Lecture 4 Apr. 4, 2018 1 / 40 Recap Vector space, linear space, linear vector space Subspace Linearly independence and dependence Dimension, Basis, Change of Basis 2 / 40

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

MATH 581D FINAL EXAM Autumn December 12, 2016

MATH 581D FINAL EXAM Autumn December 12, 2016 MATH 58D FINAL EXAM Autumn 206 December 2, 206 NAME: SIGNATURE: Instructions: there are 6 problems on the final. Aim for solving 4 problems, but do as much as you can. Partial credit will be given on all

More information

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1 Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector

More information

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to: MAC Module Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to: Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

Chapter 1. Matrix Calculus

Chapter 1. Matrix Calculus Chapter 1 Matrix Calculus 11 Definitions and Notation We assume that the reader is familiar with some basic terms in linear algebra such as vector spaces, linearly dependent vectors, matrix addition and

More information

Cayley-Hamilton Theorem

Cayley-Hamilton Theorem Cayley-Hamilton Theorem Massoud Malek In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n Let A be an n n matrix Although det (λ I n A

More information

Numerical Methods I Eigenvalue Problems

Numerical Methods I Eigenvalue Problems Numerical Methods I Eigenvalue Problems Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 2nd, 2014 A. Donev (Courant Institute) Lecture

More information

Lecture 15, 16: Diagonalization

Lecture 15, 16: Diagonalization Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose

More information

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =

More information