Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Size: px
Start display at page:

Download "Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu"

Transcription

1 Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version /6/29

2 2 version /6/29 (J. Mark Heinzle, SoSe 29)

3 Chapter. Basics Basics vector space V of dimension dimv = n over the field R (or C) basis { b,b 2,...,b n } every vector v V is represented as a unique linear combination of the basis vectors n v = v b + v 2 b v n b n = v i b i different bases lead to different components basis {ˆb,...,ˆb n } and basis {ˇb,...,ˇb n } i= v = n n ˆv iˆbi = ˇv iˇbi i= i= the components (ˆv i ) i=,...,n and (ˇv i ) i=,...n are completely different but represent one and the same vector (w.r.t. two different bases) once a basis has been chosen (or if it is clear which basis is used), it is customary to write the components of a vector v (w.r.t. that basis) as one speaks of a column vector v v 2 v =. v n an endomorphism is a linear map A of the vector space V onto itself, i.e., A : V V version /6/29 (J. Mark Heinzle, SoSe 29) 3

4 Chapter. Basics such a linear map takes vectors v V and maps these to A(v) V. it is customary to write Av instead of A(v), because of the linearity of A. once a basis has been chosen (or if it is clear which basis is used), the linear map is represented by a matrix, which is usually denoted by the same letter the map A is linear, hence Av = A ( n j= v jb j ) = n j= v j Ab j now, Ab j is a vector in V and can thus be domposed w.r.t. basis it follows that Av = n j= v j Ab j = n Ab j = n j= v j n i= A ijb i i= A ijb i = n i= ( n j= A ijv j ) bi therefore the i =,...,n components of the transformed vector are n j= A ijv j collecting the components A ij into a matrix A A 2 A n A = ( A ij )i,j=,...,n = A 2 A 22 A 2n.... A n A n2 A nn we obtain the transformed vector Av in its column vector representation through matrix multiplication A A 2 A n v A 2 A 22 A 2n v 2 Av =..... A n A n2 A nn v n it is a common source of confusion to denote the linear map A and the matrix representation of A (w.r.t. a basis) by the same letter. always keep in mind that while the linear map is an abstract (and fixed) entity, its matrix representation is not at all fixed but depends on the basis we choose. different basis: same map but different matrix. 4 version /6/29 (J. Mark Heinzle, SoSe 29)

5 2 Eigenvalues and eigenvectors a vector v o is an eigenvector of a linear map A if it is merely stretched or compressed by the map A, i.e., if there is a number λ such that Av = λv eigenvectors are the linear map s pampered children. while the linear map might have some nasty effect on a general vector (rotate, reflect,...), the map is rather kind to an eigenvector: the effect of the map on an eigenvector is to simply multiply it by a number. the number λ is called the eigenvalue that is associated with the eigenvector v. Consider the vector space R 2 (with the standard basis) and the linear map represented by the matrix ( ). 2 The vector ( ), is an eigenvector, since ( )( ) 2 The associated eigenvalue is 2. = 2 ( ). version /6/29 (J. Mark Heinzle, SoSe 29) 5

6 The set of C functions x f(x) forms a vector space, and is a linear map. The function d dx e x is an eigenvector ( eigenfunction ) since d dx e x = e x. The associated eigenvalue is ( ). (Since the vector space of C functions is not finite dimensional, we cannot operate with matrices.) v is an eigenvector of A if and only if there exists λ such that Av = λv (A λ)v = o reinterpreting this equation we see that the set of eigenvalues of A is the set of numbers λ such that (A λ)v = o possesses a non-trivial solution v. this implies a number of equivalent statements. λ is eigenvalue of A (A λ)v = o has non-trivial solutions v ker(a λ) { o} (A λ) is not invertible det(a λ) = consequently, to obtain the set of eigenvalues of A we must solve the equation A λ A 2 A n A 2 A 22 λ A 2n det(a λ) = =.... A n A n2 A nn λ 6 version /6/29 (J. Mark Heinzle, SoSe 29)

7 the expression det(a λ) is called the characteristic polynomial of A. it is a polynomial of degree n, det(a λ) = ( ) n( λ n + c n λ n + + c λ + c ) the coefficients (c i ) i=,...,n are determined by the components (A ij ) i,j=,...,n of A. the eigenvalues of A are the zeros of the characteristic polynomial, i.e., the solutions of the equation det(a λ) = ( ) n( λ n + c n λ n + + c λ + c ) = how many eigenvalues does a linear map A possess? it is crucial to distinguish real vector spaces and complex vector spaces. consider a vector space over the field R. then the linear map A is represented by a real matrix (i.e., a matrix whose entries are real) and the coefficients of the characteristic polynomial are real. the eigenvalues of A are the (real!) zeros of the characteristic polynomial. the characteristic polynomial is a polynomial of degree n. therefore, if n is even, the number of eigenvalues of the map A can be anything between and n; if n is odd, the number of eigenvalues of the map A can be anything between and n. version /6/29 (J. Mark Heinzle, SoSe 29) 7

8 Consider the vector space R 2 and the linear maps represented by the matrices ( ) ( ) ( ) A =, A =, A 2 =. The characteristic polynomials are det(a λ) = λ λ = λ2 + =, det(a λ) = λ λ = ( λ)2 = λ 2 2λ + =, det(a 2 λ) = λ λ = λ2 =. Therefore, A has no eigenvalue λ, A has one eigenvalue λ =, A 2 has two eigenvalues λ =, λ 2 =. We see that real (2 2) matrices can have,, or 2 eigenvalues. consider a vector space over the field C. then the linear map A is represented by a complex matrix (i.e., a matrix whose entries are complex) and the coefficients of the characteristic polynomial are complex. note that this does not necessarily mean that a complex matrix features an i somewhere (but it could). e.g., both 4 + i i i and are complex matrices. (since R C the real numbers are automatically complex numbers.) the eigenvalues of A are the (complex) zeros of the characteristic polynomial. since this is a polynomial of degree n, the number of eigenvalues of the map A can be anything between and n. (the fundamental theorem of algebra states that every polynomial has at least one (complex) zero.) 8 version /6/29 (J. Mark Heinzle, SoSe 29)

9 Consider the vector space C 2 and the linear maps represented by the matrices ( ) ( ) + i 3 i i 2 B =, B + i 2 =, The characteristic polynomials are det(b λ) = + i λ 3 i + i λ = ( + i λ)2 = λ 2 2( + i)λ + ( + i) 2 =, det(b 2 λ) = i λ 2 λ = (i λ)λ + 2 = λ2 iλ + 2 =. Therefore, B has one eigenvalue λ = + i, B 2 has two eigenvalues λ = 2i, λ 2 = i. We see that complex (2 2) matrices can have or 2 eigenvalues. version /6/29 (J. Mark Heinzle, SoSe 29) 9

10 Consider the vector space C 2 and the linear maps represented by the matrices ( ) ( ) ( ) A =, A =, A 2 =. The characteristic polynomials are det(a λ) = λ λ = λ2 + =, det(a λ) = λ λ = ( λ)2 = λ 2 2λ + =, det(a 2 λ) = λ λ = λ2 =. Therefore, A has two eigenvalues λ = i, λ 2 = i, A has one eigenvalue λ =, A 2 has two eigenvalues λ =, λ 2 =. We see that complex (2 2) matrices can have or 2 eigenvalues. like every polynomial the characteristic polynomial can be factorized by using its roots. suppose that there are r roots (eigenvalues) {λ,λ 2,...,λ r }. (we know that r n). then det(a λ) = ( ) n( λ n + c n λ n + + c λ + c ) where = ( ) n( λ λ ) m ( λ λ2 ) m2 (λ λ r ) mr m + m 2 + m r = n. the integer numbers m,...,m r are the multiplicities of the roots λ,...,λ r ; in our present context we say that m,m 2,...,m r are the algebraic multiplicities of the eigenvalues λ,λ 2,...,λ r. (note that n i= m i = n is a specialty of the complex numbers; in the case of a real vector space, the sum of the algebraic multiplicities can be less than n.) version /6/29 (J. Mark Heinzle, SoSe 29)

11 consider a linear map A of the vector space V onto itself. the field can be R or C. suppose that λ is an eigenvalue of A. the associated eigenvectors are obtained by solving the equation Av = λv (A λ)v = o. this is a system of linear equations. the existence of non-trivial solutions v is guaranteed by the fact that det(a λ) = (since λ is an eigenvalue). applying the theory of systems of linear equations we see that the solutions of (A λ)v = o form a (non-trivial) linear subspace E λ in V, E λ = ker(a λ). each vector v E λ satisfies the equation (A λ)v = o and is thus an eigenvector of A with eigenvalue λ. we call the space E λ the eigenspace of the map A associated with the eigenvalue λ. version /6/29 (J. Mark Heinzle, SoSe 29)

12 Consider the vector space C 3 and the linear map represented by the matrix 2 A = Computing the eigenvalues we obtain 2 λ A λ = 2 λ λ = (2 λ) 2 λ λ (λ ) ) = (2 λ)( = (2 λ) ( λ 2 + λ 6 ) =. Accordingly, one eigenvalue is 2; the remaining eigenvalue(s) are obtained by solving the quadratic equation λ 2 + λ 6 =, ± = { 3,2}. The eigenvalue 2 appears again and the number ( 3) is one more eigenvalue. Accordingly, in the present example the eigenvalues of A are λ = 2 and λ 2 = 3. The algebraic multiplicities of the eigenvalues are m = 2 and m 2 =, because the characteristic polynomial reads (λ 2) ( λ 2 + λ 6 ) = (λ 2)(λ 2)(λ + 3) = (λ 2) 2 (λ + 3). Let us compute the eigenspace E (the set of eigenvectors) associated with λ = 2. v (A λ }{{} )v = o v 2 = v 3 The only equation we get is 5 2 v v 3 = ; hence the solution is To be continued... E = { v = v v 2 } v2 = v 3 v 3. 2 version /6/29 (J. Mark Heinzle, SoSe 29)

13 ...and now the continuation. The eigenspace E is a two-dimensional subspace of V ; it is the space of eigenvectors of A w.r.t. the eigenvalue λ = 2. Every vector in E is an eigenvector of A w.r.t. λ ; examples are,,, 2. 2 Since E is two-dimensional, it is spanned by any two (linearly independent) vectors in E ; for instance we can write E =,, where denotes the linear span. (Recall that the linear span v,...,v n is defined as {c v + +c n v n } with constants c,...,c n.) Analogously, we compute the eigenspace E 2 associated with λ 2 = 3. 5 v (A λ }{{} 2 )v = o v 2 =. 3 v We obtain two independent equations: 5v = and 5 2 v v 3 = ; hence the solution is { v E 2 = v = v 2 } v = v 2 = v 3. v 3 This eigenspace is spanned by one vector and thus one-dimensional; we write E 2 =. Every vector in E 2 is an eigenvector associated with the eigenvalue λ 2 = 3. To be continued version /6/29 (J. Mark Heinzle, SoSe 29) 3

14 ... and now the conclusion. Let us summarize. The linear map represented by the matrix 2 A = possesses two eigenvectors: λ = 2 and λ 2 = 3. The associated spaces of eigenvectors (eigenspaces) E and E 2 are E =,, E 2 =. The algebraic multiplicity of λ = 2 is m = 2; the algebraic multiplicity of λ 2 = 3 is m 2 =. Let us define the geometric multiplicity d λ of an eigenvalue λ as the dimension of the associated eigenspace E λ. In our example we get d = dim E = 2 and d 2 = dim E 2 =. Comparing the algebraic multiplicities with the geometric multiplicities we see that d = m = 2, d 2 = m 2 =. An obvious question to ask is whether this statement generalizes: Does the geometric multiplicity always coincide with the algebraic multiplicity? Unfortunately, as we will see in the subsequent example, the answer is no. 4 version /6/29 (J. Mark Heinzle, SoSe 29)

15 Consider the vector space C 3 and the linear map represented by the matrix A = 2. To compute the eigenvalues we calculate the characteristic polynomial; we obtain λ A λ = 2 λ = ( λ) λ λ 2 λ = ( λ) ( λ(2 λ) + ) = (λ + ) ( λ 2 2λ + ) = (λ + )(λ ) 2 =. Accordingly, there exist two eigenvalues, λ = and λ 2 = ; the algebraic multiplicities are m = and m 2 = 2. Computing the eigenvectors (eigenspace) associated with λ = we obtain ( A λ ) v = ( A + ) v v = 3 v 2 =, v 3 hence v + v 2 =, v + 3v 2 and therefore v = and v 2 =. Accordingly, E =. Analogously, we compute the eigenvectors (eigenspace) associated with λ 2 =. We obtain ( A λ ) v = ( A ) v v = v 2 =, 2 v 3 hence v + v 2 = and v 3 =. To be continued... version /6/29 (J. Mark Heinzle, SoSe 29) 5

16 ... and now the conclusion. Accordingly, E 2 =. We see that the geometric multiplicities are d = dim E =, d 2 = dim E 2 =. In particular, we conclude that the geometric multiplicity of the eigenvalue λ 2 = is less than its algebraic multiplicity. d = m =, d 2 = < m 2 = 2. This statement is true in general. The geometric multiplicity of an eigenvalue is less than or equal to its algebraic multiplicity. consider a linear map A of V onto itself. let λ be an eigenvalue of A with algebraic multiplicity m λ ; let E λ = ker(a λ) denote the space of eigenvectors (eigenspace) associated with λ. we define the geometric multiplicity d λ of λ as the dimension of the associated eigenspace E λ, d λ = dim E λ. then there is the following important statement: d λ m λ ; in particular, the geometric multiplicity of an eigenvalue λ is less than or equal to its algebraic multiplicity. suppose the linear map A possesses the eigenvalues λ,λ 2,...,λ r with geometric multiplicities d,d 2,...,d r, i.e., the associated eigenspaces E,E 2,...,E r satisfy dim E i = d i i =,...,r. so, how many linearly independent eigenvectors does the map A have? the answer is d + d d r. since d i m i and m + + m r = n (or n in the case of real vector spaces) we obtain d + d d r n. 6 version /6/29 (J. Mark Heinzle, SoSe 29)

17 this is a non-trivial statement, which follows essentially from the fact that eigenvectors associated with different eigenvalues are always linearly independent. (we omit the proof.) an alternative way of stating the above is or E + + E r = E E r ( V ), dim ( E E r ) = dim E + + dime r = d + + d r ( n). every eigenspace E i adds d i linearly independent eigenvectors to the set of eigenvectors. (we never have to worry that we get a linearly dependent one.) we conclude this section with some useful remarks and observations. triangular matrices consider an (upper or lower) triangular matrix, i.e., A A 2 A 3 A n A 22 A 23 A 2n A = A 33 A 3n..... A nn. version /6/29 (J. Mark Heinzle, SoSe 29) 7

18 to compute the eigenvalues of this matrix we search for the zeros of the characteristic polynomial, i.e., A λ A 2 A 3 A n A 22 λ A 23 A 2n A λ = A 33 λ A 3n.... A nn λ A 22 λ A 23 A 2n A 33 λ A 3n = (A λ).... A nn λ A 33 λ A 3n = (A λ)(a 22 λ).... A nn λ = (A λ)(a 22 λ)(a 33 λ) (A nn λ). it follows that the eigenvalues coincide with the diagonal elements of the triangular matrix, i.e., the set of eigenvalues is {A,A 22,A 33,...,A nn }. eigenvalues, determinant, and trace recall that the trace of a matrix is the sum of its diagonal elements, i.e., tr A = A + A A nn = n A ii. i= consider for consistency a (n-dimensional) vector space V over the field C. let A be a linear map of V onto itself. then there exist r n eigenvalues of A, {λ,λ 2,...,λ r }, with algebraic multiplicities m, m 2,..., m r, such that m + m m r = n. the eigenvalues {λ,...,λ r } of a linear map A are intimately connected with its determinant and its trace: the determinant is the product of the eigenvalues, the 8 version /6/29 (J. Mark Heinzle, SoSe 29)

19 trace is the sum of the eigenvalues. however, we must take care of the algebraic multiplicities; an eigenvalue λ i with algebraic multiplicity m i appears m i times in the product or sum. therefore, deta = λ λ } {{ } λ 2 λ 2 m times } {{ } m 2 times λ r λ } {{ } r = λ m λm 2 2 λ mr r = m r times r i= λ m i i, tr A = λ + + λ } {{ } + λ λ λ } {{ } r + + λ } {{ } r m times m 2 times m r times r = m λ + m 2 λ m r λ r = m i λ i. the proof is not difficult if one uses the characteristic polynomial and its decomposition into its roots, i.e., i= A λ = ( ) n( λ λ ) m ( λ λ2 ) m2 (λ λ r ) mr. we restrict ourselves to the simple example of a (2 2) matrix, i.e., ( ) A A A = 2. A 2 A 22 let λ and λ 2 denote the eigenvalues of A. (either λ λ 2 or λ = λ 2 ; in the latter case there exists only one eigenvalue whose multiplicity is 2). we obtain A λ = A λ A 2 A 22 λ = (A λ)(a 22 λ) A 2 A 2 A 2 = λ 2 ( A + A 22 ) λ + A A 22 A 2 A 2 = λ 2 (tr A)λ + deta (λ λ )(λ λ 2 ) = λ 2 (λ + λ 2 )λ + λ λ 2. comparing the coefficients of the polynomials we are led to the result deta = λ λ 2, tr A = λ + λ 2. version /6/29 (J. Mark Heinzle, SoSe 29) 9

20 Consider the linear map on C 2 represented by (2 2) matrix ( ) 3 A =. 3 We use deta and tr A to compute the eigenvalues of A. Accordingly, det A = 6, tr A = 4. λ λ 2 = deta = 6, λ + λ 2 = tr A = 4, which leads to a quadratic equation that can be solved to yield λ = 2 + 2,λ 2 = 2 2. a simple corollary of the relation det A = is the statement that a linear map is singular (i.e., not invertible) if and only if zero is an eigenvalue of A. r i= λ m i i 2 version /6/29 (J. Mark Heinzle, SoSe 29)

21 Chapter 3. Diagonalization 3 Diagonalization before we begin let us reiterate: a vector v of a vector space V can be represented by a column vector once a basis of V has been chosen. the column vector representation depends on the choice of basis. likewise, a linear map A of a vector space V onto itself can be represented as a matrix once a basis of V has been chosen. the matrix representation of A depends on the choice of basis. version /6/29 (J. Mark Heinzle, SoSe 29) 2

22 Chapter 3. Diagonalization Consider the vector space R 2 and the linear map A which describes a reflection at 45 (i.e., a reflection at the straight line with slope 45 ). Under this reflection, the standard basis vectors are mapped to ( ) Ae = A = e = ( ) e 2 = ( ) ( ) = e 2 Ae 2 = A ( ) = ( ) = e. The matrix representation of A is obtained by using Ae and Ae 2 as columns; hence ( ) A =. Now let us choose a different basis {b,b 2 } and represent the linear map A w.r.t. {b,b 2 }. Choose b = ( ) e 2 = ( ). We obtain (from purely geometric considerations, i.e., by applying the reflection) Ab = b Ab 2 = b 2. Note that b and b 2 are eigenvectors of A (associated with λ = and λ 2 =, respectively). Since we have chosen a (non-standard) basis, column vectors do not quite represent what we are used to. For instance, the vector ( ) 2 v = now means v = 2 b + ( ) b 2, i.e., this vector v points along the 45 2 line. (It corresponds to w.r.t. the old standard basis.) 2 To be continued version /6/29 (J. Mark Heinzle, SoSe 29)

23 Chapter 3. Diagonalization... and now the conclusion. Likewise, the vector ( ) v = now means v = ( ) b + b 2, i.e., this vector ( ) v points in the direction 2 of the negative x-axis. (It corresponds to w.r.t. the old standard basis.) W.r.t. the new basis, the transformation Ab = b, Ab 2 = b 2, looks like ( ) ( ) ( ) ( ) Ab = A = = b Ab 2 = A = = b 2. The matrix representation of A is obtained by using Ae and Ae 2 as columns; hence ( ) A =. We obtain a different matrix representation for the same linear map. This matrix representation is preferred to the original one since the matrix is diagonal. we call a linear map A diagonalizable, if A possesses n linearly independent eigenvectors, i.e., a basis of eigenvectors. when does this happen? suppose that {λ,λ 2,...,λ r } are the eigenvalues of the linear map A. the associated eigenspaces (spaces of eigenvectors) are E,E 2,...,E r. the geometric multiplicity of the eigenvalue λ i is d i = dime i. we know that there exist d + d d r ( n) linearly independent eigenvectors. therefore, if and only if d + d d r = n, version /6/29 (J. Mark Heinzle, SoSe 29) 23

24 Chapter 3. Diagonalization then there exist n linearly independent eigenvectors. alternatively we can write E E r = V. an important case of diagonalizability is the case of a linear map A that possesses n different eigenvalues λ,λ 2,...,λ n (i.e., r = n). then, automatically, there exist n linearly independent eigenvectors. (this is simply because d i = dime i i; hence, if there exist n different eigenvalues, then d i = dim E i = i, and by the general considerations on linear independence, these eigenspaces/-vectors are linearly independent.) let us suppose that the map A is diagonalizable and let us choose a basis of (i.e., n linearly independent) eigenvectors. we do this by successively choosing bases {v i;,...,v i;di } in the eigenspaces E i, i.e., E E { }} { 2 E { }} { { r }} { V = v;,...,v ;d v2;,...,v 2;d2 vr;,...,v r;dr. } {{ }} {{ }} {{ } d vectors d 2 vectors d r vectors since v i;j is in E i, it is an eigenvector associated with the eigenvalue λ i, i.e., Av i;j = λ i v i;j. let us consider the matrix representation of the diagonalizable map A w.r.t. this basis of eigenvectors. let us denote the matrix we obtain by D (instead of A). we straightforwardly obtain D = diag ( ) λ,...,λ } {{ },λ 2,...,λ 2,...,λ } {{ } r,...,λ } {{ } r d times d 2 times d r times λ... = λ λ 2... λ2... λr... λr. 24 version /6/29 (J. Mark Heinzle, SoSe 29)

25 Chapter 3. Diagonalization it therefore follows that a diagonalizable linear map can be represented by a diagonal matrix, whose entries are the eigenvalues. we call the diagonal matrix D = diag ( ) λ,...,λ } {{ },λ 2,...,λ 2,...,λ } {{ } r,...,λ } {{ } r d times d 2 times d r times the eigenvalue matrix of the linear map A. conversely, if a map A can be represented by a diagonal matrix, then the eigenvalues are the entries of this matrix (so that the matrix is automatically the eigenvalue matrix) and the eigenvectors of A are represented by the column vectors,,...,.. hence, there exists a basis of eigenvectors and thus A is diagonalizable.. summing up, we see that a linear map A is diagonalizable if and only if it can be represented by a diagonal matrix.. version /6/29 (J. Mark Heinzle, SoSe 29) 25

26 Chapter 3. Diagonalization Consider the vector space R 3 and the linear map A represented by the Sudoku matrix 2 3 A = (The basis is tacitly assumed to be the standard basis.) The characteristic polynomial is A λ = λ 3 + 5λ 2 + 8λ. The eigenvalues are the zeros of the characteristic polynomial, i.e., λ =, λ 2 = 3 ( ) , λ3 = 3 ( ) Since the map A has three different eigenvalues, it must have three linearly independent eigenvectors and thus a basis of eigenvectors. Therefore, the Sudoku map is diagonalizable and it can be represented by the diagonal eigenvalue matrix D = diag (, 3 2 (5 + 33), 3 2 (5 33) ). 26 version /6/29 (J. Mark Heinzle, SoSe 29)

27 Chapter 3. Diagonalization Consider the vector space C 2 and the linear map A represented by matrix ( ) A =. (The basis is tacitly assumed to be the standard basis.) The eigenvalues can be read off directly, since this is a triangular matrix: There is only one eigenvalue, λ =. (Its algebraic multiplicity must be m λ = 2.) Let us compute the space of eigenvectors E λ. From we deduce that (A λ)v = ( )( v v 2 ) = ( ) E λ = ( ), dλ = dim E λ =. In particular, there is only one eigenvector (and not two linearly independent ones). There does not exist a basis of eigenvectors; therefore, the map A is not diagonalizable. In connection with the previous example we consider a rather trivial example: The identity map has one eigenvalue, λ = (with algebraic multiplicity m λ = 2). Every vector is an eigenvector for, hence E λ = C 2 and g λ = dime λ = 2. The identity map is diagonalizable (and the standard matrix representation of is already diagonal). We see that it is not a problem if an eigenvalue appears multiple times (i.e., if its algebraic multiplicity is greater than ). A problem occurs if the geometric multiplicity is strictly less than the algebraic multiplicity, d λ < m λ. In that case, r i= m i = n but r i= d i < n, whence diagonalizability is ruled out. version /6/29 (J. Mark Heinzle, SoSe 29) 27

28 Chapter 3. Diagonalization Consider the vector space R 2 and the linear map represented by the matrix ( ) A =. The characteristic polynomial is λ 2 + =, hence there do not exist any eigenvalues. If we consider the same map as a map on the vector space C 2, then there exist two eigenvalues: λ = i, λ 2 = i. The map is not diagonalizable as a real map, but it is in fact diagonalizable regarded as a complex map. (Since there exist two different eigenvalues, there exist two linearly independent eigenvectors.) in practice, a linear map is given in its matrix representation w.r.t. some (standard) basis, A A 2 A n A 2 A 22 A 2n A =..... A n A n2 A nn we know that, if and only if A is diagonalizable, then we can switch to a matrix representation in terms of a diagonal matrix (the eigenvalue matrix D). how do we switch in practice? we need a switch matrix S. the switch matrix is supposed to transform the standard basis to a basis of eigenvectors. on the basis of eigenvectors, the linear map then acts as a diagonal matrix (the eigenvalue matrix D). having applied the map in this simple form, we then switch back to the standard basis. hence, D = S AS. the switch matrix contains the eigenvectors of A as columns, i.e., S = v ; v ;2 v r;dr. to prove that D = S AS we show that SDw = ASw for all w V. due to linearity, if we aim at proving a statement for all w V, it suffices to show this 28 version /6/29 (J. Mark Heinzle, SoSe 29)

29 Chapter 3. Diagonalization statement for all basis vectors. consider the standard basis vector e =.. we obtain on the other hand, Se = v ; ASe = λ v ;. De = λ e SDe = λ Se = λ v ;. we conclude that SDe = ASe ; analogously, we obtain SDe i = ASe i for all standard basis vectors e i and thus SDw = ASw for all w V. this completes the proof of the claim. version /6/29 (J. Mark Heinzle, SoSe 29) 29

30 Chapter 3. Diagonalization Consider the map A on C 3 given by 3 + 2i 2 2i 4 A = + i i 2. + i i The characteristic polynomial is A λ = λ 3 + (2 + i)λ 2 ( + 2i)λ + i ; it is not difficult to convince oneself that the factorization into roots is Therefore, the eigenvalues are A λ = (λ i)(λ ) 2. λ = i (m =,d = ), λ 2 = (m 2 = 2). To see whether A is diagonalizable there must exist two linearly independent eigenvectors associated with the eigenvalue λ 2 (i.e., d 2 = dim E 2 = 2 is required). A straightforward computation shows that 2 E = i, E 2 =,, hence d 2 = dim E 2 = 2 indeed; accordingly, there exist 3 linearly independent eigenvectors and A is diagonalizable. The switch matrix S is 2 i S =, its inverse is It is straightforward to check that i i + i S = i i i. i i i S AS = D = diag(i,,). 3 version /6/29 (J. Mark Heinzle, SoSe 29)

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version 5/5/2 2 version

More information

EIGENVALUES AND EIGENVECTORS

EIGENVALUES AND EIGENVECTORS EIGENVALUES AND EIGENVECTORS Diagonalizable linear transformations and matrices Recall, a matrix, D, is diagonal if it is square and the only non-zero entries are on the diagonal This is equivalent to

More information

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity: Diagonalization We have seen that diagonal and triangular matrices are much easier to work with than are most matrices For example, determinants and eigenvalues are easy to compute, and multiplication

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Eigenvectors. Prop-Defn

Eigenvectors. Prop-Defn Eigenvectors Aim lecture: The simplest T -invariant subspaces are 1-dim & these give rise to the theory of eigenvectors. To compute these we introduce the similarity invariant, the characteristic polynomial.

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

Linear Algebra Practice Final

Linear Algebra Practice Final . Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis

More information

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

1. In this problem, if the statement is always true, circle T; otherwise, circle F. Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation

More information

2 b 3 b 4. c c 2 c 3 c 4

2 b 3 b 4. c c 2 c 3 c 4 OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful

More information

Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

More information

Linear algebra II Tutorial solutions #1 A = x 1

Linear algebra II Tutorial solutions #1 A = x 1 Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Chapter 5. Eigenvalues and Eigenvectors

Chapter 5. Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

EIGENVALUES AND EIGENVECTORS 3

EIGENVALUES AND EIGENVECTORS 3 EIGENVALUES AND EIGENVECTORS 3 1. Motivation 1.1. Diagonal matrices. Perhaps the simplest type of linear transformations are those whose matrix is diagonal (in some basis). Consider for example the matrices

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

MATH 235. Final ANSWERS May 5, 2015

MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Contents Eigenvalues and Eigenvectors. Basic Concepts. Applications of Eigenvalues and Eigenvectors 8.3 Repeated Eigenvalues and Symmetric Matrices 3.4 Numerical Determination of Eigenvalues and Eigenvectors

More information

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying

More information

Math Matrix Algebra

Math Matrix Algebra Math 44 - Matrix Algebra Review notes - 4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,

More information

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics Diagonalization MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Motivation Today we consider two fundamental questions: Given an n n matrix A, does there exist a basis

More information

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions. The exam will cover Sections 6.-6.2 and 7.-7.4: True/False 30% Definitions 0% Computational 60% Skip Minors and Laplace Expansion in Section 6.2 and p. 304 (trajectories and phase portraits) in Section

More information

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and

More information

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial. MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial. Geometric properties of determinants 2 2 determinants and plane geometry

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Dimension. Eigenvalue and eigenvector

Dimension. Eigenvalue and eigenvector Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,

More information

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8 Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues. Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that det(a) = 2, det(b) = 2, det(c) = 1, det(d) = 4. 2 (a) Compute det(ad)+det((b

More information

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8. Linear Algebra M1 - FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:

More information

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017 Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a). .(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)

More information

Control Systems. Linear Algebra topics. L. Lanari

Control Systems. Linear Algebra topics. L. Lanari Control Systems Linear Algebra topics L Lanari outline basic facts about matrices eigenvalues - eigenvectors - characteristic polynomial - algebraic multiplicity eigenvalues invariance under similarity

More information

Announcements Wednesday, November 01

Announcements Wednesday, November 01 Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Topic 1: Matrix diagonalization

Topic 1: Matrix diagonalization Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found

5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found 5. Diagonalization plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found eigenvalues EV, eigenvectors, eigenspaces 5.. Eigenvalues and eigenvectors.

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued) 1 A linear system of equations of the form Sections 75, 78 & 81 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a m1 x 1 + a m2 x 2 + + a mn x n = b m can be written in matrix

More information

The Cayley-Hamilton Theorem and the Jordan Decomposition

The Cayley-Hamilton Theorem and the Jordan Decomposition LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

More information

Algebra Workshops 10 and 11

Algebra Workshops 10 and 11 Algebra Workshops 1 and 11 Suggestion: For Workshop 1 please do questions 2,3 and 14. For the other questions, it s best to wait till the material is covered in lectures. Bilinear and Quadratic Forms on

More information

Online Exercises for Linear Algebra XM511

Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

More information

MATH 1553-C MIDTERM EXAMINATION 3

MATH 1553-C MIDTERM EXAMINATION 3 MATH 553-C MIDTERM EXAMINATION 3 Name GT Email @gatech.edu Please read all instructions carefully before beginning. Please leave your GT ID card on your desk until your TA scans your exam. Each problem

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A. Eigenspaces 1. (a) Find all eigenvalues and eigenvectors of A = (b) Find the corresponding eigenspaces. [ ] 1 1 1 Definition. If A is an n n matrix and λ is a scalar, the λ-eigenspace of A (usually denoted

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

MATH 1553 PRACTICE MIDTERM 3 (VERSION B) MATH 1553 PRACTICE MIDTERM 3 (VERSION B) Name Section 1 2 3 4 5 Total Please read all instructions carefully before beginning. Each problem is worth 10 points. The maximum score on this exam is 50 points.

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

2 Eigenvectors and Eigenvalues in abstract spaces.

2 Eigenvectors and Eigenvalues in abstract spaces. MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors

More information

Announcements Wednesday, November 01

Announcements Wednesday, November 01 Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section

More information

Linear Algebra Primer

Linear Algebra Primer Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Elementary Linear Algebra

Elementary Linear Algebra Matrices J MUSCAT Elementary Linear Algebra Matrices Definition Dr J Muscat 2002 A matrix is a rectangular array of numbers, arranged in rows and columns a a 2 a 3 a n a 2 a 22 a 23 a 2n A = a m a mn We

More information

Linear Algebra II Lecture 22

Linear Algebra II Lecture 22 Linear Algebra II Lecture 22 Xi Chen University of Alberta March 4, 24 Outline Characteristic Polynomial, Eigenvalue, Eigenvector and Eigenvalue, Eigenvector and Let T : V V be a linear endomorphism. We

More information

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C Topic 1 Quiz 1 text A reduced row-echelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correct-choice may have 0, 1, 2, or 3 choice may have 0,

More information

Eigenvalues and Eigenvectors: An Introduction

Eigenvalues and Eigenvectors: An Introduction Eigenvalues and Eigenvectors: An Introduction The eigenvalue problem is a problem of considerable theoretical interest and wide-ranging application. For example, this problem is crucial in solving systems

More information

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book] LECTURE VII: THE JORDAN CANONICAL FORM MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO [See also Appendix B in the book] 1 Introduction In Lecture IV we have introduced the concept of eigenvalue

More information

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A = AMS1 HW Solutions All credit is given for effort. (- pts for any missing sections) Problem 1 ( pts) Consider the following matrix 1 1 9 a. Calculate the eigenvalues of A. Eigenvalues are 1 1.1, 9.81,.1

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS There will be eight problems on the final. The following are sample problems. Problem 1. Let F be the vector space of all real valued functions on

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Recall : Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector

More information

Announcements Wednesday, November 7

Announcements Wednesday, November 7 Announcements Wednesday, November 7 The third midterm is on Friday, November 6 That is one week from this Friday The exam covers 45, 5, 52 53, 6, 62, 64, 65 (through today s material) WeBWorK 6, 62 are

More information

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010 Review Notes for Linear Algebra True or False Last Updated: January 25, 2010 Chapter 3 [ Eigenvalues and Eigenvectors ] 31 If A is an n n matrix, then A can have at most n eigenvalues The characteristic

More information

Jordan Normal Form and Singular Decomposition

Jordan Normal Form and Singular Decomposition University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ

More information

JORDAN NORMAL FORM NOTES

JORDAN NORMAL FORM NOTES 18.700 JORDAN NORMAL FORM NOTES These are some supplementary notes on how to find the Jordan normal form of a small matrix. First we recall some of the facts from lecture, next we give the general algorithm

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information