Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu


 Rodney Paul
 2 years ago
 Views:
Transcription
1 Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version 5/5/2
2 2 version 5/5/2 (J. Mark Heinzle, SoSe 2)
3 Chapter. Basics Basics Consider a vector space V of dimension dimv = n over the field R (or C). A basis { b,b 2,...,b n } is a set of n linearly independent vectors. Every vector v V admits a unique decomposition w.r.t. a chosen basis, i.e., it is represented as a unique linear combination of the basis vectors v = v b + v 2 b v n b n = n v i b i. i= An obvious but important observation is that different bases lead to different decompositions and thus to different vector components. Suppose we have a basis {ˆb,...,ˆb n } and a different basis {ˇb,...,ˇb n }. Then v = n ˆv iˆbi = i= n ˇv iˇbi, where the components (ˆv i ) i=,...,n and (ˇv i ) i=,...n are in general completely different but represent one and the same vector (w.r.t. two different bases, however). Once a basis, say { } b,b 2,...,b n, has been chosen (or if it is clear which basis is used), it is customary to collect the components of a vector v (w.r.t. that basis) into a column vector. We thus write that v is represented by w.r.t. the chosen basis. v i= v 2 v =. v n version 5/5/2 (J. Mark Heinzle, SoSe 2) 3
4 Chapter. Basics An endomorphism is a linear map A of the vector space V onto itself, i.e., A : V V. Such a linear map takes vectors v V and maps these to vectors A(v) V. It is customary to write Av instead of A(v) because of the linearity of A. Once a basis has been chosen (or if it is clear which basis is used), the linear map A is represented by a matrix, which we will denote by the same letter; its components are ( A ij )i,j=,...,n (w.r.t. the basis {b,b 2,...,b n }). Let us elaborate. Take an arbitrary vector v; application of the map A yields a different vector; we choose to denote this vector by v, i.e., v = Av. The map A is linear, hence v = Av = A ( n j= v jb j ) = n j= v j Ab }{{} j. b j Now, b j = Ab j is a vector in V and can thus be decomposed w.r.t. the basis, It follows that v = Av = n j= v j b j = n b j = n j= v j i= A ijb i. n i= A ijb i = n i= ( n j= A ijv j ) } {{ } v i b i. Therefore, the i =,...,n components of the transformed vector v, i.e., v,... v n, are given by v i = n j= A ijv j. We collect the components A ij into a matrix (which we choose to denote by the same letter as the linear map), i.e., A A 2 A n A = ( A ij )i,j=,...,n = A 2 A 22 A 2n.... A n A n2 A nn 4 version 5/5/2 (J. Mark Heinzle, SoSe 2)
5 Chapter. Basics This matrix representation of the linear map is very useful. Using the column vector representations of the vectors involved, i.e., we find that v v v 2 v =., v v 2 =., v n v n v A A 2 A n v v 2. = A 2 A 22 A 2n v A n A n2 A nn v n v n In other words, using column vectors and the matrix representation of A, the components of v = Av are obtained from the components of v through matrix multiplication. Although, admittedly, it is a common source of confusion to denote the linear map A and the matrix representation of A (w.r.t. a chosen basis) by the same letter, we will nonetheless stick to this convention at least in connection with those problems where the basis is regarded as fixed (i.e., when there is no change of basis to consider). It is useful to always keep in mind that while a linear map is an abstract (and fixed) entity, its matrix representation is not at all fixed but depends on the basis we choose. Let us elaborate. Consider two bases, {ˆb,...,ˆb n } and {ˇb,...,ˇb n }. The matrix representation of the linear map A w.r.t. the basis {ˆb,...,ˆb n } is Â Â 2 Â n Â = ( Â ij )i,j=,...,n = Â 2 Â 22 Â 2n...., Â n Â n2 Â nn where the components of this matrix are determined through the relation Aˆb j = n i= Â ijˆbi. version 5/5/2 (J. Mark Heinzle, SoSe 2) 5
6 Chapter. Basics The matrix representation of the linear map A w.r.t. the basis {ˇb,...,ˇb n } Ǎ Ǎ 2 Ǎ n Ǎ = ( Ǎ ij )i,j=,...,n = Ǎ 2 Ǎ 22 Ǎ 2n...., Ǎ n Ǎ n2 Ǎ nn where the components of this matrix are determined by Aˇb j = n i= Ǎ ijˇbi. The relation between the two matrices Â and Ǎ is obtained by considering the change of basis, which is represented by a matrix S = (S ij ) i,j=,...,n through the relation ˇbj = n S ijˆb i, i= which is merely the decomposition of the vector ˇb j w.r.t. the basis {ˆb,...,ˆb n }. From we thus obtain Aˇb j = n i= On the other hand, Ǎ ijˇbi = n i= Aˇb j = n i= Ǎ ijˇbi Ǎ ij n k= S kiˆb k = n ( n ) Aˇb j = A S ijˆb i = n S ij Aˆb i = n S ij i= i= i= = n ( n Â ki S ij )ˆbk. k= i= k= ( n n k= i= S kiǎij Â kiˆbk )ˆbk. Since the two expressions are equal, we may equate the components, which yields n for all j and k, or, in matrix notation i= S kiǎij = n SǍ = ÂS. Therefore, the matrix Ǎ is obtained from Â by i= Â ki S ij Ǎ = S ÂS, (.) 6 version 5/5/2 (J. Mark Heinzle, SoSe 2)
7 Chapter. Basics where S is the matrix that encodes the change of basis. This switch matrix contains the basis vectors {ˇb,...,ˇb n }, represented as column vectors w.r.t. the original basis {ˆb,...,ˆb n }, as its columns, i.e., S = ˇb ˇb2 ˇbn. (.2) version 5/5/2 (J. Mark Heinzle, SoSe 2) 7
8 Chapter. Basics 8 version 5/5/2 (J. Mark Heinzle, SoSe 2)
9 Chapter 2. Eigenvalues and eigenvectors 2 Eigenvalues and eigenvectors A vector v o of a vector space V is an eigenvector of a linear map A if it is merely stretched, compressed, or inverted by the map A, i.e., if there is a number λ (of the underlying field) such that Av = λv. Eigenvectors are the linear map s pampered children. While the linear map might have some nasty effect on a general vector (rotate, reflect,...), the map is rather kind to an eigenvector: The effect of the map on an eigenvector is to simply multiply it by a number. The number λ is called the eigenvalue that is associated with the eigenvector v. Consider the vector space R 2 (with the standard basis) and the linear map represented by the matrix ( ). 2 The vector ( ) is an eigenvector, since ( )( ) 2 The associated eigenvalue is 2. = 2 ( ). version 5/5/2 (J. Mark Heinzle, SoSe 2) 9
10 Chapter 2. Eigenvalues and eigenvectors The set of C functions x f(x) forms a vector space, and is a linear map. The function d dx e x is an eigenvector ( eigenfunction ) since d dx e x = e x. The associated eigenvalue is ( ). (Since the vector space of C functions is not finite dimensional, we cannot operate with matrices.) A vector v is an eigenvector of A if and only if there exists λ such that Av = λv; we write (A λ)v = o. Reinterpreting this equation we see that the set of eigenvalues of A is the set of numbers λ such that the system of linear equations possesses a nontrivial solution v. (A λ)v = o This implies a number of equivalent statements along the following lines: λ is an eigenvalue of A (A λ)v = o has nontrivial solutions v ker(a λ) { o} (A λ) is not invertible det(a λ) = Consequently, to obtain the set of eigenvalues of A, we solve the equation A λ A 2 A n A 2 A 22 λ A 2n det(a λ) = =..... A n A n2 A nn λ version 5/5/2 (J. Mark Heinzle, SoSe 2)
11 Chapter 2. Eigenvalues and eigenvectors The expression det(a λ) is called the characteristic polynomial of A. It is a polynomial of degree n, det(a λ) = ( ) n( ) λ n + c n λ n + + c λ + c, where the coefficients (c i ) i=,...,n are (complicated) expression of the components (A ij ) i,j=,...,n of A. The case of a twodimensional vector space is particularly simple. The characteristic polynomial of A is det(a λ) = A λ A 2 A 22 λ = (A λ)(a 22 λ) A 2 A 2 A 2 = λ 2 (A + A 22 ) λ + A } {{ } A 22 A 2 A } {{ 2 } c which is a polynomial of degree 2. c, The eigenvalues of A are the zeros of the characteristic polynomial, i.e., the solutions of the equation det(a λ) = ( ) n( λ n + c n λ n + + c λ + c ) =. How many eigenvalues does a linear map A possess, then? To answer this question it is crucial to distinguish real vector spaces and complex vector spaces. Consider a vector space over the field R. Then the linear map A is represented by a real matrix (i.e., a matrix whose entries are real) and the coefficients of the characteristic polynomial are real. The eigenvalues of A are the (real!) zeros of the characteristic polynomial, which is a polynomial of degree n. Therefore, recalling basic algebra, we find that, if n is even, the number of eigenvalues of the map A can be anything between and n; if n is odd, the number of eigenvalues of the map A can be anything between and n. version 5/5/2 (J. Mark Heinzle, SoSe 2)
12 Chapter 2. Eigenvalues and eigenvectors Consider the vector space R 2 and the linear maps represented by the matrices ( ) ( ) ( ) A =, A =, A 2 =. The characteristic polynomials are det(a λ) = λ λ = λ2 + =, det(a λ) = λ λ = ( λ)2 = λ 2 2λ + =, det(a 2 λ) = λ λ = λ2 =. Therefore, A has no eigenvalue λ, A has one eigenvalue λ =, A 2 has two eigenvalues λ =, λ 2 =. We see that real (2 2) matrices can have,, or 2 eigenvalues. Consider a vector space over the field C. Then the linear map A is represented by a complex matrix (i.e., a matrix whose entries are complex) and the coefficients of the characteristic polynomial are complex. Note that this does not necessarily mean that a complex matrix features an i somewhere (but it could); e.g., both 4 + i i i and are complex matrices. (Since R C the real numbers are automatically complex numbers.) The eigenvalues of A are the (complex) zeros of the characteristic polynomial. Since this is a polynomial of degree n, the number of eigenvalues of the map A can be anything between and n. (The fundamental theorem of algebra states that every polynomial has at least one (complex) zero.) 2 version 5/5/2 (J. Mark Heinzle, SoSe 2)
13 Chapter 2. Eigenvalues and eigenvectors Consider the vector space C 2 and the linear maps represented by the matrices ( ) ( ) + i 3 i i 2 B =, B + i 2 =, The characteristic polynomials are det(b λ) = + i λ 3 i + i λ = ( + i λ)2 = λ 2 2( + i)λ + ( + i) 2 =, det(b 2 λ) = i λ 2 λ = (i λ)λ + 2 = λ2 iλ + 2 =. Therefore, B has one eigenvalue λ = + i, B 2 has two eigenvalues λ = 2i, λ 2 = i. We see that complex (2 2) matrices can have or 2 eigenvalues. version 5/5/2 (J. Mark Heinzle, SoSe 2) 3
14 Chapter 2. Eigenvalues and eigenvectors Consider the vector space C 2 and the linear maps represented by the matrices ( ) ( ) ( ) A =, A =, A 2 =. The characteristic polynomials are det(a λ) = λ λ = λ2 + =, det(a λ) = λ λ = ( λ)2 = λ 2 2λ + =, det(a 2 λ) = λ λ = λ2 =. Therefore, A has two eigenvalues λ = i, λ 2 = i, A has one eigenvalue λ =, A 2 has two eigenvalues λ =, λ 2 =. We see that complex (2 2) matrices can have or 2 eigenvalues. Like every polynomial, the characteristic polynomial can be factorized by using its roots. Suppose that there are r roots (which correspond to eigenvalues) {λ,λ 2,...,λ r }. (We know that r n when the underlying field is C; in the case of R, the set of roots might be the empty set.) Then det(a λ) = ( ) n( λ n + c n λ n + + c λ + c ) where = ( ) n( λ λ ) m ( λ λ2 ) m2 (λ λ r ) mr, m + m 2 + m r n in the case of R, m + m 2 + m r = n in the case of C. This is straightforward consequence of the fundamental theorem of algebra. 4 version 5/5/2 (J. Mark Heinzle, SoSe 2)
15 Chapter 2. Eigenvalues and eigenvectors The integer numbers m,...,m r are the multiplicities of the roots λ,...,λ r ; in our present context we say that m,m 2,...,m r are the algebraic multiplicities of the eigenvalues λ,λ 2,...,λ r. Consider a linear map A of the vector space V onto itself. The field can be R or C. Suppose that λ is an eigenvalue of A. By definition, the eigenvectors associated with λ are obtained by solving the equation Av = λv, which corresponds to (A λ)v = o. Since this is a system of linear equations, the existence of nontrivial solutions v is guaranteed by the fact that det(a λ) = (which in turn follows from the fact that λ is an eigenvalue). Applying the theory of systems of linear equations we see that the solutions of (A λ)v = o form a (nontrivial) linear subspace E λ in V. Each vector v E λ satisfies the equation (A λ)v = o and is thus an eigenvector of A with eigenvalue λ. We may write E λ as or, equivalently, as E λ = { v V Av = λv }, ker(a λ). We call the space E λ the eigenspace of the map A associated with the eigenvalue λ. version 5/5/2 (J. Mark Heinzle, SoSe 2) 5
16 Chapter 2. Eigenvalues and eigenvectors Consider the vector space C 3 and the linear map represented by the matrix 2 A = Computing the eigenvalues we obtain 2 λ A λ = 2 λ λ = (2 λ) 2 λ λ (λ ) ) = (2 λ)( = (2 λ) ( λ 2 + λ 6 ) =. Accordingly, one eigenvalue is 2; the remaining eigenvalue(s) are obtained by solving the quadratic equation λ 2 + λ 6 =, ( ) 2 ± + 24 = { 3,2}. The eigenvalue 2 appears again and the number ( 3) is one more eigenvalue. Accordingly, in the present example the eigenvalues of A are λ = 2 and λ 2 = 3. The algebraic multiplicities of the eigenvalues are m = 2 and m 2 =, because the characteristic polynomial reads (λ 2) ( λ 2 + λ 6 ) = (λ 2)(λ 2)(λ + 3) = (λ 2) 2 (λ + 3). Let us compute the eigenspace E (which is the set of eigenvectors) associated with λ = 2. We solve v (A λ }{{} )v = o v 2 = v 3 for v = (v,v 2,v 3 ) t. The only equation we get is 5 2 v v 3 = ; hence the solution is { v E = v = v 2 } v2 = v 3. v 3 To be continued... 6 version 5/5/2 (J. Mark Heinzle, SoSe 2)
17 Chapter 2. Eigenvalues and eigenvectors...and now the continuation. The eigenspace E is a twodimensional subspace of V ; it is the space of eigenvectors of A w.r.t. the eigenvalue λ = 2. Every vector in E is an eigenvector of A w.r.t. λ ; examples are,,, 2. 2 Since E is twodimensional, it is spanned by any two (linearly independent) vectors in E ; for instance we can write E =,, where denotes the linear span. (Recall that the linear span v,...,v n is defined as {c v + +c n v n } with constants c,...,c n.) Analogously, we compute the eigenspace E 2 associated with λ 2 = 3. 5 v (A λ }{{} 2 )v = o v 2 =. 3 v We obtain two independent equations: 5v = and 5 2 v v 3 = ; hence the solution is { v E 2 = v = v 2 } v = v 2 = v 3. v 3 This eigenspace is spanned by one vector and thus onedimensional; we write E 2 =. Every vector in E 2 is an eigenvector associated with the eigenvalue λ 2 = 3. To be continued version 5/5/2 (J. Mark Heinzle, SoSe 2) 7
18 Chapter 2. Eigenvalues and eigenvectors... and now the conclusion. Let us summarize. The linear map represented by the matrix 2 A = possesses two eigenvectors: λ = 2 and λ 2 = 3. The associated spaces of eigenvectors (eigenspaces) E and E 2 are E =,, E 2 =. The algebraic multiplicity of λ = 2 is m = 2; the algebraic multiplicity of λ 2 = 3 is m 2 =. Let us define the geometric multiplicity d λ of an eigenvalue λ as the dimension of the associated eigenspace E λ. In our example we get d = dim E = 2 and d 2 = dim E 2 =. Comparing the algebraic multiplicities with the geometric multiplicities we see that d = m = 2, d 2 = m 2 =. An obvious question to ask is whether this statement generalizes: Does the geometric multiplicity always coincide with the algebraic multiplicity? Unfortunately, as we will see in the subsequent example, the answer is no. 8 version 5/5/2 (J. Mark Heinzle, SoSe 2)
19 Chapter 2. Eigenvalues and eigenvectors Consider the vector space C 3 and the linear map represented by the matrix A = 2. To compute the eigenvalues we calculate the characteristic polynomial; we obtain λ A λ = 2 λ = ( λ) λ λ 2 λ = ( λ) ( λ(2 λ) + ) = (λ + ) ( λ 2 2λ + ) = (λ + )(λ ) 2 =. Accordingly, there exist two eigenvalues, λ = and λ 2 = ; the algebraic multiplicities are m = and m 2 = 2. Computing the eigenvectors (eigenspace) associated with λ = we obtain ( A λ ) v = ( A + ) v v = 3 v 2 =, v 3 hence v v 2 =, v + 3v 2 and therefore v = and v 2 =. Accordingly, E =. Analogously, we compute the eigenvectors (eigenspace) associated with λ 2 =. We obtain ( A λ ) v = ( A ) v v = v 2 =, 2 v 3 hence v + v 2 = and v 3 =. To be continued... version 5/5/2 (J. Mark Heinzle, SoSe 2) 9
20 Chapter 2. Eigenvalues and eigenvectors... and now the conclusion. Accordingly, E 2 =. We see that the geometric multiplicities are d = dim E =, d 2 = dim E 2 =. In particular, we conclude that the geometric multiplicity of the eigenvalue λ 2 = is less than its algebraic multiplicity. d = m =, d 2 = < m 2 = 2. This statement is true in general. The geometric multiplicity of an eigenvalue is less than or equal to its algebraic multiplicity. Consider a linear map A of V onto itself. Let λ be an eigenvalue of A with algebraic multiplicity m λ ; let E λ = ker(a λ) denote the space of eigenvectors (eigenspace) associated with λ. We define the geometric multiplicity d λ of λ as the dimension of the associated eigenspace E λ, d λ = dim E λ. Then there is the following important statement: d λ m λ, i.e., the geometric multiplicity of an eigenvalue λ is less than or equal to its algebraic multiplicity. (The proof is not particularly difficult and thus omitted.) Suppose the linear map A possesses the eigenvalues λ,λ 2,...,λ r with geometric multiplicities d,d 2,...,d r, i.e., the associated eigenspaces E,E 2,...,E r satisfy dim E i = d i i =,...,r. So, how many linearly independent eigenvectors does the map A have? The answer is d + d d r. Since d i m i and m + + m r = n (or n in the case of real vector spaces) we obtain d + d d r n. 2 version 5/5/2 (J. Mark Heinzle, SoSe 2)
21 Chapter 2. Eigenvalues and eigenvectors The fact that there are d + d d r linearly independent eigenvectors is a nontrivial statement, which is intimately connected with the statement that eigenvectors associated with different eigenvalues are always linearly independent. Let us give a proof. Suppose the linear map A possesses the eigenvalues λ,λ 2,...,λ r and associated eigenspaces E,E 2,...,E r with dim E i = d i i =,...,r. Let us choose, separately, in each E i, i =,...,r, a set {v i;,...,v i;d } of linearly independent vectors, E = v ;,...,v ;d } {{ } d vectors, E2 = v 2;,...,v 2;d2 } {{ } d 2 vectors,..., Er = v r;,...,v r;dr. } {{ } d r vectors To prove that the collection of these vectors is linearly independent, we need to show that every linear combination of the kind µ ; v ; + + µ ;d v ;d + + µ r; v r; + + µ r;dr v r;dr = o (2.) is trivial, i.e., µ ; =,...,µ r;dr =. Let us apply the linear map A to (2.). Since the vectors are eigenvectors, we obtain λ ( µ; v ; + + µ ;d v ;d ) + + λr ( µr; v r; + + µ r;dr v r;dr ) = o. Dividing by λ we get µ ; v ; + + µ ;d v ;d + λ 2 λ ( ) + + λ r λ ( ) = o, (2.2) which we may subtract from (2.) to obtain ( λ 2 λ )( µ2; v 2; + +µ 2;d2 v 2;d2 ) + + ( λ r λ )( µr; v r; + +µ r;dr v r;dr ) = o. We choose to write this relation as µ 2;v 2; + + µ 2;d 2 v 2;d2 + + µ r;v r; + + µ r;d r v r;dr = o (2.3) with constants µ 2;,...,µ r;d r that are multiples of µ 2;,...,µ r;dr. Equation (2.3) has the same structure as (2.), and we may repeat the entire procedure to obtain µ 3;v 3; + + µ 3;d 3 v 3;d3 + + µ r;v r; + + µ r;d r v r;dr = o with constants µ 3;,...,µ r;d r that are multiples of µ 3;,...,µ r;dr. After a finite number of iterations we thus arrive at the equation µ r;v r; + + µ r;d r v r;dr = o version 5/5/2 (J. Mark Heinzle, SoSe 2) 2
22 Chapter 2. Eigenvalues and eigenvectors with constants µ r;,...,µ r;d r that are multiples of µ r;,...,µ r;dr. However, the vectors v r;,...,v r;dr are linearly independent; therefore, we find that every constant vanishes, i.e., µ r; =,...,µ r;d r =, and thus µ r; =,...,µ r;dr =. Insertion into (2.) yields µ ; v ; + +µ ;d v ;d + +µ r ; v r ; + +µ r ;dr v r ;dr = o. (2. ) Repeating the entire procedure we find that µ r ; =,...,µ r ;dr =. After a finite number of iterations we thus arrive at the conclusion that µ ; =,...,µ ;d =,...,µ r; =,...,µ r;dr =, i.e., every single constant in (2.) is zero. In other words, linear combinations of the kind (2.) are trivial which entails that the vectors are linearly independent, as claimed. Remark. There is a slight subtlety which we overlooked. One eigenvalue might be zero which makes the division impossible. However, the argument can readily be modified to this case. An alternative way of stating that there are d +d 2 + +d r linearly independent eigenvectors is E + + E r = E E r ( V ), or dim ( E E r ) = dim E + + dime r = d + + d r ( n). In brief, every eigenspace E i adds d i linearly independent eigenvectors to the set of eigenvectors (and we never have to worry that we get a linearly dependent one). We conclude this section with some useful remarks and observations. Triangular matrices Consider an (upper or lower) triangular matrix, i.e., A A 2 A 3 A n A 22 A 23 A 2n A = A 33 A 3n..... A nn. 22 version 5/5/2 (J. Mark Heinzle, SoSe 2)
23 Chapter 2. Eigenvalues and eigenvectors To compute the eigenvalues of this matrix we search for the zeros of the characteristic polynomial, i.e., A λ A 2 A 3 A n A 22 λ A 23 A 2n A λ = A 33 λ A 3n.... A nn λ A 22 λ A 23 A 2n A 33 λ A 3n = (A λ).... A nn λ A 33 λ A 3n = (A λ)(a 22 λ).... A nn λ = (A λ)(a 22 λ)(a 33 λ) (A nn λ). It follows that the eigenvalues coincide with the diagonal elements of the triangular matrix, i.e., the set of eigenvalues is {A,A 22,A 33,...,A nn }. Eigenvalues, determinant, and trace Recall that the trace of a matrix is the sum of its diagonal elements, i.e., tr A = A + A A nn = n A ii. Consider for consistency an (ndimensional) vector space V over the field C. Let A be a linear map of V onto itself. Then there exist r n eigenvalues of A, {λ,λ 2,...,λ r }, with algebraic multiplicities m, m 2,..., m r, such that m + m m r = n. The eigenvalues {λ,...,λ r } of a linear map A are intimately connected with its determinant and its trace: The determinant is the product of the eigenvalues, the i= version 5/5/2 (J. Mark Heinzle, SoSe 2) 23
24 Chapter 2. Eigenvalues and eigenvectors trace is the sum of the eigenvalues. However, we must take care of the algebraic multiplicities; an eigenvalue λ i with algebraic multiplicity m i appears m i times in the product or sum. Therefore, deta = λ λ } {{ } λ 2 λ 2 m times } {{ } m 2 times λ r λ } {{ } r = λ m λm 2 2 λ mr r = m r times r i= λ m i i, tr A = λ + + λ } {{ } + λ λ λ } {{ } r + + λ } {{ } r m times m 2 times m r times r = m λ + m 2 λ m r λ r = m i λ i. The proof of these relations is not difficult if one uses the characteristic polynomial and its decomposition into its roots, i.e., i= A λ = ( ) n( λ λ ) m ( λ λ2 ) m2 (λ λ r ) mr. We restrict ourselves to the simple example of a (2 2) matrix, i.e., ( ) A A A = 2. A 2 A 22 Let λ and λ 2 denote the eigenvalues of A. (Either λ λ 2 or λ = λ 2 ; in the latter case there exists only one eigenvalue whose multiplicity is 2). We obtain A λ = A λ A 2 A 22 λ = (A λ)(a 22 λ) A 2 A 2 and, on the other hand, A 2 = λ 2 ( A + A 22 ) λ + A A 22 A 2 A 2 = λ 2 (tr A)λ + det A, (λ λ )(λ λ 2 ) = λ 2 (λ + λ 2 )λ + λ λ 2. Comparing the coefficients of the polynomials we are led to the result det A = λ λ 2, tr A = λ + λ version 5/5/2 (J. Mark Heinzle, SoSe 2)
25 Chapter 2. Eigenvalues and eigenvectors Consider the linear map on C 2 represented by (2 2) matrix ( ) 3 A =. 3 We use det A and tr A to compute the eigenvalues of A. Accordingly, deta = 6, tr A = 4. λ λ 2 = det A = 6, λ + λ 2 = tr A = 4, which leads to a quadratic equation that can be solved to yield λ = 2 + i 2,λ 2 = 2 i 2. A simple corollary of the relation deta = is the statement that a linear map is singular (i.e., not invertible) if and only if zero is an eigenvalue of A. r i= λ m i i version 5/5/2 (J. Mark Heinzle, SoSe 2) 25
26 Chapter 2. Eigenvalues and eigenvectors 26 version 5/5/2 (J. Mark Heinzle, SoSe 2)
27 Chapter 3. Diagonalization 3 Diagonalization Before we begin let us reiterate: A vector v of a vector space V can be represented by a column vector once a basis of V has been chosen. The column vector representation depends on the choice of basis. Likewise, a linear map A of a vector space V onto itself can be represented as a matrix once a basis of V has been chosen. The matrix representation of A depends on the choice of basis. version 5/5/2 (J. Mark Heinzle, SoSe 2) 27
28 Chapter 3. Diagonalization Consider the vector space R 2 and the standard basis vectors ( ) ( ) e =, e 2 =. We shall analyze the linear map A that describes a reflection at the straight line with slope 45 (i.e., a reflection at < e + e 2 >). Under this reflection, the standard basis vectors e and e 2 are mapped to Ae = A ( ) = ( ) = e 2, Ae 2 = A ( ) = ( ) = e, which is immediate from the geometry of the problem. The matrix representation of A is obtained by using the images of e and e 2, i.e., Ae and Ae 2, as columns; hence ( ) A = (3.) w.r.t. {e,e 2 }. Now let us choose a different basis {b,b 2 } and represent the linear map A w.r.t. {b,b 2 }. Choose b = e + e 2, b 2 = e e 2. We obtain (from purely geometric considerations, i.e., by applying the reflection) Ab = b, Ab 2 = b 2. Note that b and b 2 are eigenvectors of A (associated with λ = and λ 2 =, respectively). Since we have chosen a (nonstandard) basis, column vectors do not quite represent what we are used to. For instance, the vector ( ) 2 v = now means v = 2 b + ( ) b 2, i.e., this vector v points along the 45 2 line. (It corresponds to w.r.t. the old standard basis.) 2 To be continued version 5/5/2 (J. Mark Heinzle, SoSe 2)
29 Chapter 3. Diagonalization... and now the conclusion. Likewise, the vector ( ) v = now means v = ( ) b + b 2, i.e., this vector ( ) v points in the direction 2 of the negative xaxis. (It corresponds to w.r.t. the old standard basis.) To obtain the matrix representation of A w.r.t. the new basis {b,b 2 }, we write b and b 2 as columns vectors, ( ) ( ) b =, b 2 =. (Column vectors are w.r.t. {b,b 2 }; in particular b = b + b 2 and b 2 = b + b 2.) The transformation A, which is described by Ab = b, Ab 2 = b 2, then looks like Ab = A ( ) = ( ) = b Ab 2 = A ( ) ( ) = = b 2. The matrix representation of A is obtained by using Ab and Ab 2 as columns; hence ( ) A = (3.2) w.r.t. the basis {b,b 2 }. We obtain a different matrix representation for the same linear map. This matrix representation is preferred to the original one since the matrix is diagonal. version 5/5/2 (J. Mark Heinzle, SoSe 2) 29
30 Chapter 3. Diagonalization It is straightforward to make a connection between the previous example and the considerations of chapter. Set {ˆb,ˆb 2 } = {e,e 2 }, {ˇb,ˇb 2 } = {b,b 2 }. Denote by Â the matrix representation of A w.r.t. the standard basis {ˆb,ˆb 2 }; in the previous example we have seen that ( ) Â =, see (3.). On the other hand, the matrix representation of A w.r.t. the second basis {ˇb,ˇb 2 }, which we denote by Ǎ, is ( ) Ǎ =, see (3.2). The switch matrix S is the matrix that contains the basis vectors {ˇb,ˇb 2 }, represented as column vectors w.r.t. the original basis {ˆb,ˆb 2 }, as its columns, i.e., S = ˇb ˇb2, see (.2). In our example we have ˇb = ˆb +ˆb 2 and ˇb 2 = ˆb ˆb 2, hence ( ) ( ) ˇb =, ˇb2 = w.r.t. {ˆb,ˆb 2 }, and the switch matrix becomes ( ) S =. From equation (.) we see that Ǎ = S ÂS is supposed to hold. To be continued... 3 version 5/5/2 (J. Mark Heinzle, SoSe 2)
31 Chapter 3. Diagonalization...and now the conclusion. Indeed, since S = ( ) 2, we find S ÂS = ( )( )( ) = 2 which is in fact Ǎ. ( ), We call a linear map A diagonalizable, if A possesses n linearly independent eigenvectors, i.e., a basis of eigenvectors. When does this happen? Suppose that {λ,λ 2,...,λ r } are the eigenvalues of the linear map A. The associated eigenspaces (spaces of eigenvectors) are E,E 2,...,E r. The geometric multiplicity of the eigenvalue λ i is d i = dime i. We know that there exist d + d d r ( n) linearly independent eigenvectors. Therefore, if and only if d + d d r = n, then there exist n linearly independent eigenvectors. Alternatively we can use the condition E E r = V. An important case of diagonalizability is the case of a linear map A that possesses n different eigenvalues λ,λ 2,...,λ n (i.e., r = n). Then, automatically, there exist n linearly independent eigenvectors. (This is simply because d i = dime i i; Hence, if there exist n different eigenvalues, then d i = dim E i = i, and by the general considerations on linear independence, these eigenspaces/vectors are linearly independent.) version 5/5/2 (J. Mark Heinzle, SoSe 2) 3
32 Chapter 3. Diagonalization Let us suppose that the map A is diagonalizable and let us choose a basis of (i.e., n linearly independent) eigenvectors. We do this by successively choosing bases {v i;,...,v i;di } in the eigenspaces E i, i.e., E E { }} { 2 E { }} { { r }} { V = v;,...,v ;d v2;,...,v 2;d2 vr;,...,v r;dr. } {{ }} {{ }} {{ } d vectors d 2 vectors d r vectors Since v i;j is in E i, it is an eigenvector associated with the eigenvalue λ i, i.e., Av i;j = λ i v i;j. Let us consider the matrix representation of the diagonalizable map A w.r.t. this basis of eigenvectors. Let us denote the matrix we obtain by D (instead of A). We straightforwardly obtain D = diag ( ) λ,...,λ } {{ },λ 2,...,λ 2,...,λ } {{ } r,...,λ } {{ } r d times d 2 times d r times λ... = λ λ 2... λ2... λr... λr. It therefore follows that a diagonalizable linear map can be represented by a diagonal matrix, whose entries are the eigenvalues. We call the diagonal matrix D = diag ( ) λ,...,λ } {{ },λ 2,...,λ 2,...,λ } {{ } r,...,λ } {{ } r d times d 2 times d r times the eigenvalue matrix of the linear map A. Conversely, if a map A can be represented by a diagonal matrix, then the eigenvalues are the entries of this matrix (so that the matrix is automatically the 32 version 5/5/2 (J. Mark Heinzle, SoSe 2)
33 Chapter 3. Diagonalization eigenvalue matrix) and the eigenvectors of A are represented by the column vectors.,.,...,.. Hence, there exists a basis of eigenvectors and thus A is diagonalizable. Summing up, we see that a linear map A is diagonalizable if and only if it can be represented by a diagonal matrix. Consider the vector space R 3 and the linear map A represented by the Sudoku matrix 2 3 A = (The basis is tacitly assumed to be the standard basis.) The characteristic polynomial is A λ = λ 3 + 5λ 2 + 8λ. The eigenvalues are the zeros of the characteristic polynomial, i.e., λ =, λ 2 = 3 ( ) , λ3 = 3 ( ) Since the map A has three different eigenvalues, it must have three linearly independent eigenvectors and thus a basis of eigenvectors. Therefore, the Sudoku map is diagonalizable and it can be represented by the diagonal eigenvalue matrix D = diag (, 3 2 (5 + 33), 3 2 (5 33) ). version 5/5/2 (J. Mark Heinzle, SoSe 2) 33
34 Chapter 3. Diagonalization Consider the vector space C 2 and the linear map A represented by matrix ( ) A =. (The basis is tacitly assumed to be the standard basis.) The eigenvalues can be read off directly, since this is a triangular matrix: There is only one eigenvalue, λ =. (Its algebraic multiplicity must be m λ = 2.) Let us compute the space of eigenvectors E λ. From we deduce that (A λ)v = ( )( v v 2 ) = ( ) E λ = ( ), dλ = dim E λ =. In particular, there is only one eigenvector (and not two linearly independent ones). There does not exist a basis of eigenvectors; therefore, the map A is not diagonalizable. In connection with the previous example we consider a rather trivial example: The identity map has one eigenvalue, λ = (with algebraic multiplicity m λ = 2). Every vector is an eigenvector for, hence E λ = C 2 and g λ = dime λ = 2. The identity map is diagonalizable (and the standard matrix representation of is already diagonal). We see that it is not a problem if an eigenvalue appears multiple times (i.e., if its algebraic multiplicity is greater than ). A problem occurs if the geometric multiplicity is strictly less than the algebraic multiplicity, d λ < m λ. In that case, r i= m i = n but r i= d i < n, whence diagonalizability is ruled out. 34 version 5/5/2 (J. Mark Heinzle, SoSe 2)
35 Chapter 3. Diagonalization Consider the vector space R 2 and the linear map represented by the matrix ( ) A =. The characteristic polynomial is λ 2 + =, hence there do not exist any eigenvalues. If we consider the same map as a map on the vector space C 2, then there exist two eigenvalues: λ = i, λ 2 = i. The map is not diagonalizable as a real map, but it is in fact diagonalizable regarded as a complex map. (Since there exist two different eigenvalues, there exist two linearly independent eigenvectors.) In practice, a linear map is given in its matrix representation w.r.t. some (standard) basis, A A 2 A n A 2 A 22 A 2n A =..... A n A n2 A nn We know that, if and only if A is diagonalizable, then we can switch to a matrix representation in terms of a diagonal matrix (the eigenvalue matrix D). How do we switch in practice? We need a switch matrix S. The switch matrix is supposed to transform the standard basis to a basis of eigenvectors. On the basis of eigenvectors, the linear map then acts as a diagonal matrix (the eigenvalue matrix D). Having applied the map in this simple form, we then switch back to the standard basis. Hence, D = S AS. The switch matrix contains the eigenvectors of A as columns, i.e., S = v ; v ;2 v r;dr. To prove that D = S AS we show that SDw = ASw for all w V. Due to linearity, if we aim at proving a statement for all w V, it suffices to show this version 5/5/2 (J. Mark Heinzle, SoSe 2) 35
36 Chapter 3. Diagonalization statement for all basis vectors. Consider the standard basis vector e =.. We obtain On the other hand, Se = v ; ASe = λ v ;. De = λ e SDe = λ Se = λ v ;. We conclude that SDe = ASe ; analogously, we obtain SDe i = ASe i for all standard basis vectors e i and thus SDw = ASw for all w V. This completes the proof of the claim. 36 version 5/5/2 (J. Mark Heinzle, SoSe 2)
37 Chapter 3. Diagonalization Consider the map A on C 3 given by 3 + 2i 2 2i 4 A = + i i 2. + i i The characteristic polynomial is A λ = λ 3 + (2 + i)λ 2 ( + 2i)λ + i ; it is not difficult to convince oneself that the factorization into roots is Therefore, the eigenvalues are A λ = (λ i)(λ ) 2. λ = i (m =, d = ), λ 2 = (m 2 = 2). To see whether A is diagonalizable there must exist two linearly independent eigenvectors associated with the eigenvalue λ 2 (i.e., d 2 = dim E 2 = 2 is required). A straightforward computation shows that 2 E = i, E 2 =,, hence d 2 = dim E 2 = 2 indeed; accordingly, there exist 3 linearly independent eigenvectors and A is diagonalizable. The switch matrix S is 2 i S =, its inverse is It is straightforward to check that i i + i S = i i i. i i i S AS = D = diag(i,,). version 5/5/2 (J. Mark Heinzle, SoSe 2) 37
38 Chapter 3. Diagonalization 38 version 5/5/2 (J. Mark Heinzle, SoSe 2)
Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu
Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version /6/29 2 version
More informationEIGENVALUES AND EIGENVECTORS
EIGENVALUES AND EIGENVECTORS Diagonalizable linear transformations and matrices Recall, a matrix, D, is diagonal if it is square and the only nonzero entries are on the diagonal This is equivalent to
More informationComputationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:
Diagonalization We have seen that diagonal and triangular matrices are much easier to work with than are most matrices For example, determinants and eigenvalues are easy to compute, and multiplication
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More information4. Linear transformations as a vector space 17
4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation
More informationEigenvectors. PropDefn
Eigenvectors Aim lecture: The simplest T invariant subspaces are 1dim & these give rise to the theory of eigenvectors. To compute these we introduce the similarity invariant, the characteristic polynomial.
More informationCalculating determinants for larger matrices
Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det
More informationLinear Algebra  Part II
Linear Algebra  Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T
More information1. In this problem, if the statement is always true, circle T; otherwise, circle F.
Math 1553, Extra Practice for Midterm 3 (sections 4565) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationEigenvalues and Eigenvectors
Contents Eigenvalues and Eigenvectors. Basic Concepts. Applications of Eigenvalues and Eigenvectors 8.3 Repeated Eigenvalues and Symmetric Matrices 3.4 Numerical Determination of Eigenvalues and Eigenvectors
More informationMATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (inclass portion) January 22, 2003
MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (inclass portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space
More informationLinear Algebra Practice Final
. Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis
More informationFoundations of Matrix Analysis
1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the
More informationEIGENVALUES AND EIGENVECTORS 3
EIGENVALUES AND EIGENVECTORS 3 1. Motivation 1.1. Diagonal matrices. Perhaps the simplest type of linear transformations are those whose matrix is diagonal (in some basis). Consider for example the matrices
More informationMath Matrix Algebra
Math 44  Matrix Algebra Review notes  4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,
More informationQuestion: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of AcI?
Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of AcI? Property The eigenvalues
More informationThe CayleyHamilton Theorem and the Jordan Decomposition
LECTURE 19 The CayleyHamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal
More information2 b 3 b 4. c c 2 c 3 c 4
OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a
More informationand let s calculate the image of some vectors under the transformation T.
Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationTHE MINIMAL POLYNOMIAL AND SOME APPLICATIONS
THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationA = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,
65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationElementary Linear Algebra
Matrices J MUSCAT Elementary Linear Algebra Matrices Definition Dr J Muscat 2002 A matrix is a rectangular array of numbers, arranged in rows and columns a a 2 a 3 a n a 2 a 22 a 23 a 2n A = a m a mn We
More informationEigenvalues and Eigenvectors: An Introduction
Eigenvalues and Eigenvectors: An Introduction The eigenvalue problem is a problem of considerable theoretical interest and wideranging application. For example, this problem is crucial in solving systems
More informationNONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction
NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationMath 205, Summer I, Week 4b:
Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide
More informationGeneralized Eigenvectors and Jordan Form
Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least
More information1 Last time: leastsquares problems
MATH Linear algebra (Fall 07) Lecture Last time: leastsquares problems Definition. If A is an m n matrix and b R m, then a leastsquares solution to the linear system Ax = b is a vector x R n such that
More informationMath 205, Summer I, Week 4b: Continued. Chapter 5, Section 8
Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8 2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v
More informationOHSx XM511 Linear Algebra: Solutions to Online True/False Exercises
This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)
More informationMATH 235. Final ANSWERS May 5, 2015
MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your
More informationLinear algebra II Tutorial solutions #1 A = x 1
Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationMATH 1553 PRACTICE MIDTERM 3 (VERSION B)
MATH 1553 PRACTICE MIDTERM 3 (VERSION B) Name Section 1 2 3 4 5 Total Please read all instructions carefully before beginning. Each problem is worth 10 points. The maximum score on this exam is 50 points.
More informationA proof of the Jordan normal form theorem
A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it
More informationMATH 31  ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3  ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationTopic 1: Matrix diagonalization
Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationChapter 5. Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the
More informationDefinition (T invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.15.4, 6.16.2 and 7.17.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More information4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial
Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationLecture 10  Eigenvalues problem
Lecture 10  Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10  Eigenvalues problem Introduction Eigenvalue problems form an important class of problems
More informationTopics in linear algebra
Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over
More informationExamples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.
The exam will cover Sections 6.6.2 and 7.7.4: True/False 30% Definitions 0% Computational 60% Skip Minors and Laplace Expansion in Section 6.2 and p. 304 (trajectories and phase portraits) in Section
More informationMATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.
MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ
More informationEigenvalues and Eigenvectors
Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying
More informationMath 110 Linear Algebra Midterm 2 Review October 28, 2017
Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28  Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationDM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini
DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and
More informationOnline Exercises for Linear Algebra XM511
This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2
More informationa 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12
24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2
More informationLinear Algebra M1  FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.
Linear Algebra M1  FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:
More informationMATH 583A REVIEW SESSION #1
MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),
More informationLECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]
LECTURE VII: THE JORDAN CANONICAL FORM MAT 204  FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO [See also Appendix B in the book] 1 Introduction In Lecture IV we have introduced the concept of eigenvalue
More informationMath 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that
Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that det(a) = 2, det(b) = 2, det(c) = 1, det(d) = 4. 2 (a) Compute det(ad)+det((b
More information5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found
5. Diagonalization plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found eigenvalues EV, eigenvectors, eigenspaces 5.. Eigenvalues and eigenvectors.
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More informationMATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by
MATH 110  SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar
More informationMath 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.
Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses
More informationProblem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show
MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology  Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,
More informationAlgebra Workshops 10 and 11
Algebra Workshops 1 and 11 Suggestion: For Workshop 1 please do questions 2,3 and 14. For the other questions, it s best to wait till the material is covered in lectures. Bilinear and Quadratic Forms on
More information33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM
33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More informationDiagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics
Diagonalization MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Motivation Today we consider two fundamental questions: Given an n n matrix A, does there exist a basis
More informationAnnouncements Wednesday, November 01
Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section
More informationLec 2: Mathematical Economics
Lec 2: Mathematical Economics to Spectral Theory Sugata Bag Delhi School of Economics 24th August 2012 [SB] (Delhi School of Economics) Introductory Math Econ 24th August 2012 1 / 17 Definition: Eigen
More informationftuiowamath2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST
me me ftuiowamath2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number
More informationLecture 12: Diagonalization
Lecture : Diagonalization A square matrix D is called diagonal if all but diagonal entries are zero: a a D a n 5 n n. () Diagonal matrices are the simplest matrices that are basically equivalent to vectors
More informationMATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.
MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial. Geometric properties of determinants 2 2 determinants and plane geometry
More informationNOTES on LINEAR ALGEBRA 1
School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura
More informationMATH 1553C MIDTERM EXAMINATION 3
MATH 553C MIDTERM EXAMINATION 3 Name GT Email @gatech.edu Please read all instructions carefully before beginning. Please leave your GT ID card on your desk until your TA scans your exam. Each problem
More informationAnnouncements Monday, November 06
Announcements Monday, November 06 This week s quiz: covers Sections 5 and 52 Midterm 3, on November 7th (next Friday) Exam covers: Sections 3,32,5,52,53 and 55 Section 53 Diagonalization Motivation: Difference
More informationSOLUTIONS: ASSIGNMENT Use Gaussian elimination to find the determinant of the matrix. = det. = det = 1 ( 2) 3 6 = 36. v 4.
SOLUTIONS: ASSIGNMENT 9 66 Use Gaussian elimination to find the determinant of the matrix det 1 1 4 4 1 1 1 1 8 8 = det = det 0 7 9 0 0 0 6 = 1 ( ) 3 6 = 36 = det = det 0 0 6 1 0 0 0 6 61 Consider a 4
More informationDimension. Eigenvalue and eigenvector
Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, ranknullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,
More informationEigenspaces in Recursive Sequences
Eigenspaces in Recursive Sequences Ben Galin September 5, 005 One of the areas of study in discrete mathematics deals with sequences, in particular, infinite sequences An infinite sequence can be defined
More information(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.
1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III
More informationMath 396. Quotient spaces
Math 396. Quotient spaces. Definition Let F be a field, V a vector space over F and W V a subspace of V. For v, v V, we say that v v mod W if and only if v v W. One can readily verify that with this definition
More informationMidterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015
Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 The test lasts 1 hour and 15 minutes. No documents are allowed. The use of a calculator, cell phone or other equivalent electronic
More informationLinear Algebra II Lecture 13
Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since
More informationDifferential Topology Final Exam With Solutions
Differential Topology Final Exam With Solutions Instructor: W. D. Gillam Date: Friday, May 20, 2016, 13:00 (1) Let X be a subset of R n, Y a subset of R m. Give the definitions of... (a) smooth function
More informationTopic 2 Quiz 2. choice C implies B and B implies C. correctchoice C implies B, but B does not imply C
Topic 1 Quiz 1 text A reduced rowechelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correctchoice may have 0, 1, 2, or 3 choice may have 0,
More informationTherefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.
Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be
More informationThe minimal polynomial
The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic
More informationMath Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88
Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant
More informationDSGA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DSGA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationLinear Algebra in Actuarial Science: Slides to the lecture
Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a ToolBox Linear Equation Systems Discretization of differential equations: solving linear equations
More informationJORDAN NORMAL FORM NOTES
18.700 JORDAN NORMAL FORM NOTES These are some supplementary notes on how to find the Jordan normal form of a small matrix. First we recall some of the facts from lecture, next we give the general algorithm
More informationRecall : Eigenvalues and Eigenvectors
Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector
More informationLattices and Hermite normal form
Integer Points in Polyhedra Lattices and Hermite normal form Gennady Shmonin February 17, 2009 1 Lattices Let B = { } b 1,b 2,...,b k be a set of linearly independent vectors in ndimensional Euclidean
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More information(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).
.(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)
More informationMAT Linear Algebra Collection of sample exams
MAT 342  Linear Algebra Collection of sample exams Ax. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More informationALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND
More information