MAT1302F Mathematical Methods II Lecture 19
|
|
- Blaze Skinner
- 6 years ago
- Views:
Transcription
1 MAT302F Mathematical Methods II Lecture 9 Aaron Christie 2 April 205 Eigenvectors, Eigenvalues, and Diagonalization Now that the basic theory of eigenvalues and eigenvectors is in place most importantly a procedure that allows us to find all the eigenvalues, eigenvectors, and eigenspaces for a given matrix A, it s time to discuss an important application of these ideas, which is the notion of diagonalizing a matrix. First, recall what a diagonal matrix is: it is a matrix whose only nonzero entries lie on its main diagonal. A couple of examples are and Diagonal matrices have a special form that s nice for a number of reasons: Diagonal matrices are both upper and lower triangular, so: the eigenvalues of a diagonal matrix are just its diagonal entries, and the determinant of a diagonal matrix is just the product of its diagonal entries. They are particularly easy to mutliply: just multiply diagonal entries in the same-numbered row.
2 This makes it easy to determine invertibility and find an inverse for a diagonal matrix: as long as no diagonal entry is 0 the matrix is invertible and the inverse is diagonal with entries equal to the inverses of the diagonal entries of the original matrix. These properties are legitimately important in application. For instance, computing powers of a matrix is generally quite difficult, especially by hand. But even using computing software, real applications often require computing very large powers of very large matrices, large enough to sometimes be a problem even for a machine made for doing that kind of thing. Computing A 7 if A = is going to be a pain, and relatively speaking it s not even that bad because it s not that big, it only has integer entries, and the power is not especially large. On the other hand, if A = , computing its powers isn t bad at all; it just reduces to computing powers of its entries: A 7 = ( 3) So, it would certainly be computationally helpful if it were possible to take advantage of this kind of simplification with matrices that aren t themselves diagonal. Here s an example of the sort of thing it might be reasonable to expect in a somewhat general context. 2
3 Example.. Take a collection of three matrices: [ ] [ ] D = P = P = 0 4 Then, let A be the product of these: A = P DP = [ ] [ 2 3 Now suppose we want to compute just the 4th power of A. By hand, this will be difficult and lengthy. But the way we formed A to begin with actually makes this easier: A 4 = (P DP )(P DP )(P DP )(P DP ) = P D(P P )D(P P )D(P P )DP = P DIDIDIDP = P D 4 P [ ] = Even though A is not diagonal, it s still possible to use the fact that diagonal matrices can be easily exponentiated to take some of the work out of this chore. The form appearing in this example is in some ways the nearest we can reasonably expect to get to a diagonal matrix if we re given an arbitrary matrix. Definition.2. If A = P BP for some invertible matrix P, then we say that A and B are similar. Remark.3. Notice that this definition applies beyond the situation we re planning on using it for: the matrix B doesn t have to be diagonal; it can be any matrix at all that s related to A through an invertible matrix. Our goal now, is to find a diagonal matrix similar to a given matrix, or to find out when it is possible to do this. Definition.4. If A is similar to a diagonal matrix, then A is said to be diagonalizable. (Explicitly, A is diagonalizable when there is a diagonal matrix D and an invertible matrix P such that A = P DP ) 3 ].
4 To diagonalize a matrix A is a matter of finding a diagonal matrix D and an invertible matrix P such that A = P DP. It turns out that not every matrix is diagonalizable, but that the circumstances under which a matrix is diagonalizable have everything to do with its eigenvalues and eigenvectors. Example.5. To see how this works out, recall an example from last time. Let [ ] 4 A =. 2 The characteristic polynomial of this matrix is so the eigenvalues are λ 2 5λ + 6 = (λ 2)(λ 3), λ = 2 and λ 2 = 3. Now let s find the corresponding eigenspaces. λ = 2: 2I A = [ ] 2 2 The solutions in vector parametric form are [ ] t 2, [ ] so we can choose any nonzero vector of this form as a basis vector for the eigenspace. Taking t = gives [ ] 2, but we could also remove the fractional entry by choosing t = 2 instead: [ ]. 2 λ 2 = 3: 3I A = [ ] 2 2 [ ]
5 This has solutions in vector parametric form [ ] t, so [ ] is an eigenvector with eigenvalue 3 and a basis vector for the 3-eigenspace (as would any nonzero multiple of this vector). Recall that two classes ago, we saw a theorem that told us that eigenvectors for distinct eigenvalues of a matrix are linearly independent. This means that if we form a matrix with the basis vectors we chose for each eigenvalue, we get an invertible matrix (a matrix with linearly independent columns is invertible by the Invertible Matrix Theorem): [ ] P = 2 (you can check the determinant of this matrix to prove that it is invertible if you re doubtful). The inverse of this matrix can be quickly calculated to be [ ] P =. 2 Now, here s the miracle (it s not a miracle, but it seems to come out of nowhere for us). If we form a diagonal matrix A by taking the diagonal entry in the ith column to be the eigenvalue of the eigenvector appearing in the ith column of P, then P DP = [ 2 ] [ ] [ ] 2 0 = [ ] 4 = A. 2 So: A is actually similar to the diagonal matrix whose entries are the eigenvalues of A! And now, with this similarity in hand, we can compute large powers of A: A 00 = (P DP ) 00 = P D 00 P [ ] [ ] [ ] = [ ] 2(3 = 00 ) (3 00 )
6 Example.6. Suppose A = First, we find the eigenvalues of A by calculating and factoring its characteristic polynomial. Along the way, we ll use a couple row and column operations to simplify the cofactor expansion: λ det(λi A) = 2 λ λ 4 λ = 0 λ 2 2 λ 2 2 λ 4 λ = 0 λ λ 6 Therefore, A has two eigenvalues, = (λ 2)[(λ 4)(λ 6) ( 4)( 2)] = (λ 2)[λ 2 0λ ] = (λ 2)[λ ] = (λ 2)(λ 8)(λ 2) = (λ 2) 2 (λ 8). λ = 2 and λ 2 = 8, one with algebraic multiplicity 2 and the other with multiplicity. Next, find the corresponding eigenspaces: λ = 2: λi A = = So the general solutions to (2I A) x = 0 are x = x 2 x 3 x 2 = s x 3 = t 6
7 Therefore, the eigenspace corresponding to the eigenvalue 2 is s + t 0. 0 This eigenspace is 2 dimensional, and a basis for it is and 0. 0 These will take up two columns in the matrix P when we form it. λ 2 = 8: λi A = = So the general solutions to (8I A) x = 0 are x = x 3 x 2 = x 3 x 3 = t Therefore, the eigenspace corresponding to the eigenvalue 8 is t, which is dimensional. A basis vector for it is. Now, with all of these parts, we can assemble them into a diagonalization. First, the diagonal matrix D =
8 consisting of the eigenvalues of A. Then, the invertible matrix P = 0 0 consisting of the basis vectors for the two eigenspaces (NOTE: The eigenvectors are in the same columns of P as their matching eigenvalues in D!). The inverse of P (which you can calculate using the row reduction method) is P = Then, A is similar to D because = Remark.7. In the above example, we could have used matrices D and P different from the ones above in a few different ways: The columns of D and P could be in a different order. For example, we could use D = and P = instead. The only requirement is that corresponding eigenvalues and eigenvectors must appear in the same columns in the matrices. So, for example, using D = without changing the order of the columns in P would not successfully diagonalize A. For a given eigenvalue, you can use any basis for the corresponding eigenspace as columns in P. 8
9 An important thing to notice about the last example is that we would not have been able to complete the diagonalization if the eigenspace for 2 had only been dimensional (remember that the dimension of the eigenspace is also called the geometric multiplicity). If that had been the case, we would have found ourselves one column short forming the matrix P and there would have been no remedy for this. So: A given matrix is diagonalizable if and only if the geometric multiplicity of each eigenvalue is equal to its algebraic multiplicity. (It will always be the case that the geometric multiplicity is less than or equal to the algebraic multiplicity) Equivalently, an n n matrix is diagonalizable if and only if the sum of the dimensions of its eigenspaces is n. Remark.8. Some Warnings Diagonalizable is not the equivalent to invertible! It is possible to find matrices that are diagonalizable but not invertible, and invertible matrices that are not diagonalizable. Similarity is not the same as row equivalent! Row equivalence means there is a sequence of row operations that will take you from one matrix A to another, B. Similarity means there is an invertible matrix P such that A = P BP. Example.9. If 3 0 A = 0 3 0, then we can see immediately that the eigenvalues of A are 3 and 2, since A is triangular. What might stop us from diagonalizing A is the eigenspace for 3 being less than 2 dimensional, and this does turn out to be what happens: 0 0 3I A =
10 Therefore, the solutions to (3I A) x = 0 are x = 0 x 2 = t. x 3 = 0, Therefore, the eigenspace for 3 is dimensional, with basis vector 0. 0 The eigenspace for 2 will be -dimensional (it can t be more than, nor less than ), so there are insufficient vectors to build P, thus A is not diagonalizable. Remark.0. Since a given scalar λ is an eigenvalue if and only if (λi A) x = 0 has nontrivial solutions. If there are nontrivial solutions, the solution set involves at least one free variable, so the eigenspace for λ is at least dimensional. There is at least one circumstance where we can be sure that a matrix will be diagonalizable. Theorem.. An n n matrix with n distinct eigenvalues is diagonalizable. In this case, each eigenvalue has algebraic multiplicity, and each eigenvalue must have geometric multiplicity at least, so algebraic multiplicity = geometric multiplicity and the matrix must be diagonalizable. Therefore, the only instances we need to worry about is when an eigenvalue has algebraic multiplicity greater than. Procedure For Diagonalizing A Matrix Let A be n n.. Find the eigenvalues of A by solving the characteristic equation det(λi A) = Find the eigenspace for each eigenvalue: 0
11 Row reduce [λi A 0]. Write the solution in vector parametric form: x = t v + + t k v k Then, the eigenspace is Span{ v,..., v k } and v,..., v k is a basis for it. If the dimension of the eigenspace is less than the algebraic multiplicity of the eigenvalue, then A is not diagonalizable. If the geometric multiplicity of every eigenvalue is equal to its algebraic multiplicity, then A is diagonalizable proceed to the next step. 3. Construct P using the eigenvectors found in the previous step as columns, P = [ v v 2 ]. 4. Construct the diagonal matrix D from the eigenvalues of A, λ i is the eigenvalue of v i. D = λ λ 2, The diagonalization can be verified by checking that A = P DP or AP = P D (note that doing the latter doesn t require you to compute P!). Definition.2. If A is an n n matrix, then a basis of R n consisting of eigenvectors of A is called an eigenvector basis for A. Theorem.3 (The Diagonalization Theorem).. An n n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors; equivalently, if there is an eigenvector basis for A. 2. A = P DP where D is a diagonal matrix if and only if the columns of P form an eigenvector basis for A and the entries of D are the eigenvalues of A.
12 Remark.4. The importance of the theorem is that it says that the method for diagonalizing a matrix that we ve seen this class is in fact the only method there is. There isn t some other completely different method that could be used to cover cases where the eigenvector/eigenvalue method doesn t work. Example.5. Is A = diagonalizable? Yes, which you can tell without going to the trouble of actually diagonalizing it. You can tell just because it is triangular and so it s possible to see immediately that it has four distinct eigenvalues. Note also that A is an example of a matrix that is diagonalizable but not invertible (which you can see because 0 is one if its eigenvalues). 2 Next Class: One last look at some applications: difference equations, Markov chains, and maybe a few words about Google PageRank. 2
MAT 1302B Mathematical Methods II
MAT 1302B Mathematical Methods II Alistair Savage Mathematics and Statistics University of Ottawa Winter 2015 Lecture 19 Alistair Savage (uottawa) MAT 1302B Mathematical Methods II Winter 2015 Lecture
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationand let s calculate the image of some vectors under the transformation T.
Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationRecall : Eigenvalues and Eigenvectors
Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector
More informationChapter 4 & 5: Vector Spaces & Linear Transformations
Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think
More informationComputationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:
Diagonalization We have seen that diagonal and triangular matrices are much easier to work with than are most matrices For example, determinants and eigenvalues are easy to compute, and multiplication
More informationSolutions to Final Exam
Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns
More informationLecture 12: Diagonalization
Lecture : Diagonalization A square matrix D is called diagonal if all but diagonal entries are zero: a a D a n 5 n n. () Diagonal matrices are the simplest matrices that are basically equivalent to vectors
More informationLecture 15, 16: Diagonalization
Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationPROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS
PROBLEM SET Problems on Eigenvalues and Diagonalization Math 335, Fall 2 Oct. 2, 2 ANSWERS i Problem. In each part, find the characteristic polynomial of the matrix and the eigenvalues of the matrix by
More information3.3 Eigenvalues and Eigenvectors
.. EIGENVALUES AND EIGENVECTORS 27. Eigenvalues and Eigenvectors In this section, we assume A is an n n matrix and x is an n vector... Definitions In general, the product Ax results is another n vector
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationMath 110 Linear Algebra Midterm 2 Review October 28, 2017
Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections
More informationChapter 5. Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the
More informationName Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.
Name Solutions Linear Algebra; Test 3 Throughout the test simplify all answers except where stated otherwise. 1) Find the following: (10 points) ( ) Or note that so the rows are linearly independent, so
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) Eigenvalues and Eigenvectors Fall 2015 1 / 14 Introduction We define eigenvalues and eigenvectors. We discuss how to
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationProblems for M 10/26:
Math, Lesieutre Problem set # November 4, 25 Problems for M /26: 5 Is λ 2 an eigenvalue of 2? 8 Why or why not? 2 A 2I The determinant is, which means that A 2I has 6 a nullspace, and so there is an eigenvector
More informationEigenvalues and Eigenvectors
Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n
More informationMATH 310, REVIEW SHEET 2
MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,
More informationMath 1553 Worksheet 5.3, 5.5
Math Worksheet, Answer yes / no / maybe In each case, A is a matrix whose entries are real a) If A is a matrix with characteristic polynomial λ(λ ), then the - eigenspace is -dimensional b) If A is an
More informationc c c c c c c c c c a 3x3 matrix C= has a determinant determined by
Linear Algebra Determinants and Eigenvalues Introduction: Many important geometric and algebraic properties of square matrices are associated with a single real number revealed by what s known as the determinant.
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationFinal Exam Practice Problems Answers Math 24 Winter 2012
Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the
More informationJordan normal form notes (version date: 11/21/07)
Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let
More informationDiagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics
Diagonalization MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Motivation Today we consider two fundamental questions: Given an n n matrix A, does there exist a basis
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationEigenvalues and Eigenvectors
5 Eigenvalues and Eigenvectors 5.2 THE CHARACTERISTIC EQUATION DETERMINANATS nn Let A be an matrix, let U be any echelon form obtained from A by row replacements and row interchanges (without scaling),
More informationEigenvalues, Eigenvectors, and Diagonalization
Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationChapter 3. Determinants and Eigenvalues
Chapter 3. Determinants and Eigenvalues 3.1. Determinants With each square matrix we can associate a real number called the determinant of the matrix. Determinants have important applications to the theory
More informationTherefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.
Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be
More informationFrom Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?
Overview Last week introduced the important Diagonalisation Theorem: An n n matrix A is diagonalisable if and only if there is a basis for R n consisting of eigenvectors of A. This week we ll continue
More informationDiagonalization of Matrix
of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that
More informationEigenvalues by row operations
Eigenvalues by row operations Barton L. Willis Department of Mathematics University of Nebraska at Kearney Kearney, Nebraska 68849 May, 5 Introduction There is a wonderful article, Down with Determinants!,
More informationAnnouncements Monday, November 06
Announcements Monday, November 06 This week s quiz: covers Sections 5 and 52 Midterm 3, on November 7th (next Friday) Exam covers: Sections 3,32,5,52,53 and 55 Section 53 Diagonalization Motivation: Difference
More informationExamples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.
The exam will cover Sections 6.-6.2 and 7.-7.4: True/False 30% Definitions 0% Computational 60% Skip Minors and Laplace Expansion in Section 6.2 and p. 304 (trajectories and phase portraits) in Section
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More informationREVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION
REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar
More informationc 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0
LECTURE LECTURE 2 0. Distinct eigenvalues I haven t gotten around to stating the following important theorem: Theorem: A matrix with n distinct eigenvalues is diagonalizable. Proof (Sketch) Suppose n =
More informationMath 205, Summer I, Week 4b:
Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide
More informationHomework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9
Bachelor in Statistics and Business Universidad Carlos III de Madrid Mathematical Methods II María Barbero Liñán Homework sheet 4: EIGENVALUES AND EIGENVECTORS DIAGONALIZATION (with solutions) Year - Is
More informationProblems for M 11/2: A =
Math 30 Lesieutre Problem set # November 0 Problems for M /: 4 Let B be the basis given by b b Find the B-matrix for the transformation T : R R given by x Ax where 3 4 A (This just means the matrix for
More information1. In this problem, if the statement is always true, circle T; otherwise, circle F.
Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation
More informationLinear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions
Linear Algebra (MATH 4) Spring 2 Final Exam Practice Problem Solutions Instructions: Try the following on your own, then use the book and notes where you need help. Afterwards, check your solutions with
More informationWhat is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =
STUDENT S COMPANIONS IN BASIC MATH: THE ELEVENTH Matrix Reloaded by Block Buster Presumably you know the first part of matrix story, including its basic operations (addition and multiplication) and row
More informationA = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,
65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example
More informationTMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013
TMA4115 - Calculus 3 Lecture 21, April 3 Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013 www.ntnu.no TMA4115 - Calculus 3, Lecture 21 Review of last week s lecture Last week
More informationMath 1553, Introduction to Linear Algebra
Learning goals articulate what students are expected to be able to do in a course that can be measured. This course has course-level learning goals that pertain to the entire course, and section-level
More informationEigenvalues and Eigenvectors: An Introduction
Eigenvalues and Eigenvectors: An Introduction The eigenvalue problem is a problem of considerable theoretical interest and wide-ranging application. For example, this problem is crucial in solving systems
More informationMath 315: Linear Algebra Solutions to Assignment 7
Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are
More informationAnnouncements Monday, October 29
Announcements Monday, October 29 WeBWorK on determinents due on Wednesday at :59pm. The quiz on Friday covers 5., 5.2, 5.3. My office is Skiles 244 and Rabinoffice hours are: Mondays, 2 pm; Wednesdays,
More informationEigenvalues and Eigenvectors
5 Eigenvalues and Eigenvectors 5.2 THE CHARACTERISTIC EQUATION DETERMINANATS n n Let A be an matrix, let U be any echelon form obtained from A by row replacements and row interchanges (without scaling),
More informationEigenvalue and Eigenvector Homework
Eigenvalue and Eigenvector Homework Olena Bormashenko November 4, 2 For each of the matrices A below, do the following:. Find the characteristic polynomial of A, and use it to find all the eigenvalues
More informationQuestion: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?
Section 5. The Characteristic Polynomial Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI? Property The eigenvalues
More informationChapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015
Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal
More information= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.
3.3 Diagonalization Let A = 4. Then and are eigenvectors of A, with corresponding eigenvalues 2 and 6 respectively (check). This means 4 = 2, 4 = 6. 2 2 2 2 Thus 4 = 2 2 6 2 = 2 6 4 2 We have 4 = 2 0 0
More informationCalculating determinants for larger matrices
Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det
More informationPractice problems for Exam 3 A =
Practice problems for Exam 3. Let A = 2 (a) Determine whether A is diagonalizable. If so, find a matrix S such that S AS is diagonal. If not, explain why not. (b) What are the eigenvalues of A? Is A diagonalizable?
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationLinear Algebra review Powers of a diagonalizable matrix Spectral decomposition
Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing
More informationMAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:
MAC Module Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to: Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors
More informationMAC Module 12 Eigenvalues and Eigenvectors
MAC 23 Module 2 Eigenvalues and Eigenvectors Learning Objectives Upon completing this module, you should be able to:. Solve the eigenvalue problem by finding the eigenvalues and the corresponding eigenvectors
More informationspring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra
spring, 2016. math 204 (mitchell) list of theorems 1 Linear Systems THEOREM 1.0.1 (Theorem 1.1). Uniqueness of Reduced Row-Echelon Form THEOREM 1.0.2 (Theorem 1.2). Existence and Uniqueness Theorem THEOREM
More informationFinal Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015
Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,
More informationLinear Algebra Practice Final
. Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis
More informationMTH 464: Computational Linear Algebra
MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)
More informationREVIEW FOR EXAM II. The exam covers sections , the part of 3.7 on Markov chains, and
REVIEW FOR EXAM II The exam covers sections 3.4 3.6, the part of 3.7 on Markov chains, and 4.1 4.3. 1. The LU factorization: An n n matrix A has an LU factorization if A = LU, where L is lower triangular
More informationRecitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples
Math b TA: Padraic Bartlett Recitation 9: Probability Matrices and Real Symmetric Matrices Week 9 Caltech 20 Random Question Show that + + + + +... = ϕ, the golden ratio, which is = + 5. 2 2 Homework comments
More informationMAT2342 : Introduction to Linear Algebra Mike Newman, 5 October assignment 1
[/8 MAT4 : Introduction to Linear Algebra Mike Newman, 5 October 07 assignment You must show your work. You may use software or solvers of some sort to check calculation of eigenvalues, but you should
More information22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices
m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix
More informationSummer Session Practice Final Exam
Math 2F Summer Session 25 Practice Final Exam Time Limit: Hours Name (Print): Teaching Assistant This exam contains pages (including this cover page) and 9 problems. Check to see if any pages are missing.
More informationEigenvectors and Hermitian Operators
7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding
More information2 b 3 b 4. c c 2 c 3 c 4
OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a
More informationCS 246 Review of Linear Algebra 01/17/19
1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector
More informationMATH 310, REVIEW SHEET
MATH 310, REVIEW SHEET These notes are a summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive, so please
More informationDiagonalization. Hung-yi Lee
Diagonalization Hung-yi Lee Review If Av = λv (v is a vector, λ is a scalar) v is an eigenvector of A excluding zero vector λ is an eigenvalue of A that corresponds to v Eigenvectors corresponding to λ
More informationUNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if
UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is
More informationLinear Algebra Practice Problems
Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a
More informationAnnouncements Wednesday, November 01
Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am. Section
More informationGeneralized Eigenvectors and Jordan Form
Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least
More informationDifferential Equations
This document was written and copyrighted by Paul Dawkins. Use of this document and its online version is governed by the Terms and Conditions of Use located at. The online version of this document is
More informationEigenvalues and Eigenvectors 7.2 Diagonalization
Eigenvalues and Eigenvectors 7.2 Diagonalization November 8 Goals Suppose A is square matrix of order n. Provide necessary and sufficient condition when there is an invertible matrix P such that P 1 AP
More informationMath 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES
Math 46, Spring 2 More on Algebraic and Geometric Properties January 2, 2 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES Algebraic properties Algebraic properties of matrix/vector multiplication Last time
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationLinear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7
Linear Algebra Rekha Santhanam Johns Hopkins Univ. April 3, 2009 Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, 2009 1 / 7 Dynamical Systems Denote owl and wood rat populations at time k
More informationMATH 1553, C. JANKOWSKI MIDTERM 3
MATH 1553, C JANKOWSKI MIDTERM 3 Name GT Email @gatechedu Write your section number (E6-E9) here: Please read all instructions carefully before beginning Please leave your GT ID card on your desk until
More informationMath Linear Algebra Final Exam Review Sheet
Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of
More informationAdvanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur
Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Lecture No. #07 Jordan Canonical Form Cayley Hamilton Theorem (Refer Slide Time:
More informationMath 291-2: Lecture Notes Northwestern University, Winter 2016
Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,
More informationMath 2331 Linear Algebra
5. Eigenvectors & Eigenvalues Math 233 Linear Algebra 5. Eigenvectors & Eigenvalues Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ Shang-Huan Chiu,
More information33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM
33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the
More informationDM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini
DM554 Linear and Integer Programming Lecture 9 Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. More on 2. 3. 2 Resume Linear transformations and
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More information