Linear algebra II Tutorial solutions #1 A = x 1

Similar documents
Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 due Thursday, Feb A =

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

and let s calculate the image of some vectors under the transformation T.

Eigenvalues and Eigenvectors

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Calculating determinants for larger matrices

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

Dimension. Eigenvalue and eigenvector

Eigenvalues and Eigenvectors

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

1 Last time: least-squares problems

2. Every linear system with the same number of equations as unknowns has a unique solution.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math Final December 2006 C. Robinson

Math 215 HW #9 Solutions

Study Guide for Linear Algebra Exam 2

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 221, Spring Homework 10 Solutions

Definition (T -invariant subspace) Example. Example

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

Math 308 Practice Final Exam Page and vector y =

Math 315: Linear Algebra Solutions to Assignment 7

Linear Algebra- Final Exam Review

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Symmetric and anti symmetric matrices

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Math Matrix Algebra

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

235 Final exam review questions

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 369 Linear Algebra

4. Linear transformations as a vector space 17

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Announcements Monday, October 29

Recall : Eigenvalues and Eigenvectors

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Solutions Problem Set 8 Math 240, Fall

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Generalized Eigenvectors and Jordan Form

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

MA 265 FINAL EXAM Fall 2012

Chapter 5 Eigenvalues and Eigenvectors

LINEAR ALGEBRA REVIEW

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

MATH Spring 2011 Sample problems for Test 2: Solutions

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Linear Algebra in Actuarial Science: Slides to the lecture

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Eigenvalues, Eigenvectors, and Diagonalization

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Chapter 5. Eigenvalues and Eigenvectors

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Jordan Canonical Form Homework Solutions

Math 3191 Applied Linear Algebra

Announcements Wednesday, November 01

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

MAT Linear Algebra Collection of sample exams

Eigenvectors. Prop-Defn

Math 1553, Introduction to Linear Algebra

CSL361 Problem set 4: Basic linear algebra

Problem 1: Solving a linear equation

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Matrices and Linear Algebra

Online Exercises for Linear Algebra XM511

Name: Final Exam MATH 3320

MATH 235. Final ANSWERS May 5, 2015

Linear Algebra Practice Problems

Solutions to practice questions for the final

ANSWERS. E k E 2 E 1 A = B

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Linear Algebra Final Exam Review Sheet

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Math 240 Calculus III

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

18.06SC Final Exam Solutions

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Determinants by Cofactor Expansion (III)

No books, notes, any calculator, or electronic devices are allowed on this exam. Show all of your steps in each answer to receive a full credit.

Announcements Wednesday, November 01

Eigenvalues and Eigenvectors

Transcription:

Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta = λ 2 8λ+7 = (λ )(λ 7). The eigenvectors with eigenvalue λ = satisfy the system Av = v, namely [ ][ ] [ ] 4 2 x (A I)v = = = = 4x+2y = = y = 2x. 4 2 y This means that every eigenvector with eigenvalue λ = must have the form [ ] [ ] x v = = x, x. 2x 2 Similarly, the eigenvectors with eigenvalue λ = 7 are solutions of Av = 7v, so [ ][ ] [ ] 2 2 x (A 7I)v = = = = 2x 2y = = y = x 4 4 y and every eigenvector with eigenvalue λ = 7 must have the form [ ] [ ] x v = = x, x. x 2. Is the following matrix diagonalisable? Why or why not? [ ] 4 A =. 2 Since tra = 6 and deta = 9, the characteristic polynomial of A is f(λ) = λ 2 (tra)λ+deta = λ 2 6λ+9 = (λ 3) 2, so the only eigenvalue is λ = 3. The eigenvectors satisfy the system (A 3I)v =, namely [ ][ ] [ ] [ ] [ ] x x = = x+y = = v = = x, x. y x Since A has only one linearly independent eigenvector, it is not diagonalisable.

3. Find a matrix A that has v as an eigenvector with eigenvalue λ = 2 and v 2 as an eigenvector with eigenvalue λ 2 = 5 when [ ] [ ] 2 v =, v 2 =. If B is the matrix whose columns are v and v 2, then the general theory implies that [ ] [ ] B λ 2 AB = =. 5 Once we now solve this equation for A, we may conclude that A = [ ] 2 [ ] 2 5 λ 2 [ ] 2 = 3 [ ] 4 5 2 5 [ ] = 2 [ ] 3 2. 4 4. Two square matrices A,C are said to be similar, if C = B AB for some invertible matrix B. Show that similar matrices have the same characteristic polynomial and also the same eigenvalues. Hint: one has C λi = B (A λi)b. Using the identity in the hint and properties of the determinant, we get det(c λi) = detb det(a λi) detb = det(a λi). This shows that A, C have the same characteristic polynomial. The eigenvalues are merely the roots of this polynomial, so A,C have the same eigenvalues as well.

Linear algebra II Tutorial solutions #2. Find a basis for both the null space and the column space of the matrix 3 4 A = 2 2 6. 3 4 The reduced row echelon form of A is easily found to be 3 R = 2. Since the pivots appear in the first and the second columns, this implies that C(A) = Span{Ae,Ae 2 } = Span 2,. The null space of A is the same as the null space of R. It can be expressed in the form x 3 3x 4 3 N(A) = 2x 3 x 4 x 3 : x 3,x 4 R = Span 2,. x 4 2. Find the eigenvalues and the generalised eigenvectors of the matrix 4 6 3 A = 4. 2 2 The eigenvalues of A are the roots of the characteristic polynomial f(λ) = det(a λi) = λ 3 +5λ 2 7λ+3 = (3 λ)(λ ) 2. When it comes to the eigenvalue λ = 3, one can easily check that 3 N(A 3I) = Span, N(A 3I)2 = N(A 3I).

This implies that N(A 3I) j = N(A 3I) for all j, so we have found all generalised eigenvectors with λ = 3. When it comes to the eigenvalue λ =, one similarly has 3 N(A I) = Span 2, N(A I)2 = N(A I) 3 = Span,. In view of the general theory, we must thus have N(A I) j = N(A I) 2 for all j 2. 3. Find a relation between x,y,z such that the following matrix is diagonalisable. x y A = 2 z. Since A is upper triangular, its eigenvalues are its diagonal entries λ =,,2. Let us now worry about the eigenvectors. When λ = 2, we can use row reduction to get x y x y x A 2I = z. z This gives 2 pivots and linearly independent eigenvector. When λ =, we get x y z z A I = z x y y xz. This gives 2 pivots and linearly independent eigenvector in the case that y xz, but it gives pivot and 2 linearly independent eigenvectors in the case that y = xz. Thus, A is diagonalisable if and only if y = xz. 4. Suppose that λ is an eigenvalue of a square matrix A and let j be a positive integer. Show that the null space of (A λi) j is an A-invariant subspace of C n. Assuming that v is in the null space of (A λi) j, one can easily check that (A λi) j Av = (A λi) j (A λi +λi)v = (A λi) j+ v +λ(a λi) j v =. This means that Av is also in the null space, so the null space is A-invariant.

Linear algebra II Tutorial solutions #3. Find the Jordan form and a Jordan basis for the matrix 2 A = 7 5 3. 5 6 The characteristic polynomial of the given matrix is f(λ) = det(a λi) = λ 3 +λ 2 33λ+36 = (4 λ)(λ 3) 2. Thus, the eigenvalues are λ = 3,3,4 and one can easily find the null spaces N(A 3I) = Span 2, N(A 4I) = Span. 2 This implies that A is not diagonalisable and that its Jordan form is 3 J = B AB = 3. 4 To find a Jordan basis, we need to find a Jordan chain v,v 2 with eigenvalue λ = 3 and an eigenvector v 3 with eigenvalue λ = 4. In our case, we have N(A 3I) 2 = Span,, so it easily follows that a Jordan basis is provided by the vectors v =, v 2 = (A 3I)v = 2, v 3 =. 2 2. Suppose that A is a 4 4 matrix whose column space is equal to its null space. Show that A 2 must be the zero matrix and find the Jordan form of A. To show that A 2 is the zero matrix, we note that its columns are all zero, as Ae i C(A) = Ae i N(A) = A(Ae i ) = = A 2 e i =.

To find the Jordan form of A, we note that the null space of A is two-dimensional since dimc(a)+dimn(a) = 4 = 2dimN(A) = 4 = dimn(a) = 2. Also, the null space of A 2 is four-dimensional because A 2 is the zero matrix by above. This gives the Jordan chain diagram for the eigenvalue λ =, so the Jordan form is J =. 3. Suppose that A is a 4 4 matrix whose only eigenvalue is λ =. Suppose also that the column space of A I is one-dimensional. Find the Jordan form of A. The eigenvalue λ = has multiplicity 4 and the number of Jordan chains is dimn(a I) = 4 dimc(a I) = 3. In particular, the Jordan chain diagram is and the Jordan form is J =. 4. Suppose that v,v 2,...,v k form a basis for the null space of a square matrix A and that C = B AB for some invertible matrix B. Find a basis for the null space of C. We note that a vector v lies in the null space of C = B AB if and only if B ABv = ABv = Bv N(A) Bv = k c i v i i= for some scalar coefficients c i. In particular, the vectors B v i span the null space of C. To show that these vectors are also linearly independent, we note that k c i B v i = i= k c i v i = c i = for all i. i=