Math Spring 2011 Final Exam

Similar documents
Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

MATH 1553-C MIDTERM EXAMINATION 3

Numerical Linear Algebra Homework Assignment - Week 2

Math 315: Linear Algebra Solutions to Assignment 7

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

and let s calculate the image of some vectors under the transformation T.

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

4. Linear transformations as a vector space 17

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 1553, C. JANKOWSKI MIDTERM 3

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Definition (T -invariant subspace) Example. Example

Symmetric and anti symmetric matrices

Recall : Eigenvalues and Eigenvectors

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

Math 110 (Fall 2018) Midterm II (Monday October 29, 12:10-1:00)

LINEAR ALGEBRA REVIEW

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Name: Final Exam MATH 3320

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math Linear Algebra Final Exam Review Sheet

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Math Fall Final Exam

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

0.1 Rational Canonical Forms

I. Multiple Choice Questions (Answer any eight)

Exercise Set 7.2. Skills

1 Linear Algebra Problems

Chapter 5 Eigenvalues and Eigenvectors

Conceptual Questions for Review

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Review problems for MA 54, Fall 2004.

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

Diagonalization of Matrix

Chapter 3. Determinants and Eigenvalues

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

DEPARTMENT OF MATHEMATICS

DEPARTMENT OF MATHEMATICS

Math Matrix Algebra

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

DEPARTMENT OF MATHEMATICS

Final Exam Practice Problems Answers Math 24 Winter 2012

Eigenvalues and Eigenvectors: An Introduction

Linear Algebra: Matrix Eigenvalue Problems

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Phys 201. Matrices and Determinants

Math 313 (Linear Algebra) Exam 2 - Practice Exam

MATH 235. Final ANSWERS May 5, 2015

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

k is a product of elementary matrices.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

DEPARTMENT OF MATHEMATICS

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Lecture 1 Systems of Linear Equations and Matrices

18.06 Quiz 2 April 7, 2010 Professor Strang

Math 205, Summer I, Week 4b:

AMS526: Numerical Analysis I (Numerical Linear Algebra)

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Schur s Triangularization Theorem. Math 422

Properties of Linear Transformations from R n to R m

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MAC Module 12 Eigenvalues and Eigenvectors

MAT 1302B Mathematical Methods II

Math 3191 Applied Linear Algebra

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Linear Algebra Practice Problems

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Jordan normal form notes (version date: 11/21/07)

4. Determinants.

G1110 & 852G1 Numerical Linear Algebra

CS 246 Review of Linear Algebra 01/17/19

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.

18.06SC Final Exam Solutions

MAT2342 : Introduction to Linear Algebra Mike Newman, 5 October assignment 1

Eigenvalue and Eigenvector Problems

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Transcription:

Math 471 - Spring 211 Final Exam Instructions The following exam consists of three problems, each with multiple parts. There are 15 points available on the exam. The highest possible score is 125. Your explanations should use complete English sentences. For full credit on a problem, please completely justify your answer, e.g., be sure to include all necessary computations (when in doubt, err on the side of more rather than fewer details). The problems 1. (5 pts.) Let A be the 3 3-matrix 1 1 A := 4 3. 5 1 2 i) (15 pts.) Determine the characteristic polynomial of A, the eigenvalues of A and their algebraic multiplicities. Solution. It s easiest to compute the determinant of A λid 3 by expanding along the last column. In that case, there is only 1 term that contributes and we see det(a λid 3 ) = (2 λ)(( 1 λ)(3 λ) ( 4)1) = (2 λ)( 3 3λ + λ + λ 2 + 4) = (2 λ)(λ 2 2λ + 1) = (2 λ)(1 λ) 2. In other words, A has eigenvalues 2 (with algebraic multiplicity 1) and 1 (with algebraic multiplicity 2). ii) (15 pts.) Determine the geometric multiplicities of each eigenvalue, and use this to describe the Jordan normal form of A and also its Jordan decomposition. Solution. Now that we know the eigenvalues, to determine the geometric multiplicities, we must find the dimension of the corresponding eigenspaces. Since 1

2 2 appears with algebraic multiplicity 1, it follows that 2 has geometric multiplicity 1. Therefore, we just have to compute the geometric multiplicity of the eigenvalue 1. To do this, we need to solve the equation (A Id 3 )x =, i.e,. 2 1 4 2 x =. 5 1 1 Dividing the second equation by 2, we see that the first and second rows are equal and therefore by subtracting the second row from the first, we get a row of zeros. By explicitly solving the remaining equations, we get x 2 = 2x 1 and x 3 = 3x 1. In other words, there is, up to scaling, only 1 eigenvector with eigenvalue 1, which is the vector x 1 (1, 2, 3) t. Given this information, we know the Jordan normal form J A of A: 1 1 J A := 1. 2 iii) (1 pts.) Find a matrix P such that P AP 1 is in Jordan normal form. Solution. Remember that to put P in Jordan normal form, we need a basis of C 3 consisting of generalized eigenvectors (and these generalized eigenvectors need to be cyclic vectors). We know that the eigenvalue 2 has a unique up to scaling eigenvector (which we will compute shortly), and 1 has a unique up to scaling eigenvector (which we computed in the last step). Since the algebraic and geometric multiplicities of 1 do not agree, we need to find a single generalized eigenvector corresponding to the eigenvalue 1. Let us first compute the eigenvector corresponding to the eigenvalue 2; we thus need to solve the equation (A 2Id 3 )x =. In other words, we consider the equation 3 1 4 1 x =, 5 1 which has solution x 3 (,, 1) t. Given the Jordan form, the generalized eigenvector we re seeking must, first of all, be a solution to the equation (A Id 3 ) 2 x =. Computing the square of the matrix (A Id 3 ) gives the matrix. 1 2 1 Solving the equation (A Id 3 ) 2 x = we get the equation x 1 2x 2 + x 3 =, gives the solutions x 1 (1,, 1) t + x 2 (, 1, 2) t. However, we also want the solution to be a cyclic vector, i.e., (A Id 3 )x should be the eigenvector corresponding to

3 1. By explicitly computing (A Id 3 )(1,, 1) t and (A Id 3 )(, 1, 2) t, we see that the latter is precisely (1, 2, 3) t. Define B to be the matrix whose columns are the vectors we have just found, i.e., 1 B := 2 1. 3 2 1 Since P is lower triangular and has 1s along the diagonal, its inverse is necessarily lower triangular and has 1s along the digaonal. Thus, by matrix multiplication we see that 1 B 1 := 2 1, 1 2 1 and by multiplying out the matrices, you see that BJ A B 1 = A. In other words, J A = B 1 AB is in Jordan normal form. iv) (1 pts.) Write a closed form expression for e A. Solution. We know that e A := n An n!. Since A = BJ AB 1, we also know that e A = Be J AB 1. On the other hand, we know that J A = D + N where D is the diagonal matrix (1, 1, 2) and N is the matrix unit e 12. Since D and N commute, one can show that e J A = e D+N = e D e N ; since N 2 =, it follows that e N = Id + N. If you didn t see this, here is a more direct way to proceed. We know that e 2 12 =. Therefore, suppose we want to expand (D + N) i ; we can do this by means of the binomial formula. The first term is D i, the second term is id i 1 N and all other terms contain powers of N 2 and are therefore. Therefore e J A = n D n + nd n 1 N n! = n D n n! + N Dn 1 (n 1)! = ed + Ne D. In other words, e e e J A = e. e 2 To compute e A, we then need to multiply on the left by B and on the right by B 1 ; we therefore get e e e A = 4e 3e. 7e + e 2 5e e 2 e 2

4 2. (5 pts.) Show that every n n-matrix is similar to its transpose by following the steps indicated below. i) (25 pts.) First, suppose A consists of a single Jordan block. Write down the transpose, and find an explicit matrix P such that P AP 1 = A T (hint: guess the matrix in the 2 2 case and the 3 3-case and try to determine a pattern; additional hint: the matrix P will have the property that P 1 = P and all its entries will be either zeros or ones). Solution. A single Jordan block is of the form λ 1 λ 1 J :=....... λ i.e., it has λ along the diagonal and ones along the superdiagonal. Its transpose J T therefore has λ along the diagonal and ones along the subdiagonal. To get from J to J T, we think in terms of multiplication by elementary matrices: left multiplication corresponds to row operations and right multiplication corresponds to column operations. To obtain J T from J, you can reverse the order of the rows, and then reverse the order of the columns. To accomplish this in terms of matrices, let P be the matrix with 1s along the anti-diagonal and zeros elsewhere. Said differently, P is the matrix that has the property that P ij = 1 if i + j = n, and if i + j. Left multiplication by P exactly reverse the order of the rows and right multiplication by P reverses the order of the columns. In other words, J T = P JP. ii) (15 pts.) Using what you learned in part (i), show that any matrix A in Jordan normal form is similar to its transpose. Solution. Any matrix L in Jordan normal form consists of a collection of Jordan blocks J 1,..., J r. For each Jordan block J i, take a matrix P i as in part 1 of size equal to the size of the corresponding Jordan block. Let P be the block-diagonal matrix with diagonal blocks P i. The transpose of the matrix L is the blockdiagonal matrix with diagonal blocks J1 T,..., J r T. By what we learned in part (i) and the formula for block multiplication, we conclude that P LP = L T. iii) (1 pts.)show that any matrix is similar to its transpose by using Jordan normal form. Solution. Suppose A is any matrix, then we know that there exists a matrix Q such that QAQ 1 is in Jordan normal form J A, or equivalently, A = QJ A Q 1. Therefore using the relationship between transpose and matrix multiplication A T = (Q 1 ) T JA T QT. By part (ii), we can write JA T = P J AP, i.e., A T = Q 1 T P JA P Q T. To simplify notation, observe that tranposing the equation QQ 1 = Id n gives the relationship Q 1 T Q T = Id n, i.e., Q 1 T = Q T 1. This gives the formula P Q T A T Q T 1 P = J A.,

5 Multiplying both sides on the left by Q and on the right by Q 1 gives the required similarity. 3. (5 pts.) An n n-matrix with complex entries A is called skew-hermitian if A = A. i) (15 pts.) Explain why skew-hermitian matrices are unitarily diagonalizable. Solution. If A = A, then AA = A 2 = A A, i.e., A is a normal matrix, and hence is unitarily diagonalizable. ii) (3 pts.) Show that a skew-hermitian matrix has purely imaginary eigenvalues. (Hint: a number λ is purely imaginary if λ = λ. Additional hint: think about how we proved that the eigenvalues of a Hermitian matrix are real.) Solution. Suppose λ is an eigenvalue of A, and let x be an eigenvector with this eigenvalue. According to the hint, we would like to show that λ = λ. Now, x is non-zero, and therefore, x, x is non-zero, where here the angle brackets denote the standard sesquilinear form on C n. Now, observe the following Ax, x = λx, x = λ x, x, where the last equality stems from the sesquilinearity of the form. On the other hand for any matrix A, we have Ax, x = x, A x. Thus, if A is skew-hermitian, we can rewrite this as x, Ax = x, λx = λx, x. Tracing through the various equalities, one gets λ x, x = λ x, x, and dividing both sides by x, x gives the required result. iii) (5 pts.) What can you say about the eigenvalues of a real skew-symmetric matrix? Solution. Real skew-symmetric matrices are themselves skew-hermitian and therefore have purely imaginary eigenvalues.