EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Similar documents
PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

EXAM. Exam #3. Math 2360 Fall 2000 Morning Class. Nov. 29, 2000 ANSWERS

Dimension. Eigenvalue and eigenvector

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Problem 1: Solving a linear equation

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

MAT 1302B Mathematical Methods II

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

MATH 221, Spring Homework 10 Solutions

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MA 265 FINAL EXAM Fall 2012

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Announcements Monday, October 29

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Announcements Monday, November 06

Recall : Eigenvalues and Eigenvectors

Solutions to Final Exam

Chapter 5. Eigenvalues and Eigenvectors

Math 205, Summer I, Week 4b:

Math 20F Practice Final Solutions. Jor-el Briones

Diagonalization of Matrix

Eigenvalues, Eigenvectors, and Diagonalization

Math 308 Practice Final Exam Page and vector y =

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Linear Algebra- Final Exam Review

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Math Final December 2006 C. Robinson

4. Linear transformations as a vector space 17

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

1. Select the unique answer (choice) for each problem. Write only the answer.

Test 3, Linear Algebra

Study Guide for Linear Algebra Exam 2

18.06SC Final Exam Solutions

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

MATH 2360 REVIEW PROBLEMS

Math 1553 Worksheet 5.3, 5.5

Matrices related to linear transformations

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Linear Algebra Final Exam Review Sheet

and let s calculate the image of some vectors under the transformation T.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Math 314H Solutions to Homework # 3

Problems for M 10/26:

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MATH Spring 2011 Sample problems for Test 2: Solutions

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Practice problems for Exam 3 A =

Announcements Wednesday, November 01

Chapter 5 Eigenvalues and Eigenvectors

Jordan Canonical Form Homework Solutions

Definition (T -invariant subspace) Example. Example

MATH 1553-C MIDTERM EXAMINATION 3

Name: Final Exam MATH 3320

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Linear Algebra Practice Final

MATH. 20F SAMPLE FINAL (WINTER 2010)

Remarks on Definitions

235 Final exam review questions

MATH 310, REVIEW SHEET 2

MAT Linear Algebra Collection of sample exams

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Eigenvalue and Eigenvector Homework

City Suburbs. : population distribution after m years

Practice Final Exam Solutions

Answer Keys For Math 225 Final Review Problem

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Econ Slides from Lecture 7

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

ANSWERS. E k E 2 E 1 A = B

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Diagonalization. Hung-yi Lee

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

Practice Final Exam Solutions for Calculus II, Math 1502, December 5, 2013

Math 313 (Linear Algebra) Exam 2 - Practice Exam

Math 20F Final Exam(ver. c)

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

2 Eigenvectors and Eigenvalues in abstract spaces.

Math 310 Final Exam Solutions

Generalized Eigenvectors and Jordan Form

MATH 1553 PRACTICE FINAL EXAMINATION

Transcription:

EXAM Exam #3 Math 2360, Spring 200 April 24, 200 ANSWERS

i

40 pts Problem In this problem, we will work in the vectorspace P 3 = { ax 2 + bx + c a, b, c R }, the space of polynomials of degree less than 3 Let P be the ordered basis P = [ x 2 x ] of P 3 and let Q be the ordered basis Q = [ 2x 2 + x + x 2 + 2x +, x 2 + x + ] of P 3 A Find the change of basis matrices S PQ and S QP The defining equation of the change of basis matrix S PQ is Q = PS PQ Reading off the coefficients, we have [ 2x 2 + x + x 2 + 2x +, x 2 + x + ] = [ x 2 x ] 2 2, }{{}}{{} Q P (a matrix multiplication), so S PQ = 2 2 () To find S QP, we use the fact that S QP = S PQ Thus, 0 S QP = 0, 3 where we have used a calculator to find the inverse of the matrix in () B Let p(x) = 2x 2 x + Find [p(x)] Q, the coordinate vector of p(x) with respect to the ordered basis Q Show how p(x) can be written as a linear combination to the elements of Q First we find [p(x)] P, since that s easy The defining equation of [p(x)] P is p(x) = P[p(x)] P We have p(x) = 2x 2 x + = [ x 2 x ] 2, }{{} P

and so [p(x)] P = 2 To find [p(x)] Q, we use the change of coordinate equation [p(x)] Q = S QP [p(x)] P, so [p(x)] Q = 0 0 2 = 2, 3 2 using our value of S QP from above Thus, [p(x)] Q = 2 2 The defining equation of [p(x)] Q is p(x) = Q[p(x)] Q Thus, we have p(x) = Q[p(x)] Q = [ 2x 2 + x + x 2 + 2x +, x 2 + x + ] 2 }{{} 2 Q }{{} [p(x)] Q = (2x 2 + x + ) 2(x 2 + 2x + ) + 2(x 2 + x + ) Thus, we can write p(x) as a linear combination of the elements of Q as follows p(x) = (2x 2 + x + ) 2(x 2 + 2x + ) + 2(x 2 + x + ) 40 pts Problem 2 In this problem, we will work in the vectorspace P 3 = { ax 2 + bx + c a, b, c R }, the space of polynomials of degree less than 3 P = [ x 2 x ] of P 3 Let A be the matrix A = 3 0, 2 2 8 Let P be the ordered basis which is invertible Let Q be the ordered basis defined by Q = PA Let T : P 3 P 3 be the linear transformation defined by T (p(x)) = p (x) + p(x) 2

A Find [T ] PP, the matrix of T with respect to the ordered basis P The defining equation of [T ] PP is T (P) = P[T ] PP (2) and T (P) = [ T (x 2 ) T (x) T () ] Since T (p(x)) = p (x) + p(x) we have T (x 2 ) = (x 2 ) + x 2 = 2x + x 2 T (x) = (x) + x = + x T () = () + = 0 + =, so T (P) = [ x 2 + 2x x + ] Reading off the coefficients, we have [ x 2 + 2x x + ] = [ x 2 x ] 0 0 2 0, }{{}}{{} 0 T (P) P (matrix multiplication) Comparing this with equation (2), we see that 0 0 [T ] PP = 2 0 0 B Find [T ] QQ, the matrix of T with respect to the basis Q We will use the change of basis equation for linear transformations, ie, [T ] QQ = S QP [T ] PP S PQ Since S QP = S PQ, we can rewrite this equation as The defining equation of S PQ is [T ] QQ = S PQ [T ] PP S PQ (3) Q = PS PQ (4) Since the basis Q is defined by Q = PA, where A is given in the problem, comparison with equation (4) shows that S PQ = A Hence, plugging into (3) shows that [T ] QQ = A [T ] PP A 3

Using the given value of A and the value of [T ] PP from the previous part of the problem, we have [T ] QQ = 3 0 2 2 8 0 0 2 0 3 0 0 2 2 8 Using a calculator for the matrix computations, we find 3 7 [T ] QQ = 2 5/2 /2 0 /2 3/2 60 pts Problem 3 Let U = [ u ] u 2 be the ordered basis of R 2 where [ [ 2 2 u =, u ] 2 = 2] Let T : R 2 R 2 be the linear transformation whose matrix with respect to the standard basis E is [ ] 2 A =, 0 in other words, T (x) = Ax A Find the change of basis matrices S EU and S UE The defining equation of S EU is U = ES EU The corresponding matrix equation is mat(u) = mat(e)s EU = IS EU = S EU Thus, we have Thus, S EU = mat(u) = S EU = [ ] 2 2 2 [ ] 2 2 2 We know that S UE = S EU Using a calculator for the computation of the inverse, we have [ ] S UE = /2 4

B Find [T ] UU, the matrix of T with respect to U We are given the matrix of T with respect to the standard basis, so [ ] 2 [T ] EE = A = 0 To find [T ] UU, we use the change of coordinate equation for linear transformations, ie, [T ] UU = S UE [T ] EE S EU [ = /2 [ ] 2 4 = 0 ] [ 2 0 Thus, the answer to this part of the problem is [T ] UU = [ ] 2 4 0 ] [ ] 2 2 2 C Find the scalars c and c 2 such that T (2u 3u 2 ) = c u + c 2 u 2 Let v = 2u 3u 2 Then, of course, v = [ [ ] 2 u u 2, }{{} 3] U The defining equation of [v] U is v = U[v] U, so we have [v] U = [ 2 3], The equation for the action of the linear transformation T in the U-coordinates is [T (v)] U = [T ] UU [v] U Thus, we have [T (v)] U = [ ] [ ] 2 4 2 = 0 3 [ ] 8 3 5

The defining equation of [T (v)] U is T (v) = U[T (v)] U So we have T (2u 3u 2 ) = T (v) = [ [ ] ] 8 u u 2 = 8u }{{} 3 + 3u 2, U }{{} [T (v)] U So the answer is c = 8, c 2 = 3 40 pts Problem 4 Let A = [ ] 3 2 0 Find the characteristic polynomial of A and the eigenvalues of A We have A λi = [ ] [ ] [ ] 3 0 λ 3 λ = 2 0 0 2 λ Thus, the characteristic polynomial p(λ) of A is Thus, we have p(λ) = det(a λi) = λ 3 2 λ = ( λ)( λ) 6 = λ + λ 2 6 = λ 2 λ 6 p(λ) = λ 2 λ 6 The eigenvalues of A are the roots of the characteristic polynomial The characteristic polynomial factors as p(λ) = (λ 3)(λ) + 2), so the roots are 3 and 2 Thus, Eigenvalues of A = 2, 3 60 pts Problem 5 In each part you are given an matrix A and its eigenvalues Find a basis for each of the eigenspaces of A Determine if A is diagonalizable, and if it is, find a matrix P and a diagonal matrix D so that P AP = D 6

A A = 2 2, Eigenvalues =, 2 Consider first the eigenvalue λ = We have A λi = A I = 0 2 0 The RREF of this matrix is (by calculator) R = 0 0 0 0 0 The eigenspace E is the nullspace of A I, which is the same as the nullspace of R, ie, the solution space of Rx = 0 Calling the variables x, x 2 and x 3, we see that x 3 is a free variable, say x 3 = α The second row of R gives the equation x 2 x 3 = 0, so x 2 = α and the first row gives the equation x x 3 = 0, so x = α Thus, the vectors in the nullspace of R are x x 2 = α α = α α x 3 Thus, a basis of the eigenspace E is Basis of E = so E has dimension one Next, consider the eigenvalue λ = 2 We have A λi = A 2I = 2 0, which has the RREF R = 0 /2 0 3/2 0 0 0 7

The eigenspace E 2 is the nullspace of A 2I, which is the same as the nullspace of R Abbreviating the computation of the nullspace of R, we have x 3 = α x 2 3 2 x 3 = 0 = x 2 = 3 2 α x 2 x 3 = 0 = x = 2 α so the nullspace is parametrized by x 3 2 x 2 = α 2 α = α 3/2 /2 α x 3 Thus, we have Basis of E 2 = 3/2 /2 Alternatively, we could multiply this vector by 2 and say Basis of E 2 = 3 2 and so avoid the agony of dealing with fractions We have only produced 2 independent eigenvectors, and there are no additional eigenvectors that are independent of these two Thus, A does not have a basis of eigenvectors, we we conclude A is not diagonalizable B 0 3 6 A = 2 5 6, Eigenvalues = 2, 2 3 8 First consider the eigenvalue λ = 2 Then we have 2 3 6 A λi = A + 2I = 2 3 6, 2 3 6 8

which has the RREF /4 /2 R = 0 0 0 0 0 0 For the abbreviated computation of the nullspace we have x 2 = α x 3 = β x 4 x 2 2 x 3 = 0 = x = 4 α + 2 β The parametrization of the nullspace is x 4 x 2 = α + 2 β /4 /2 α = α + β 0 β 0 Thus, we get x 3 /4 Basis of E 2 =, 0 /2 0 (5) As matter of personal taste, I ll get rid of the fractions by multiplying the first vector by 4 and the second by 2 That gives me Basis of E 2 = 4, 0 0 (6) 2 We see that E 2 has dimension 2 Next, consider the eigenvalue λ = We have 9 3 6 A λi = A I = 2 3 9, 2 3 9 which has has the RREF R = 0 0 0 0 0 Note that we have already computed the nullspace of this matrix R in the first part of the problem Thus, we know Basis of E = (7) 9

Since we ve found three linearly independent eigenvectors, A has a basis of eigenvectors and A is diagonalizable Using the basis (6) of E 2 as a matter of taste (I could equally well use (5)), and the basis (7) of E, set P = 4 0, D = 2 0 0 0 2 0 0 2 0 0 then P is invertible, D is a diagonal matrix and P AP = D (check it if you don t believe me) 40 pts Problem 6 In each part, determine if the given transformation T is linear If it is linear, find a matrx so that T (x) = Ax If T is not linear, justify your answer A T : R 2 R 2 is given by ([ ]) [ ] x x x T = 2 x 2 x 2 + x 2 2 For T to be linear, it must preserve scalar multiplication, ie, it must be true that T (αx) = αt (x) for all scalars α and all x R 2 We have ( [ ]) ([ ]) [ ] [ ] x αx (αx T α = T = )(αx 2 ) α x 2 αx 2 (αx ) 2 + (αx 2 ) 2 = 2 x x 2 α 2 (x 2 + x 2 (8) 2) ([ ]) [ ] [ ] x x x αt = α 2 αx x x 2 x 2 + x 2 = 2 2 α(x 2 + x 2 (9) 2) If T was linear, the right hand sides of equations (8) and (9) would have to be equal for all values of α, x and x 2 In particular, we would have to have α 2 x x 2 = αx x 2 for all values of α, x and x 2 This is plainly not the case (eg, α = 2, x = x 2 = ), so T is not linear 0

B T : R 3 R 2 is given by T x = x 2 x 3 [ ] 2x 3x 2 + 5x 3 x x 3 We can rewrite the right hand side of the definition of T as a matrix multiplication T x [ ] x 2 2 3 5 = x x 0 2 x 3 Thus, T (x) = Ax for all x R 3, where A is the matrix A = [ 2 3 ] 5 0 Since T is given by matrix multiplication, T must be linear T is linear x 3 50 pts Problem 7 Let S be the subspace of R 4 spanned by the vectors 2 8 v = 2, v 2 = 3, v 3 = 0 5, v 4 = 3 3 0 A Cut down the list of vectors above to a basis for S What is the dimension of S? B For each of the following vectors, determine if the vector is in S and, if so, express it as a linear combination of the basis vectors you found in the previous part of the problem 3 5 w = 5 2, w 2 = 3 4 5 0

We can do all the computation in one RREF calculation on the calculator Make up a matrix containing all these vectors A = v v 2 v 3 v 4 w w 2 2 8 3 5 2 3 0 5 3 5 2 4 3 3 0 0 The RREF of this is R = 0 3 0 0 0 2 0 2 0 0 0 0 0 0 0 0 0 0 Columns, 2 and 4 of R form a basis for the span of the columns left of the bar, so the same is true of A Thus, v, v 2 and v 4 are a basis of S Since S has a basis with three elements, The dimension of S is 3 In R, we have col 5 (R) = col (R) + 2 col 2 (R) + col 4 (R) The same relation must hold among the columns of A, so we have w = v + 2v 2 + v 4 Since w is a linear combination of vectors in S, w S The last column of R is not a linear combination of the columns to the left of the bar (because of the last row), so the same is true of A Thus, w 2 / S 2