MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Similar documents
MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH Spring 2011 Sample problems for Test 2: Solutions

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MA 265 FINAL EXAM Fall 2012

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Eigenvalues and Eigenvectors A =

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

235 Final exam review questions

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math 310 Final Exam Solutions

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MAT Linear Algebra Collection of sample exams

Chapter 3 Transformations

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

ANSWERS. E k E 2 E 1 A = B

MATH 221, Spring Homework 10 Solutions

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Eigenvalues and Eigenvectors

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

MATH 115A: SAMPLE FINAL SOLUTIONS

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

2. Every linear system with the same number of equations as unknowns has a unique solution.

Quizzes for Math 304

Solutions to Final Exam

PRACTICE PROBLEMS FOR THE FINAL

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

I. Multiple Choice Questions (Answer any eight)

Conceptual Questions for Review

Linear Algebra. and

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

LINEAR ALGEBRA SUMMARY SHEET.

HOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS

Review problems for MA 54, Fall 2004.

Math Final December 2006 C. Robinson

Answer Keys For Math 225 Final Review Problem

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Math 308 Practice Test for Final Exam Winter 2015

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Problem 1: Solving a linear equation

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Chapter 6: Orthogonality

and let s calculate the image of some vectors under the transformation T.

Reduction to the associated homogeneous system via a particular solution

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Dimension. Eigenvalue and eigenvector

Cheat Sheet for MATH461

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

LINEAR ALGEBRA QUESTION BANK

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

1 Last time: least-squares problems

UNIT 6: The singular value decomposition.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Math 205, Summer I, Week 4b:

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Practice problems for Exam 3 A =

1. Select the unique answer (choice) for each problem. Write only the answer.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Eigenvalues, Eigenvectors, and Diagonalization

Lecture 7: Positive Semidefinite Matrices

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

1. General Vector Spaces

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

MTH 2032 SemesterII

Linear Algebra Highlights

Linear Algebra 2 Spectral Notes

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Study Guide for Linear Algebra Exam 2

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Linear Algebra problems

MATH 369 Linear Algebra

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

MATH 235. Final ANSWERS May 5, 2015

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Math 315: Linear Algebra Solutions to Assignment 7

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

Linear Algebra- Final Exam Review

Online Exercises for Linear Algebra XM511

Definition (T -invariant subspace) Example. Example

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Linear Algebra Final Exam Solutions, December 13, 2008

Transcription:

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1 5.6) Inner products and norms Orthogonal complement Least squares problems The Gram-Schmidt orthogonalization process Eigenvalues and eigenvectors (Leon 6.1, 6.3) Eigenvalues, eigenvectors, eigenspaces Characteristic polynomial Diagonalization

Sample problems for Test 2 Problem 1 (20 pts.) Let M 2,2 (R) denote the vector space of 2 2 matrices with real entries. Consider a linear operator L : M 2,2 (R) M 2,2 (R) given by L ( x y z w ) = ( x y z w ) ( ) 1 2. 3 4 Find the matrix of the operator L with respect to the basis ( ) ( ) ( ) ( ) 1 0 0 1 0 0 0 0 E 1 =, E 0 0 2 =, E 0 0 3 =, E 1 0 4 =. 0 1

Problem 2 (30 pts.) Let V be a subspace of R 4 spanned by the vectors x 1 = (1, 1, 1, 1) and x 2 = (1, 0, 3, 0). (i) Find an orthonormal basis for V. (ii) Find an orthonormal basis for the orthogonal complement V. Problem 3 (30 pts.) Let A = 1 2 0 1 1 1. 0 2 1 (i) Find all eigenvalues of the matrix A. (ii) For each eigenvalue of A, find an associated eigenvector. (iii) Is the matrix A diagonalizable? Explain. (iv) Find all eigenvalues of the matrix A 2.

Bonus Problem 4 (20 pts.) Find a linear polynomial which is the best least squares fit to the following data: x 2 1 0 1 2 f (x) 3 2 1 2 5 Bonus Problem 5 (20 pts.) Let L : V W be a linear mapping of a finite-dimensional vector space V to a vector space W. Show that dim Range(L) + dim ker(l) = dim V.

Problem 1 (20 pts.) Let M 2,2 (R) denote the vector space of 2 2 matrices with real entries. Consider a linear operator L : M 2,2 (R) M 2,2 (R) given by ( ) ( ) ( ) x y x y 1 2 L =. z w z w 3 4 Find the matrix of the operator L with respect to the basis ( ) ( ) ( ) ( ) 1 0 0 1 0 0 0 0 E 1 =, E 0 0 2 =, E 0 0 3 =, E 1 0 4 =. 0 1 Let M L denote the desired matrix. By definition, M L is a 4 4 matrix whose columns are coordinates of the matrices L(E 1 ), L(E 2 ), L(E 3 ), L(E 4 ) with respect to the basis E 1, E 2, E 3, E 4.

( ) ( ) 1 0 1 2 L(E 1 ) = = 0 0 3 4 ( ) ( ) 0 1 1 2 L(E 2 ) = = 0 0 3 4 ( ) ( ) 0 0 1 2 L(E 3 ) = = 1 0 3 4 ( ) ( ) 0 0 1 2 L(E 4 ) = = 0 1 3 4 ( ) 1 2 = 1E 0 0 1 +2E 2 +0E 3 +0E 4, ( ) 3 4 = 3E 0 0 1 +4E 2 +0E 3 +0E 4, ( ) 0 0 = 0E 1 2 1 +0E 2 +1E 3 +2E 4, ( ) 0 0 = 0E 3 4 1 +0E 2 +3E 3 +4E 4. It follows that 1 3 0 0 M L = 2 4 0 0 0 0 1 3. 0 0 2 4

Thus the relation (x1 ) ( y 1 x y = z 1 w 1 z w ) ( ) 1 2 3 4 is equivalent to the relation x 1 1 3 0 0 x y 1 z 1 = 2 4 0 0 y 0 0 1 3 z. 0 0 2 4 w w 1

Problem 2 (30 pts.) Let V be a subspace of R 4 spanned by the vectors x 1 = (1, 1, 1, 1) and x 2 = (1, 0, 3, 0). (i) Find an orthonormal basis for V. First we apply the Gram-Schmidt orthogonalization process to vectors x 1,x 2 and obtain an orthogonal basis v 1,v 2 for the subspace V: v 1 = x 1 = (1, 1, 1, 1), v 2 = x 2 x 2 v 1 v 1 = (1, 0, 3, 0) 4 (1, 1, 1, 1) = (0, 1, 2, 1). v 1 v 1 4 Then we normalize vectors v 1,v 2 to obtain an orthonormal basis w 1,w 2 for V: v 1 = 2 = w 1 = v 1 = 1 v 1 (1, 1, 1, 1) 2 v 2 = 6 = w 2 = v 2 v 2 = 1 6 (0, 1, 2, 1)

Problem 2 (30 pts.) Let V be a subspace of R 4 spanned by the vectors x 1 = (1, 1, 1, 1) and x 2 = (1, 0, 3, 0). (ii) Find an orthonormal basis for the orthogonal complement V. Since the subspace V is spanned by vectors (1, 1, 1, 1) and (1, 0, 3, 0), it is the row space of the matrix ( ) 1 1 1 1 A =. 1 0 3 0 Then the orthogonal complement V is the nullspace of A. To find the nullspace, we convert the matrix A to reduced row echelon form: ( 1 1 1 1 1 0 3 0 ) ( ) 1 0 3 0 1 1 1 1 ( ) 1 0 3 0. 0 1 2 1

Hence a vector (x 1, x 2, x 3, x 4 ) R 4 belongs to V if and only if ( ) x 1 ( ) 1 0 3 0 x 2 0 0 1 2 1 = 0 x 3 x 4 { x1 + 3x 3 = 0 x 2 2x 3 + x 4 = 0 { x1 = 3x 3 x 2 = 2x 3 x 4 The general solution of the system is (x 1, x 2, x 3, x 4 ) = = ( 3t, 2t s, t, s) = t( 3, 2, 1, 0) + s(0, 1, 0, 1), where t, s R. It follows that V is spanned by vectors x 3 = (0, 1, 0, 1) and x 4 = ( 3, 2, 1, 0).

The vectors x 3 = (0, 1, 0, 1) and x 4 = ( 3, 2, 1, 0) form a basis for the subspace V. It remains to orthogonalize and normalize this basis: v 3 = x 3 = (0, 1, 0, 1), v 4 = x 4 x 4 v 3 v 3 = ( 3, 2, 1, 0) 2 (0, 1, 0, 1) v 3 v 3 2 = ( 3, 1, 1, 1), v 3 = 2 = w 3 = v 3 v 3 = 1 2 (0, 1, 0, 1), v 4 = 12 = 2 3 = w 4 = v 4 = 1 v 4 2 ( 3, 1, 1, 1). 3 Thus the vectors w 3 = 1 2 (0, 1, 0, 1) and w 4 = 1 2 3 ( 3, 1, 1, 1) form an orthonormal basis for V.

Problem 2 (30 pts.) Let V be a subspace of R 4 spanned by the vectors x 1 = (1, 1, 1, 1) and x 2 = (1, 0, 3, 0). (i) Find an orthonormal basis for V. (ii) Find an orthonormal basis for the orthogonal complement V. Alternative solution: First we extend the set x 1,x 2 to a basis x 1,x 2,x 3,x 4 for R 4. Then we orthogonalize and normalize the latter. This yields an orthonormal basis w 1,w 2,w 3,w 4 for R 4. By construction, w 1,w 2 is an orthonormal basis for V. It follows that w 3,w 4 is an orthonormal basis for V.

The set x 1 = (1, 1, 1, 1), x 2 = (1, 0, 3, 0) can be extended to a basis for R 4 by adding two vectors from the standard basis. For example, we can add vectors e 3 = (0, 0, 1, 0) and e 4 = (0, 0, 0, 1). To show that x 1,x 2,e 3,e 4 is indeed a basis for R 4, we check that the matrix whose rows are these vectors is nonsingular: 1 1 1 1 1 0 3 0 0 0 1 0 0 0 0 1 = 1 3 0 0 1 0 0 0 1 = 1 0.

To orthogonalize the basis x 1,x 2,e 3,e 4, we apply the Gram-Schmidt process: v 1 = x 1 = (1, 1, 1, 1), v 2 = x 2 x 2 v 1 v 1 = (1, 0, 3, 0) 4 (1, 1, 1, 1) = (0, 1, 2, 1), v 1 v 4 1 v 3 = e 3 e 3 v 1 v 1 v 1 v 1 e 3 v 2 2 6 (0, 1, 2, 1) = ( 1 4, 1 12, 1 12, 1 12 v 2 = (0, 0, 1, 0) 1 (1, 1, 1, 1) 4 v 2 v 2 ) = 1 ( 3, 1, 1, 1), 12 v 4 = e 4 e 4 v 1 v 1 v 1 v 1 e 4 v 2 v 2 v 2 v 2 e 4 v 3 v 3 v 3 v 3 = (0, 0, 0, 1) 1 4 (1, 1, 1, 1) 1 6 (0, 1, 2, 1) 1/12 = ( 0, 1 2, 0, 1 2 1 1/12 12 ) = 1 (0, 1, 0, 1). 2 ( 3, 1, 1, 1) =

It remains to normalize vectors v 1 = (1, 1, 1, 1), v 2 = (0, 1, 2, 1), v 3 = 1 ( 3, 1, 1, 1), v 12 4 = 1 (0, 1, 0, 1): 2 v 1 = 2 = w 1 = v 1 = 1 v 1 (1, 1, 1, 1) 2 v 2 = 6 = w 2 = v 2 v 2 = 1 6 (0, 1, 2, 1) v 3 = 1 12 = 1 2 3 = w 3 = v 3 = 1 v 3 2 ( 3, 1, 1, 1) 3 v 4 = 1 2 = w 4 = v 4 v 4 = 1 2 (0, 1, 0, 1) Thus w 1,w 2 is an orthonormal basis for V while w 3,w 4 is an orthonormal basis for V.

Problem 3 (30 pts.) Let A = 1 2 0 1 1 1. 0 2 1 (i) Find all eigenvalues of the matrix A. The eigenvalues of A are roots of the characteristic equation det(a λi) = 0. We obtain that 1 λ 2 0 det(a λi) = 1 1 λ 1 0 2 1 λ = (1 λ) 3 2(1 λ) 2(1 λ) = (1 λ) ( (1 λ) 2 4 ) = (1 λ) ( (1 λ) 2 )( (1 λ) + 2 ) = (λ 1)(λ + 1)(λ 3). Hence the matrix A has three eigenvalues: 1, 1, and 3.

Problem 3 (30 pts.) Let A = 1 2 0 1 1 1. 0 2 1 (ii) For each eigenvalue of A, find an associated eigenvector. An eigenvector v = (x, y, z) of the matrix A associated with an eigenvalue λ is a nonzero solution of the vector equation (A λi)v = 0 1 λ 2 0 1 1 λ 1 x y = 0 0. 0 2 1 λ z 0 To solve the equation, we convert the matrix A λi to reduced row echelon form.

First consider the case λ = 1. The row reduction yields A + I = 2 2 0 1 2 1 1 1 0 1 2 1 0 2 2 0 2 2 Hence 1 1 0 0 1 1 1 1 0 0 1 1 1 0 1 0 1 1. 0 2 2 0 0 0 0 0 0 (A + I)v = 0 { x z = 0, y + z = 0. The general solution is x = t, y = t, z = t, where t R. In particular, v 1 = (1, 1, 1) is an eigenvector of A associated with the eigenvalue 1.

Secondly, consider the case λ = 1. The row reduction yields A I = 0 2 0 1 0 1 1 0 1 0 2 0 1 0 1 0 1 0 1 0 1 0 1 0. 0 2 0 0 2 0 0 2 0 0 0 0 Hence (A I)v = 0 { x + z = 0, y = 0. The general solution is x = t, y = 0, z = t, where t R. In particular, v 2 = ( 1, 0, 1) is an eigenvector of A associated with the eigenvalue 1.

Finally, consider the case λ = 3. The row reduction yields A 3I = 2 2 0 1 2 1 1 1 0 1 2 1 1 1 0 0 1 1 0 2 2 0 2 2 0 2 2 1 1 0 0 1 1 1 1 0 0 1 1 1 0 1 0 1 1. 0 2 2 0 0 0 0 0 0 Hence (A 3I)v = 0 { x z = 0, y z = 0. The general solution is x = t, y = t, z = t, where t R. In particular, v 3 = (1, 1, 1) is an eigenvector of A associated with the eigenvalue 3.

Problem 3 (30 pts.) Let A = 1 2 0 1 1 1. 0 2 1 (iii) Is the matrix A diagonalizable? Explain. The matrix A is diagonalizable, i.e., there exists a basis for R 3 formed by its eigenvectors. Namely, the vectors v 1 = (1, 1, 1), v 2 = ( 1, 0, 1), and v 3 = (1, 1, 1) are eigenvectors of the matrix A belonging to distinct eigenvalues. Therefore these vectors are linearly independent. It follows that v 1,v 2,v 3 is a basis for R 3. Alternatively, the existence of a basis for R 3 consisting of eigenvectors of A already follows from the fact that the matrix A has three distinct eigenvalues.

Problem 3 (30 pts.) Let A = 1 2 0 1 1 1. 0 2 1 (iv) Find all eigenvalues of the matrix A 2. We have that A = UBU 1, where B = 1 0 0 0 1 0, U = 1 1 1 1 0 1. 0 0 3 1 1 1 The columns of U are eigenvectors of A belonging to the eigenvalues 1, 1, and 3, respectively. Then A 2 = UBU 1 UBU 1 = UB 2 U 1, that is, the matrix A 2 is similar to the diagonal matrix B 2 = diag(1, 1, 9). Similar matrices have the same characteristic polynomial, hence they have the same eigenvalues. Thus the eigenvalues of A 2 are the same as the eigenvalues of B 2 : 1 and 9.

Bonus Problem 4 (20 pts.) Find a linear polynomial which is the best least squares fit to the following data: x 2 1 0 1 2 f (x) 3 2 1 2 5 We are looking for a function f (x) = c 1 + c 2 x, where c 1, c 2 are unknown coefficients. The data of the problem give rise to an overdetermined system of linear equations in variables c 1 and c 2 : c 1 2c 2 = 3, c 1 c 2 = 2, c 1 = 1, c 1 + c 2 = 2, c 1 + 2c 2 = 5. This system is inconsistent.

We can represent the system as a matrix equation Ac = y, where 1 2 3 1 1 ( ) A = 1 0 1 1, c = c1 2, y = c 2 1 2. 1 2 5 The least squares solution c of the above system is a solution of the normal system A T Ac = A T y: 1 2 3 ( ) 1 1 1 1 1 1 1 ( ) ( ) 2 1 0 1 2 1 0 c1 1 1 1 1 1 2 = 1 1 c 2 2 1 0 1 2 1 2 1 2 5 ( ) ( ) ( ) { 5 0 c1 3 c1 = 3/5 = 0 10 20 c 2 = 2 c 2 Thus the function f (x) = 3 + 2x is the best least squares fit 5 to the above data among linear polynomials.

Bonus Problem 5 (20 pts.) Let L : V W be a linear mapping of a finite-dimensional vector space V to a vector space W. Show that dimrange(l) + dim ker(l) = dim V. The kernel ker(l) is a subspace of V. It is finite-dimensional since the vector space V is. Take a basis v 1,v 2,...,v k for the subspace ker(l), then extend it to a basis v 1,v 2,...,v k,u 1,u 2,...,u m for the entire space V. Claim Vectors L(u 1 ), L(u 2 ),...,L(u m ) form a basis for the range of L. Assuming the claim is proved, we obtain dimrange(l) = m, dim ker(l) = k, dim V = k + m.

Claim Vectors L(u 1 ), L(u 2 ),...,L(u m ) form a basis for the range of L. Proof (spanning): Any vector w Range(L) is represented as w = L(v), where v V. Then v = α 1 v 1 + α 2 v 2 + + α k v k + β 1 u 1 + β 2 u 2 + + β m u m for some α i, β j R. It follows that w = L(v) = α 1 L(v 1 )+ +α k L(v k )+β 1 L(u 1 )+ +β m L(u m ) = β 1 L(u 1 ) + + β m L(u m ). Note that L(v i ) = 0 since v i ker(l). Thus Range(L) is spanned by the vectors L(u 1 ),...,L(u m ).

Claim Vectors L(u 1 ), L(u 2 ),...,L(u m ) form a basis for the range of L. Proof (linear independence): Suppose that t 1 L(u 1 ) + t 2 L(u 2 ) + + t m L(u m ) = 0 for some t i R. Let u = t 1 u 1 + t 2 u 2 + + t m u m. Since L(u) = t 1 L(u 1 ) + t 2 L(u 2 ) + + t m L(u m ) = 0, the vector u belongs to the kernel of L. Therefore u = s 1 v 1 + s 2 v 2 + + s k v k for some s j R. It follows that t 1 u 1 +t 2 u 2 + +t m u m s 1 v 1 s 2 v 2 s k v k = u u = 0. Linear independence of vectors v 1,...,v k,u 1,...,u m implies that t 1 = = t m = 0 (as well as s 1 = = s k = 0). Thus the vectors L(u 1 ), L(u 2 ),...,L(u m ) are linearly independent.