MATH Spring 2011 Sample problems for Test 2: Solutions

Similar documents
MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MA 265 FINAL EXAM Fall 2012

235 Final exam review questions

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 221, Spring Homework 10 Solutions

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MATH. 20F SAMPLE FINAL (WINTER 2010)

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

MAT Linear Algebra Collection of sample exams

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

ANSWERS. E k E 2 E 1 A = B

1. Select the unique answer (choice) for each problem. Write only the answer.

PRACTICE PROBLEMS FOR THE FINAL

Math Final December 2006 C. Robinson

Math 310 Final Exam Solutions

Answer Keys For Math 225 Final Review Problem

Solutions to Final Exam

Chapter 3 Transformations

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

MATH 235. Final ANSWERS May 5, 2015

Eigenvalues and Eigenvectors A =

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

2. Every linear system with the same number of equations as unknowns has a unique solution.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Dimension. Eigenvalue and eigenvector

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Conceptual Questions for Review

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Online Exercises for Linear Algebra XM511

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 115A: SAMPLE FINAL SOLUTIONS

Solving a system by back-substitution, checking consistency of a system (no rows of the form

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

1 Last time: least-squares problems

LINEAR ALGEBRA SUMMARY SHEET.

UNIT 6: The singular value decomposition.

Linear Algebra- Final Exam Review

Math 215 HW #9 Solutions

Quizzes for Math 304

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Eigenvalues, Eigenvectors, and Diagonalization

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Linear algebra II Tutorial solutions #1 A = x 1

and let s calculate the image of some vectors under the transformation T.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Solutions to practice questions for the final

Cheat Sheet for MATH461

I. Multiple Choice Questions (Answer any eight)

Problem 1: Solving a linear equation

Math Matrix Algebra

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Linear Algebra Practice Problems

0 2 0, it is diagonal, hence diagonalizable)

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Linear Algebra. and

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math 24 Spring 2012 Sample Homework Solutions Week 8

Announcements Monday, October 29

Reduction to the associated homogeneous system via a particular solution

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Math 21b Final Exam Thursday, May 15, 2003 Solutions

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

MATH 369 Linear Algebra

LINEAR ALGEBRA REVIEW

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Transcription:

MATH 304 505 Spring 011 Sample problems for Test : Solutions Any problem may be altered or replaced by a different one! Problem 1 (15 pts) Let M, (R) denote the vector space of matrices with real entries Consider a linear operator L : M, (R) M, (R) given by x y x y 1 L = z w z w 3 4 Find the matrix of the operator L with respect to the basis ( ) ( ) ( 1 0 0 1 0 0 E 1 =, E 0 0 =, E 0 0 3 = 1 0 ), E 4 = ( ) 0 0 0 1 Let M L denote the desired matrix By definition, M L is a 4 4 matrix whose columns are coordinates of the matrices L(E 1 ), L(E ), L(E 3 ), L(E 4 ) with respect to the basis E 1, E, E 3, E 4 We have that 1 0 1 1 L(E 1 ) = = = 1E 0 0 3 4 0 0 1 + E + 0E 3 + 0E 4, 0 1 1 3 4 L(E ) = = = 3E 0 0 3 4 0 0 1 + 4E + 0E 3 + 0E 4, 0 0 1 0 0 L(E 3 ) = = = 0E 1 0 3 4 1 1 + 0E + 1E 3 + E 4, It follows that L(E 4 ) = ( ) ( ) 0 0 1 0 1 3 4 = ( ) 0 0 3 4 1 3 0 0 M L = 4 0 0 0 0 1 3 0 0 4 = 0E 1 + 0E + 3E 3 + 4E 4 Problem (0 pts) Find a linear polynomial which is the best least squares fit to the following data: x 1 0 1 f(x) 3 1 5 We are looking for a function f(x) = c 1 + c x, where c 1, c are unknown coefficients The data of the problem give rise to an overdetermined system of linear equations in variables c 1 and c : c 1 c = 3, c 1 c =, c 1 = 1, c 1 + c =, c 1 + c = 5 1

This system is inconsistent We can represent it as a matrix equation Ac = y, where 1 3 1 1 ( ) A = 1 0 1 1, c = c1, y = c 1 1 5 The least squares solution c of the above system is a solution of the system A T Ac = A T y: 1 3 ( ) 1 1 1 1 1 1 1 ( ) ( ) 1 0 1 1 0 c1 1 1 1 1 1 = 1 1 c 1 0 1 1 1 5 ( )( ) ( ) { 5 0 c1 3 c1 = 3/5 = 0 10 0 c = c Thus the function f(x) = 3 5 +x is the best least squares fit to the above data among linear polynomials Problem 3 (5 pts) Let V be a subspace of R 4 spanned by the vectors x 1 = (1, 1, 1, 1) and x = (1, 0, 3, 0) (i) Find an orthonormal basis for V First we apply the Gram-Schmidt orthogonalization process to vectors x 1,x and obtain an orthogonal basis v 1,v for the subspace V : v 1 = x 1 = (1, 1, 1, 1), v = x x v 1 v 1 = (1, 0, 3, 0) 4 (1, 1, 1, 1) = (0, 1,, 1) v 1 v 1 4 Then we normalize vectors v 1,v to obtain an orthonormal basis w 1,w for V : w 1 = v 1 v 1 = 1 v 1 = 1 (1, 1, 1, 1), w = v v = 1 v = 1 (0, 1,, 1)

(ii) Find an orthonormal basis for the orthogonal complement V Since the subspace V is spanned by vectors (1, 1, 1, 1) and (1, 0, 3, 0), it is the row space of the matrix ( ) 1 1 1 1 A = 1 0 3 0 Then the orthogonal complement V is the nullspace of A To find the nullspace, we convert the matrix A to reduced row echelon form: 1 1 1 1 1 0 3 0 1 0 3 0 1 0 3 0 1 1 1 1 0 1 1 Hence a vector (x 1, x, x 3, x 4 ) R 4 belongs to V if and only if { x1 + 3x 3 = 0 x x 3 + x 4 = 0 { x1 = 3x 3 x = x 3 x 4 The general solution of the system is (x 1, x, x 3, x 4 ) = ( 3t, t s, t, s) = t( 3,, 1, 0) + s(0, 1, 0, 1), where t, s R It follows that V is spanned by vectors x 3 = (0, 1, 0, 1) and x 4 = ( 3,, 1, 0) It remains to orthogonalize and normalize this basis for V : v 3 = x 3 = (0, 1, 0, 1), v 4 = x 4 x 4 v 3 v 3 = ( 3,, 1, 0) (0, 1, 0, 1) = ( 3, 1, 1, 1), v 3 v 3 w 3 = v 3 v 3 = 1 (0, 1, 0, 1), w 4 = v 4 v 4 = 1 3 v 4 = 1 ( 3, 1, 1, 1) 3 Thus the vectors w 3 = 1 (0, 1, 0, 1) and w 4 = 1 3 ( 3, 1, 1, 1) form an orthonormal basis for V Alternative solution: Suppose that an orthonormal basis w 1,w for the subspace V has been extended to an orthonormal basis w 1,w,w 3,w 4 for R 4 Then the vectors w 3,w 4 form an orthonormal basis for the orthogonal complement V We know that vectors v 1 = (1, 1, 1, 1) and v = (0, 1,, 1) form an orthogonal basis for V This basis can be extended to a basis for R 4 by adding two vectors from the standard basis For example, we can add vectors e 3 = (0, 0, 1, 0) and e 4 = (0, 0, 0, 1) The vectors v 1,v,e 3,e 4 do form a basis for R 4 since the matrix whose rows are these vectors is nonsingular: 1 1 1 1 0 1 1 0 0 1 0 = 1 0 0 0 0 1 To orthogonalize the basis v 1,v,e 3,e 4, we apply the Gram-Schmidt process (note that the vectors v 1 and v are already orthogonal): v 3 = e 3 e 3 v 1 v 1 e 3 v v = (0, 0, 1, 0) 1 v 1 v 1 v v 4 (1, 1, 1, 1) (0, 1,, 1) = 1 ( 3, 1, 1, 1), 1 v 4 = e 4 e 4 v 1 v 1 v 1 v 1 e 4 v v v v e 4 v 3 v 3 v 3 v 3 = = (0, 0, 0, 1) 1 4 1 1/1 (1, 1, 1, 1) (0, 1,, 1) 1/1 1 1 ( 3, 1, 1, 1) = 1 (0, 1, 0, 1) 3

It remains to normalize vectors v 1,v,v 3,v 4 : w 1 = v 1 v 1 = 1 (1, 1, 1, 1), w = v v = 1 (0, 1,, 1), w 3 = v 3 v 3 = 1v 3 = 1 3 ( 3, 1, 1, 1), w 4 = v 4 v 4 = v 4 = 1 (0, 1, 0, 1) We have obtained an orthonormal basis w 1,w,w 3,w 4 for R 4 that extends an orthonormal basis w 1,w for the subspace V It follows that w 3 = 1 ( 3, 1, 1, 1), w 3 4 = 1 (0, 1, 0, 1) is an orthonormal basis for V 1 0 Problem 4 (30 pts) Let A = 1 1 1 0 1 (i) Find all eigenvalues of the matrix A The eigenvalues of A are roots of the characteristic equation det(a λi) = 0 We obtain that 1 λ 0 det(a λi) = 1 1 λ 1 0 1 λ = (1 λ)3 (1 λ) (1 λ) = (1 λ) ( (1 λ) 4 ) = (1 λ) ( (1 λ) )( (1 λ) + ) = (λ 1)(λ + 1)(λ 3) Hence the matrix A has three eigenvalues: 1, 1, and 3 (ii) For each eigenvalue of A, find an associated eigenvector An eigenvector v = (x, y, z) of A associated with an eigenvalue λ is a nonzero solution of the vector equation (A λi)v = 0 To solve the equation, we apply row reduction to the matrix A λi First consider the case λ = 1 The row reduction yields 0 1 1 0 1 1 0 1 1 0 1 0 1 A + I = 1 1 1 1 0 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 0 0 Hence (A + I)v = 0 1 0 1 x 0 0 1 1 y = 0 0 0 0 z 0 { x z = 0, y + z = 0 The general solution is x = t, y = t, z = t, where t R In particular, v 1 = (1, 1, 1) is an eigenvector of A associated with the eigenvalue 1 Secondly, consider the case λ = 1 The row reduction yields 0 0 1 0 1 1 0 1 1 0 1 A I = 1 0 1 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 Hence (A I)v = 0 1 0 1 x 0 0 1 0 y = 0 0 0 0 z 0 { x + z = 0, y = 0 4

The general solution is x = t, y = 0, z = t, where t R In particular, v = ( 1, 0, 1) is an eigenvector of A associated with the eigenvalue 1 Finally, consider the case λ = 3 The row reduction yields 0 1 1 0 1 1 0 A 3I = 1 1 1 1 0 1 1 0 0 0 Hence 1 1 0 1 1 0 1 0 1 0 1 1 0 1 1 0 1 1 0 0 0 0 0 0 0 (A 3I)v = 0 1 0 1 x 0 0 1 1 y = 0 0 0 0 z 0 { x z = 0, y z = 0 The general solution is x = t, y = t, z = t, where t R In particular, v 3 = (1, 1, 1) is an eigenvector of A associated with the eigenvalue 3 (iii) Is the matrix A diagonalizable? Explain The matrix A is diagonalizable, ie, there exists a basis for R 3 formed by its eigenvectors Namely, the vectors v 1 = (1, 1, 1), v = ( 1, 0, 1), and v 3 = (1, 1, 1) are eigenvectors of the matrix A belonging to distinct eigenvalues Therefore these vectors are linearly independent It follows that v 1,v,v 3 is a basis for R 3 Alternatively, the existence of a basis for R 3 consisting of eigenvectors of A already follows from the fact that the matrix A has three distinct eigenvalues (iv) Find all eigenvalues of the matrix A Suppose that v is an eigenvector of the matrix A associated with an eigenvalue λ, that is, v 0 and Av = λv Then A v = A(Av) = A(λv) = λ(av) = λ(λv) = λ v Therefore v is also an eigenvector of the matrix A and the associated eigenvalue is λ We already know that the matrix A has eigenvalues 1, 1, and 3 It follows that A has eigenvalues 1 and 9 Since a 3 3 matrix can have up to 3 eigenvalues, we need an additional argument to show that 1 and 9 are the only eigenvalues of A The matrix A is diagonalizable Namely, A = UBU 1, where 1 0 0 B = 0 1 0 0 0 3 and U is the matrix whose columns are eigenvectors v 1,v,v 3 : 1 1 1 U = 1 0 1 1 1 1 Then A = UBU 1 UBU 1 = UB U 1 It follows that det(a λi) = det(ub U 1 λi) = det ( UB U 1 U(λI)U 1) 5

= det ( U(B λi)u 1) = det(u)det(b λi)det(u 1 ) = det(b λi) Thus the matrix A has the same characteristic polynomial as the diagonal matrix 1 0 0 B = 0 1 0 0 0 9 Consequently, the matrices A and B have the same eigenvalues The latter has eigenvalues 1 and 9 Bonus Problem 5 (15 pts) Let L : V W be a linear mapping of a finite-dimensional vector space V to a vector space W Show that dim Range(L) + dim ker(l) = dimv The kernel ker(l) is a subspace of V Since the vector space V is finite-dimensional, so is ker(l) Take a basis v 1,v,,v k for the subspace ker(l), then extend it to a basis v 1,,v k,u 1,u,,u m for the entire space V We are going to prove that vectors L(u 1 ), L(u ),,L(u m ) form a basis for the range L(V ) Then dim Range(L) = m, dim ker(l) = k, and dimv = k + m Spanning: Any vector w Range(L) is represented as w = L(v), where v V We have for some α i, β j R It follows that v = α 1 v 1 + α v + + α k v k + β 1 u 1 + β u + + β m u m w = L(v) = α 1 L(v 1 ) + + α k L(v k ) + β 1 L(u 1 ) + + β m L(u m ) = β 1 L(u 1 ) + + β m L(u m ) (L(v i ) = 0 since v i ker(l)) Thus Range(L) is spanned by the vectors L(u 1 ), L(u ),,L(u m ) Linear independence: Suppose that t 1 L(u 1 ) + t L(u ) + + t m L(u m ) = 0 for some t i R Let u = t 1 u 1 + t u + + t m u m Since L(u) = t 1 L(u 1 ) + t L(u ) + + t m L(u m ) = 0, the vector u belongs to the kernel of L Therefore u = s 1 v 1 + s v + + s k v k for some s j R It follows that t 1 u 1 + t u + + t m u m s 1 v 1 s v s k v k = u u = 0 Linear independence of vectors v 1,,v k,u 1,,u m implies that t 1 = t = = t m = 0 (as well as s 1 = s = = s k = 0) Thus the vectors L(u 1 ), L(u ),,L(u m ) are linearly independent