LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Similar documents
MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MA 265 FINAL EXAM Fall 2012

MATH 221, Spring Homework 10 Solutions

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

I. Multiple Choice Questions (Answer any eight)

PRACTICE PROBLEMS FOR THE FINAL

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

235 Final exam review questions

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Solutions to Final Exam

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Practice problems for Exam 3 A =

Chapter 3 Transformations

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

2. Every linear system with the same number of equations as unknowns has a unique solution.

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Review problems for MA 54, Fall 2004.

Conceptual Questions for Review

Definition (T -invariant subspace) Example. Example

and let s calculate the image of some vectors under the transformation T.

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Solutions to Review Problems for Chapter 6 ( ), 7.1

MAT Linear Algebra Collection of sample exams

Linear Algebra- Final Exam Review

Diagonalizing Matrices

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Final Exam Practice Problems Answers Math 24 Winter 2012

ANSWERS. E k E 2 E 1 A = B

18.06 Quiz 2 April 7, 2010 Professor Strang

Study Guide for Linear Algebra Exam 2

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Name: Final Exam MATH 3320

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

1 Last time: least-squares problems

MATH. 20F SAMPLE FINAL (WINTER 2010)

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Diagonalization of Matrix

Linear Algebra Final Exam Study Guide Solutions Fall 2012

LINEAR ALGEBRA SUMMARY SHEET.

Eigenvalues and Eigenvectors A =

Applied Linear Algebra in Geoscience Using MATLAB

Math 3191 Applied Linear Algebra

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Problem 1: Solving a linear equation

Math 1553, Introduction to Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Solutions to practice questions for the final

Math 205, Summer I, Week 4b:

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Exercise Set 7.2. Skills

Chapter 6: Orthogonality

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

MAT 1302B Mathematical Methods II

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Math 315: Linear Algebra Solutions to Assignment 7

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Lecture 15, 16: Diagonalization

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Sample Final Exam: Solutions

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

LINEAR ALGEBRA QUESTION BANK

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Lecture 12: Diagonalization

Test 3, Linear Algebra

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Columbus State Community College Mathematics Department Public Syllabus

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Section 6.4. The Gram Schmidt Process

Reduction to the associated homogeneous system via a particular solution

MATH 115A: SAMPLE FINAL SOLUTIONS

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Cheat Sheet for MATH461

Math 2B Spring 13 Final Exam Name Write all responses on separate paper. Show your work for credit.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

MATH Spring 2011 Sample problems for Test 2: Solutions

SUMMARY OF MATH 1600

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

7. Symmetric Matrices and Quadratic Forms

Transcription:

LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable, and (iii) if it is diagonalizable, find an invertible matrix P and a diagonal matrix D such that A = P DP If the matrix is orthogonally diagonalizable, then try to find such a P which is a diagonal matrix HINT: The eigenvalues for the matrix A below are and A = B = 9 E = Solution: The matrix A is orthogonally diagonalizable since it is symmetric (see section 5) To diagonalize it, we first find eigenvectors of A (In the following solution, I will find an orthonormal set of eigenvectors, to show you how to orthogonally diagonalize A; however, for the question as stated, you could use any set of three linearly independent eigenvectors to get a correct answer) For the eigenvalue, the associated eigenvectors are solutions to the linear system corresponding to whose solution is given by the equations x = x and x = x (with x as a free variable) So (,, ) is an eigenvector We eventually want an orthonormal basis of R, so we divide this eigenvector by its norm,, to get the unit eigenvector u =,

LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS For the eigenvalue, the associated eigenvectors are the solutions to the linear system corresponding to, whose solution is given by the equation x = x x with x and x both free Choosing (, ) and (, ) for the values of x and x in the above solution, we get (,, ) and (,, ), two linearly independent eignevalues of A with eigenvalue Since we want an orthonormal set of eigenvectors, we apply the Gram-Schmidt process to these two vectors to get the orthogonal set, (see the solution to Problem below for details on how to apply Gram- Schmidt), and then we make each of these vectors into unit vectors by dividing them by their norms, to get u =, u = Finally, as explained in section 5, we can use the orthonormal set of eigenvectors { u, u, u } and the eigenvalues we have found above to construct matrices P and D that orthogonally diagonalize A: P = u u u =, D = For B, we again start by finding its eigenvalues B is upper-triangular, so its eigenvalues are the entries on the main diagonal, and the eigenvalues of B are and (Alternatively, it is easy to find the eigenvalues of B by solving the characteristic equation, which is = ( λ)( λ)) To find an eigenvector corresponding to the eigenvalue, we solve: 9 9 The general solution is that x = and x is free, so one possible eigenvector is (, ) To find an eigenvector corresponding to the eigenvalue, we solve:

LINEAR ALGEBRA, -IPARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS 9 The general solution is that x = x and x is free, so one possible eigenvector is (, ) So B is diagonalizable since the two eigenvectors we have found form a basis of R (as explained in section 5) Using the technique in section 5, we can let P =, D = (The columns of P are the two eigenvectors we have found, and the diagonal entries of D are the associated eigenvalues) However, since the matrix B is not symmetric, it is not orthogonally diagonalizable Finally, we consider the matrix E For this matrix, the characteristic equation is: = λ λ λ and expanding the determinant by the first column, this becomes = ( λ) λ λ = ( λ) Therefore, the only eigenvalue of E is (which is a triple root of the characteristic polynomial) Note that there do exists matrices with only one eigenvalue which are diagonalizable (For practice, you could try to find an example of such a matrix) So we are not yet finished with this problem! The -eigenspace of E is the null space of the null space of the matrix This null space is the set of all (x, x, x ) in R such that x =, which has as a basis the set {(,, ), (,, )} So the dimension of the -eigenspace is only This means that the geometric multiplicity of the eigenvalue is, but the algebraic multiplicity of this eigenvalue is (as we saw in the characteristic equation above) Therefore, E is not diagonalizable (b) Give an example of a matrix C which is not similar to the matrix A in part (a) (The definition from section 5 of the textbook says: C is similar to A if and only if there is an invertible matrix P such that C = P AP ),

4LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Solution: Any two similar matrices have the same sets of eigenvalues So we just have to find a matrix C whose set of eigenvalues is different from {, }; there are many possible correct answers here, such as C = (whose eigenvalues are,, and ) Problem (a) Find an orthonormal basis for the subspace W of R spanned by, Solution: Call this subspace S First, note that the set given is linearly independent (since neither vector is a multiple of the other), so it is a basis for S So we can use the Gram-Schmidt process (section 4, Theorem ) to find an orthogonal basis of S: let v =, v = = = Then v and v are orthogonal (since v v =, but to get an orthonormal basis, we must divide each vector by its norm The norm of v is and the norm of v is, so an orthonormal basis of S is { v, v } =, Solution: This comes down to finding a vector in the orthogonal complement of S As explained in section, the orthogonal complement of S is

LINEAR ALGEBRA, -IPARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS5 equal to the nullspace of the matrix This matrix row-reduces to and the solution to the associated homogeneous linear system is given by the equations x = x x and x = x (with x free) Using the value of for x, we get the solution (,, ), and we can normalize this to get the vector v = So { v, v, v } is an orthonormal basis of R (c) Is there a unique way to extend the basis you found in (a) to an orthonormal basis of R? Explain No, the vector v found above is not the unique way to extend { v, v } to an orthonormal basis of R : the vector v works, too: it is also a unit vector orthogonal to v and v (But it turns out that v and v are the only two possibilities) (d) Find the orthogonal projection of the vector (,, ) onto the subspace W Solution: We will use the orthonormal basis for W that we found in part (a) Call this basis { w, w } The orthogonal projection of (,, ) onto W is: w w w +, w w w = w + w (note that w w = and w w = because the basis is orthonormal) = / / /