Final Exam Practice Problems Answers Math 24 Winter 2012

Similar documents
LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra Practice Problems

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Definition (T -invariant subspace) Example. Example

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

and let s calculate the image of some vectors under the transformation T.

Study Guide for Linear Algebra Exam 2

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Chapter 2 Notes, Linear Algebra 5e Lay

Problem 1: Solving a linear equation

Review problems for MA 54, Fall 2004.

Math 308 Practice Final Exam Page and vector y =

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MA 265 FINAL EXAM Fall 2012

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Solutions to Final Exam

MATH 3321 Sample Questions for Exam 3. 3y y, C = Perform the indicated operations, if possible: (a) AC (b) AB (c) B + AC (d) CBA

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Lecture 1 Systems of Linear Equations and Matrices

Definitions for Quizzes

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

CHAPTER 8: Matrices and Determinants

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math Linear Algebra Final Exam Review Sheet

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

MAT 1302B Mathematical Methods II

Linear Algebra Highlights

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MAT Linear Algebra Collection of sample exams

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

235 Final exam review questions

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Practice problems for Exam 3 A =

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Chapter 5. Eigenvalues and Eigenvectors

MATH 1553 PRACTICE FINAL EXAMINATION

Math Final December 2006 C. Robinson

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Eigenvalues, Eigenvectors, and Diagonalization

Linear Algebra Practice Problems

Linear Algebra. Min Yan

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Matrices related to linear transformations

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

4. Linear transformations as a vector space 17

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Additional Homework Problems

Lecture 2 Systems of Linear Equations and Matrices, Continued

MATH 1553-C MIDTERM EXAMINATION 3

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Review of Linear Algebra

MATH10212 Linear Algebra B Homework Week 4

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Math 369 Exam #2 Practice Problem Solutions

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

LINEAR ALGEBRA SUMMARY SHEET.

Solving a system by back-substitution, checking consistency of a system (no rows of the form

LINEAR ALGEBRA QUESTION BANK

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math 2114 Common Final Exam May 13, 2015 Form A

Conceptual Questions for Review

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Linear Equations in Linear Algebra

Math 21b. Review for Final Exam

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MAC Module 3 Determinants. Learning Objectives. Upon completing this module, you should be able to:

Linear Equations in Linear Algebra

ANSWERS. E k E 2 E 1 A = B

Linear Algebra problems

Chapter 5 Eigenvalues and Eigenvectors

MAT1302F Mathematical Methods II Lecture 19

Matrices and RRE Form

Quizzes for Math 304

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Cheat Sheet for MATH461

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Dimension. Eigenvalue and eigenvector

Math 315: Linear Algebra Solutions to Assignment 7

MATH. 20F SAMPLE FINAL (WINTER 2010)

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Transcription:

Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the set of all n n matrices, with standard scalar multiplication and vector addition defined as a Jordan product, a vector space? No, we would need r(a B) = (ra) (rb), but that latter actually equals r (A B). () Let V = (,,, ), V = (,,, ), and V 3 = (,,, ). (a) Show that {V, V, V 3 } is an orthogonal set. Calculate the inner product of each pair of vectors, to show each is 0. (b) Find a V 4 so that {V, V, V 3, V 4 } is an orthogonal basis for R 4. To find such a vector, solve the system of linear equations obtained by setting the inner product of your unknown with each of the given vectors equal to zero. Since the constants are zero you may do this by row reducing the un-augmented matrix below. 0 0 0 0 0 0 The reduced echelon form tells you the sum of the second and fourth entries and the third and fourth entries must be zero, and the first and last entries must be equal. A vector that has these properties is (,,, ). The possible answers are that vector or any of its nonzero scalar multiples. (c) Turn your basis from (b) into an orthonormal basis. Each of the vectors in the basis has inner product with itself of 4 and hence length, so divide each by. (3) For each matrix below: (a) Find the inverse or show it does not exist; (b) find the characteristic polynomial, all eigenvalues, and their associated eigenspaces, and (c) diagonalize the matrix if possible, giving a basis relative to which it has that diagonal form. 6 4 3 6 0, 0 3 4 0 0, 3 3 3 Hint for third matrix: 5 is an eigenvalue. Though the first matrix has an unpleasant characteristic polynomial computation, it factors without too much difficulty. In each case I will do part (b) first, since it will tell me whether the matrix is invertible or not, but I will only write all the work for the first matrix. det 3 x 6 x 4 = 9x 8x x 3 = x(x + 9)(x ). 6 0 x.

The eigenvalues are 0, 9, and. Since 0 is an eigenvalue, the answer to (a) is that this is noninvertible. To find the eigenspace for eigenvalue 0 we row reduce the given matrix: 0 0 0 0 0 0 This tells us the eigenspace for 0 is the span of (, 0 ). For eigenvalue we row reduce the following matrix. 6 4 0 5/4 0 7/4 6 0 0 0 The eigenspace for is the span of ( 5/4, 7/4, ). Finally, for eigenvalue 9 we row reduce the following matrix. 6 8 4 0 0 0 / 6 0 0 0 The eigenspace for 9 is spanned by (0, /, ). The matrix is diagonalizable, with diagonal 0,, 9 relative to the basis {(, 0, ), ( 5, 7, 4), (0,, )}. For the second matrix, the characteristic polynomial is simply ( x)(4 x)( x), so the eigenvalues are, 4, and. The matrix is diagonalizable with the diagonal those numbers in order, and one basis relative to which it has that form is {(5, 3, 0), (0,, 0), (, 0, 3)}; those vectors in order span each of the eigenspaces. To invert the second matrix, row reduce the augmented matrix below. 0 0 0 0 0 0 3 4 0 0 0 4 3 0 0 0 0 0 0 0 0 0 / 0 0 0 / 0 4 0 3 0 0 0 0 / 0 0 0 / 0 0 3/4 /4 /4 0 0 0 0 / The last three columns of that last matrix are the inverse to the given matrix. Finally, for the third matrix the characteristic polynomial is 0 4x + 9x x 3 = (x ) (x 5). The eigenvalues are with multiplicity and 5 with multiplicity. Row reduction gives that the eigenspace for 5 is spanned by (,, ), and the eigenspace for is the plane x + y + z = 0, spanned by (among other options) {(,, 0), (, 0, )}. Since the geometric multiplicities match the algebraic multiplicities the matrix is diagonalizable. Furthermore, since 0 is not an eigenvalue, the matrix is invertible, though its inverse is a fractional nightmare. (4) For each pair of matrix properties below, there are three possible relationships: If you have x you must have y and vice-versa, if you have x you must have y but you may have y without x, or you can never have x and y simultaneously. Determine which one holds between each pair. Does your answer change if you consider only matrices that are neither the zero matrix nor the identity matrix?

(a) diagonal (b) symmetric (c) nilpotent (d) idempotent (e) invertible Each of (b) through (e) may be satisfied without any of the others. Every matrix that satisfies (a) satisfies (b), but not vice-versa (a matrix at least with all entries equal and nonzero, for example, is symmetric but not diagonal). Only the zero matrix is both (a) and (c), and only projections relative to a nice basis are both (a) and (d) (i.e., a basis relative to which they have a diagonal of all 0s and s). A diagonal matrix that has only nonzero diagonal entries is invertible, but not every diagonal matrix is invertible. The only matrix satisfying (b) and (c) is again the zero matrix (symmetry ensures any nonzero entries will line up together and propagate under matrix multiplication). Of course any diagonal idempotent matrix is a symmetric idempotent matrix, and while the identity is a symmetric invertible matrix, the 3 3 matrix with s running from the lower left to upper right and 0s elsewhere is in fact its own inverse. The only matrix satisfying (c) and (d) is the zero matrix. There is no matrix satisfying (c) and (e), and the only matrix satisfying (d) and (e) is the identity matrix. (5) Determine conditions on h and k such that the following system of linear equations has (i) infinitely many solutions, (ii) a unique solution, and (iii) no solutions. x + 3y = k 4x + hy = 8 One step of reduction turns the second equation into (h )y = 8 4k. We get infinitely many solutions if both of those coefficients are 0, leaving us with just x + 3y = k. That is, if h = and k =. If h = but k, we have no soluctions, because y times 0 would have to equal something nonzero. If h, we have a unique solution, to wit: x = k 3(8 4k) h, y = 8 4k h. (6) Assuming A, B, C, X are all n n matrices and the first three are invertible, solve AX + B = CA for X. We may subtract B from both sides and then multiply by A on the left to solve for X: X = A (CA B) = A CA A B. No further simplification is possible in the absence of additional information about the matrices. 3

(7) Suppose the matrix A has eigenvalue, with eigenvector (, ), and eigenvalue 5, with eigenvector (, ). Use change of basis to find A. To use change of basis, we need the matrices that map vectors to and from coordinates relative to the basis B = {(, ), (, )}. Relative to B, A is a diagonal matrix with diagonal entries, 5. The columns of the change-of-basis matrices are always the kind of object we are trying to obtain: the vectors of B are the columns of the matrix that produces vectors from coordinates, and the coordinates of the standard basis are the columns of the matrix that produces coordinates from vectors. To find the B-coordinates of the standard basis, consider the general linear combination of the B vectors: (a b, a + b). To obtain (, 0) we need a = /, b = /, and to obtain (0, ) we need a = /, b = /. Therefore, remembering matrices are applied as transformations from the right to the left, the desired matrix is [ ] [ 0 0 5 ] [ ] = [ 3 7 7 3 Note that we did essentially the same arithmetic as if we had found the linear combinations of B that gave the standard basis and applied those linear combinations to the images of the B vectors; we have simply done both together and in a slightly different order. (8) The augmented matrix of a system of three linear equations in three unknowns has been row reduced to the following form. What are the solutions to the system, if any? 5 0 4 8 0 0 0 You may finish the row reduction or do this by eye. We need x 3 = 0 so x 3 = 0. That gives x = 8, so x = 4, and hence x + 0 =, so x =. (9) Let P : V V be a projection, and let {A,..., A k } be a basis for Im P. Suppose that this basis is extended to a basis for all of V, {A,..., A k, B,..., B l }. (a) For each i, i l, let C i = B i P (B i ). Show that {A,..., A k, C,..., C l } is a basis for V. Since P (B i ) Im(P ), it is a linear combination of the vectors {A,..., A k }. Therefore, from C i and {A,..., A k } we may obtain B i = C i + P (B i ). Since the set {A,..., A k, C,..., C l } spans {A,..., A k, B,..., B l }, which in turn spans all of V, {A,..., A k, C,..., C l } also spans V and since it has size dim(v ) it is a basis. (b) Find the matrix for P with respect to the basis in part (a). This basis is exactly a basis relative to which P has the prototypical projection matrix: a diagonal matrix in which the first k entries are s and the remainder are 0s. 4 ].

(0) How many isometries are there from R to R, with the standard inner product? An isometry must map an orthonormal basis to an orthonormal basis. The only orthonormal bases of R are {()} and {( )}. Therefore there are two isometries on R: the map that takes each basis to itself and the map that swaps them. () In a three-dimensional vector space V, two bases are M = {m, m, m 3 } and N = {n, n, n 3 }. Given the following relationships between M and N, find the changeof-basis matrices from M to N and from N to M. m = n + n 3 m = n m 3 = n n 3 The basis that takes M-coordinates to N -coordinates is the easy one, since its entries are the N -coordinates of the M-vectors, which are exactly the coefficients of the given linear equations: 0 0 0 0 To find the other change of basis you must solve the systems of linear equations to obtain the M-coordinates of the N -vectors. There are essentially two guises under which the process can be enacted: invert the matrix above, or work out the equations given and then put the result into a matrix. Either way you end up with the following matrix: 0 0 0 () Let V be a vector space and S V be a spanning set for V. Suppose A S is linearly independent, but A {x} is linearly dependent for all x S A. Prove A is a basis for V. If A {x} is linearly dependent there is a nontrivial linear combination of those vectors giving the zero vector, and since A is linearly independent the coefficient of x in such a combination must be nonzero. Hence we may solve for x. This means A generates every vector in S A, and since S spans all of V so must A. Therefore A is a linearly independent spanning set, otherwise known as a basis. (3) Determine whether the set {x + x, 3x +, x x + 3} is a basis for P (R). It is probably easier to show these are a linearly independent set than to show they span (contrast #9(a)). Take a linear combination of these three polynomials giving the zero polynomial and it breaks down into a system of three linear equations, one per degree of x: a c = 0, a + 3b c = 0, and a + b + 3c = 0. Solve these however you desire, by row reducing a matrix or not, and you will see each of a, b, c must be 0. Therefore the only linear combination giving the zero polynomial is trivial, and the given vectors are linearly independent. 5

(4) Prove that if E and E are eigenspaces for T : V V, then either E = E or E E = {0}. Let λ, λ denote the eigenvalues associated to E, E, respectively. If λ = λ clearly E = E. If not, then all eigenvectors of λ are linearly independent from eigenvectors of λ, so the only vector their eigenspaces can have in common is the one non-eigenvector member of an eigenspace, 0. 6