Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Similar documents
Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapter 6: Orthogonality

2. Every linear system with the same number of equations as unknowns has a unique solution.

LINEAR ALGEBRA REVIEW

PRACTICE PROBLEMS FOR THE FINAL

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Math 308 Practice Test for Final Exam Winter 2015

MATH 221, Spring Homework 10 Solutions

Study Guide for Linear Algebra Exam 2

Dimension. Eigenvalue and eigenvector

1. General Vector Spaces

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

and let s calculate the image of some vectors under the transformation T.

ELE/MCE 503 Linear Algebra Facts Fall 2018

Definitions for Quizzes

LINEAR ALGEBRA SUMMARY SHEET.

Announcements Monday, October 29

235 Final exam review questions

March 27 Math 3260 sec. 56 Spring 2018

Recall : Eigenvalues and Eigenvectors

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

SUMMARY OF MATH 1600

Linear Algebra Highlights

Math Final December 2006 C. Robinson

Summer Session Practice Final Exam

(v, w) = arccos( < v, w >

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Chapter 4 & 5: Vector Spaces & Linear Transformations

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Conceptual Questions for Review

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Math Linear Algebra Final Exam Review Sheet

Solutions to Final Exam

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

CS 246 Review of Linear Algebra 01/17/19

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Solving a system by back-substitution, checking consistency of a system (no rows of the form

(v, w) = arccos( < v, w >

MA 265 FINAL EXAM Fall 2012

Eigenvalues, Eigenvectors, and Diagonalization

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Announcements Wednesday, November 01

1. Select the unique answer (choice) for each problem. Write only the answer.

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Practice Final Exam. Solutions.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Math 54 Second Midterm Exam, Prof. Srivastava October 31, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Numerical Linear Algebra Homework Assignment - Week 2

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Econ Slides from Lecture 7

Chapter 5 Eigenvalues and Eigenvectors

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Math 407: Linear Optimization

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Sample Final Exam: Solutions

(v, w) = arccos( < v, w >

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Chapter 4 Euclid Space

Math 315: Linear Algebra Solutions to Assignment 7

0 2 0, it is diagonal, hence diagonalizable)

Review problems for MA 54, Fall 2004.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Math 369 Exam #2 Practice Problem Solutions

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Solutions to Review Problems for Chapter 6 ( ), 7.1

The Jordan Normal Form and its Applications

Announcements Wednesday, November 01

Review of Linear Algebra

Lecture Summaries for Linear Algebra M51A

Math 1553, Introduction to Linear Algebra

Properties of Linear Transformations from R n to R m

Solutions to Midterm 2 Practice Problems Written by Victoria Kala Last updated 11/10/2015

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

7. Dimension and Structure.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

LINEAR ALGEBRA QUESTION BANK

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Dimension and Structure

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Transcription:

Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final, you should Know the previous material we have covered Know how to find the coordinates of a vector relative to a given basis Know how to find a vector in a given coordinate system given the vector relative to a given basis Know how to find perform a change of basis given two or more bases Know how to find the dimension of a vector space Know how to find the rank and nullity of a matrix Know how to identify if a scalar value is an eigenvalue of a matrix Know how to identify if a vector is a an eigenvector of a matrix Know how to find eigenvalues and eigenvectors Know how to diagonalize a matrix Know how to find a power of a diagonal matrix Know how to find the dot product of two vectors Know how to find the norm or length of a vector Know how to find the distance between two vectors Know how to determine if two vectors are orthogonal to each other Know how to find the angle between two vectors Know how to determine if a set is orthogonal or orthonormal Know how to determine if a matrix is orthogonal Know how to find the orthogonal projection of one vector onto another Know how to find a coordinate vector of a vector with respect to a basis using dot products Know how to find the least squares solution of a linear system If you are not sure if you know how to do any of the above, you should read the appropriate notes and do some practice problems from your homework and textbook 1

Coordinate Systems and Change of Basis See sections 44, 47 of your textbook Since a basis spans a space, then all the vectors in that space can be written as a linear combination of the basis vectors, ie fora basis B = {v 1, v 2,, v n } of a vector space V, then for all x V, x = c 1 v 1 + + c n v n The vector of the weights c 1,, c n is said to be the coordinate vector of x with respect to B, ie Dimension of a Vector Space See sections 45, 46 of your textbook x B = ( c 1 c 2 c n ) The dimension of a vector space is the number of elements in its basis In particular, the dimension of Null(A), or nullity, is the number of free variables in the equation Ax = 0 and the dimension of Col(A), or rank, is the number of pivot columns of A If U is a subspace of a vector space V, then dimu dimv We can find a way to change between bases Let B = {b 1, b 2,, b n } and C = {c 1, c 2,, c n } be bases of a vector space V Then there exists an n n matrix P B C such that x c = P B C x B and x B = P C B x C where P 1 B C = P C B P B C is called the change of coordinates matrix from B to C To find P B C, set up the augmented matrix ( c1 c n c n b 1 b 2 b n ) and reduce to get ( I PB C ) Theorem (Rank-Nullity Theorem) If A is an m n matrix then If A is an m n matrix and has rank r, then rank(a) + nullity(a) = n dim of Col(A) Row(A) Null(A) Null(A T ) is r r n r m r 2

Eigenvalues and Eigenvectors See sections 51, 52 of your textbook λ is said to be an eigenvalue of a square matrix A with nonzero eigenvector x such that We can rearrange this equation to get Since x is nonzero, it must be that Ax = λx (A λi)x = 0 (1) det(a λi) = 0 (2) We use equation (2) to find the eigenvalues of a matrix After we find the eigenvalues, we then use equation (1) to find their eigenvectors Equation (2) should yield a polynomial equation, this is sometimes called the characteristic equation A matrix A is said to be similar to a matrix B if there exists an invertible matrix P such that Some nice facts: A = P BP 1 The eigenvalues of a triangular or diagonal matrix are the entries on its main diagonal If a matrix has eigenvalue 0 then it is not invertible (See practice problems) If two matrices are similar then they have the same eigenvalues (See practice problems) The set of eigenvectors of a matrix (called the eigenspace) form a subspace Diagonalization See section 53 of your textbook Sometimes we wish to find powers of matrices, like A 100 But it would be a very difficult and tedious process to multiply a matrix A out 100 times To make this process easier we diagonalize a matrix: If A is an n n matrix and has n distinct eigenvectors, then A = P DP 1 where P is the matrix of eigenvectors of A: P = ( ) v 1 v 2 v n and D is a diagonal matrix of eigenvalues of A: λ 1 0 0 0 λ 2 0 D = 0 0 λ m 3

If A = P DP 1 then A k = A A A = P DP 1 P DP 1 P DP 1 = P D(P 1 P )D(P 1 P ) (P 1 P )DP 1 = P DD DP 1 = P D k P 1 If then λ 1 0 0 0 λ 2 0 D = 0 0 λ m λ k 1 0 0 D k 0 λ k 2 0 = 0 0 λ k m Dot Product, Length, and Orthogonality See section 61 of your textbook Let u, v R n The dot product, or inner product, u v is defined to be u v = u T v = ( ) v 2 u 1 u 2 u n = u 1v 1 + u 2 v 2 + + u n v n v n The norm, or length, of a vector u R n is defined by u = u u = u 2 1 + u2 2 + + u2 n Notice that from this definition we have u 2 = u u The distance between two vectors u, v R n is given by u v The vectors u and v are said to be orthogonal if and only if u v = 0 The dot product of two vectors u, v R n is also given by the formula v 1 u v = u v cos θ 4

where θ is the angle between u, v We can rearrange this formula to find the angle between to vectors: ( ) u v θ = cos 1 u v Properties: u v = v u (this says order doesn t matter) (u + v) w = u w + v w (cu) v = c(u v) = u (cv) u u 0 and u u = 0 if and only if u = 0 cv = c v for any c R Orthogonal Sets See section 62 of your textbook A set of vectors {u 1,, u n } is said to be an orthogonal set if each set of vectors is orthogonal, that is, u i u j = 0 for all i j A basis is said to be an orthogonal basis if it is also an orthogonal set Let {u 1,, u n } be an orthogonal basis for a subspace W of R n For each y W, where c j = y u j u j u j, j = 1,, p The term is said to be the projection of y onto u j y = c 1 u 1 + + c n u n y u j u j u j u j A unit vector u, sometimes called a normed vector, is a vector that has length one; that is u = 1 To normalize a vector v, we divide the vector by its length: u = v v An orthonormal set is a set of orthogonal unit vectors A matrix A is said to be orthogonal if and only if its column vectors are orthonormal; that is, A T A = I 5

Least Squares Problems See section 65 of your textbook The least squares solution of the system Ax = b is a vector ˆx such that b Aˆx b Ax for all x To find the least squares solution, we solve the system A T Aˆx = A T b for ˆx The orthogonal projection of b on the column space of A is the product Aˆx 6