Study Guide for Linear Algebra Exam 2

Similar documents
Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Solutions to Final Exam

Announcements Monday, October 29

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

LINEAR ALGEBRA SUMMARY SHEET.

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Dimension. Eigenvalue and eigenvector

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Linear Algebra: Sample Questions for Exam 2

Summer Session Practice Final Exam

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Math 3191 Applied Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Eigenvalues and Eigenvectors

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

and let s calculate the image of some vectors under the transformation T.

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Announcements Wednesday, November 01

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Solving a system by back-substitution, checking consistency of a system (no rows of the form

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

MTH 464: Computational Linear Algebra

Chapter 5. Eigenvalues and Eigenvectors

Math Linear Algebra Final Exam Review Sheet

Math 2114 Common Final Exam May 13, 2015 Form A

Math 54 HW 4 solutions

Math Final December 2006 C. Robinson

Practice Final Exam. Solutions.

Test 3, Linear Algebra

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Eigenvalues, Eigenvectors, and Diagonalization

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MA 265 FINAL EXAM Fall 2012

Math 2331 Linear Algebra

Definitions for Quizzes

Math 3191 Applied Linear Algebra

Name: Final Exam MATH 3320

MATH 221, Spring Homework 10 Solutions

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Announcements Wednesday, November 01

1 Last time: inverses

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

LINEAR ALGEBRA REVIEW

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Online Exercises for Linear Algebra XM511

MAT 1302B Mathematical Methods II

Eigenvalues and Eigenvectors

The definition of a vector space (V, +, )

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Vector space and subspace

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Linear Algebra- Final Exam Review

Chapter 5 Eigenvalues and Eigenvectors

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

I. Multiple Choice Questions (Answer any eight)

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Chapter 1. Vectors, Matrices, and Linear Spaces

235 Final exam review questions

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Lecture Summaries for Linear Algebra M51A

Math 4377/6308 Advanced Linear Algebra

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Linear Algebra Summary. Based on Linear Algebra and its applications by David C. Lay

Eigenvalues and Eigenvectors

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

MAT 242 CHAPTER 4: SUBSPACES OF R n

Econ Slides from Lecture 7

1. Select the unique answer (choice) for each problem. Write only the answer.

Properties of Linear Transformations from R n to R m

MATH10212 Linear Algebra Lecture Notes

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

MATH 1553 PRACTICE FINAL EXAMINATION

City Suburbs. : population distribution after m years

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

Math 2331 Linear Algebra

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math113: Linear Algebra. Beifang Chen

Transcription:

Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the ten axioms (or rules) listed below. The axioms must hold for all vectors u, v, and w in V and for all scalars c and d. 1. The sum of u and v, denoted by u + v, is in V. 2. u + v = v + u 3. ( u + v ) + w = u + ( v + w ) 4. There is a zero vector 0 in V such that u + 0 = u 5. For each u in V, there is a vector, u in V such that u + ( u) = 0 6. The scalar multiple of u by c, denoted by cu, is in V 7. c( u + v ) = cu + cv 8. ( c + d )u = cu + dv 9. c( du ) = ( cd )u 10. 1u = u Subspace A subspace of a vector space V, is a subset H of V that has three properties: a. The zero vector of V is in H b. H is closed under vector addition. That is for u and v in H, the sum u + v is in H c. H is closed under multiplication by scalars. That is, for each u in H and each scalar c, the vector cu is in H Null space The null space of an m x n matrix A, written as Nul A, is the set of all solutions to the homogeneous equation Ax = 0. In set notation: Nul A = { x : x is in R n and Ax = 0 } Column space The column space of an m x n matrix A, written Col A, is the set of all linear combinations of the columns of A. If A = [ a 1,, a n ], then Col A = Span{ a 1,, a n } Linear transformation A linear transformation T from a vector space V into a vector space W is a rule that assigns to each vector x in V a unique vector T( x ) in W, such that i. T( u + v ) = T( u ) + T( v ) for all u, v in V, and ii. T( cu ) = ct( u ) for all u in V and all scalars c

Linear independence An indexed set of vectors { v 1,, v p } is said to be linearly independent if the vector equation c 1v 1 + c 2v 2 + + c pv p = 0 has only the trivial solution c 1 = 0,, c p = 0 Linear dependence An indexed set of vectors { v 1,, v p } is said to be linearly dependent if the vector equation c 1v 1 + c 2v 2 + + c pv p = 0 has a nontrivial solution, that is, if there are some weights, c 1,, c p, not all zero such that the equation holds Linear dependence relation In such a case ( above ), this is called a linear dependence relation among v 1,, v p Basis Let H be a subspace of a vector space V. An indexed set of vectors B = { b 1,, b p } in V is a basis for H if i. B is a linearly independent set, and ii. the subspace spanned by B coincides with H; that is H = Span{ b 1,, b p } Coordinate vector Suppose the set B = { b 1,, b p } is a basis for V and x is in V. The coordinates of x relative to the basis B ( or the B coordinates of x ) are the weights c 1,, c n such that x = c 1b 1 + + c nb n Dimension If V is spanned by a finite set, it is said to be finite dimensional and the dimension of V, written as dim V, is the number of vectors in the basis for V. The dimension of the zero space { 0 }, is defined to be zero. If V is not spanned by a finite set, it is said to be infinite dimensional Rank The rank of A is the dimension of the column space of A Change of coordinates matrix P B = [ b 1 b 2 b n ] = x = P B [ x ] B

Eigenvalue A scalar λ is called an eigenvalue of A if there is a nontrivial solution x of Ax = λx; such an x is called an eigenvector corresponding to λ Eigenvector An eigenvector of an n x n matrix A is a nonzero vector x such that Ax = λx for some scalar λ Eigenspace The set of all solutions to: ( A λi )x = 0 I is the null space of ( A λi ). This is called the Eigenspace of A corresponding to λ Diagonalizable A square matrix A is said to be diagonalizable if A is a similar to a diagonal matrix, that is: A = PDP 1 for some invertible matrix P and some diagonal matrix D Similar matrices We say that A is similar to B if there is an invertible matrix P such that PAP 1 = B, or A = PBP 1 Chapter.ThmNum theorem 3.2 If A is a triangular matrix, the det A is the product of the entries on the main diagonal of A. 3.3 row ops Let A be a square matrix a. If multiple of one row of A is added to another row to produce a matrix B, then det A = det B b. If two rows of A are interchanged to produce B, then det B = det A c. If one row of A is multiplied by k to produce B, then det B = k*det A 3.4 A square matrix A is invertible if and only if det A!= 0 3.5 If A is an n x n matrix, then det A T = det A

3.6 multiplicative property If A and B are n x n matrices, then det AB = ( det A )( det B ) 3. Formula for det A as a product of pivots det A ( 1 ) r * ( product of pivots in U ) ; when A is invertible 0 ; when A is not invertible 4.1 If v 1,, v p are in a vector space V, then Span{ v 1,, v p } is a subspace of V 4.2 The null space of an m x n matrix A is a subspace of R n, Equivalently, the set of all solutions to a system Ax = 0 of m homogeneous linear equations in n unknowns is a subspace or R n PROOF: Nul A is a subset of R n because A has n columns. Show that Nul A satisfies the three properties of subspaces: 1. Of course 0 is in Nul A let u and v represent two vectors in Nul A, then Au = 0 and Av = 0 2. A( u + v ) = Au + Av = 0 + 0 = 0 ( using a property of matrix multiplication ), and thus closed under vector addition 3. A( cu ) = c( Au ) = c( 0 ) = 0 ( where c is any scalar ), which shows that cu is in Nul A, thus, Nul A is a subspace of R n 4.3 The column space of an m x n matrix A is a subspace of R n 4.4 An indexed set { v 1,, v p } of two or more vectors, with v 1!= 0, is linearly dependent if and only if some v j ( with j > 1 ) is a linear combination of the preceding vectors v 1,, v j 1 4.5 spanning set theorem Let S = { v 1,, v p } be a set in V and let H = Span{ v 1,, v p }. a. If one of the vectors in S say, v k is a linear combination of the remaining vectors in S, then the set formed from S by removing v k still spans H b. If H!= { 0 }, some subset of S is a basis for H 4.6 The pivot columns of a matrix A form a basis for Col A

4.7 unique representation theorem Let B = { b 1,, b n } be a basis for a vector space V. Then for each x in V, there exists a unique set of scalars c 1,,c n such that x = c 1b 1 + + c nb n PROOF: Since B spans V, there exists scalars such that x = c 1b 1 + + c nb n holds true. Suppose x also has the representation: x = d 1b 1 + + d nb n for scalars d 1,, d n. Then subtracting, we have: 0 = x x = ( c 1 d 1 )b 1 + + ( c n d n )b n Since B is linearly independent, the weights in the second equation must all be zero. That is, c j = d j for 1 <= j <= n 4.8 Let B = { b 1,, b n } be a basis for a vector space V. Then the coordinate mapping x > [ x ] B is a one to one linear transformation from V onto R n. 4.9 The vector space V has a basis B = { b 1,, b n }, then any set in V containing more than n vectors must be linearly dependent 4.10 If a vector space V has a basis of n vectors, then every basis of V must consist of exactly n vectors 4.12 Basis Theorem Let V be a p dimensional vector space, p >= 1. Any linearly independent set of exactly p elements in V is automatically a basis for V. Any set of exactly p elements that spans V is automatically a basis for V. 4.13 If two matrices A and B are row equivalent, then their row spaces are the same. If B is in echelon form, the nonzero rows of B form a basis for the row space of A as well as B 4.14 The rank theorem The dimensions of the column space and the row space of an m x n matrix A are equal. This common dimension, the rank of A, also equals the number of pivot positions in A and satisfies the equation: rank A + dim Nul A = n

IMT cont d Let A be an m x n matrix. Then the following statements are each equivalent to the statement that A is an invertible matrix. m. The columns of A form a basis for R n. n. Col A = R n o. dim Col A = n p. rank A = n q. Nul A = { 0 } r. dim Nul A= 0 5.1 The eigenvalues of a triangular matrix are the entries on its main diagonal. IMT the END!! Let A be an n x n matrix. Then A is invertible if and only if: s. The number 0 is not an eigenvalue of A 5.2 If v 1,, v r are eigenvectors that correspond to distinct eigenvalues λ 1,, λ r of an n x n matrix A, then the set { v 1,, v r } is linearly independent 5.3 Properties of determinants Let A and B be n x n matrices. a. A is invertible if and only if det A!= 0 b. det AB = ( det A )( det B ) c. det A T = det A d. If A is triangular, then det A is the product of the entries on the main diagonal of A e. A row replacement operation on A does not change the determinant. A row interchange changes the sign of the determinant. A row scaling also scales the determinant by the same as the scalar factor 5.4 If n x n matrices A and B are similar, then they have the same characteristic polynomial and hence the same eigenvalues ( with the same multiplicities ) 5.5 The digitalization theorem An n x n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. In fact, A = PDP 1, with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the diagonal entries of D are the eigenvalues of A that correspond, respectively, to the eigenvectors in P. 5.6 An n x n matrix with n distinct eigenvalues is diagonalizable

5.7 Let A be an m x n matrix whose distinct eigenvalues are λ 1,, λ p a. For 1 <= k <= p, the dimension of the Eigenspace for λ k is less than or equal to the multiplicity of the eigenvalue λ k b. The matrix A is diagonalizable if and only if the sum of the dimensions of the distinct eigenspaces equals n, and this happens if and only if the dimension of the eigenspace for each λ k equals the multiplicity of λ k c. If A is diagonalizable and B k is a basis for the eigenspace corresponding to λ k for each k, then the total collection of vectors in the sets B 1,, B 1 forms an eigenvector basis for R n