Warm-up True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be a nonzero matrix and set S = col(a). Prove that if A x = y is consistent, then proj S y = y
A I. Without doing any computations, estimate det A II. Without doing any computations, give an eigenvalue of A with an associated eigenvector III.Write down the matrix A. Hint: think A x 1 x =?? 2??
4.1: Subspaces Key examples: span u 1,, u k, null(a) and range A = col(a)
4.1: Subspaces Key examples: span u 1,, u k, null(a) and range A = col(a) A 1-to-1 iff ker A = {0}
4.1: Subspaces Key examples: span u 1,, u k, null(a) and range A = col(a) A 1-to-1 iff ker A = {0} BIG Theorem addition?
4.2: Basis and dimension Definitions
4.2: Basis and dimension Definitions Every basis has the same number of vectors
4.2: Basis and dimension Definitions Every basis has the same number of vectors Can always get a basis: Given a linearly independent set, just add vectors not in the span Given a spanning set, kick out vectors in the span
4.2: Basis and dimension Definitions Every basis has the same number of vectors Can always get a basis: Given a linearly independent set, just add vectors not in the span Given a spanning set, kick out vectors in the span BIG Theorem addition?
4.3: Row and column space Definitions
4.3: Row and column space Definitions Same dimension! (# pivot columns = # pivot rows = rank )
4.3: Row and column space Definitions Same dimension! (# pivot columns = # pivot rows = rank ) If A is m n, nullity A + rank A = m
4.3: Row and column space Definitions Same dimension! (# pivot columns = # pivot rows = rank ) If A is m n, nullity A + rank A = m BIG Theorem addition?
5.1: det: n n matrices R Cofactor expansion
5.1: det: n n matrices R Cofactor expansion A invertible iff det A 0
5.1: det: n n matrices R Cofactor expansion A invertible iff det A 0 area A R = det(a) area(r)
5.1: det: n n matrices R Cofactor expansion A invertible iff det A 0 area A R = det(a) area(r) det A T = det A, det AB = det A det(b)
5.1: det: n n matrices R Cofactor expansion A invertible iff det A 0 area A R det A T = det(a) area(r) = det A, det AB = det A det(b) Determinant of a triangular matrix is the product of the diagonal entries
5.1: det: n n matrices R Cofactor expansion A invertible iff det A 0 area A R det A T = det(a) area(r) = det A, det AB = det A det(b) Determinant of a triangular matrix is the product of the diagonal entries BIG Theorem addition?
6.1: Eigenvalues and eigenvectors A x = λx
6.1: Eigenvalues and eigenvectors A x = λx To find eigenvalues, find roots of det A λi = 0
6.1: Eigenvalues and eigenvectors A x = λx To find eigenvalues, find roots of det A λi = 0 Multiplicity: λ 4 λ + 5 λ 7 3
6.1: Eigenvalues and eigenvectors A x = λx To find eigenvalues, find roots of det A λi = 0 Multiplicity: λ 4 λ + 5 λ 7 3 To find eigenvectors, solve A λi x = 0; gives the eigenspace
6.1: Eigenvalues and eigenvectors A x = λx To find eigenvalues, find roots of det A λi = 0 Multiplicity: λ 4 λ + 5 λ 7 3 To find eigenvectors, solve A λi x = 0; gives the eigenspace BIG Theorem addition?
6.3: Change of basis Let B = { v 1,, v n } be a basis. Then c 1 c 2 x B = c n B means x = c 1 v 1 + c 2 v 2 + + c n v n
6.3: Change of basis Let B = { v 1,, v n } be a basis. Then c 1 c 2 x B = c n B means x = c 1 v 1 + c 2 v 2 + + c n v n So if U = [ v 1 v n ], then x = Ux B U 1 x = x B
6.3: Change of basis Let B = { v 1,, v n } be a basis. Then c 1 c 2 x B = c n B means x = c 1 v 1 + c 2 v 2 + + c n v n So if U = [ v 1 v n ], then x = Ux B U 1 x = x B Formula to change between two bases follows
8.1: Orthogonality and Orthogonal vectors, sets, bases and subspaces (S )
8.1: Orthogonality and Orthogonal vectors, sets, bases and subspaces (S ) x = x x
8.1: Orthogonality and Orthogonal vectors, sets, bases and subspaces (S ) x = x x If x y = 0, x 2 + y 2 = x + y 2
8.1: Orthogonality and Orthogonal vectors, sets, bases and subspaces (S ) x = x x If x y = 0, x 2 + y 2 = x + y 2 If x span{ s 1,, s k }, an orthogonal basis, then x = k j=1 x sj s j s j s j = k j=1 proj s j ( x)
8.2: Projections and Gramm-Schmidt Definition, properties
8.2: Projections and Gramm-Schmidt Definition, properties Key observation: x proj S (x) is orthogonal to S
8.2: Projections and Gramm-Schmidt Definition, properties Key observation: x proj S (x) is orthogonal to S Gramm Schmidt on s 1,, s k : v 1 s 1 v 2 s 2 proj v1 ( s 2 ) v 3 s 3 proj v1 s 3 proj v2 ( s 3 ) v j s j proj span{v1,v 2,,v j 1 } s j
8.5: Least-squares regression How to solve an inconsistent system A x = y: Replace y with proj col A y =: y
8.5: Least-squares regression How to solve an inconsistent system A x = y: Replace y with proj col A y =: y The solution x to A x = y is the least-squares solution and satisfies A x y A x y for all x
8.5: Least-squares regression How to solve an inconsistent system A x = y: Replace y with proj col A y =: y The solution x to A x = y is the least-squares solution and satisfies A x y A x y for all x Normal equations: the least-squares solution set to A x = y is identical to the solution set of A T A x = A T y