A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13
Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose A T, (AB) T = B T A T, Symmetric A = A T Inverse A 1, doesn t exist always, (AB) 1 = B 1 A 1. x T x is a scalar, xx T is a matrix. Ax = b, three ways of expressing: n j=1 a ijx j = b j, j r T j x = b j, j, where r j is j th row. x 1 a 1 + x 2 a 2 +... + x n a n = b (Linear Combination, l.c.) System of equations : Non-singular (unique solution), singular (no solution, infinite solution). A Review of Linear Algebra p.2/13
2 1 1 4 6 0 2 7 2 2 1 1 0 8 2 0 0 1 LU factorization x 1 x 2 x 3 x 1 x 2 x 3 = = }{{} U 1 1 1 1 1 1 1 1 }{{}}{{} E F 5 2 9 5 12 2 1 2 1 Ax = b Ux = EFGb } 1 {{ 1 } G, L = G 1 F 1 E 1 A Review of Linear Algebra p.3/13
LU factorization (First non-singular case) If no row exchanges are required, then A = LU (unique). Solve Lc = b, then Ux = c Another form A = LDU. (Second non-singular case) There exist a permutation matrix P that reorders the rows, so that PA = LU. (Singular Case) No such P exist. (Cholesky Decomposition) If A is symmetric, and A = LU can be found without any row exchanges, then A = LL T (also called square root of a matrix). (proof). Positive Definite matrix always have a Cholesky decompostion. A Review of Linear Algebra p.4/13
Vector Space, Subspace and Matrix (Real Vector Space) A set of vectors" with rules for vector addition and multiplication by real numbers. E.g. R 1,R 2,...,R, Hilbert Space. (8 conditions) Includes an identity vector and zero vector, closed under addition and multiplication etc. etc. (Subspace) Subset of a vector space, closed under addition and multiplication (should contain zero). Subspace spanned" by a matrix (Outline the concept) 1 0 b 1 x 1 5 + x 2 4 = b 2 2 4 b 3 A Review of Linear Algebra p.5/13
Linear Independence, Basis, Dimension (Linear Independence, l.i.) If x 1 a 1 + x 2 a 2 +... + x n a n only happens when x 1 = x 2 =... = 0, {a k } are called linearly independent. A set of n vectors in R m are not l.i. if n > m (proof). (Span) If every vector v in V can be expressed as a l.c. of {a k }, then {a k } are said to span V. (Basis) {a k } are called basis of V if they are l.i. and span V (Too many and unique) (Dimension) Number of vectors in any basis is called dimension (and is same for all basis). A Review of Linear Algebra p.6/13
Four Fundamental Spaces Fundamental Theorem of Linear Algebra I 1. R(A) = Column Space of A; l.c. of columns; dim r. 2. N(A) = Nullspace of A; All x : Ax = 0; dim n r. 3. R(A T ) = Row space of A; l.c. of rows; dim r. 4. N(A T ) = Left nullspace of A; All y : A T y = 0; dim m r. (Rank) r is called rank of the matrix. Inverse exist iff rank is as large as possible. Question: Rank of uv T A Review of Linear Algebra p.7/13
Orthogonality (Norm) x 2 = x T x = x 2 1 +... + x2 n (Inner Product) x T y = x 1 y 1 +... + x n y n (Orthogonal) x T y = 0 Orthogonal l.i. (proof). (Orthonormal basis) Orthogonal vectors with norm =1 (Orthogonal Subspaces) V W if v w, v V,w W (Orthogonal Complement) The space of all vectors orthogonal to V denoted as V. The row space is orthogonal to the nullspace (in R n ) and the column space is orthogonal to the left nullspace (in R m ).(proof). A Review of Linear Algebra p.8/13
Finally... Fundamental Theorem of Linear Algebra II 1. R(A T ) = N(A) 2. R(A) = N(A T ) Any vector can be expressed as (1) (2) x = x 1 b 1 +... + x r b }{{} r x r = x r + x n +x r+1 b r+1 +... + x n b n }{{} x n Every matrix transforms its row space to its column space (Comments about pseudo-inverse and invertibility) A Review of Linear Algebra p.9/13
Gram-Schmidt Orthogonalization (Projection) of b on a is at b a T a a, for unit vector (at b)a (Schwartz Inequality) a T b a b (Orthogonal Matrix)Q = [q 1...q n ], Q T Q = I. (proof). (Length preservation) Qx = x (proof). Given vectors {a k }, construct orthogonal vectors{q k } 1. q 1 = a 1 / a 1 2. for each j, a j = a j (q T 1 a j)q 1... (q T j 1 a j)q j 1 3. q j = a j / a j QR Decomposition (Example) A Review of Linear Algebra p.10/13
Eigenvalues and Eigenvectors (Invariance)Ax = λx. (Characteristics Equation) (A λi)x = 0 (Nullspace) λ 1 +... + λ n = a 11 +... + a nn. λ 1...λ n = det(a). (A = SΛS 1 ) Suppose there exist n linear independent eigenvectors for A. If S is the matrix whose columns are those independent vectors, then A = SΛS 1 where Λ = diag(λ 1,...,λ n ). Diagonalizability is concerned with eigenvectors, and invertibility is concerned with eigenvalues. (Real symmetric matrix) Eigenvectors are orthogonal. So A = QΛQ T. (Spectral Theorem) A Review of Linear Algebra p.11/13
Singular Value Decomposition Any matrix can be factorized as A = UΣV T. Insightful? Finish. A Review of Linear Algebra p.12/13
Finish Thanks to Maria (Marisol Flores Gorrido) for helping me with this tutorial. A Review of Linear Algebra p.13/13