Mathematics for Computer Science w11 Algebra of Matrices matrix definition, geometric interpretation, determinant, inverse, orthogonal, system of linear equations Summary and Figures from : Linear Algebra (5th Ed) : 612 Solved Problems by Seymour Lipschutz, Marc Lipson Schaum's Outline ISBN-10: 0071794565 Definition : A matrix A over field K (usually R) is a rectangular array of scalars, usually presented as : (columns) (c1) (c2)... (cn) a11 a12... a1n (r1) a21 a22... a2n (r2) A =... (rows) am1 am2... amn (rn) A = [aij], i=1...m, j=1...n aij is the element in the i-row and j-column. m rows, n columns. (m, n) is the size of A. A = B if they are the same size and aij = bij for all i, j O = [0ij] zero matrix Matrix Addition and Scalar Multiplication (similar to vectors in Rn) Addition A = [aij] B = [bij] A + B = [aij + bij] (same size only!)
Scalar Multiplication A = [aij] k e R ka = [k*aij] also : -A = (-1)A A - B = A + (-B) Linear combination A, B e matrices (same size) k1, k2 e R C = k1a + k2b (linear combination of A and B) THEOREM 2.1: Consider any matrices A, B, C (with the same size) and any scalars k and k1. Then (similar to vectors in Rn) (i) (A + B) + C = A + (B + C) v) k(a + B) = ka + kb (ii) A + 0 = 0 + A = A vi) (k + k1)a = ka + k1a (iii) A + (-A) = (-A) + A = 0 vii) (kk1)a = k(k1a) (iv) A + B = B + A viii) 1A = A Summation simbol (sigma)!!!! f(k) = f(1) + f(2) + + f(n) k is the index, 1 and n are the lower and upper limits, respectively. more generaly, n1, n2 limits!!!!!! f(k) = f(n1) + f(n1+1) + + f(n2) Matrix Multiplication (simple case) A = [a11 a21... a2m] B = [b11 b21... bm1] AB = a11*b11 + a21*b21 +... a2m*bm1 (scalar or 1x1 matrix)
(general case) A = [aij], (m x p) matrix B = [bij], (p x n) matrix (note: A has the same number of colums, p, as the number of rows of B!) C = AB C = [cij] where cij = ai1*b1j + ai2*b2j +... aip*bpj (i-row of A multiplied by j-column of B)! cij = aik bkj!!! note that, not always AB = BA!!! (matrix multiplication is not commutative) THEOREM 2.2: Let A; B; C be matrices. Then, whenever the products and sums are defined, (i) A(BC) = (AB)C (associative law) (ii) A(B + C) = AB + AC (left distributive law) (iii) (A + B)C = AC + BC (right distributive law) (iv) k(ab) = (ka)b = A(kB), k a scalar. Transpose of A A = [aij] (m x n matrix) A T = [aji] (n x m matrix) (rows <-> columns) THEOREM 2.3: Let A and B be matrices and let k be a scalar. Then, whenever the sum and product are defined, (i) (A + B) T = A T + B T (ii) (A T ) T = A (iii) (ka)t = kat (iv) (AB) T = B T A T Note that, by (iv), the transpose of a product is the product of the transposes, but in the reverse order. Square Matrices A = [aij], (n x n) Diagonal and Trace diagonal of A = a11, a22,..., ann tr(a) = trace(a) = a11 + a22 +... + ann THEOREM 2.4: Suppose A and B are n-square matrices and k is a scalar. Then (i) tr(a + B) = tr(a) + tr(b) (iii) tr(a T ) = tr(a) (ii) tr(ka) = ktr(a) (iv) tr(ab) = tr(ba)
Identity Matrix, Scalar Matrices I = [aij], aii == 1 and aij = 0, i!= j AI = IA = A ki : k-scalar matrix Powers of Matrices, Polynomials in Matrices A 2 = AA A 3 = AAA polynomal f(x) = ao + a1*x + a2*x 2 +... + an*x n polynomal of A f(a) = aoi + a1*a + a2*a 2 +... + an*a n if f(a) = 0 then we say that A is a root or zero of 'f' A = 1 2 3-4 f(x) = 2*x**2-3*x + 5 g(x) = x**2 + 3*x - 10 16-18 f(a) = -27 61 0 0 g(a) = 0 0 Invertible (Non-singular) Matrices if given A, exists B such that (square matrices) : AB = BA = I we say that A is invertible and B is its inverse, denoted A -1 example
2 5 3-5 1 0 A = B = AB = 1 3-1 2 0 1 if A and B are invertible, then AB is also invertible and we have : (AB) -1 = B -1 A -1 Inverse of a 2 x 2 matrix A a b d -b A = A -1 = 1/det(A) * c d -c a det(a) = a*d - b*c (determinant of A) if det(a) = 0 then A is not invertible Special Type of Squares Matrices Diagonal and Triangular Matrices Diagonal Matrices D = [dij] if dij == 0 for every i!= j D = [dij] = diag(d11, d22,..., dnn) Triangular (upper / lower) Matrices T = [tij] if tij == 0 for every i > j (upper triangular) T = [tij] if tij == 0 for every i < j (lower triangular) THEOREM 2.5: Suppose A = [aij] and B = [bij] are n x n (upper) triangular matrices. Then : (i) A + B, ka and AB are (upper) triangular with respective diagonals: (a11+b11, a22+b22,..., ann+bnn) (ka11, ka22,... kann) (a11*b11, a22*b22,..., ann*bnn)
(ii) For any polynomial f(x), the matrix f(a) is (upper) triangular with diagonal (f(a11), f(a22),..., f(ann)) (iii) A is invertible if and only if each diagonal element aii!= 0, and when A -1 exists it is also (upper) triangular. Symmetric and Skew-Symmetric Matrices A = [aij] (n x n) A = A T then A is symmetric or aij = aji,for every i, j A = -A T then A is skew-symmetric or aij = -aji,for every i, j (NB: the diagonal elements should all be zero!) Orthogonal Matrices A T = A -1 or AA T = A T A = I example : 1-2 1 2 A = 1/5 * A T = 1/5 * 2 1-2 1 1 8-4 1 4 8 A = 1/9 * 4-4 -7 A T = 1/9 * 8-4 1 8 1 4-4 -7 4 A A T = I THEOREM 2.6: Let A be a real matrix. Then the following are equivalent: (a) A is orthogonal. (b) The rows of A form an orthonormal set. (c) The columns of A form an orthonormal set.
THEOREM 2.7: Let A be a real 2 x 2 orthogonal matrix. Then, for some real number theta : cos(theta) sin(theta) cos(theta) sin(theta) A = or A = - sin(theta) cos(theta) sin(theta) -cos(theta) Normal Matrices A real matrix A is normal if it commutes with its transpose A T, that is : AA T = A T A If A is symmetric, orthogonal, or skew-symmetric, then A is normal. There are also other normal matrices. 1-2 1 2 A = A T = 2 1-2 1 AA T = A T A = 5I ---- Summary and Figures from : Linear Algebra (5th Ed) : 612 Solved Problems by Seymour Lipschutz, Marc Lipson Schaum's Outline ISBN-10: 0071794565