D Matrix Calculus D 1
Appendix D: MATRIX CALCULUS D 2 In this Appendix we collect some useful formulas of matrix calculus that often appear in finite element derivations D1 THE DERIVATIVES OF VECTOR FUNCTIONS Let x and y be vectors of orders n and m respectively: x x 1 x 2 x n, y y 1 y 2 y m, (D1) where each component y i may be a function of all the x j, a fact represented by saying that y is a function of x,or y y(x) (D2) If n 1, x reduces to a scalar, which we call x Ifm 1, y reduces to a scalar, which we call y Various applications are studied in the following subsections D11 Derivative of Vector with Respect to Vector The derivative of the vector y with respect to vector x is the n m matrix def 1 2 1 m 1 1 1 2 2 m 2 2 1 2 n m n n (D3) D12 Derivative of a Scalar with Respect to Vector If y is a scalar, def 1 2 n (D4) D13 Derivative of Vector with Respect to Scalar If x is a scalar, def [ 1 2 m ] (D5) D 2
D 3 D1 THE DERIVATIVES OF VECTOR FUNCTIONS REMARK D1 Many authors, notably in statistics and economics, define the derivatives as the transposes of those given above 1 This has the advantage of better agreement of matrix products with composition schemes such as the chain rule Evidently the notation is not yet stable EXAMPLE D1 Given and y [ y1 ], x y 2 y 1 x 2 1 x 2 y 2 x 2 3 + 3x 2 [ x1 x 2 x 3 ] (D6) (D7) the partial derivative matrix / is computed as follows: 1 2 1 1 1 2 2 2 1 2 3 3 [ 2x1 0 ] 1 3 0 2x 3 (D8) D14 Jacobian of a Variable Transformation In multivariate analysis, if x and y are of the same order, the determinant of the square matrix /, that is J (D9) is called the Jacobian of the transformation determined by y y(x) The inverse determinant is J 1 (D10) 1 One author puts it this way: When one does matrix calculus, one quickly finds that there are two kinds of people in this world: those who think the gradient is a row vector, and those who think it is a column vector D 3
Appendix D: MATRIX CALCULUS D 4 EXAMPLE D2 The transformation from spherical to Cartesian coordinates is defined by x r sin θ cos ψ, y r sin θ sin ψ, z r cos θ (D11) where r > 0, 0 < θ < π and 0 ψ < 2π To obtain the Jacobian of the transformation, let x x 1, y x 2, z x 3 r y 1, θ y 2, ψ y 3 (D12) Then J sin y 2 cos y 3 sin y 2 sin y 3 cos y 2 y 1 cos y 2 cos y 3 y 1 cos y 2 sin y 3 y 1 sin y 2 y 1 sin y 2 sin y 3 y 1 sin y 2 cos y 3 0 y 2 1 sin y 2 r 2 sin θ (D13) The foregoing definitions can be used to obtain derivatives to many frequently used expressions, including quadratic and bilinear forms EXAMPLE D3 Consider the quadratic form y x T Ax where A is a square matrix of order n Using the definition (D3) one obtains (D14) Ax + AT x (D15) and if A is symmetric, 2Ax (D16) We can of course continue the differentiation process: 2 y ( ) A + A T, (D17) 2 and if A is symmetric, 2 y 2 2A (D18) The following table collects several useful vector derivative formulas y Ax x T A x T x x T Ax A T A 2x Ax + A T x D 4
D 5 D2 THE CHAIN RULE FOR VECTOR FUNCTIONS D2 Let THE CHAIN RULE FOR VECTOR FUNCTIONS x x 1 x 2 x n, y y 1 y 2 y r and z z 1 z 2 z m (D19) where z is a function of y, which is in turn a function of x Using the definition (D2), we can write ( ) z T Each entry of this matrix may be expanded as z 1 z 1 1 2 z 2 1 z 2 2 1 2 z 1 n z 2 n n (D20) z i j r q1 z i q q j { i 1, 2,,m j 1, 2,,n (D21) Then ( ) z T z1 y q q 1 z2 q q 1 zm q z 1 z 1 1 z 2 z 2 1 1 q 1 2 2 2 ( ) z T ( On transposing both sides, we finally obtain z1 q q 2 z2 q q 2 zm q q z2 q q n z2 q q n zm q q n 2 z 1 1 1 r 1 2 1 n z 2 r 2 2 1 2 2 n r r r 1 2 r n ) T ( ) z T (D22) z z, (D23) which is the chain rule for vectors If all vectors reduce to scalars, z z z, (D24) D 5
Appendix D: MATRIX CALCULUS D 6 which is the conventional chain rule of calculus Note, however, that when we are dealing with vectors, the chain of matrices builds toward the left For example, if w is a function of z, which is a function of y, which is a function of x, w z w z (D25) On the other hand, in the ordinary chain rule one can indistictly build the product to the right or to the left because scalar multiplication is commutative D3 THE DERIVATIVE OF SCALAR FUNCTIONS OF A MATRIX Let X (x ij ) be a matrix of order (m n) and let y f (X), (D26) be a scalar function of X The derivative of y with respect to X, denoted by X, (D27) is defined as the following matrix of order (m n): G X 11 21 m1 12 22 m2 1n 2n mn [ ] ij i, j E ij ij, (D28) where E ij denotes the elementary matrix* of order (m n) This matrix G is also known as a gradient matrix EXAMPLE D4 Find the gradient matrix if y is the trace of a square matrix X of order n, that is y tr(x) n x ii i1 (D29) Obviously all non-diagonal partials vanish whereas the diagonal partials equal one, thus G X I, (D30) where I denotes the identity matrix of order n * The elementary matrix E ij of order m n has all zero entries except for the (i, j) entry, which is one D 6
D 7 D4 THE MATRIX DIFFERENTIAL D31 Functions of a Matrix Determinant An important family of derivatives with respect to a matrix involves functions of the determinant of a matrix, for example y X or y AX Suppose that we have a matrix Y [y ij ] whose components are functions of a matrix X [x rs ], that is y ij f ij (x rs ), and set out to build the matrix X (D31) Using the chain rule we can write rs i j Y ij ij ij rs (D32) But Y j y ij Y ij, (D33) where Y ij is the cofactor of the element y ij in Y Since the cofactors Y i1, Y i2, are independent of the element y ij,wehave ij Y ij (D34) It follows that rs i j Y ij ij rs (D35) There is an alternative form of this result which is ocassionally useful Define a ij Y ij, A [a ij ], b ij ij rs, B [b ij ] (D36) Then it can be shown that rs tr(ab T ) tr(b T A) (D37) EXAMPLE D5 If X is a nonsingular square matrix and Z X X 1 its cofactor matrix, If X is also symmetric, G X X ZT G X X 2ZT diag(z T ) (D38) (D39) D 7
Appendix D: MATRIX CALCULUS D 8 D4 THE MATRIX DIFFERENTIAL For a scalar function f (x), where x is an n-vector, the ordinary differential of multivariate calculus is defined as n f df dx i (D40) i In harmony with this formula, we define the differential of an m n matrix X [x ij ]tobe dx def i1 dx 11 dx 12 dx 1n dx 21 dx 22 dx 2n dx m1 dx m2 dx mn This definition complies with the multiplicative and associative rules (D41) d(αx) α dx, d(x + Y) dx + dy (D42) If X and Y are product-conforming matrices, it can be verified that the differential of their product is d(xy) (dx)y + X(dY) (D43) which is an extension of the well known rule d(xy) ydx+ xdyfor scalar functions EXAMPLE D6 If X [x ij ] is a square nonsingular matrix of order n, and denote Z X X 1 Find the differential of the determinant of X: d X i, j X ij dx ij where X ij denotes the cofactor of x ij in X X ij dx ij tr( X X 1 ) T dx) tr(z T dx), i, j (D44) EXAMPLE D7 With the same assumptions as above, find d(x 1 ) The quickest derivation follows by differentiating both sides of the identity X 1 X I: d(x 1 )X + X 1 dx 0, (D45) from which If X reduces to the scalar x we have d(x 1 ) X 1 dxx 1 d ( ) 1 dx x x 2 (D46) (D47) D 8