Lecture 14: Orthogonality and general vector spaces 1 Symmetric matrices Recall the definition of transpose A T in Lecture note 9. Definition 1.1. If a square matrix S satisfies then we say S is a symmetric matrix. Useful properties: 1. For any square matrix A, are two symmetric matrices. 2. If S is symmetric, then S 1 is also symmetric. S = S T, (1.1) S 1 = A+A T, (1.2) S 2 = A T A, (1.3) Remark 1.1. If S T = S, then we say S is a skew-symmetric (or antisymmetric or antimetric) matrix. 2 Orthogonal vectors, spaces and matrices 2.1 Orthogonal vectors and spaces Definition 2.1. The dot product of two vectors in R n is defined by u v = n u k v k, k=1 for any two vectors u = [u 1,u 2,...,u n ] T and v = [v 1,v 2,...,v n ] T in R n. Definition 2.2. Two vectors u and v are called orthogonal or perpendicular to each other, denoted as u v, if and only if u v = 0. 1 Copy right reserved by Yingwei Wang
Definition 2.3. Let S = {v 1,v 2,...,v k } be a basis for a vector space V. If the vectors in S are mutually orthogonal that is, each two of them are orthogonal then S is called an orthogonal basis for V. For example, the following is an orthogonal basis for for R n, 1 0., 1.,..., 0.. 1 Definition 2.4. The two vector spaces V and U are called orthogonal or perpendicular to each other, denoted as V U, if and only if for any u U and v V. u v = 0, Remark 2.1. Suppose the basis sets for the vector spaces V and U are respectively, Then V U is equivalent to S v S u. V = span (S v ), U = span (S u ). Forexample, anytwoofthefollowingsubspacesofr 3 areperpendicular/orthogonal to each other, 1 0 0 X = span 0, Y = span 1, Z = span 0, 1 X Y, X Z, Y Z. Another example: 1 U = span 0, 0 U Z. 0 0 1, Z = span 0, 2 Copy right reserved by Yingwei Wang
Proposition 2.1 (orthogonal relation between row space and null space). Given a matrix A, its null space Null(A) and row space Row(A) are perpendicular/orthogonal to each other. Proof. Let the j-th row of the matrix A m n be r j = [a j1,a j2,...,a jn ], i.e., a 11 a 12... a 1n r 1 a 21 a 22... a 2n r 2 A m n =..... =... a m1 a m2... a mn r m Then Ax = 0 is equivalent to the following: Ax = 0, r 1 x = 0, r 2 x = 0, r m x = 0. m n (c 1 r 1 +c 2 r 2 + +c m r m ) x = 0. It implies that r Row(A) and x Null(A), we have It follows that Null(A) Row(A). r x = 0. Remark 2.2. For any matrix A, its leftnull space and column space are also perpendicular/orthogonal to each other, i.e., 2.2 Orthogonal matrices Definition 2.5. If the matrix Q satisfies then we say Q is an orthogonal matrix. Null(A T ) Col(A). Q 1 = Q T, (2.1) 3 Copy right reserved by Yingwei Wang
Consider the columns of an orthogonal matrix Q = [q 1,q 2 ], then [ ] [ q Q T T [q1 Q = 1 ] q T q 2 = 1 q 1 q T 1 q ] [ ] 2 1 0 q T 2q 1 q T =. (2.2) 2q 2 q T 2 It implies that the columns Q, i.e., {q 1,q 2 }, are mutually orthogonal. By using QQ T = I, we can also show that the rows of Q are mutually orthogonal. That is why we have the Definition 2.5. Several 2-by-2 examples of orthogonal matrices: I = ( ) 1 0, P = ( ) 1 0, Q = ( cosθ sinθ sinθ cosθ ), R = ( ) cosθ sinθ. (2.3) sinθ cosθ Remark 2.3. Please check the equality (2.1) for P,Q,R defined by (2.3). Useful properties: Let Q be an orthogonal matrix. 1. The solution to the linear system Qx = b is x = Q T b. 2. Foranycolumnvectorx R n,wehave Qx = x,where x = x 2 1 +x2 2 + +x2 n. 3. If both Q 1 and Q 2 are orthogonal, then Q 1 Q 2 is also orthogonal. 3 More general vector spaces (other than R n ) Matrix spaces. Consider the M 2 : 1. Zero vector: 2. Basis set: M 2 = {all of 2-by-2 matrices}, { ( ) } a b = A =, a,b,c,d R. c d ( ). ), {( 1 0 3. Dimension: dim(m 2 ) = 4. ( ), 4. Some usefull subspaces of M 2 : ( ), 1 0 ( )}. 4 Copy right reserved by Yingwei Wang
(a) Diagonal matrix space. { ( ) a 0 D 2 = A = 0 d {( ) 1 0 = span, (b) Symmetric matrix space. (c) Zero trace matrix space. }, a,d R )} ( S 2 = { A M 2 : A = A T} { ( ) } a b = A =, b = c c d {( ) ( ) ( )} 1 1 = span,,. 1 0 T 2 = {A M 2 : trac(a) = 0} { ( ) } a b = A =, a+d = 0 c d {( ) ( ) ( )} 1 0 = span,,. 1 1 See Examples 1, 2 in Chapter 4.7 of your textbook. Polynomial spaces. Consider the P 2 : 1. Zero vector: p(x) = 0. 2. Basis set: {1,x,x 2 }. 3. Dimension: dim(p 2 ) = 3. P 2 = {all polynomials with degree 2} = { p(x) = a+bx+cx 2, a,b,c R }. 4. Subspaces: P 1 = {p(x) = a+bx, a,b R} and P 0 = {p(x) = a, a R}. See Example 4 in Chapter 4.4 and Examples 4, 5 in Chapter 4.7 of your textbook.. 5 Copy right reserved by Yingwei Wang
Remark 3.1. Actually, the matrix space M 2 = R 4 and the polynomial space P 2 = R 3. Solution spaces of linear differential equations. First order differential equations with zero right hand side. It implies that dim(v 1 ) = 1. V 1 = {all solutions to y +y = 0}, = { y(t) = ce t, c R }, = span { e t}. Second order differential equations with zero right hand side. V 2 = {all solutions to y = 0}, = {y(t) = c 1 t+c 2, c 1,c 2 R}, = span{t, 1}; V 3 = {all solutions to y +y = 0}, = {y(t) = c 1 sint+c 2 cost, c 1,c 2 R}, = span{sint, cost}; V 4 = {all solutions to y y = 0}, = { y(t) = c 1 e t +c 2 e t, c 1,c 2 R }, = span { e t, e t}. It implies that dim(v 2 ) = dim(v 3 ) = dim(v 4 ) = 2. In Chapter 5, we will learn how to find the linearly independent solutions to the second order linear differential equation: Ay +By +C = 0. 6 Copy right reserved by Yingwei Wang
4 Summary of square matrices Suppose A is an n n square matrix. The following statements are equivalent to each other: 1. A is nonsingular; 2. A is invertible (i.e. A 1 such that AA 1 = A 1 A = I n ); 3. det(a) 0; 4. There are no zero rows after Gaussian Eliminations; 5. A is row equivalent to I n ; 6. The nonhomogeneous equation Ax = b has unique solution for any right hand side b; 7. The homogeneous equation Ax = 0 has only trivial solution (zero solution); 8. The nullspace of A is {0}; 9. The columns (rows) of A are linearly independent (i.e. they form a basis for R n ) and Col(A) = Row(A) = R n ; 10. Rank(A) = dim(col(a)) = dim(row(a)) = n; 11. A T is also nonsingular (invertible). Similarly, the following statements are also equivalent to each other: 1. A is singular; 2. A is non-invertible; 3. det(a) = 0; 4. There are at least one zero row after Gaussian Elimination; 5. A is not row equivalent to I n ; 6. The nonhomogeneous equation Ax = b has either infinitely many solution or no solution; 7 Copy right reserved by Yingwei Wang
7. The homogeneous equation Ax = 0 has non trivial solutions; 8. x 0, s.t. x Null(A); 9. The columns (rows) of A are linearly dependent; 10. Rank(A) = dim(col(a)) = dim(row(a)) < n. 11. A T is also singular (non-invertible). 8 Copy right reserved by Yingwei Wang