Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F = R, χ is simply the identity. If F = C, χ(λ) = λ. In both cases, we write χ(λ) = λ.
Inner product Let V be a vector space over F. An inner product on V is a nonsingular symmetric bilinear form, : V V F. Symmetry: x, y = y, x for x, y V. Bilinearity: (i) λx + λ x, y = λ x, y + λ x, y, (ii) x, µy + µ y = µ x, y + µ x, y. (ii) follows from (i) and symmetry. Positive definiteness: For every x V. x, x 0 and equality holds if and only if x = 0. 1
Standard inner product on R n Consider R n with the standard basis B = {e 1, e 2,..., e n } (where e i has 1 in its i-th position, and 0 elsewhere). The standard inner product on R n is defined by n a i e i, i=1 n b i e i = i=1 n a i b i. i=1 For x = n i=1 x ie i, x 2 = x, x = n x 2 i. i=1 2
Unitary space C n Consider C n with standard basis B = {e 1, e 2,..., e n } (where e i has 1 in its i-th position, and 0 elsewhere). The Hermitian inner product on C n is defined by n a i e i, i=1 n b i e i = i=1 n a i b i. i=1 For x = n i=1 x ie i, x 2 = x, x = n x i x i = i=1 n x i 2. i=1 3
Examples (1) The space C[a, b] of continuous complex valued functions on a closed interval [a, b] with inner product f, g = b a f(x)g(x)dx. (2) The space l 2. Let F = R or C. An infinite sequence (x n ), n = 0, 1, 2..., in F is square summable if n=0 x n 2 < (convergent). The square summable sequences form the space l 2 with inner product (x n ), (y n ) = x n y n. n=0 This latter series converges by the comparison test since ( x n y n ) 2 0 = x n y n 1 2 ( x n 2 + y n 2 ) = 1 2 ( x n 2 + (y n 2 ). 4
Norms The norm of x V is defined by x = x, x. A vector x V is a unit vector if x = 1. A nonzero vector x in a positive definite inner product space can be normalized into a unit vector: x x :=, x, x i.e. x = 1. 5
Some inequalities The norm of x V is defined by x = x, x. (1) x 0 and x = 0 if and only if x = 0. (2) λx = λ x for λ F and x V. (3) Cauchy-Schwarz inequality: x, y x y. Equality holds if and only if x and y are linearly dependent. (4) Triangle inequality: x + y x + y. Equality holds if and only if x and y are linearly dependent. (5) Parallelogram law: x + y 2 + x y 2 = 2( x 2 + y 2 ). 6
Cauchy-Schwarz inequality Proof. We may assume x and y nonzero. For any λ F, 0 x λy 2 = x λy, x λy = x, x λ x, y λ ( y, x λ y, y ) = x 2 λ x, y λ ( y, x λ y 2). By choosing λ = y, x y 2, we have becomes 0 x 2 y, x y 2 x, y = x 2 y 2 x, y 2 y 2. Therefore, x, y x y. Clearly, equality holds if and only if x λy = 0 for some λ F, i.e., x and y are linearly dependent. 7
Polarization identities (1) The real case: x, y = 1 ( u + v 2 u v 2). 4 (2) The complex case: x, y = 1 ( u + v 2 u v 2) + i ( u + iv 2 u iv 2). 4 4 8
Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6B: Orthogonality
Orthogonality Let V be an inner product space. Two vectors x, y V are orthogonal if x, y = 0. Notation: x y. 1
Orthogonal = linearly independent Proposition. Mutually orthogonal nonzero vectors are linearly independent. Proof. Let u 1,..., u s be orthogonal nonzero vectors in an inner product space V. Suppose s i=1 λ iu i = 0. Then for each j = 1,..., s, 0 = s λ i u i, u j = i=1 s λ i u i, u j = λ j u j, u j = λ j = 0 i=1 since u j, u j 0. Therefore, the vectors are linearly independent. 2
Orthogonal projection of a vector onto a subspace Let V be a positive definite inner product space, and W V a subspace. For x V, the orthogonal projection onto W is the vector y W for which x y x for every x W. i.e., x, x = x, y. For this, let B = {u 1,..., u s } be a basis of W, and write y = s i=1 a iu i. We require, for each i = 1,..., s, u i, x = u i, s a j u j = j=1 s u i, u j a j. j=1 This amounts to solving the system of linear equations a 1 u 1, x M. =., u s, x a s where M = ( u i, u j ). We shall justify below that M is nonsingular. Granting this, the above equation has a unique solution in a 1,..., a s. This gives the orthogonal projection y of x onto V. 3
( u i, u j ) is a nonsingular matrix Proposition. Let {u 1,..., u n } be a basis of a positive definite inner product space. The matrix M = ( u i, u j ) is nonsingular. Proof. Suppose, for a contradiction, that M is singular. There exists b 1,..., b n F, not all zero, such that b 1 M. = 0.. 0 Let y = n j=1 b ju j. It follows from the above that x, y = 0 for every x V. This is clearly impossible since y, y > 0. b n 4
Orthogonal complement Let V be an inner product space over F, and V a subspace of V. The orthogonal complement of V is the subspace W := {x V : x, y = 0 for every y W }. Theorem. V = W W. 5
Orthogonal decomposition Theorem. V = W W. Proof. (1) W W = 0. (2) For every x V, x = y + (x y), where y is the orthogonal projection of x onto V, and x y W. Therefore, we have an orthogonal direct sum decomposition V = W W. 6
Orthogonal decomposition: infinite dimensional case A decomposition V = W W may not hold if V is infinite dimensional. Example. Let V = l 2, and W be the span of all e n = (0,..., 0, 1, 0,...), which contains 1 in its n-th position, and 0 elsewhere. The sequence e 0, e 1,..., e n,..., is orthonormal. Note that W is a proper subspace of l 2, since each element of W, being a linear combination of e 1,..., e n..., has only finitely many nonzero terms. If x = (x n ) W, then for each integer n, 0 = x, e n = x n. This shows that x = 0, and W = {0}. Clearly, l 2 W = W W. 7
Orthogonal and orthonormal bases Let V be an inner product space. An orthogonal basis of V is a basis consisting of mutually orthogonal vectors, i.e., a basis B = {u 1,..., u n } of V satisfying u i, u j = 0 for distinct i, j = 1,..., n. An orthonormal basis is an orthogonal basis of unit vectors, i.e., a basis B = {u 1,..., u n } of V satisfying u i, u j = δ i,j for i, j = 1,..., n. 8
Existence of orthogonal basis Theorem. A finite dimensional positive definite inner product space contains an orthogonal basis. Proof. (Induction on dimension). This is clearly true for 1-dimensional inner product spaces. Every nonzero vector constitutes an orthogonal basis. Assume this true for all inner product spaces of dimensions < n. Let V be an inner product space of dimension n. Choose a nonzero v V. Then V = Span(v) v, and dim F v = n 1. By inductive hypothesis, v contains an orthogonal basis B. Then {v} B is an orthogonal basis of V. Corollary. A finite dimensional positive definite inner product space contains an orthonormal basis. 9
Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6C: Gram-Schmidt orthogonalization In this section, V is a positive definite inner product over R or C.
Orthogonal projection onto a vector Lemma. The orthogonal projection of u V along v is the vector u, v v, v v. Proof. Write the orthogonal projection as av for some a F. We require u av, v = 0. From this, u, v a v, v = 0, and a = u, v v, v. 1
Orthogonal projection onto a subspace (again) (1) Let W V be a subspace of V, with an orthogonal basis u 1,..., u s. For x V, the orthogonal projection on W is the vector ˆx W such that x ˆx w for every w W. x x ˆx ˆx V Write ˆx = s i=1 λ iu i. For each i = 1, 2..., s, we require x ˆx, u i = 0. This means, x, u i λ i u i, u i = 0, and λ i = x, u i ˆx = s i=1 x, u i u i 2 u i. u i 2. (2) If e 1,..., e s is an orthonormal basis of W V, s ˆx = x, e i e i. i=1 2
Bessel s inequality: ˆx x Proof. From x = ˆx + (x ˆx), with ˆx, x ˆx = 0, we have x 2 = ˆx 2 + x ˆx 2 ˆx 2. Therefore, ˆx x. Equality holds if and only if x ˆx = 0, i.e., x W. 3
Best approximation ˆx is the best approximation to x in W, in the sense that x ˆx < x w for every w W \ {ˆx}. Proof. For w W, x w = (x ˆx) + (ˆx w). Note that ˆx w W and is orthogonal to x ˆx. Therefore, x w 2 = x ˆx 2 + ˆx w 2 x ˆx 2. Equality holds if and only if ˆx w = 0. Therefore, x ˆx < x w if w W \ {ˆx}. Remark: If V is infinite dimensional, we also call n ˆx = x, e i e i i=1 the Fourier expansion with respect to the orthonormal set e 1,..., e n. The best approximation property and Bessel s inequality are still valid. 4
Example: The space l 2 For each integer n 0, let u n be the infinite sequence which has 1 in its n-th position, and 0 elsewhere. The set e 0, e 1,..., e n,... is an orthonormal set. However, the span is not the whole space l 2. Let V be the span of e 1,..., e n. The orthogonal projection of x = (x i ) l 2 is the truncation of x at position n. ˆx = (x 1,..., x n, 0, 0,...), 5
Example: V = C[ π, π] The functions 1, cosnx, sinnx, n = 1, 2,..., form an orthogonal set: π π π π π π cosmx cosnxdx = δ m,n π, m, n = 1, 2,..., sinmx sin nxdx = δ m,n π, m, n = 1, 2,..., cosmx sin nxdx = 0, m, n = 0, 1,.... These lead to the Fourier expansions of f(x) C[ π, π]: where a 0 = 1 π a k = 1 π b k = 1 π ˆf = a n 0 2 + (a k coskx + b k sinkx), π π π π π π k=1 f(x)dx, f(x) coskxdx, k = 1, 2,..., f(x) sinkxdx, k = 1, 2,... 6
Gram-Schmidt orthogonalization Let v 1,..., v n be given vectors in a positive definite inner product space V. Define a sequence u 1,..., u n as follows. (i) u 1 = v 1, (ii) for k = 2,..., n, u k = v k k 1 i=1 λ k,iu i, where { vk, u i u λ k,i = i if u 2 i 0, 0 if u i = 0. Then u 1,..., u n is a sequence of mutually orthogonal vectors satisfying for each k = 1,..., n. Span(u 1,..., u k ) = Span(v 1,..., v k ) Proposition. If v 1,..., v n is a basis of V, then u 1,..., u n is an orthogonal basis of V. 7
QR factorization of a matrix Given an m n matrix over F = R or C, A = ( v 1 v 2... v n ), by reorganizing the vectors obtained in the Gram-Schmidt orthogonalization process, we obtain a factorization A = QR in which (i) Q = ( ũ 1 ũ 2... ũ n ) consists of the vectors u1,..., u n normalized, (ii) R is the upper triangular matrix v u 1 2, u 1 u 1 0 u R = 2 v n, u 1 u 1 v n, u 2 u 2... 0 0 u n 8