Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205
Motivation When working with an inner product space, the most convenient bases are those that. consist of orthogonal vectors, and 2. consist of vectors of length.
Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set.
Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set. Example The standard basis {e, e 2,..., e n } is an orthonormal set in R n.
Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set. Example The standard basis {e, e 2,..., e n } is an orthonormal set in R n. Recall: if v V and v 0 then has norm. u = v v
Example Create an orthonormal set of vectors from: v = (0,, 0) v 2 = (, 0, ) v 3 = (, 0, )
Example Create an orthonormal set of vectors from: v = (0,, 0) v 2 = (, 0, ) v 3 = (, 0, ) u = (0,, 0) u 2 = (/ 2, 0, / 2) u 3 = (/ 2, 0, / 2)
Orthogonality and Linear Independence Theorem If S = {v, v 2,..., v n } is an orthogonal set of nonzero vectors in an inner product space, then S is linearly independent.
Proof Suppose there are scalars k, k 2,..., k n such that k v + k 2 v 2 + + k n v n = 0. Scalar k i = 0 since k v + k 2 v 2 + + k n v n, v i = 0, v i k v, v i + k 2 v 2, v i + + k n v n, v i = 0 k i v i, v i = 0 k i v i 2 = 0 k i = 0 Since this is true for all i we have k = k 2 = = k n = 0.
Orthonormal Basis Definition A basis for an inner product space consisting of orthonormal vectors is called an orthonormal basis. A basis for an inner product space consisting of orthogonal vectors is called an orthogonal basis.
Orthonormal Basis Definition A basis for an inner product space consisting of orthonormal vectors is called an orthonormal basis. A basis for an inner product space consisting of orthogonal vectors is called an orthogonal basis. Theorem If B = {v, v 2,..., v n } is an orthonormal basis for an inner product space V, and if u V, then u = u, v v + u, v 2 v 2 + + u, v n v n. Remark: The coordinates of u relative to B are u, v, u, v 2,..., u, v n.
Example Let u =, u 2 = 2 3, u 3 = 4 5 Verify that {u, u 2, u 3 } is an orthogonal basis for R 3 and find an orthonormal basis for R 3. Express u = 0 in terms of this orthonormal basis.
Solution Since u = 3, u 2 = 4, and u 3 = 42 then v = 3, v 2 = 42 are orthonormal vectors. 2 3, v 3 = 42 4 5 u = u, v v + u, v 2 v 2 + u, v 3 v 3 = 2 v v 2 + 5 v 3 3 4 42
Coordinates Relative to Orthonormal Bases Theorem If B is an orthonormal basis for an n-dimensional inner product space, and if (u) B = (u, u 2,..., u n ) and (v) B = (v, v 2,..., v n ) then u = u 2 + u2 2 + u2 n d(u, v) = (u v ) 2 + (u 2 v 2 ) 2 + + (u n v n ) 2 u, v = u v + u 2 v 2 + + u n v n
Coordinates Relative to Orthonormal Bases Theorem If B is an orthonormal basis for an n-dimensional inner product space, and if (u) B = (u, u 2,..., u n ) and (v) B = (v, v 2,..., v n ) then u = u 2 + u2 2 + u2 n d(u, v) = (u v ) 2 + (u 2 v 2 ) 2 + + (u n v n ) 2 u, v = u v + u 2 v 2 + + u n v n Remark: computing norms, distances, and inner products with coordinates relative to orthonormal bases is equivalent to computing them in Euclidean coordinates.
Coordinates Relative to Orthogonal Bases If B = {v, v 2,..., v n } is merely an orthogonal basis for V then { } v B = v, v 2 v 2,..., v n v n is an orthonormal basis for V. Thus if u V u = u, v v 2 v + u, v 2 v 2 2 v 2 + + u, v n v n 2 v n.
Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space?
Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space? Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. u = w + w 2
Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space? Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. u = w + w 2 Remark: w is called the orthogonal projection of u on W. w 2 is called the component of u orthogonal to W.
Illustration u = w + w 2 = proj W u + proj W u Thus proj W u = u proj W u u = proj W u + (u proj W u) u w 2 u u-proj w u w proj W u
Calculating Orthogonal Projections Theorem Suppose W is an finite-dimensional subspace of an inner product space V.. If B = {v, v 2,..., v r } is an orthonormal basis for W and u V then proj W u = u, v v + u, v 2 v 2 + + u, v r v r. 2. If B = {v, v 2,..., v r } is an orthogonal basis for W and u V then proj W u = u, v v 2 v + u, v 2 v 2 2 v 2 + + u, v r v r 2 v r.
Example B = {v, v 2, v 3 } = 3, 4 2 3, 42 is an orthonormal basis for R 3. Let W = span{v, v 2 }. 4 5 Calculate proj W 0.
Solution proj W 0 = (, 0, ), + (, 0, ), = 2 v v 2 3 4 = 2 25 42 37 42 3 (,, ) v 4 (2,, 3) v 2
Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces.
Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces. Theorem Every nonzero finite-dimensional inner product space has an orthonormal basis.
Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces. Theorem Every nonzero finite-dimensional inner product space has an orthonormal basis. Proof. The proof is an algorithm known as the Gram-Schmidt Process.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }. 4. Continue in this manner until we have defined B = {u, u 2,..., u n }, an orthogonal basis for V.
Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }. 4. Continue in this manner until we have defined B = {u, u 2,..., u n }, an orthogonal basis for V. { } u 5. B = u, u 2 u 2,..., u n is orthonormal basis for V. u n
Example Given 2 2, 4 3 2, 2 find an orthonormal basis for R 3.
Solution Step : v = ( (, 2, 2) (, 2, 2) = 3, 2 ) 3, 2 3
Solution ( (, 2, 2) Step : v = (, 2, 2) = 3, 2 ) 3, 2 3 Step 2: v 2 = (4, 3, 2) (4, 3, 2), v v 2 v = (4, 3, 2) (4, 3, 2), v v since v is a unit vector. ( 0 v 2 = (4, 3, 2) 2v = 3, 5 3, 0 ) 3 We will replace v 2 with a unit vector in the same direction, so v 2 = (2/3, /3, 2/3).
Solution ( (, 2, 2) Step : v = (, 2, 2) = 3, 2 ) 3, 2 3 Step 2: v 2 = (4, 3, 2) (4, 3, 2), v v 2 v = (4, 3, 2) (4, 3, 2), v v since v is a unit vector. ( 0 v 2 = (4, 3, 2) 2v = 3, 5 3, 0 ) 3 We will replace v 2 with a unit vector in the same direction, so v 2 = (2/3, /3, 2/3). Step 3: v 3 = (, 2, ) (, 2, ), v (, 2, ), v 2 (since v and v 2 are unit vectors). ( v 3 = (, 2, ) ()v (2)v 2 = 2 3, 2 3, ) 3 Vector v 3 is already a unit vector.
Example Let vector space P 2 have the inner product p, q = p(x)q(x) dx. Given {, x, x 2 } find an orthonormal basis for P 2.
Solution ( of 2), = v = 2 () 2 dx = 2
Solution ( of 2), = v = 2 u = x v 2 = () 2 dx = 2 x, = x 2 = x 2 2 x x, x = 2 x dx 3 2 x
Solution (2 of 2) v = 2 3 v 2 = 2 x u 3 = x 2 x 2, 3 2 x 2, 2 = x 2 2 3 2 3 (0) 2 x = x 2 3 2 x 3 2 x v 3 = x 2 /3 x 2 3, x 2 3 = 5 8 (3x 2 )
Extending Orthonormal Sets Recall: earlier we stated a theorem which said that a linearly independent set in a finite-dimensional vector space can be enlarged to a basis by adding appropriate vectors.
Extending Orthonormal Sets Recall: earlier we stated a theorem which said that a linearly independent set in a finite-dimensional vector space can be enlarged to a basis by adding appropriate vectors. Theorem If W is a finite-dimensional inner product space, then: Every orthogonal set of nonzero vectors in W can be enlarged to an orthogonal basis for W. Every orthonormal set of nonzero vectors in W can be enlarged to an orthonormal basis for W.
QR-Decomposition Remark: the Gram-Schmidt Process can be used to factor certain matrices.
QR-Decomposition Remark: the Gram-Schmidt Process can be used to factor certain matrices. Theorem (QR-Decomposition) If A is an m n matrix with linearly independent column vectors, then A can be factored as A = QR where Q is an m n matrix with orthonormal column vectors, and R is an n n invertible upper triangular matrix.
QR-Decomposition (continued) Proof. Suppose A = [ u u 2 u n ] and rank(a) = n.
QR-Decomposition (continued) Proof. Suppose A = [ u u 2 u n ] and rank(a) = n. Apply the Gram-Schmidt process to {u, u 2,..., u n } to produce the orthonormal basis {q, q 2,..., q n } and define matrix Q = [ q q 2 q n ].
QR-Decomposition (continued) Proof. Note that u = u, q q + u, q 2 q 2 + + u, q n q n u 2 = u 2, q q + u 2, q 2 q 2 + + u 2, q n q n. u n = u n, q q + u n, q 2 q 2 + + u n, q n q n
QR-Decomposition (continued) Proof. Note that u = u, q q + u, q 2 q 2 + + u, q n q n u 2 = u 2, q q + u 2, q 2 q 2 + + u 2, q n q n. u n = u n, q q + u n, q 2 q 2 + + u n, q n q n Writing this system of equations in matrix form yields u, q u 2, q u n, q u, q 2 u 2, q 2 u n, q 2 A = Q... = QR u, q n u 2, q n u n, q n
QR-Decomposition (continued) Proof. By definition i q i = u i u i, q j q j j= i u i = q i + u i, q j q j j= u i span{q, q 2,..., q i }
QR-Decomposition (continued) Proof. By definition i q i = u i u i, q j q j j= i u i = q i + u i, q j q j j= u i span{q, q 2,..., q i } Thus for j 2 vector q j is orthogonal to u, u 2,..., u j.
QR-Decomposition (continued) Proof. Thus R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n
QR-Decomposition (continued) Proof. Thus R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n We must still show that u i, q i 0 for i =, 2,..., n in order for R to be nonsingular.
QR-Decomposition (continued) Proof. Recall i u i = q i + u i, q j q j u i, q i = j= i q i + u i, q j q j, q i j= i = q i, q i + u i, q j q j, q i }{{} j= =0 = q i 2 0
QR-Decomposition (continued) Proof. Thus (finally) R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n which is invertible.
Example Example Find the QR-decomposition of A = 2 2 3 2 6.
Example Example Find the QR-decomposition of A = 2 2 3 2 6. Hint: an orthonormal basis for the column space of A is 2 2, 2, 0 0 0 0 2. 0
Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2
Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2. Find w and w 2 with the stated properties.
Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2. Find w and w 2 with the stated properties. 2. Show that w and w 2 are unique.
Homework Read Section 6.3 Exercises:, 3, 7, 9, 5, 9, 27, 29