Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Similar documents
Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.4. The Gram Schmidt Process

The Gram Schmidt Process

The Gram Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math Linear Algebra

Math 3191 Applied Linear Algebra

Lecture 4 Orthonormal vectors and QR factorization

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

6. Orthogonality and Least-Squares

Chapter 6: Orthogonality

Row Space, Column Space, and Nullspace

MTH 2310, FALL Introduction

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Applied Linear Algebra in Geoscience Using MATLAB

Math 3191 Applied Linear Algebra

LINEAR ALGEBRA W W L CHEN

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Chapter 3 Transformations

MATH 22A: LINEAR ALGEBRA Chapter 4

Lecture 10: Vector Algebra: Orthogonal Basis

Linear Algebra Massoud Malek

Orthogonal Complements

Homework 5. (due Wednesday 8 th Nov midnight)

MTH 2032 SemesterII

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

INNER PRODUCT SPACE. Definition 1

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Math 2331 Linear Algebra

Math Linear Algebra II. 1. Inner Products and Norms

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Models Review

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Vectors. Vectors and the scalar multiplication and vector addition operations:

Homework 11 Solutions. Math 110, Fall 2013.

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Dot product and linear least squares problems

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

March 27 Math 3260 sec. 56 Spring 2018

2. Review of Linear Algebra

A Primer in Econometric Theory

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

MATH Linear Algebra

MAT Linear Algebra Collection of sample exams

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Properties of Linear Transformations from R n to R m

MATH 235: Inner Product Spaces, Assignment 7

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

Math 4377/6308 Advanced Linear Algebra

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Math 407: Linear Optimization

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

ELE/MCE 503 Linear Algebra Facts Fall 2018

Exercises for Unit I (Topics from linear algebra)

Lecture notes: Applied linear algebra Part 1. Version 2

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

Exercises for Unit I (Topics from linear algebra)

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

There are two things that are particularly nice about the first basis

Cheat Sheet for MATH461

LINEAR ALGEBRA SUMMARY SHEET.

Solution to Homework 8, Math 2568

Math 290, Midterm II-key

Evaluating Determinants by Row Reduction

Linear Algebra. and

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Review of Linear Algebra

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATH 115A: SAMPLE FINAL SOLUTIONS

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Solutions to Review Problems for Chapter 6 ( ), 7.1

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Math 2331 Linear Algebra

Least-Squares Systems and The QR factorization

Typical Problem: Compute.

Chapter 4 Euclid Space

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Matrix decompositions

Math Real Analysis II

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Review problems for MA 54, Fall 2004.

Chapter 7: Symmetric Matrices and Quadratic Forms

Mathematics Department Stanford University Math 61CM/DM Inner products

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

The Gram-Schmidt Process 1

1. General Vector Spaces

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS

Transcription:

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205

Motivation When working with an inner product space, the most convenient bases are those that. consist of orthogonal vectors, and 2. consist of vectors of length.

Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set.

Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set. Example The standard basis {e, e 2,..., e n } is an orthonormal set in R n.

Orthogonal Sets Definition A set of vectors in an inner product space is an orthogonal set if all the vectors are pairwise orthogonal. An orthogonal set in which each vector has norm (length) is called an orthonormal set. Example The standard basis {e, e 2,..., e n } is an orthonormal set in R n. Recall: if v V and v 0 then has norm. u = v v

Example Create an orthonormal set of vectors from: v = (0,, 0) v 2 = (, 0, ) v 3 = (, 0, )

Example Create an orthonormal set of vectors from: v = (0,, 0) v 2 = (, 0, ) v 3 = (, 0, ) u = (0,, 0) u 2 = (/ 2, 0, / 2) u 3 = (/ 2, 0, / 2)

Orthogonality and Linear Independence Theorem If S = {v, v 2,..., v n } is an orthogonal set of nonzero vectors in an inner product space, then S is linearly independent.

Proof Suppose there are scalars k, k 2,..., k n such that k v + k 2 v 2 + + k n v n = 0. Scalar k i = 0 since k v + k 2 v 2 + + k n v n, v i = 0, v i k v, v i + k 2 v 2, v i + + k n v n, v i = 0 k i v i, v i = 0 k i v i 2 = 0 k i = 0 Since this is true for all i we have k = k 2 = = k n = 0.

Orthonormal Basis Definition A basis for an inner product space consisting of orthonormal vectors is called an orthonormal basis. A basis for an inner product space consisting of orthogonal vectors is called an orthogonal basis.

Orthonormal Basis Definition A basis for an inner product space consisting of orthonormal vectors is called an orthonormal basis. A basis for an inner product space consisting of orthogonal vectors is called an orthogonal basis. Theorem If B = {v, v 2,..., v n } is an orthonormal basis for an inner product space V, and if u V, then u = u, v v + u, v 2 v 2 + + u, v n v n. Remark: The coordinates of u relative to B are u, v, u, v 2,..., u, v n.

Example Let u =, u 2 = 2 3, u 3 = 4 5 Verify that {u, u 2, u 3 } is an orthogonal basis for R 3 and find an orthonormal basis for R 3. Express u = 0 in terms of this orthonormal basis.

Solution Since u = 3, u 2 = 4, and u 3 = 42 then v = 3, v 2 = 42 are orthonormal vectors. 2 3, v 3 = 42 4 5 u = u, v v + u, v 2 v 2 + u, v 3 v 3 = 2 v v 2 + 5 v 3 3 4 42

Coordinates Relative to Orthonormal Bases Theorem If B is an orthonormal basis for an n-dimensional inner product space, and if (u) B = (u, u 2,..., u n ) and (v) B = (v, v 2,..., v n ) then u = u 2 + u2 2 + u2 n d(u, v) = (u v ) 2 + (u 2 v 2 ) 2 + + (u n v n ) 2 u, v = u v + u 2 v 2 + + u n v n

Coordinates Relative to Orthonormal Bases Theorem If B is an orthonormal basis for an n-dimensional inner product space, and if (u) B = (u, u 2,..., u n ) and (v) B = (v, v 2,..., v n ) then u = u 2 + u2 2 + u2 n d(u, v) = (u v ) 2 + (u 2 v 2 ) 2 + + (u n v n ) 2 u, v = u v + u 2 v 2 + + u n v n Remark: computing norms, distances, and inner products with coordinates relative to orthonormal bases is equivalent to computing them in Euclidean coordinates.

Coordinates Relative to Orthogonal Bases If B = {v, v 2,..., v n } is merely an orthogonal basis for V then { } v B = v, v 2 v 2,..., v n v n is an orthonormal basis for V. Thus if u V u = u, v v 2 v + u, v 2 v 2 2 v 2 + + u, v n v n 2 v n.

Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space?

Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space? Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. u = w + w 2

Orthogonal Projections Question: how do you create an orthogonal basis for an arbitrary finite-dimensional inner product space? Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. u = w + w 2 Remark: w is called the orthogonal projection of u on W. w 2 is called the component of u orthogonal to W.

Illustration u = w + w 2 = proj W u + proj W u Thus proj W u = u proj W u u = proj W u + (u proj W u) u w 2 u u-proj w u w proj W u

Calculating Orthogonal Projections Theorem Suppose W is an finite-dimensional subspace of an inner product space V.. If B = {v, v 2,..., v r } is an orthonormal basis for W and u V then proj W u = u, v v + u, v 2 v 2 + + u, v r v r. 2. If B = {v, v 2,..., v r } is an orthogonal basis for W and u V then proj W u = u, v v 2 v + u, v 2 v 2 2 v 2 + + u, v r v r 2 v r.

Example B = {v, v 2, v 3 } = 3, 4 2 3, 42 is an orthonormal basis for R 3. Let W = span{v, v 2 }. 4 5 Calculate proj W 0.

Solution proj W 0 = (, 0, ), + (, 0, ), = 2 v v 2 3 4 = 2 25 42 37 42 3 (,, ) v 4 (2,, 3) v 2

Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces.

Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces. Theorem Every nonzero finite-dimensional inner product space has an orthonormal basis.

Gram-Schmidt Process Remark: we know that every nonzero finite-dimensional vector space has a basis. We can develop a stronger statement for inner product spaces. Theorem Every nonzero finite-dimensional inner product space has an orthonormal basis. Proof. The proof is an algorithm known as the Gram-Schmidt Process.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }. 4. Continue in this manner until we have defined B = {u, u 2,..., u n }, an orthogonal basis for V.

Gram-Schmidt Algorithm Let S = {v, v 2,..., v n } be a basis for V. To construct an orthonormal basis for V we perform the following steps.. Let u = v and define W = span{u }. 2. Let u 2 = v 2 proj W v 2 = v 2 v 2, u u 2 u and define W 2 = span{u, u 2 }. 3. Let u 3 = v 3 proj W2 v 3 = v 3 v 3, u u 2 u v 3, u 2 u 2 2 u 2 and define W 3 = span{u, u 2, u 3 }. 4. Continue in this manner until we have defined B = {u, u 2,..., u n }, an orthogonal basis for V. { } u 5. B = u, u 2 u 2,..., u n is orthonormal basis for V. u n

Example Given 2 2, 4 3 2, 2 find an orthonormal basis for R 3.

Solution Step : v = ( (, 2, 2) (, 2, 2) = 3, 2 ) 3, 2 3

Solution ( (, 2, 2) Step : v = (, 2, 2) = 3, 2 ) 3, 2 3 Step 2: v 2 = (4, 3, 2) (4, 3, 2), v v 2 v = (4, 3, 2) (4, 3, 2), v v since v is a unit vector. ( 0 v 2 = (4, 3, 2) 2v = 3, 5 3, 0 ) 3 We will replace v 2 with a unit vector in the same direction, so v 2 = (2/3, /3, 2/3).

Solution ( (, 2, 2) Step : v = (, 2, 2) = 3, 2 ) 3, 2 3 Step 2: v 2 = (4, 3, 2) (4, 3, 2), v v 2 v = (4, 3, 2) (4, 3, 2), v v since v is a unit vector. ( 0 v 2 = (4, 3, 2) 2v = 3, 5 3, 0 ) 3 We will replace v 2 with a unit vector in the same direction, so v 2 = (2/3, /3, 2/3). Step 3: v 3 = (, 2, ) (, 2, ), v (, 2, ), v 2 (since v and v 2 are unit vectors). ( v 3 = (, 2, ) ()v (2)v 2 = 2 3, 2 3, ) 3 Vector v 3 is already a unit vector.

Example Let vector space P 2 have the inner product p, q = p(x)q(x) dx. Given {, x, x 2 } find an orthonormal basis for P 2.

Solution ( of 2), = v = 2 () 2 dx = 2

Solution ( of 2), = v = 2 u = x v 2 = () 2 dx = 2 x, = x 2 = x 2 2 x x, x = 2 x dx 3 2 x

Solution (2 of 2) v = 2 3 v 2 = 2 x u 3 = x 2 x 2, 3 2 x 2, 2 = x 2 2 3 2 3 (0) 2 x = x 2 3 2 x 3 2 x v 3 = x 2 /3 x 2 3, x 2 3 = 5 8 (3x 2 )

Extending Orthonormal Sets Recall: earlier we stated a theorem which said that a linearly independent set in a finite-dimensional vector space can be enlarged to a basis by adding appropriate vectors.

Extending Orthonormal Sets Recall: earlier we stated a theorem which said that a linearly independent set in a finite-dimensional vector space can be enlarged to a basis by adding appropriate vectors. Theorem If W is a finite-dimensional inner product space, then: Every orthogonal set of nonzero vectors in W can be enlarged to an orthogonal basis for W. Every orthonormal set of nonzero vectors in W can be enlarged to an orthonormal basis for W.

QR-Decomposition Remark: the Gram-Schmidt Process can be used to factor certain matrices.

QR-Decomposition Remark: the Gram-Schmidt Process can be used to factor certain matrices. Theorem (QR-Decomposition) If A is an m n matrix with linearly independent column vectors, then A can be factored as A = QR where Q is an m n matrix with orthonormal column vectors, and R is an n n invertible upper triangular matrix.

QR-Decomposition (continued) Proof. Suppose A = [ u u 2 u n ] and rank(a) = n.

QR-Decomposition (continued) Proof. Suppose A = [ u u 2 u n ] and rank(a) = n. Apply the Gram-Schmidt process to {u, u 2,..., u n } to produce the orthonormal basis {q, q 2,..., q n } and define matrix Q = [ q q 2 q n ].

QR-Decomposition (continued) Proof. Note that u = u, q q + u, q 2 q 2 + + u, q n q n u 2 = u 2, q q + u 2, q 2 q 2 + + u 2, q n q n. u n = u n, q q + u n, q 2 q 2 + + u n, q n q n

QR-Decomposition (continued) Proof. Note that u = u, q q + u, q 2 q 2 + + u, q n q n u 2 = u 2, q q + u 2, q 2 q 2 + + u 2, q n q n. u n = u n, q q + u n, q 2 q 2 + + u n, q n q n Writing this system of equations in matrix form yields u, q u 2, q u n, q u, q 2 u 2, q 2 u n, q 2 A = Q... = QR u, q n u 2, q n u n, q n

QR-Decomposition (continued) Proof. By definition i q i = u i u i, q j q j j= i u i = q i + u i, q j q j j= u i span{q, q 2,..., q i }

QR-Decomposition (continued) Proof. By definition i q i = u i u i, q j q j j= i u i = q i + u i, q j q j j= u i span{q, q 2,..., q i } Thus for j 2 vector q j is orthogonal to u, u 2,..., u j.

QR-Decomposition (continued) Proof. Thus R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n

QR-Decomposition (continued) Proof. Thus R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n We must still show that u i, q i 0 for i =, 2,..., n in order for R to be nonsingular.

QR-Decomposition (continued) Proof. Recall i u i = q i + u i, q j q j u i, q i = j= i q i + u i, q j q j, q i j= i = q i, q i + u i, q j q j, q i }{{} j= =0 = q i 2 0

QR-Decomposition (continued) Proof. Thus (finally) R = u, q u 2, q u n, q 0 u 2, q 2 u n, q 2... 0 0 u n, q n which is invertible.

Example Example Find the QR-decomposition of A = 2 2 3 2 6.

Example Example Find the QR-decomposition of A = 2 2 3 2 6. Hint: an orthonormal basis for the column space of A is 2 2, 2, 0 0 0 0 2. 0

Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2

Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2. Find w and w 2 with the stated properties.

Proof of Projection Theorem Theorem (Projection Theorem) If W is a finite-dimensional subspace of an inner product space V, then every vector u V can expressed uniquely as where w W and w 2 W. Proof. There are two steps: u = w + w 2. Find w and w 2 with the stated properties. 2. Show that w and w 2 are unique.

Homework Read Section 6.3 Exercises:, 3, 7, 9, 5, 9, 27, 29