There are two things that are particularly nice about the first basis

Similar documents
Chapter 3 Transformations

INNER PRODUCT SPACE. Definition 1

Linear Algebra Massoud Malek

MATH Linear Algebra

Math Real Analysis II

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Applied Linear Algebra in Geoscience Using MATLAB

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MATH 235: Inner Product Spaces, Assignment 7

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Inner Product Spaces

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

MTH 2032 SemesterII

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

The Gram Schmidt Process

The Gram Schmidt Process

Linear Algebra. Session 12

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Algebra, Summer 2011, pt. 3

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Chapter 6: Orthogonality

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Chapter 6. Orthogonality

Lecture 4 Orthonormal vectors and QR factorization

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

MTH 2310, FALL Introduction

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Vectors. Vectors and the scalar multiplication and vector addition operations:

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

k is a product of elementary matrices.

Beyond Vectors. Hung-yi Lee

MATH 22A: LINEAR ALGEBRA Chapter 4

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Linear Algebra problems

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

(v, w) = arccos( < v, w >

Linear Algebra. and

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Typical Problem: Compute.

6.1. Inner Product, Length and Orthogonality

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

(v, w) = arccos( < v, w >

Math Linear Algebra II. 1. Inner Products and Norms

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Mathematics Department Stanford University Math 61CM/DM Inner products

(v, w) = arccos( < v, w >

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Section 6.4. The Gram Schmidt Process

Lecture Notes 1: Vector spaces

Linear Models Review

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Section 7.5 Inner Product Spaces

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

1. General Vector Spaces

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

MAT Linear Algebra Collection of sample exams

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES

Math 3191 Applied Linear Algebra

Recall that any inner product space V has an associated norm defined by

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Lecture 3: Linear Algebra Review, Part II

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

LINEAR ALGEBRA REVIEW

7. Dimension and Structure.

Solutions to Review Problems for Chapter 6 ( ), 7.1

Linear Algebra. Min Yan

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Recall the convention that, for us, all vectors are column vectors.

Definitions for Quizzes

Review of Linear Algebra

Eigenvectors and Hermitian Operators

Math 291-2: Lecture Notes Northwestern University, Winter 2016

ECS130 Scientific Computing. Lecture 1: Introduction. Monday, January 7, 10:00 10:50 am

5. Orthogonal matrices

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Math 407: Linear Optimization

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Transpose & Dot Product

Solution to Homework 8, Math 2568

March 27 Math 3260 sec. 56 Spring 2018

LINEAR ALGEBRA W W L CHEN

Transcription:

Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially be chosen in many different ways: for example, the sets { ( ) ( )} 0 B, 0 and { ( ) 3 B, ( )} both form bases for R The vectors from the first basis are graphed below in red, and the vectors from the second basis are graphed in blue: There are two things that are particularly nice about the first basis { ( ) ( )} 0 B, 0 as compared with the second basis: The vectors in B are orthogonal with respect to the usual inner product (dot product) on R Both of the vectors in B are unit vectors (length ) with respect to the dot product Since a basis gives us a means for representing the coordinates or location of any vector in the space, bases with the two properties listed above are especially advantageous; in this section, we will describe a method for locating bases of a vector space that have these two properties

Orthogonal and Orthonormal Sets It is now a convenient time to develop some vocabulary to describe the types of basis vectors that we are looking for: Definition A set of vectors in a (real) inner product space V equipped with the inner product, is called orthogonal if each distinct pair of vectors is orthogonal with respect to, If in addition each vector in the set has norm with respect to the inner product, the set is called orthonormal Returning to our example of two different basis for R, it is easy to see that the red vectors below form an orthonormal set: They are both orthogonal and length, and we say that they form an orthonormal basis for R However, the blue vectors are neither orthogonal nor length, thus they form neither an orthonormal nor an orthogonal basis for R Fortunately, even if vector v is not a unit vector, there is a simple algorithm for finding a unit vector preserving the direction of v: if v is not the zero vector, so that then the vector v 0, is a unit vector, and has the same direction as v u v v v

We can quickly check that each of these claims is true: first, let s calculate the length of u v : u v v v v v since v is a nonnegative scalar, so that u v is indeed a unit vector Let s check that u v and v have the same direction by calculating the angle θ between them: recall that ( ) θ cos uv, v u v ( cos v v, v ) v ( ) v, v cos v ( ) v, v cos ( ) v, v ( ) v, v cos v, v cos 0 Since the angle θ between u v and v is 0, the vectors clearly have the same direction The process of finding a unit vector with the same direction as v is called normalizing v Key Point We can normalize any nonzero vector v by calculating u v v v The normalized vector u v is the unit vector in the same direction as v Regardless of their length, orthogonal vectors are extremely important for the following reason: Theorem 63 If S {v, v,, v n } is an orthogonal set of (nonzero) vectors in a inner product space V, then the S is a linearly independent set 3

The theorem leads to a helpful observation: Key Point A set of n orthogonal vectors in an n dimensional inner product space V is a basis for V Example The vectors form a basis for P f f(x) + x, g g(x) x, and h h(x) + x Is the basis an orthogonal basis under the usual inner product on P? Is the basis an orthonormal basis? 3 If it is orthogonal but not orthonormal, use the vectors above to find a basis for P that is orthonormal Recall that the standard inner product on P is defined on vectors in P by f f(x) a 0 + a x + a x and g g(x) b 0 + b x + b x f, g a 0 b 0 + a b + a b To answer all of the questions above, we ll need to calculate the lengths of each vector and the angles between each pair of vectors; since both calculations involve the inner products, we record each pair of inner products on the chart below: Since the length of a vector f is given by we see that, f g h f 0 0 g 0 4 0 h 0 0 f f, f, f, g, and h The angle between a pair of vectors is given by ( ) f, g θ cos ; f g however, inspecting the chart above, we see that f, g g, h f, h 0 Since cos 0 π, the three vectors are clearly mutually orthogonal We are now ready to answer each of the questions posed above: 4

Is the basis an orthogonal basis under the usual inner product on P? Since the basis vectors are mutually orthogonal, the basis is an orthogonal basis Is the basis an orthonormal basis? None of the basis vectors are unit vectors, so it is orthogonal but not orthonormal 3 If it is orthogonal but not orthonormal, use the vectors above to find a basis for P that is orthonormal We can use the vectors above to create unit vectors using the formula u v v v Since f, g, and h, the vectors u f + x, u g x, and u h + x are unit vectors Since they preserve the directions of f, g, and h respectively, u f, u g, and u h are mutually orthogonal (and so are linearly independent as well as a basis for the three dimensional P ) Since they are also unit vectors, they form an orthonormal basis for P Examples of Orthonormal Bases Standard Basis for R n The standard basis vectors 0 0 0 e, e,, and e 0 n 0 0 of R n are mutually orthogonal unit vectors under the standard inner product (dot product) on R n, thus form an orthonormal basis for R n Standard Basis for P n The standard basis vectors, x, x,, and x n of the vector space P n of all polynomials of degree no more than n are mutually orthogonal unit vectors under the standard inner product on P n defined by f, g a 0 b 0 + a b + + a n b n

An Orthonormal Basis for M n Let e ij indicate the n n matrix whose i, j entry is, and all of whose other entries are 0 For example, e 3 is given by 0 0 0 0 0 0 0 e 3 0 0 0 0 0 0 0 0 Under the standard inner product on M n, U, V tr (U V ), the set of all matrices of the form e ij for i, j, n forms an orthonormal basis for M n Coordinates and Orthogonal/Orthonormal Bases In Section 44, we learned that if the (ordered) set S {v,, v n } of vectors in V forms a basis for V, so that any vector u in V is a linear combination of the basis vectors, such as u c v + c v + + c n v n, then we can use the coefficients c, c, etc of the basis vectors to write the coordinates of u under the basis as c c (u) S As we saw in Section 44, it can often be complicated to find the desired coordinates for a given vector However, if the basis being used for the coordinate representation is orthogonal or orthonormal, there is a simple formula for finding the coordinates of a given vector: c n Theorem 63 Let S {v,, v n } be a basis for an inner product space V under the inner product, If S is an orthogonal basis and u is any vector in V, then u may be written as the linear combination u u, v v v + u, v v v + + u, v n v n v n, 6

and the coordinates of u under this basis are given by the matrix u, v v u, v n v n u, v n v n If S is an orthonormal basis and u is any vector in V, then u may be written as the linear combination u u, v v + u, v v + + u, v n v n, and the coordinates of u under this basis are given by the matrix u, v u, v n u, v n Example Find the coordinates of p p(x) + x x under the orthonormal basis {u f + x, u g x, u h + x } Since the given basis is orthonormal, the coordinates of p are given by the matrix p, u f p, u g p, u h 7

Let s make the calculations: p, u f + x x, + x ) + 0 + ( ) ( 4 6 ; p, u g + x x, x 0 + 0 ; p, u h + x x, + x ( ) + 0 ( ) 4 Thus the coordinates of p with respect to this basis are given by 6 p, u f p, u g p, u h 8

Finding an Orthonormal/Orthogonal Basis for an Inner Product Space The next theorem describes a truly beautiful fact about inner product spaces: Theorem 63 Every (finite-dimensional) inner product space has an orthonormal basis Of course, the theorem leads to a question: Given an inner product space V, how do we find an orthonormal basis for it? The following algorithm (one of the more important and well known algorithms in the field), called the Gram-Schmidt Process, answers this question Given any basis for the inner product space of interest, the Gram-Schmidt Process converts it into an orthogonal (or orthonormal, if desired) basis The Gram-Schmidt Process Let {u, u,, u n } be any basis for an inner product space V with inner product, The following process uses this basis to build an orthogonal basis {v, v,, v n } for V : v u v u u, v v v 3 v 3 u 3 u 3, v v v u 3, v v v 4-r Repeat up to v r To convert this basis to an orthonormal one, normalize each of the basis vectors v, v,, v r Essentially, the algorithm works by projecting each old basis vector u i onto the orthogonal subspace to that spanned by the previous orthogonal vectors v,, v i Example Convert the basis { ( ) 3, into an orthonormal one ( )} With u ( ) 3 and u ( ), 9

we begin the Gram-Schmidt Process by setting v u Next, we calculate v using the formula we need to find and Thus we have v u u, v v v ; u, v 3 + 8 v 3 + v u u, v v v ( ) 8 ( ) ( ) 6 ( 4 ( ) 3 ) Thus the set { 3, } 6 forms an orthogonal basis for R ; however, as we were asked to find an orthonormal basis, we need to normalize each of the vectors above by multiplying them by the reciprocals of their lengths We

have v v ( ) 3 3 3, and v v 0 6 3 Thus the set 3 {, } 3 is an orthonormal basis for R The two vectors from the original basis are graphed below:

As we saw earlier, they are neither orthogonal nor orthonormal The new basis vectors under the Gram-Schmidt Process are graphed in red: Notice that they are orthogonal, and that they appear to be the same length (unit vectors) Finally, the original basis and the new basis are graphed together:

Notice that one of the new basis vectors is just a scaled-down version of one of the old vectors, whereas the second new basis vector has been moved so that it is orthogonal 3