Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Similar documents
Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

MTH 2310, FALL Introduction

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

The Gram Schmidt Process

The Gram Schmidt Process

There are two things that are particularly nice about the first basis

March 27 Math 3260 sec. 56 Spring 2018

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math 3191 Applied Linear Algebra

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 3191 Applied Linear Algebra

Linear Models Review

The Gram-Schmidt Process 1

Chapter 6: Orthogonality

MTH 2032 SemesterII

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math Real Analysis II

MATH Linear Algebra

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Lecture 4: Applications of Orthogonality: QR Decompositions

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

Math Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Chapter 3 Transformations

Linear Algebra. Min Yan

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Linear Algebra II. 1. Inner Products and Norms

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

New concepts: Span of a vector set, matrix column space (range) Linearly dependent set of vectors Matrix null space

Homework 5. (due Wednesday 8 th Nov midnight)

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Diagonalizing Matrices

Answer Key for Exam #2

Chapter 3: Vector Spaces x1: Basic concepts Basic idea: a vector space V is a collection of things you can add together, and multiply by scalars (= nu

Math 2030 Assignment 5 Solutions

Solutions to Review Problems for Chapter 6 ( ), 7.1

Homework 11 Solutions. Math 110, Fall 2013.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes

Algebra II. Paulius Drungilas and Jonas Jankauskas

Applied Linear Algebra in Geoscience Using MATLAB

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

LINEAR ALGEBRA W W L CHEN

6.1. Inner Product, Length and Orthogonality

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Vectors. Vectors and the scalar multiplication and vector addition operations:

Typical Problem: Compute.

Conceptual Questions for Review

Section 6.4. The Gram Schmidt Process

MATH 22A: LINEAR ALGEBRA Chapter 4

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Solutions to Math 51 First Exam April 21, 2011

Section 7.5 Inner Product Spaces

Linear Algebra Massoud Malek

(v, w) = arccos( < v, w >

Linear Algebra 2 Spectral Notes

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

4.3 - Linear Combinations and Independence of Vectors

Math 291-2: Lecture Notes Northwestern University, Winter 2016

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Mathematics Department Stanford University Math 61CM/DM Inner products

Math 21b Final Exam Thursday, May 15, 2003 Solutions

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Chapter 4 Euclid Space

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

INNER PRODUCT SPACE. Definition 1

Methods of Mathematical Physics X1 Homework 2 - Solutions

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

Designing Information Devices and Systems II

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Recall that any inner product space V has an associated norm defined by

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

(v, w) = arccos( < v, w >

Reduction to the associated homogeneous system via a particular solution

Chapter 6. Orthogonality and Least Squares

Linear Analysis Lecture 16

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Orthogonal Complements

October 25, 2013 INNER PRODUCT SPACES

Transcription:

Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of a vector v = i= v jjvjj = p ux v v = t n vi : i= v v v n If v = ; then we can \normalize" v by letting u = pointing in the same direction as v. jjvjj v. This is a vector of length s examples ofvectors of length in R we have the canonical basis vectors e = ; e = ;and e =. Recall also the denition of orthogonality: Denition Two vectors, v and w, in R n are called \orthogonal" if the dot product, v w = :

We see that each of the canonical basis vectors e ; e ; and e is orthogonal to the other two. This turns out to be an important and useful property, which leads to a denition: Denition n orthonormal basis for R n is a basis fu ; :::; u n g which has the additional two properties that (a) u i u j = if i = j, and (b) jju i jj = for i = ; ::; n. Note that these can be combined into the statement that if i = j u i u j = if i = j : n important fact is that if V is an subspace of dimension k, and if fv ; :::; v k g is a set of mutually orthogonal non-zero vectors (u i u j = if i = j), then this set of vectors is automatically linearly independent. The trick for proving this is very important in this subject, so I give it here: Theorem 5.5. If fv ; :::; v k g is a set of nonzero mutually orthogonal vectors in a subspace V; then it is a basis for its span. Proof: We only need to show that the vectors are linearly independent. Suppose that kx c i v i = : Then ut then = v v i= kx c i v i = : i= kx c i v i = i= kx c i v v i = c v v i= because v is orthogonal to v i if i =. Since v is nonzero, we must have c = : Similarly, taking the inner product of each v j with k P i= c i v i ; we obtain c j = for each i: Hence fv ; :::; v k g is a linearly independent set and so a basis of whatever vector space it spans. y normalizing each vector we could get an orthonormal basis.

For example, in R we have the standard basis, which is orthonormal: (!!) ut another orthonormal basis is that is any number. is an orthonormal basis. Then cos sin p p ; ; sin cos p p ; : More generally, suppose. Sometimes we don't care muchif the vectors in the basis are normalized. example, we would say that ; is an \orthogonal basis". For Now suppose that V = R n ; and suppose that fv ; :::; v n g is an orthonormal basis for R n : Then associated with this basis we have an n n matrix Q whose columns are the vectors v ; :::; v n. The matrix Q T has the same vectors as its rows (actually, the transpose of these vectors). Since the vectors form an orthonormal set, it follows from the properties of matrix multiplication that Q T Q = I: Equivalently we could say that Q T = Q : Denition: matrix Q such that Q T Q = I is called an \orthogonal" matrix. Remark: You might think that such a Q should be called \orthonormal". You might be right { but it is not.. dvantage of orthonormal bases: Here is an example to show why orthonormal bases are useful. One important orthogonal (not orthonormal) basis of R n is called the \wavelet" basis. The wavelet basis of R 4 is 9 8> < ; ; ; :

We can easily make this into an orthonormal basis fu ; u ; u ; u 4 g ; where u = ; u = ; u = p ; u 4 = p Suppose we want to express some other vector v in terms of these vectors. We want to have v =c u + c u + c u + c 4 u 4 : One method for nding the c i is to set up four equations in four unknowns and solve, using Gaussian elimination. If the basis were not orthonormal, this is what we would have to do. ut since it is orthonormal, we can take the inner product hv; v j i using the above sum. s we saw earlier, we get v u j = c j u j u j = c j : So nding the four c j requires evaluating four dot products, which is much quicker than Gaussian elimination. Essentially a lot of the work of the Gaussian elimination was done in showing that the basis was orthonormal, and we don't have to repeat that each time we have a new v:. orthogonal projection Suppose that u and v are vectors in R n : For convenience, you can think of R ; for example. Then we dene the \projection of v onto u" as follows: u v p = proj u v = jjujj u: Notice that the quantity in parentheses is a scalar. So this is some multiple of u. To see why we choose this particular multiple, we consider the vector v p. Recall that this runs from the end of p to the end of v. Proposition The vector v p is orthogonal to u. Proof. We take the dot product of u and v p: u (v p) = u v u p = u v u = u v u v jjujj u u: 4 u v jjujj u

ut u u = jjujj ; so there is cancellation and we get on the right. This proves that v p is orthogonal to u. In the denition and in this proof, I did not assume that u was a unit vector. ut if it is, the formula becomes simpler, because jjujj = : We then get p = (u v) u: Now let's discuss a projection onto a plane in R. Suppose that u and u are unit vectors in R : Suppose also that they are orthogonal, so that u u = : Their span, span fu ; u g is a plane. Suppose that v is another vector. We want to nd a point p in span fu ; u g such that v p is orthogonal to both u and u. This will mean that it is orthogonal to every vector in the plane spanned by u and u. Since we are taking u and u to be unit vectors, the formula is fairly simple. Set p = (v u ) u + (v u ) u : Note that this is a linear combination of u and u ; because the dot products are scalars. We check: u (v p) = u v u p: lso, ut u u = and u u =. u p = (v u ) (u u ) + (v u ) (u u ) This means that u p = u v, or u (v p) = : Hence v p is orthogonal to u ; and in the same way we can show that it is orthogonal to u : The point p is the \orthogonal projection" of v onto span fu ; u g.. Gram-Schmidt procedure for nding an orthonormal basis. In an earlier section we saw that it is important to nd orthonormal bases. do we nd them? Here is the problem: How Given a basis fv ; :::; v n g of some subspace V of R m (where m n), nd an orthonormal basis fu ; :::; u n g : 5

The process for doing this is very famous and important. It is called the \Gram- Schmidt process." (The text explains that these authors were not the rst to invent it, however.) It is based on the orthogonal projection described above. The process is inductive. Set u = v : Now use u jjv jj ~u which is nonzero and orthogonal to u and then set u = ~u how to get ~u ; the rest will be easy to explain. and v to get a vector jj~u jj. Once we see The idea is to use projection. s we discussed earlier, we set and then ~u = v proj u v = v (v u ) u u = ~u jj~u jj : It is then clear that u is orthogonal to u and also has length : The next step is similar to what we did above in discussing a projection onto the span of two orthonormal vectors. We need to \project" v onto spanfu ; u g in such a way that v (the projection) is orthogonal to u and u : We saw that the formula is ~u = v (v u ) u (v u ) u The rest of the procedure continues in this way, as described in the text, and some examples are worked out there. - Here is an example. The vectors 8 < : ; ; 9 = ; form a basis for R. Starting from this basis, nd an orthonormal basis using the Gram-Schmidt method.

Solution: Let these vectors by v ; v ; v ; in the order given. Then u = ~u jj~u jj = p : ~u = v (v u ) u = = p p p p = p p p = p p p : Normalizing, we nd that u = p ~u = p ~u = = p p p p p p p p = Hence the orthonormal basis is 8 >< p p p ; p p p ; p p 9 You should check that these vectors are orthogonal to each other and of length. 7

Homework:. (9 pts.) (a) Verify that subspace of R 4 : 8 >< ; 8 >< V = x y z w ; j x + y + z + w = 9 is a basis for the following (b) Use the Gram-Schmidt process to nd an orthonormal basis for V. Hint: Make sure that the vectors in your answer are all orthogonal to each other, all of length, and all satisfy x + y + z + w = :. pg. 47, #. (Follow procedure in notes.). pg. 48, # 4. Find an orthonormal basis for the orthogonal complement of 8 9 x >< y z j x + z = and y w = : w 9 8