Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Similar documents
Math 3191 Applied Linear Algebra

MTH 2310, FALL Introduction

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

Solutions to Review Problems for Chapter 6 ( ), 7.1

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Orthogonal Complements

Lecture 10: Vector Algebra: Orthogonal Basis

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math 2331 Linear Algebra

Math Linear Algebra

Linear Models Review

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Section 6.4. The Gram Schmidt Process

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Math 3191 Applied Linear Algebra

There are two things that are particularly nice about the first basis

The Gram Schmidt Process

The Gram Schmidt Process

MATH Linear Algebra

March 27 Math 3260 sec. 56 Spring 2018

Math Real Analysis II

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Orthogonal complement

Vectors. Vectors and the scalar multiplication and vector addition operations:

6. Orthogonality and Least-Squares

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Applied Linear Algebra in Geoscience Using MATLAB

Transpose & Dot Product

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

Transpose & Dot Product

Homework 5. (due Wednesday 8 th Nov midnight)

MTH 2032 SemesterII

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Designing Information Devices and Systems II

6.1. Inner Product, Length and Orthogonality

Announcements Monday, November 20

LINEAR ALGEBRA SUMMARY SHEET.

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

EXAM. Exam 1. Math 5316, Fall December 2, 2012

INNER PRODUCT SPACE. Definition 1

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Chapter 3 Transformations

Inner Product, Length, and Orthogonality

Typical Problem: Compute.

Reduction to the associated homogeneous system via a particular solution

The Gram-Schmidt Process 1

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 2331 Linear Algebra

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Further Mathematical Methods (Linear Algebra)

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

2.4 Hilbert Spaces. Outline

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Lecture Notes 1: Vector spaces

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Lecture 3: Review of Linear Algebra

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 4 Orthonormal vectors and QR factorization

Lecture 3: Review of Linear Algebra

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 2331 Linear Algebra

A Primer in Econometric Theory

Mathematics Department Stanford University Math 61CM/DM Inner products

Lecture 1: Basic Concepts

Math Linear Algebra II. 1. Inner Products and Norms

Linear Algebra Massoud Malek

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Pseudoinverse & Moore-Penrose Conditions

Lecture 1.4: Inner products and orthogonality

Orthogonality and Least Squares

Announcements Monday, November 26

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Inner Product Spaces

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

P = A(A T A) 1 A T. A Om (m n)

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Orthogonality. Orthonormal Bases, Orthogonal Matrices. Orthogonality

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

6.241 Dynamic Systems and Control

Answer Key for Exam #2

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

2. Review of Linear Algebra

Columbus State Community College Mathematics Department Public Syllabus

Transcription:

Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product space of R n with the dot product. Notation. When discussing vectors in an inner product space and not strictly dot products, a slightly different notation is used. The inner product we primarily use will be the dot product; however, a general inner product is denoted by u, v. Thus when dealing with dot products, u v = u, v. Like vector spaces, we can define this operation differently; most of the time, however, we won t. Definition. Two vectors are orthogonal if their dot product is zero. That is, vectors u and v are orthogonal if u v = 0. Note that for two-dimensional and three-dimensional vectors this means the angle between the two vectors is 90. A set of vectors { v, v,..., v n } is orthogonal if the vectors are mutually orthogonal; that is, for every i j, v i v j = 0. A vector is normal it has a length of unit; that is, if v i = for i =... n. A set of vectors { v, v,..., v n } is orthonormal if the vectors are mutually orthogonal and each vector is normal. Example. Show that the basis B for R below is an orthogonal set. Construct an orthonormal basis from this set. {[ ] [ ]} B =, Figure 5 5 5 5 y x

Example. Show that the set B is orthogonal. If this set is a basis for R, construct an orthormal basis from it. 0 B = 0,, 0 0 Instructor: A.E.Cary Page of 9

Recall from Calculus III that a scalar projection of u onto v is given by comp v u = projection of u onto v (or the orthogonal projection of u onto v ) is given by proj v u = We can also write comp v u = u v v and proj v u = u v v v v u v v and the vector u v v v v. Figure 5 y x 5 5 5 Figure 5 y x 5 5 5 Example. Derive the above formulas for comp v u and proj v u. Instructor: A.E.Cary Page of 9

Example. What can we say about proj v u and u proj v u? The two are orthogonal! Verify this. Example 5. Use the standard inner product on R (the dot product) and find proj v u and u proj v u for u = and v =. Confirm that they are orthogonal. Figure 5 y x 5 5 5 Instructor: A.E.Cary Page of 9

Orthogonal Decomposition and the Gram-Schmidt Process Definition. Let W be a subspace of R n. The set of all vectors z that are orthogonal to the vectors in W is called the orthogonal complement of W and is denoted by W. Theorem. A vector x is in W if and only if x is orthogonal to every vector in a set that spans W. Furthermore, W is itself a subspace of R n. Theorem. The Orthogonal Decomposition Theorem Let W be a subspace of R n. Then each y in R n can be written uniquely in the form y = ŷ + z where ŷ is in W and z is in W. Furthermore, if { u,..., u p } is an orthogonal basis for W, then and z = y ŷ. ŷ = y u u u u + + y u p u p u p u p Definition. Let W be a subspace of R n and let { u,..., u p } be an orthogonal basis for W. The orthogonal projection of y onto W, denoted by ŷ or proj W y is ŷ = y u u u u + + y u p u p u p u p Instructor: A.E.Cary Page 5 of 9

Example 6. Verify that { u, u } is an orthogonal set and then find the orthogonal projection of y onto Span { u, u }. u = u = y = Theorem. The Best Approximation Theorem Let W be a subspace of R n, let y be any vector in R n, and let ŷ be the orthogonal projection of y onto W. Then ŷ can be used to determine the closest point in W to y as for all v in W distinct from ŷ. y ŷ < y v Example 7. State the closest point in W to the vector y for the previous example. Instructor: A.E.Cary Page 6 of 9

Theorem. Every finite dimensional inner product space has an orthogonal basis. Gram-Schmidt Process This process is an algorithm for finding an orthonormal basis to any inner product space, and can be used to find an orthonormal basis to any vector space. Gram-Schmidt Process. () Let B = { v, v,..., v n } be any basis for the inner product space V. () Use B to define a set of n vectors { w, w,..., w n } as follows: w = v w = v proj w v w = v proj w v proj w v w = v proj w v proj w v proj w v. w n = v n proj w v n proj w v n proj wn v n () The set B = { w, w,..., w n } is an orthogonal basis for V. () To obtain an orthonormal basis, we divide each vector in B by its length. The basis B below is an orthonormal basis for V : { } B w = w, w w,..., w n w n Example 8. Use the standard inner product on R, the basis B = 0, 0, and the Gram-Schmidt process to find an orthonormal basis for R. Instructor: A.E.Cary Page 7 of 9

Instructor: A.E.Cary Page 8 of 9

Example 9. Find an orthogonal basis for the column space of the matrix A below. Use this to then find an orthonormal basis for the column space of A. 6 6 8 6 Instructor: A.E.Cary Page 9 of 9