Math 3191 Applied Linear Algebra

Similar documents
Math 2331 Linear Algebra

Math 3191 Applied Linear Algebra

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Orthogonal Complements

Section 6.4. The Gram Schmidt Process

Math 3191 Applied Linear Algebra

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Solutions to Review Problems for Chapter 6 ( ), 7.1

Lecture 10: Vector Algebra: Orthogonal Basis

Math 3191 Applied Linear Algebra

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math 2331 Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MTH 2310, FALL Introduction

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Homework 5. (due Wednesday 8 th Nov midnight)

6.1. Inner Product, Length and Orthogonality

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

6. Orthogonality and Least-Squares

March 27 Math 3260 sec. 56 Spring 2018

The Gram Schmidt Process

The Gram Schmidt Process

MTH 2032 SemesterII

LINEAR ALGEBRA SUMMARY SHEET.

Math 2331 Linear Algebra

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 3191 Applied Linear Algebra

Lecture 4 Orthonormal vectors and QR factorization

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Problem # Max points possible Actual score Total 120

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 4: Applications of Orthogonality: QR Decompositions

Math 3191 Applied Linear Algebra

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Math 4377/6308 Advanced Linear Algebra

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Announcements Monday, November 26

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Announcements Monday, November 20

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

EECS 275 Matrix Computation

MATH Linear Algebra

2. Every linear system with the same number of equations as unknowns has a unique solution.

Dot product and linear least squares problems

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Announcements Monday, November 19

Transpose & Dot Product

Math Linear Algebra II. 1. Inner Products and Norms

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Linear Algebra Final Exam Study Guide Solutions Fall 2012

A Primer in Econometric Theory

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Transpose & Dot Product

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Lecture 9: Vector Algebra

Vectors. Vectors and the scalar multiplication and vector addition operations:

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 1553 PRACTICE FINAL EXAMINATION

Linear Models Review

Math 4377/6308 Advanced Linear Algebra

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Homework 11 Solutions. Math 110, Fall 2013.

Orthogonality and Least Squares

Math 3C Lecture 25. John Douglas Moore

Problem 1: Solving a linear equation

The Gram-Schmidt Process 1

Math 3191 Applied Linear Algebra

Diagonalizing Matrices

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

Math Real Analysis II

Chapter 6. Orthogonality and Least Squares

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Practice Final Exam. Solutions.

MATH 22A: LINEAR ALGEBRA Chapter 4

Math 115A: Homework 5

2.4 Hilbert Spaces. Outline

Maths for Signals and Systems Linear Algebra in Engineering

Designing Information Devices and Systems II

Math 3191 Applied Linear Algebra

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 15a: Linear Algebra Practice Exam 2

Applied Linear Algebra in Geoscience Using MATLAB

Review problems for MA 54, Fall 2004.

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

1 Last time: inverses

Transcription:

Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./

Orthonormal Sets A set of vectors {u, u,..., u p } in R n is called an orthonormal set if. It is orthogonal.. Each vector has length. If the orthonormal set {u, u,..., u p } spans a vector space W, then {u, u,..., u p } is called an orthonormal basis for W. Math 9Applied Linear Algebra p./

Orthogonal Matrices Recall that v is a unit vector if v = v v = v T v =. Suppose U = [u u u ] where {u, u, u } is an orthonormal set. Then U T U = u T u T u T [u u u ] = u T u u T u u T u u T u u T u u T u u T u u T u u T u = = It can be shown that UU T = I also. So U = U T (such a matrix is called an orthogonal matrix). (NOTE: U must be square to be orthogonal). Math 9Applied Linear Algebra p./

THEOREM U T U = I. An m n matrix U has orthonormal columns if and only if THEOREM Let U be an m n matrix with orthonormal columns, and let x and y be in R n. Then a. Ux = x b. (Ux) (Uy) = x y c. (Ux) (Uy) = if and only if x y =. Proof of part b: (Ux) (Uy) = Math 9Applied Linear Algebra p./

Section. Orthogonal Sets Review: by = y u u u u is the orthogonal projection of onto. y u Suppose {u,..., u p } is an orthogonal basis for W in R n. For each y in W, y = y u y up u u u + + u p u p u p Math 9Applied Linear Algebra p./

EXAMPLE Suppose {u, u, u } is an orthogonal basis for R and let W =Span{u, u }. Write y in R as the sum of a vector by in W and a vector z in W. W ƒ y z W u y u Math 9Applied Linear Algebra p./

Solution: Write y = y u u u + y u u u u u + y u u u u where by= y u u u + y u u u u u z = y u u u u. To show that z is orthogonal to every vector in W, show that z is orthogonal to the vectors in {u, u }. Since z u = = = z u = = = Math 9Applied Linear Algebra p./

THEOREM 8 THE ORTHOGONAL DECOMPOSITION THEOREM Let W be a subspace of R n. Then each y in R n can be uniquely represented in the form y =by + z where by is in W and z is in W. In fact, if {u,..., u p } is any orthogonal basis of W, then and z = y by. by = y u y up u u u + + u p u p u p The vector by is called the orthogonal projection of y onto W. Math 9Applied Linear Algebra p.8/

y z W y =proj W y Math 9Applied Linear Algebra p.9/

EXAMPLE: Let u =, u =, and y =. Observe that {u, u } is an orthogonal basis for W =Span{u, u }. Write y as the sum of a vector in W and a vector orthogonal to W. Solution: proj W y = by = y u u u u + y u u u u = ( ) + ( ) = z = y by = = 9 Math 9Applied Linear Algebra p./

Geometric Interpretation of Orthogonal Projections y y y u u u u u y u u u u u Math 9Applied Linear Algebra p./

THEOREM 9 The Best Approximation Theorem Let W be a subspace of R n, y any vector in R n, and by the orthogonal projection of y onto W. Then by is the point in W closest to y, in the sense that y by < y v for all v in W distinct from by. y z W y =proj W y Math 9Applied Linear Algebra p./

Outline of Proof Let v in W distinct from by. Then v by is also in W (why?) z = y by is orthogonal to W y by is orthogonal to v by y v = (y by) + (by v) = y v = y by + by v. y v > y by Hence, y by < y v. Math 9Applied Linear Algebra p./

EXAMPLE Find the closest point to y in Span{u, u } where y =, u =, and u =. Solution: by= y u u u u + y u u u u = ( ) + ( ) = Math 9Applied Linear Algebra p./

Another View of matrix Multiplication Part of Theorem below is based upon another way to view matrix multiplication where A is m p and B is p n AB = h col A col A col p A i row B row B. row p B = (col A) (row B) + + (col p A) (row p B) Math 9Applied Linear Algebra p./

For example = = h i + h i = Math 9Applied Linear Algebra p./

h So if U = u u u p i. Then U T = u T u T.. So u T p UU T = u u T + u u T + + u p u T p T `UU y = `u u T + u u T + + u p u T p y = `u u T y + `u u T y + + `up u T p y = u `ut y + u `ut y + + u p `ut p y = (y u ) u + (y u ) u + + (y u p ) u p `UU T y = (y u ) u + (y u ) u + + (y u p ) u p Math 9Applied Linear Algebra p./

THEOREM If {u,..., u p } is an orthonormal basis for a subspace W of R n, then If U = h proj W y = (y u ) u + + `y u p up u u u p i, then proj W y =UU T y for all y in R n. Outline of Proof: proj W y = y u y up u u u + + u p u p u p = (y u ) u + + `y u p up = UU T y. Math 9Applied Linear Algebra p.8/

Section. The Gram-Schmidt Process Goal: Form an orthogonal basis for a subspace W. EXAMPLE: Suppose W =Span{x, x } where x = Find an orthogonal basis {v, v } for W. and x =. Math 9Applied Linear Algebra p.9/

Let v = x =. by= proj v x = x v v v v and v = x by = x x v v v v = = (component of x orthogonal to x ) Math 9Applied Linear Algebra p./

EXAMPLE Suppose {x, x, x } is a basis for a subspace W of R. Describe an orthogonal basis for W. Solution: Let v = x and v = x x v v v v. {v, v } is an orthogonal basis for Span{x, x }. Let v = x x v v v v x v v v v (component of x orthogonal to Span{x, x }) Note that v is in W. Why? {v, v, v } is an orthogonal basis for W. Math 9Applied Linear Algebra p./

Theorem : The Gram-Schmidt Process Given a basis {x,..., x p } for a subspace W of R n, define v = x v = x x v v v v v = x x v v v v x v v v v. v p = x p x p v v v v x p v v v v Then {v,..., v p } is an orthogonal basis for W and x p v p v p v p v p Span{x,..., x p } =Span{v,..., v p } Math 9Applied Linear Algebra p./

EXAMPLE Suppose {x, x, x }, where x =, x =, x =, is a basis for a subspace W of R. Describe an orthogonal basis for W. Solution: v = x = and Math 9Applied Linear Algebra p./

cont. v = x x v v v v = = 9 9 Replace v with v : v = 9 9 = 9 8 (optional step - to make v easier to work with in the next step) Math 9Applied Linear Algebra p./

cont. v = x x v v v v x v v v v v = 9 9 8 = 9 8 = Math 9Applied Linear Algebra p./

cont. Rescale (optional): v = Orthogonal Basis for W : {v, v, v } = 8 >< >:, 9 8, 9 >= >; Math 9Applied Linear Algebra p./