Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Similar documents
Section 6.4. The Gram Schmidt Process

Chapter 6: Orthogonality

Solutions to Review Problems for Chapter 6 ( ), 7.1

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Math 3191 Applied Linear Algebra

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Homework 5. (due Wednesday 8 th Nov midnight)

Lecture 10: Vector Algebra: Orthogonal Basis

6. Orthogonality and Least-Squares

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

LINEAR ALGEBRA SUMMARY SHEET.

2. Every linear system with the same number of equations as unknowns has a unique solution.

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Least squares problems Linear Algebra with Computer Science Application

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Dot product and linear least squares problems

Math 407: Linear Optimization

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Math Linear Algebra

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Math Linear Algebra II. 1. Inner Products and Norms

The Gram Schmidt Process

The Gram Schmidt Process

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MTH 2032 SemesterII

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 4 Orthonormal vectors and QR factorization

SUMMARY OF MATH 1600

Online Exercises for Linear Algebra XM511

Problem # Max points possible Actual score Total 120

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Chapter 7: Symmetric Matrices and Quadratic Forms

(v, w) = arccos( < v, w >

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

MA 265 FINAL EXAM Fall 2012

Linear Algebra 18 Orthogonality

1. Select the unique answer (choice) for each problem. Write only the answer.

Orthogonal Complements

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

7. Dimension and Structure.

Math 3C Lecture 25. John Douglas Moore

Pseudoinverse & Moore-Penrose Conditions

Problem set #4. Due February 19, x 1 x 2 + x 3 + x 4 x 5 = 0 x 1 + x 3 + 2x 4 = 1 x 1 x 2 x 4 x 5 = 1.

Typical Problem: Compute.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

7. Symmetric Matrices and Quadratic Forms

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Lecture 3: QR-Factorization

I. Multiple Choice Questions (Answer any eight)

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MTH 2310, FALL Introduction

Abstract Vector Spaces

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Lecture 3: Linear Algebra Review, Part II

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

(v, w) = arccos( < v, w >

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

System of Linear Equations

1 Last time: inverses

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Announcements Monday, November 26

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Practice Final Exam. Solutions.

Diagonalizing Matrices

Review problems for MA 54, Fall 2004.

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Vectors. Vectors and the scalar multiplication and vector addition operations:

MATH 115A: SAMPLE FINAL SOLUTIONS

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Lecture 6. Numerical methods. Approximation of functions

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Sample Final Exam: Solutions

(v, w) = arccos( < v, w >

Linear least squares problem: Example

Transcription:

Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal basis of W. Solution Step I Take v = x. Step II Take v = x proj v x. More precisely, take v = x x v v = + + + + = =. 4 { Then v =, v = } is an orthogonal basis of W. Theorem (The Gram Schmidt Process) Given a basis {x,..., x p } for a nonzero subspace W of R n, define v = x v = x x v v v = x x v v x v v v v.. v p = x p x p v v x p v p v p v p v p Then {v,..., v p } is an orthogonal basis for W. Example Let x =, x =, x =. Give an orthogonal basis of W = Span{x, x, x }, and then give an orthonormal basis of W. Solution Step I Set v = x =.

Step II Set /4 v = x x v v = + + + + + + = /4 /4. /4 To make our computation later easier, we can use v = 4 v =. Step III Set v = x x v v x v v v v = + + + + + + + + + ( ) + + + = 4 = / /. / It might be better to use v = v =. Therefore, an orthogonal basis of W is { },,. { /4 } (Remark, it is not necessary to use v, v, v, and, /4 /4, / / /4 / correct answer.) is also a

Step IV Normalize them: / u = v v = + + + = / /. / / u = v v = ( ) + + + = / / /. u = v v = + ( ) + + = /. / / QR Factorization If A is an m n matrix with linearly independent columns, then A can be factored as A = QR, where Q is an m n matrix whose columns form an orthonormal basis for Col(A) and R is an n n upper triangular invertible matrix with positive entries on its diagonal. Example We use the example above, with A = Solution We use the computation above with / / u = / /, u = / / / /, u = / / /. Then the needed matrix Q is / / Q = / / / / / / / / / Since Q T Q = I, we must have Q T A = Q T QR R = Q T A

4 So we have / / / / R = / / / / / / / / = / / /. Application of QR decomposition In practical problems, we sometimes want to solve linear equations Ax = b, but the system is very inconsistent. For example, we expect certain number y should be of the form a sin x + b cos x, for some number a and b. So we conduct experiments to get lots of data points (x i, y i ). Then we hope to solve the equation y = a sin x + b cos x y = a sin x + b cos x y n = a sin x n + b cos x n To solve for a and b, we need to solve a linear system like sin x sin x y sin x sin x y... sin x n sin x n y n But we all know that experiments come with errors, and in practice, there is almost no way for this system with say n = to be consistent. Least-square solution We hope to get a pair (a, b) that closest to being a solution. Here is the theoretic meaning of this: the system isconsistent exactly when the column y sin x cos x vector y =. lies in the span of. and., a.k.a. Col(A). In general y n sin x n cos x n this is not the case, so we choose the projection of y to Col(A). The projection, which sin x cos x can be written as a. + b. is the vector in Col(A) that is closest to the sin x n cos x n experimental result y. This is called the least-square solution. In terms of formula, if the matrix A has a QR factorization, we have ˆx = R Q T b. (Heuristically, we are doing x = A b = R Q b = R Q T b. Except that this does not make sense as Q is not a square matrix. So even though Q T Q = I, it doesn t make Q exist.)

True/False Questions () If {v, v, v } is an orthogonal basis for W, then multiplying v by a scalar c gives a new orthogonal basis {v, v, cv }. True. () If A = QR, where Q has orthonormal columns, then R = Q T A. True. From A = QR, we deduce that Q T A = Q T QR = R. () If W = Span{x, x, x } with {x, x, x } linearly independent, and if {v, v, v } is an orthogonal set of nonzero vectors in W, then {v, v, v } is a basis for W. True. The condition says that dim W =. An orthogonal set is always linearly independent; so {v, v, v } is a basis for W. (4) If x is not in a subspace W, then x proj W x is not zero. True. x is in the subspace W if and only if x = proj W x. () In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the columns space of A. True. Exercise Using Gram-Schmidt process to produce an orthogonal basis for { } W = Span, 9 9. Solution Set v =. Then set v = x x v v = 9 ( ) + 9 ( ) + ( 9) + ( ) 9 + ( ) + + ( ) 4 = 9 9 4 =.

So an orthogonal basis of W is { 4 },. Exercise Using Gram-Schmidt Span{ process to produce an orthonormal basis for W = 4, 4 }. 7 Solution Set v = 4. Then set Normalize them as follows: v = x x v v = 4 + 4 7 7 + 4 + 4 = 4 4 = 4 4. 7 8 u = v v = + 4 + u = v v = 4 4 4 + 4 + ( 8) = 4 = / 4 /. = 8 4 = 4 So an orthonormal basis of W is { / / /, / } /. Exercise Give the QR factorization for the matrix 9 A = 7 / / /.

Solution Set v =. Then set 9 v = x x v v = 7 9 + 7 + ( ) ( ) + + + ( ) + 9 = 7 7 =. We normalize them as follows: / u = v v = + + ( ) + = = / /. / / u = v v = ( ) + + + = =. / / / So the matrix Q is / / Q = / / / / / / The corresponding matrix R is ( ) 9 ( ) R = Q T / / / / A = 7 / / / / =. 7