The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Similar documents
Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Linear Algebra Massoud Malek

Lecture 4: Applications of Orthogonality: QR Decompositions

MATH 22A: LINEAR ALGEBRA Chapter 4

Math Linear Algebra

Dot product and linear least squares problems

Applied Linear Algebra in Geoscience Using MATLAB

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Section 6.4. The Gram Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

LINEAR ALGEBRA KNOWLEDGE SURVEY

There are two things that are particularly nice about the first basis

MTH 2032 SemesterII

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Chapter 3 Transformations

Math 407: Linear Optimization

MATH 235: Inner Product Spaces, Assignment 7

MITOCW ocw f99-lec17_300k

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

PRACTICE PROBLEMS FOR THE FINAL

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

6. Orthogonality and Least-Squares

Math 3191 Applied Linear Algebra

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Stability of the Gram-Schmidt process

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Lecture 3: Linear Algebra Review, Part II

Lecture 10: Vector Algebra: Orthogonal Basis

Maths for Signals and Systems Linear Algebra in Engineering

Linear Algebra, Summer 2011, pt. 3

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Chapter 6: Orthogonality

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Analysis Lecture 16

Vectors. Vectors and the scalar multiplication and vector addition operations:

Math 3191 Applied Linear Algebra

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Diagonalizing Matrices

LINEAR ALGEBRA SUMMARY SHEET.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MTH 2310, FALL Introduction

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Conceptual Questions for Review

Inner Product Spaces

Chapter 5 Orthogonality

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Linear Models Review

The Gram Schmidt Process

The Gram Schmidt Process

Solutions to Second In-Class Exam: Math 401 Section 0201, Professor Levermore Friday, 16 April 2010

October 25, 2013 INNER PRODUCT SPACES

The geometry of least squares

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

L2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015

MIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.

Linear Algebra, part 3 QR and SVD

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Numerical Linear Algebra Chap. 2: Least Squares Problems

Exercises for Unit I (Topics from linear algebra)

Exercises for Unit I (Topics from linear algebra)

Linear least squares problem: Example

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

MAT Linear Algebra Collection of sample exams

Practice problems for Exam 3 A =

Designing Information Devices and Systems II

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Total 170. Name. Final Examination M340L-CS

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

Important Matrix Factorizations

Typical Problem: Compute.

Solutions to Review Problems for Chapter 6 ( ), 7.1

Review problems for MA 54, Fall 2004.

MATH Linear Algebra

7. Symmetric Matrices and Quadratic Forms

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Math Linear Algebra II. 1. Inner Products and Norms

AMS526: Numerical Analysis I (Numerical Linear Algebra)

INNER PRODUCT SPACE. Definition 1

Cheat Sheet for MATH461

Review of Linear Algebra

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Quizzes for Math 304

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

P = A(A T A) 1 A T. A Om (m n)

LINEAR ALGEBRA QUESTION BANK

Transcription:

Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product is a function of two vector arguments (where both vectors must be from the same vector space ) with the following properties: (1) (2) (3) (4) (5) (1) and (2) are called "linearity in the first argument" (3) is called "symmetry" (4) and (5) are called "positive definiteness"

Can you give an example? The most important example is the dot product on : Is any old inner product also "linear" in the second argument? Check: Do inner products relate to norms at all? Yes, very much so in fact. Any inner product gives rise to a norm: For the dot product specifically, we have

Tell me about orthogonality. Two vectors x and y are called orthogonal if their inner product is 0: In this case, the two vectors are also often called perpendicular: If is the dot product, then this means that the two vectors m an angle of 90 degrees. For orthogonal vectors, we have the Pythagorean theorem:

What if I've got two vectors that are not orthogonal, but I'd like them to be? Given: Want: with also written as: Demo: Orthogonalizing vectors This process is called orthogonalization. Note: The expression above becomes even simpler if

In this situation, where is the closest point to x on the line? It's the point pointed to by a. By the Pythagorean theorem, all other points on the line would have a larger norm: They would contain b and a component along v. How does the point-normal m (of a line) work? 1. Find a vector orthogonal to the vector v spanning the line. (For example by orthogonalization of another vector against v.) 3. Find a point p on the line. 2. Scale n to have norm 1: n is called the (unit) normal. 4. Compute 5. The equation is satisfied exactly by points on the line. 6. For all other points, gives the (signed) distance from the line. Demo: Point-normal m and signed distance

What's an orthogonal basis? A basis of a vector space that is also pairwise orthogonal: if What's an orthonormal basis (or "ONB")? An orthogonal basis where each basis vector has norm 1. For some given vector x, how do I find coefficients with respect to an ONB? Much easier than finding coefficients by solving a linear system! Also much cheaper: Can we build a matrix that computes those coefficients us? A square matrix whose columns are orthonormal is called orthogonal.

What else is true a orthogonal matrix Q? In words: Qx has the same 2-norm as x. What if Q contains a few zero columns instead of orthonormal vectors? orthonormal columns Define Compute x projected onto P is called the orthogonal projector onto Observe: Demo: Orthogonal Projection

Orthonormal vectors seem very useful. How can we make more than two? Need pairwise orthogonality! needs to be orthogonalized against all prior vectors! Demo: Orthogonalizing three vectors Demo: Gram-Schmidt--The Movie Demo: Gram-Schmidt and Modified Gram-Schmidt Demo: Keeping track of the coefficients in Gram-Schmidt (QR factorization) So, what is the QR factorization? The QR factorization is given by where A is any matrix, Q is orthogonal, and R is upper triangular. If life were consistent, shouldn't this be called the QU factorization? Yes.

What is the cost of QR factorization ( an nxn matrix A)? For each of n columns of A: Orthogonalize against every one of (at most n) previous vectors: By computing a dot product at a cost of O(n). Demo: Complexity of LU and QR Does QR work non-square matrices? This is very similar to LU factorization. Either: square "full" "complete" square "thin" "reduced" Is Q still orthogonal in a "thin" QR factorization? No--its columns do not m a full basis of That's why it is sometimes reasonable to ask a "full"/"complete" QR.

If I have a "thin" QR, can I obtain a "full" QR from it? Add zeros here. Add more orthogonal columns here. How? (1) Randomly choose m-n vectors. (2) Use Gram-Schmidt to orthogonalize them against all previous columns. Can QR fail? If two columns are exactly linearly dependent, we may obtain a zero vector when orthogonalizing. -> Zero on diagonal in R. Not much of an issue: Does not affect existence of QR. Pivoted QR exists, but is not very common.