Lecture 10: Vector Algebra: Orthogonal Basis

Similar documents
Math 3191 Applied Linear Algebra

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MTH 2310, FALL Introduction

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Section 6.4. The Gram Schmidt Process

Orthogonal Complements

Solutions to Review Problems for Chapter 6 ( ), 7.1

Homework 5. (due Wednesday 8 th Nov midnight)

MTH 2032 SemesterII

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 2331 Linear Algebra

MATH Linear Algebra

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

LINEAR ALGEBRA SUMMARY SHEET.

Math 3191 Applied Linear Algebra

6. Orthogonality and Least-Squares

The Gram Schmidt Process

The Gram Schmidt Process

Linear Models Review

Vectors. Vectors and the scalar multiplication and vector addition operations:

Lecture 4 Orthonormal vectors and QR factorization

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

Math 3C Lecture 25. John Douglas Moore

A Primer in Econometric Theory

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Dot product and linear least squares problems

Math 2331 Linear Algebra

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Problem # Max points possible Actual score Total 120

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Math Linear Algebra

Matrix Algebra for Engineers Jeffrey R. Chasnov

The Gram-Schmidt Process 1

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

6.1. Inner Product, Length and Orthogonality

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

Applied Linear Algebra in Geoscience Using MATLAB

There are two things that are particularly nice about the first basis

Announcements Monday, November 19

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

Chapter 3 Transformations

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

2. Review of Linear Algebra

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Announcements Monday, November 20

Inner Product Spaces

Lecture 3: Linear Algebra Review, Part II

Lecture 9: Vector Algebra

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

2.4 Hilbert Spaces. Outline

Lecture 1: Basic Concepts

5 Compact linear operators

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Diagonalizing Matrices

Linear Algebra, Summer 2011, pt. 3

Recall that any inner product space V has an associated norm defined by

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

5. Orthogonal matrices

Numerical Linear Algebra Chap. 2: Least Squares Problems

LINEAR ALGEBRA W W L CHEN

A Vector Space Justification of Householder Orthogonalization

Announcements Monday, November 19

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Matrix decompositions

Exercises for Unit I (Topics from linear algebra)

Orthogonality. Orthonormal Bases, Orthogonal Matrices. Orthogonality

Announcements Monday, November 26

Math 3191 Applied Linear Algebra

Exercises for Unit I (Topics from linear algebra)

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Introduction to Linear Algebra, Second Edition, Serge Lange

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Section 6.1. Inner Product, Length, and Orthogonality

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

MATH 22A: LINEAR ALGEBRA Chapter 4

Conceptual Questions for Review

1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations

7. Dimension and Structure.

Orthogonal Projection. Hung-yi Lee

Further Mathematical Methods (Linear Algebra)

INNER PRODUCT SPACE. Definition 1

Math 2331 Linear Algebra

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Transcription:

Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process

Orthogonal Set Any set of vectors that are mutually orthogonal, is a an orthogonal set.

Orthonormal Set Any set of unit vectors that are mutually orthogonal, is a an orthonormal set. In other words, any orthogonal set is an orthonormal set if all the vectors in the set are unit vectors. Example: u, u 2, u 3 is an orthonormal set, where, 3 u =, u 2 = 6 2 6, u 3 = 66 4 66 7 6 66

An orthogonal set is Linearly Independent

Projection of vector b on vector a a b = a b cos θ Vector c is the image/perpendicular projection of b on a Direction of c is the same as a Magnitude of c is c = b cos θ = a b a If a is the unit vector of a, then vector c = a b a a b a = a a a c = a b = a b a a a b cos θ b b sin θ

Orthogonal Basis An orthogonal basis for a subspace W of R n is a basis for W that is also an orthogonal set. Example: 0 0, 0 0, 0 0 is basically the x, y, and z axis. It is an orthogonal basis in R 3, and it spans the whole R 3 space. It is also an orthogonal set.

Orthogonal Basis We know that given a basis of a subspace, any vector in that subspace will be a linear combination of the basis vectors. For example, if u and v are linearly independent and form the basis for a subspace S, then any vector y in S can be expressed as: y = c u + c 2 v But computing c and c 2 is not straight forward. v y On the other hand, if u and v form an orthogonal basis, then c = y u u y u = u u and c 2 = y v v y v = v v c 2 c u

Does not work if it is not Orthogonal basis y = c u + c 2 v But computing c and c 2 is not straight forward (yet). What is computed is, d = y u u = y u u u d 2 v y d 2 = y v v = y v v v c 2 c u d 8

Orthogonal Decomposition Theorem

Orthogonal Decomposition Theorem

Orthogonal Decomposition Theorem

Orthogonal Decomposition Theorem Example

Projection of a vector on a Subspace u and v are orthogonal 3D vectors. They span a plane (green plane) in 3D y is an arbitrary 3D vector out of the plane. y is the projection of y onto the plane. y = y u y v u + v u u v v The point y is also the closest point to y on the plane. y y is perpendicular to y, Span{u, v}, and hence u and v v u y y Perpendicular to y, u, v

Closest point of a vector to a span does not depend on the basis of that span y is an arbitrary 3D vector out of the plane. y is the projection of y onto the plane. y = x u + x 2 u 2 y = x 3 u 3 + x 4 u 4 y = x 5 u + x 6 u 3 u 2 u Coordinates of y change if the basis changes. But the vector y itself does not change. Hence the the closest point to y on the plane does not change. Even here, y y is perpendicular to y and Span{.,. } u 4 u 3 y y Perpendicular to y and the span

Closest point of a vector to a span does not depend on the basis of that span FINDING THE CLOSEST POINT OF A VECTOR TO A SPAN means : Finding the coordinates of the projection of the vector So, if you want to compute the closest point of a vector to a span, then find an appropriate basis with which you can compute the coordinates of the projection easily. What would be that basis? Answer: An orthogonal basis! u 4 u 3 y y u 2 u Perpendicular to y and the span

Projection on a span of non-orthogonal vectors How to find projection of any arbitrary 3D vector onto the span of two non-orthogonal, linearly independent vectors? u and u 2 are not orthogonal, but linearly independent vectors in 3D. y is an arbitrary 3D vector. Find the projection of y in the space spanned by u and u 2. a) First, find the orthogonal set of vectors v and v 2 that span the same subspace as u and u 2. In other words, find an orthogonal basis. b) Project y onto the space spanned by orthogonal v and v 2 vectors, as we earlier. v 2 u 2 u v y y

How to find an orthogonal basis?

How to find an orthogonal basis? Assume that the first vector u is in the orthogonal basis. Other vector(s) of the basis are computed that are perpendicular to u Let v = u Let v 2 = u 2 u 2v v v v We know that v 2 is perpendicular to v. v 2 is in the Span{u, u 2 } (Why?) So Span{u, u 2 } = Span{v, v 2 } And {v, v 2 } is an orthogonal basis Projection of y on to the Span{u, u 2 } v 2 u 2 u v y y y = y v v v v + y v 2 v 2 v 2 v 2

Gram-Schmidt Orthogonalization Process

Gram-Schmidt Orthogonalization Process Example:

Solving Inconsistent Systems

Solving Inconsistent Systems

Solving Inconsistent Systems Example : A trader buys and/or sells tomatoes and potatoes. (Negative number means buys, positive number means sells.) In the process, he either makes profit (positive number) or loss (negative number). A week s transaction is shown; find the approximate cost of tomatoes and potatoes. Tomatoes (tons) Potatoes (tons) -6 - -2 2 7 6 Profit/Loss (in thousands)

Solving Inconsistent Systems t - 6p = t - 2p = 2 t + p = t + 7p = 6 t + 6 2 7 p = 2 6 The above equation might not have a solution (values of t and p that would satisfy that equation). So the best we can do is to find the values of t and p that would result in a vector on the right hand side that is as close as possible to the desired right hand side vector.

Solving Inconsistent Systems Example 2: