Math 3191 Applied Linear Algebra

Similar documents
Math 2331 Linear Algebra

March 27 Math 3260 sec. 56 Spring 2018

Math 3191 Applied Linear Algebra

Math 2331 Linear Algebra

6.1. Inner Product, Length and Orthogonality

Orthogonality and Least Squares

Math 3191 Applied Linear Algebra

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

MTH 2310, FALL Introduction

Section 6.1. Inner Product, Length, and Orthogonality

Chapter 6. Orthogonality and Least Squares

6. Orthogonality and Least-Squares

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math Linear Algebra

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Chapter 6: Orthogonality

Math 3191 Applied Linear Algebra

Overview. Motivation for the inner product. Question. Definition

Orthogonal Complements

Announcements Wednesday, November 15

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Orthogonal Projection. Hung-yi Lee

MTH 2032 SemesterII

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 4377/6308 Advanced Linear Algebra

Announcements Wednesday, November 15

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Math 2331 Linear Algebra

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

LINEAR ALGEBRA SUMMARY SHEET.

Row Space, Column Space, and Nullspace

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Chapter 6 - Orthogonality

Math 2331 Linear Algebra

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Lecture 9: Vector Algebra

Criteria for Determining If A Subset is a Subspace

Review of Linear Algebra

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

Lecture 23: 6.1 Inner Products

Math 54 HW 4 solutions

Answer Key for Exam #2

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Typical Problem: Compute.

Announcements Monday, November 26

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Lecture 3: Linear Algebra Review, Part II

MATH 221, Spring Homework 10 Solutions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

Solutions to Review Problems for Chapter 6 ( ), 7.1

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Problem # Max points possible Actual score Total 120

1 Last time: inverses

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Chapter 3 Transformations

2.4 Hilbert Spaces. Outline

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

The geometry of least squares

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Study Guide for Linear Algebra Exam 2

Linear Algebra V = T = ( 4 3 ).

Announcements Monday, November 20

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Math 3191 Applied Linear Algebra

MAT Linear Algebra Collection of sample exams

MATH 22A: LINEAR ALGEBRA Chapter 4

Computational math: Assignment 1

Lecture 22: Section 4.7

Applied Linear Algebra in Geoscience Using MATLAB

Announcements Monday, November 26

2. Review of Linear Algebra

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Linear Models Review

LINEAR ALGEBRA W W L CHEN

Designing Information Devices and Systems II

DS-GA 1002 Lecture notes 10 November 23, Linear models

The Gram Schmidt Process

The Gram Schmidt Process

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Announcements Monday, November 19

Linear Algebra, Summer 2011, pt. 3

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Section 6.4. The Gram Schmidt Process

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

Linear independence, span, basis, dimension - and their connection with linear systems

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Lecture 10: Vector Algebra: Orthogonal Basis

Chapter 1 Vector Spaces

Inner Product, Length, and Orthogonality

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Transcription:

Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/

Motivation Not all linear systems have solutions. EXAMPLE: No solution to 1 5 x 1 x 5 = 5 exists. Why? 8 9 < Ax is a point on the line spanned by 1 = 5 and b is not on the line. So Ax b for all : ; x. x.5.5 1.5 1 0.5 1 x 1 Math 191Applied Linear Algebra p./

Approximate Solutions Instead find bx so that Abx lies closest to b. x.5.5 1.5 1 0.5 1 x 1 Using techniques in this chapter, we will find that bx= 1. 0 5, so that Abx = 1.. 8 5. Math 191Applied Linear Algebra p./

Observation: Segment joining Abx and b is perpendicular ( or orthogonal) to the set of solutions to Ax = b. Need to develop fundamental ideas of length orthogonality orthogonal projections The key to all of these concepts is the Inner Product. Math 191Applied Linear Algebra p./

The Inner Product Inner product or dot product of u = u 1 u.. 5 and v = v 1 v.. 5 : u n v n u v = u T v = h u 1 u u n i v 1 v.. 5 = u 1 v 1 + u v + + u n v n v n Note that v u =v 1 u 1 + + v n u n = u 1 v 1 + + u n v n = u v Math 191Applied Linear Algebra p.5/

THEOREM 1 Let u, v and w be vectors in R n, and let c be any scalar. Then a. u v = v u b. (u + v) w = u w + v w c. (cu) v =c (u v) = u (cv) d. u u 0, and u u = 0 if and only if u = 0. Combining parts b and c, one can show (c 1 u 1 + + c p u p ) w =c 1 (u 1 w) + + c p (u p w) Math 191Applied Linear Algebra p./

Length of a Vector For v = v 1 v.., the length or norm of v is the nonnegative scalar v defined 5 by v n v = v v = p v1 + v + + v n and v = v v. For example, if v = a b 5, then v = a + b (distance between 0 and v) Observation: For any scalar c, cv = c v Math 191Applied Linear Algebra p./

Distance in R n The distance between u and v in R n : dist(u, v) = u v. This agrees with the usual formulas for R and R. Let u = (u 1, u ) and v = (v 1, v ). Then u v = (u 1 v 1, u v ) and dist(u, v) = u v = (u 1 v 1, u v ) = q (u 1 v 1 ) + (u v ) Math 191Applied Linear Algebra p.8/

Orthogonal Vectors [dist (u, v)] = u v = (u v) (u v) = (u) (u v) + ( v) (u v) = = u u u v + v u + v v = u + v u v [dist (u, v)] = u + v u v Math 191Applied Linear Algebra p.9/

Previous slide showed that [dist (u, v)] = u + v u v Similarly, we can show that [dist (u, v)] = u + v + u v Since [dist (u, v)] = [dist (u, v)], u v =. Two vectors u and v are said to be orthogonal (to each other) if u v = 0. Also note that if u and v are orthogonal, then u + v = u + v. THEOREM THE PYTHAGOREAN THEOREM Two vectors u and v are orthogonal if and only if u + v = u + v. Math 191Applied Linear Algebra p.10/

Orthogonal Complements If a vector z is orthogonal to every vector in a subspace W of R n, then z is said to be orthogonal to W. The set of vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by W (read as W perp ). Math 191Applied Linear Algebra p.11/

Row, Null and Columns Spaces THEOREM Let A be an m n matrix. Then the orthogonal complement of the row space of A is the nullspace of A, and the orthogonal complement of the column space of A is the nullspace of A T : (Row A) =Nul A, (Col A) =Nul A T. Why? (See complete proof in the text) Consider Ax = 0: r 1 x 0 r x 0 Note that Ax = =. where r 1,..., r m are the rows of A. Thus, x 5. 5 r m x 0 is orthogonal to each row of A. So x is orthogonal to Row A. Math 191Applied Linear Algebra p.1/

EXAMPLE Let A = 1 0 1 5. 0 8 9 >< 0 1 Basis for Nul A = 1 5 >:, >= 0 5 and therefore Nul A is a plane in R. >; 0 1 8 9 >< 1 >= Basis for Row A = 0 5 and therefore Row A is a line in R. >: >; 1 8 9 < Basis for Col A = 1 = 5 : ; and therefore Col A is a line in R. 8 9 < Basis for Nul A T = = 5 : 1 ; and therefore Nul AT is a line in R. Math 191Applied Linear Algebra p.1/

Section. Orthogonal Sets A set of vectors {u 1, u,..., u p } in R n is called an orthogonal set if u i u j = 0 whenever i j. EXAMPLE: Is 8 >< >: 1 1 0 5, 1 1 0 5, 0 0 1 9 >= 5 >; an orthogonal set? Solution: Label the vectors u 1, u, and u respectively. Then u 1 u =, u 1 u =, u u = Therefore, {u 1, u, u } is an orthogonal set. Math 191Applied Linear Algebra p.1/

THEOREM Suppose S = {u 1, u,..., u p } is an orthogonal set of nonzero vectors in R n and W =span{u 1, u,..., u p }. Then S is a linearly independent set and is therefore a basis for W. Partial Proof: Suppose c 1 u 1 + c u + + c p u p = 0 (c 1 u 1 + c u + + c p u p ) = 0 (c 1 u 1 ) u 1 + (c u ) u 1 + + (c p u p ) u 1 = 0 c 1 (u 1 u 1 ) + c (u u 1 ) + + c p (u p u 1 ) = 0 c 1 (u 1 u 1 ) = 0 Since u 1 0, u 1 u 1 > 0 which means c 1 =. In a similar manner, c,...,c p can be shown to by all 0. So S is a linearly independent set. Math 191Applied Linear Algebra p.15/

Orthogonal Basis An orthogonal basis for a subspace W of R n is a basis for W that is also an orthogonal set. Question: Why would we want to have an orthogonal basis? Ans: It makes it easy to calculate the coordinates relative to the basis. EXAMPLE: Suppose S = {u 1, u,..., u p } is an orthogonal basis for a subspace W of R n and suppose y is in W. Find c 1,...,c p so that y =c 1 u 1 + c u + + c p u p. Solution: y = (c 1 u 1 + c u + + c p u p ) y u 1 = (c 1 u 1 + c u + + c p u p ) u 1 y u 1 =c 1 (u 1 u 1 ) + c (u u 1 ) + + c p (u p u 1 ) y u 1 =c 1 (u 1 u 1 ) c 1 = y u 1 u 1 u 1 Similarly, c =, c =,..., c p = Math 191Applied Linear Algebra p.1/

THEOREM 5 Let {u 1, u,..., u p } be an orthogonal basis for a subspace W of R n. Then each y in W has a unique representation as a linear combination of u 1, u,..., u p. In fact, if then y =c 1 u 1 + c u + + c p u p c j = y u j u j u j (j = 1,..., p). Math 191Applied Linear Algebra p.1/

EXAMPLE: Express y = 5 as a linear combination of the orthogonal basis 8 >< >: 1 1 0 5, 1 1 0 5, 0 0 1 9 >= 5. >; Solution: y u 1 u 1 u 1 = y u u u = y u u u = Hence y = u 1 + u + u Math 191Applied Linear Algebra p.18/

Orthogonal Projections For a nonzero vector u in R n, suppose we want to write y in R n as the the following y = (multiple of u) + (multiple a vector to u). (y αu) u =0 y u α (u u) =0 = α = by= y u u (orthogonal projection of y onto u) u u and z = y y u u (component of y orthogonal to u) u u Math 191Applied Linear Algebra p.19/

Example Let y = 8 5 and u = 1 5. Compute the distance from y to the line through 0 and u. Solution: by= y u u u u = Distance from y to the line through 0 and u = distance from by to y = by y = Math 191Applied Linear Algebra p.0/

Orthonormal Sets A set of vectors {u 1, u,..., u p } in R n is called an orthonormal set if 1. It is orthogonal.. Each vector has length 1. If the orthonormal set {u 1, u,..., u p } spans a vector space W, then {u 1, u,..., u p } is called an orthonormal basis for W. Math 191Applied Linear Algebra p.1/

Orthogonal Matrices Recall that v is a unit vector if v = v v = v T v = 1. Suppose U = [u 1 u u ] where {u 1, u, u } is an orthonormal set. Then U T U = u T 1 u T u T 5 [u 1 u u ] = u T 1 u 1 u T 1 u u T 1 u u T u 1 u T u u T u u T u 1 u T u u T u 5 = 5 = It can be shown that UU T = I also. So U 1 = U T (such a matrix is called an orthogonal matrix). (NOTE: U must be square to be orthogonal). Math 191Applied Linear Algebra p./

THEOREM U T U = I. An m n matrix U has orthonormal columns if and only if THEOREM Let U be an m n matrix with orthonormal columns, and let x and y be in R n. Then a. Ux = x b. (Ux) (Uy) = x y c. (Ux) (Uy) = 0 if and only if x y = 0. Proof of part b: (Ux) (Uy) = Math 191Applied Linear Algebra p./