March 27 Math 3260 sec. 56 Spring 2018

Similar documents
Math 3191 Applied Linear Algebra

Orthogonal Complements

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

6.1. Inner Product, Length and Orthogonality

Orthogonality and Least Squares

Math 2331 Linear Algebra

MTH 2310, FALL Introduction

Chapter 6: Orthogonality

Overview. Motivation for the inner product. Question. Definition

Section 6.1. Inner Product, Length, and Orthogonality

Chapter 6. Orthogonality and Least Squares

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

6. Orthogonality and Least-Squares

Math Linear Algebra

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

LINEAR ALGEBRA SUMMARY SHEET.

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 3191 Applied Linear Algebra

Dimension. Eigenvalue and eigenvector

Math 2331 Linear Algebra

PRACTICE PROBLEMS FOR THE FINAL

Announcements Wednesday, November 15

Typical Problem: Compute.

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Announcements Monday, November 26

The Four Fundamental Subspaces

Announcements Wednesday, November 15

Math 22 Fall 2018 Midterm 2

Lecture 3: Linear Algebra Review, Part II

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

(v, w) = arccos( < v, w >

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

2. Every linear system with the same number of equations as unknowns has a unique solution.

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Announcements Monday, October 29

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Math 102, Winter 2009, Homework 7

4.3 - Linear Combinations and Independence of Vectors

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

Math 4377/6308 Advanced Linear Algebra

Vector space and subspace

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

(v, w) = arccos( < v, w >

Announcements Monday, November 19

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

(v, w) = arccos( < v, w >

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

Practice Final Exam. Solutions.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Study Guide for Linear Algebra Exam 2

1 Last time: inverses

Math Linear algebra, Spring Semester Dan Abramovich

The definition of a vector space (V, +, )

Announcements Monday, November 20

7. Dimension and Structure.

The Gram Schmidt Process

The Gram Schmidt Process

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

MATH 221, Spring Homework 10 Solutions

Math 3C Lecture 25. John Douglas Moore

The Fundamental Theorem of Linear Algebra

Shorts

Math 2331 Linear Algebra

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Chapter 2: Matrix Algebra

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

SUMMARY OF MATH 1600

Orthogonal Projection. Hung-yi Lee

Math 54 HW 4 solutions

MTH 2032 SemesterII

Math 2331 Linear Algebra

Answer Key for Exam #2

DS-GA 1002 Lecture notes 10 November 23, Linear models

Math 3191 Applied Linear Algebra

WI1403-LR Linear Algebra. Delft University of Technology

Definitions for Quizzes

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Math 308 Practice Test for Final Exam Winter 2015

MAT 242 CHAPTER 4: SUBSPACES OF R n

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Transcription:

March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated with an m n matrix A, its column space, null space, and row space. March 26, 2018 1 / 52

Theorem If two matrices A and B are row equivalent, then their row spaces are the same. In particular, if B is an echelon form of the matrix A, then the nonzero rows of B form a basis for Row B and also for Row A since these are the same space. March 26, 2018 2 / 52

Example A matrix A along with its rref is shown. 2 5 8 0 17 A = 1 3 5 1 5 3 11 19 7 1 1 7 13 5 3 1 0 1 0 1 0 1 2 0 3 0 0 0 1 5 0 0 0 0 0 (a) Find a basis for Row A and state the dimension dim Row A. March 26, 2018 3 / 52

Example continued... (b) Find a basis for Col A and state its dimension. March 26, 2018 4 / 52

Example continued... (c) Find a basis for Nul A and state its dimension. March 26, 2018 5 / 52

March 26, 2018 6 / 52

Remarks We can naturally associate three vector spaces with an m n matrix A. Row A and Nul A are subspaces of R n and Col A is a subspace of R m. Careful! The rows of the rref do span Row A. But we go back to the columns in the original matrix to get vectors that span Col A. (Get a basis for Col A from A itself!) Careful Again! Just because the first three rows of the rref span Row A does not mean the first three rows of A span Row A. (Get a basis for Row A from the rref!) March 26, 2018 7 / 52

Remarks Row operations preserve row space, but change linear dependence relations of rows. Row operations change column space, but preserve linear dependence relations of columns. Another way to obtain a basis for Row A is to take the transpose A T and do row operations. We have the following relationships: Col A = Row A T and Row A = Col A T. The dimension of the null space is called the nullity. March 26, 2018 8 / 52

Rank Definition: The rank of a matrix A (denoted rank A) is the dimension of the column space of A. Theorem: For m n matrix A, dim Col A = dim Row A = rank A. Moreover rank A + dim Nul A = n. Note: This theorem states the rather obvious fact that { } { } { number of number of total number + = pivot columns non-pivot columns of columns }. March 26, 2018 9 / 52

Examples (1) A is a 5 4 matrix with rank A = 4. What is dim Nul A? (2) If A is 7 5 and dim Col A = 2. Determine the nullity 1 of A and rank A T. March 26, 2018 10 / 52

Addendum to Invertible Matrix Theorem Let A be an n n matrix. The following are equivalent to the statement that A is invertible. (m) The columns of A form a basis for R n (n) Col A = R n (o) dim Col A = n (p) rank A = n (q) Nul A = {0} (r) dim Nul A = 0 March 26, 2018 11 / 52

Section 6.1: Inner Product, Length, and Orthogonality Recall: A vector u in R n can be considered an n 1 matrix. It follows that u T is a 1 n matrix u T = [u 1 u 2 u n ]. Definition: For vectors u and v in R n we define the inner product of u and v (also called the dot product) by the matrix product v 1 u T v 2 v = [u 1 u 2 u n ]. = u 1v 1 + u 2 v 2 + + u n v n. v n Note that this product produces a scalar. It is sometimes called a scalar product. March 26, 2018 12 / 52

Theorem (Properties of the Inner Product) We ll use the notation u v = u T v. Theorem: For u, v and w in R n and real scalar c (a) u v = v u (b) (u + v) w = u w + v w (c) c(u v) = (cu) v = u (cv) (d) u u 0, with u u = 0 if and only if u = 0. March 26, 2018 13 / 52

The Norm The property u u 0 means that u u always exists as a real number. Definition: The norm of the vector v in R n is the nonnegative number denoted and defined by v = v v = v1 2 + v 2 2 + + v n 2 where v 1, v 2,..., v n are the components of v. As a directed line segment, the norm is the same as the length. March 26, 2018 14 / 52

Norm and Length Figure: In R 2 or R 3, the norm corresponds to the classic geometric property of length. March 26, 2018 15 / 52

Unit Vectors and Normalizing Theorem: For vector v in R n and scalar c cv = c v. Definition: A vector u in R n for which u = 1 is called a unit vector. Remark: Given any nonzero vector v in R n, we can obtain a unit vector u in the same direction as v u = v v. This process, of dividing out the norm, is called normalizing the vector v. March 26, 2018 16 / 52

Example Show that v/ v is a unit vector. March 26, 2018 17 / 52

Example Find a unit vector in the direction of v = (1, 3, 2). March 26, 2018 18 / 52

Distance in R n Definition: For vectors u and v in R n, the distance between u and v is denoted and defined by dist(u, v) = u v. Example: Find the distance between u = (4, 0, 1, 1) and v = (0, 0, 2, 7). March 26, 2018 19 / 52

Orthogonality Definition: Two vectors are u and v orthogonal if u v = 0. Figure: Note that two vectors are perpendicular if u v = u + v March 26, 2018 20 / 52

Orthogonal and Perpendicular Show that u v = u + v if and only if u v = 0. March 26, 2018 21 / 52

March 26, 2018 22 / 52

March 26, 2018 23 / 52

The Pythagorean Theorem Theorem: Two vectors u and v are orthogonal if and only if u + v 2 = u 2 + v 2. March 26, 2018 24 / 52

Orthogonal Complement Definition: Let W be a subspace of R n. A vector z in R n is said to be orthogonal to W if z is orthogonal to every vector in W. Definition: Given a subspace W of R n, the set of all vectors orthogonal to W is called the orthogonal complement of W and is denoted by W. March 26, 2018 25 / 52

Theorem: W is a subspace of R n. March 26, 2018 26 / 52

March 26, 2018 27 / 52

Example Let W =Span 1 0 0, 0 0 1. Show that W =Span 0 1 0. Give a geometric interpretation of W and W as subspaces of R 3. March 26, 2018 28 / 52

March 26, 2018 29 / 52

March 26, 2018 30 / 52

Example [ 1 3 2 Let A = 2 0 4 [Row(A)]. ]. Show that if x is in Nul(A), then x is in March 26, 2018 31 / 52

March 26, 2018 32 / 52

March 26, 2018 33 / 52

Theorem Theorem: Let A be an m n matrix. The orthogonal complement of the row space of A is the null space of A. That is [Row(A)] = Nul(A). The orthongal complement of the column space of A is the null space of A T i.e. [Col(A)] = Nul(A T ). March 26, 2018 34 / 52

Example: Find the orthogonal complement of Col(A) A = 5 2 1 3 3 0 2 4 1 2 2 9 0 1 1 March 26, 2018 35 / 52

March 26, 2018 36 / 52

Section 6.2: Orthogonal Sets Remark: We know that if B = {b 1,..., b p } is a basis for a subspace W of R n, then each vector x in W can be realized (uniquely) as a sum x = c 1 b 2 + + c p b p. If n is very large, the computations needed to determine the coefficients c 1,..., c p may require a lot of time (and machine memory). Question: Can we seek a basis whose nature simplifies this task? And what properties should such a basis possess? March 26, 2018 37 / 52

Orthogonal Sets Definition: An indexed set {u 1,..., u p } in R n is said to be an orthogonal set provided each pair of distinct vectors in the set is orthogonal. That is, provided u i u j = 0 whenever i j. Example: Show that the set orthogonal set. 3 1 1, 1 2 1, 1 4 7 is an March 26, 2018 38 / 52

3 1 1, 1 2 1, 1 4 7 March 26, 2018 39 / 52

Orthongal Basis Definition: An orthogonal basis for a subspace W of R n is a basis that is also an orthogonal set. Theorem: Let {u 1,..., u p } be an orthogonal basis for a subspace W of R n. Then each vector y in W can be written as the linear combination y = c 1 u 1 + c 2 u 2 + + c p u p, where the weights c j = y u j u j u j. March 26, 2018 40 / 52

Example 3 1 1, 1 2 1 the vector y =, 2 3 0 1 4 7 is an orthogonal basis of R3. Express as a linear combination of the basis vectors. March 26, 2018 41 / 52

March 26, 2018 42 / 52

Projection Given a nonzero vector u, suppose we wish to decompose another nonzero vector y into a sum of the form y = ŷ + z in such a way that ŷ is parallel to u and z is perpendicular to u. March 26, 2018 43 / 52

Projection Since ŷ is parallel to u, there is a scalar α such that ŷ = αu. March 26, 2018 44 / 52

Projection onto the subspace L =Span{u} Notation: ŷ = proj L = [ ] [ 7 4 Example: Let y = and u = 6 2 Span{u} and z is orthogonal to u. ( y u ) u u u ]. Write y = ŷ + z where ŷ is in March 26, 2018 45 / 52

Example Continued... Determine the distance between the point (7, 6) and the line Span{u}. March 26, 2018 46 / 52

Orthonormal Sets Definition: A set {u 1,..., u p } is called an orthonormal set if it is an orthogonal set of unit vectors. Definition: An orthonormal basis of a subspace W of R n is a basis that is also an orthonormal set. Example: Show that R 2. 3 5 4 5, 4 5 3 5 is an orthonormal basis for March 26, 2018 47 / 52

March 26, 2018 48 / 52

Orthogonal Matrix [ 3 Consider the matrix U = 5 4 ] 5 whose columns are the vectors in the last example. Compute the product 4 5 3 5 U T U What does this say about U 1? March 26, 2018 49 / 52

Orthogonal Matrix Definition: A square matrix U is called an orthogonal matrix if U T = U 1. Theorem: An n n matrix U is orthogonal if and only if it s columns form an orthonormal basis of R n. The linear transformation associated to an orthogonal matrix preserves lenghts and angles in the following sense: March 26, 2018 50 / 52

Theorem: Orthogonal Matrices Let U be an n n orthogonal matrix and x and y vectors in R n. Then (a) Ux = x (b) (Ux) (Uy) = x y, in particular (c) (Ux) (Uy) = 0 if and only if x y = 0. Proof (of (a)): March 26, 2018 51 / 52

March 26, 2018 52 / 52