orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Size: px
Start display at page:

Download "orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,"

Transcription

1 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, Least Squares Problem, Gram-Schmidt Orthogonalization Process. 5.1 The Scalar Product in R n 1. Def. Let x, y R n. 2. (a) The scalar product of x and y is x T y = x 1 y x n y n. (b) The Euclidean length of vector x is x = x x 2 n = x T x. (c) The distance between x and y is x y = (x 1 y 1 ) (x n y n ) 2. [ ] 3 Ex. R 2 : the scalar product, lengths and distance for x = and y = 4 Ex. Given x 0, the vector [ ] 1. 7 x is a unit vector. It has the same direction as x. x Thm 5.1. If x and y are nonzero vectors in R n and θ is the angle between them, then x T y = x y cos θ. Proof. The proof in the textbook (p212 in 7th ed) is also true for R n situations. 34

2 Remark: Let x and y be two vectors in R n. (1) The angle θ between x and y is: cos θ = (2) (Cauchy-Schwarz Inequality) (with graph) xt y x y. x T y x y. (3) x and y are orthogonal (x y) iff x T y = 0. (4) x and y are parallel iff x T y = x y. Ex. The standard basis vectors in R 3. Ex. Ex 4 (p213 in 7th ed). 3. (Show by graphs) Given x, y, the scalar projection of x onto y is: ( ) y α = x T = xt y y y, the vector projection of x onto y is: p = α y y = xt y y T y y. Ex. Ex 5, p214 in 7th ed. (Fig 5.1.3) Let Q be the point on the line y = 1 x that is 3 closest to the point (1, 4). Determine the coordinates of Q. Ex. Ex 6, p215 in 7th ed. Find the equation of the plane passing thru (2, 1, 3) 2 and normal to N = 3. 4 Ex. Ex 7, p215 in 7th ed. Find the distance from (2, 0, 0) to the plane x + 2y + 2z = Applications of orthogonality: (skip) Ex. Application 1. (p217 in 7th ed). Information Retrieval Revisited. Ex. Application 2. (p219 in 7th ed). Statistics Correlation and Covariance Matrices. (detail) Ex. Application 3. (p222 in 7th ed). Psychology Factor Analysis and Principal Component Analysis. 35

3 5.1.1 Homework Sect 5.1 1ac, 2ac, 3a, 7, 9, Orthogonal Subspaces 1. Def. Two subspaces X, Y R n are orthogonal to each other, i.e. X Y, iff x T y = 0 for x X, y Y. Ex. Example 2 in the textbook(pp226 in 7th ed). 2. Def. Let Y be a subspace of R n. The set of vectors orthogonal to all vectors of Y form a subspace Y, called the orthogonal complement of Y. Namely, Y = {x R n x T y = 0 for every y Y }. Remark. If X Y, then X Y = {0}. Ex. In R 3, Y = span(e 1 + e 2 ). Find Y. 3. Def. Given A R m n, denote the range of A by R(A). Similarly, R(A) = {Ax x R n } = L A (R n ) = the column space of A R(A T ) = {A T y y R m } = the row space of A Thm 5.2 (Fundamental Subspace Theorem). A R m n, then N(A) = R(A T ) and N(A T ) = R(A). (It is one of the most important theorems in this chapter.) Proof. x N(A) Ax = 0 x is orthogonal to every row vector of A x is orthogonal to the row space of A. x R(A T ). Similarly for the other identity. 36

4 Ex. (HW 1(b)) Determine bases for R(A T ), N(A), R(A), N(A T ): [ ] A = Properties of orthogonal complements: Thm 5.3. Let S be a subspace of R n. (a) dim S + dim S = n. (b) R n = S S. (That is, every vector v R n can be uniquely expressed as v = u + w for some u S and w S ). (c) (S ) = S. Remarks: (a): If S is the row space of a matrix A R m n, then S = N(A) and dim S = rank A. So dim S + dim S = rank A + dim N(A) = n. Ex. HW 9 (pp233 in 7th ed). If A R m n has rank r, what are the dimensions of N(A) and N(A T )? (Two ways: 1. rank A + dim N(A) = n.. 2. Fundamental Subspaces Theorem) Homework p247 1ad, 2, 5, Least Squares Problems Thm 5.4. Given a subspace S and a point b R n, there is a unique point p S such that b p is minimal. Indeed, p is the projection of b onto S, and (b p) S. (graph) For a system Ax = b: 1. When b {Ax x R n } = R(A) (the column space of A in R m ), the system Ax = b has solution(s). 2. When b / R(A), the system Ax = b has no solution. However, we can always find a unique p R(A) such that the distance b p is minimal (graph). Since p R(A), there exists ˆx R n such that p = Aˆx. Def. A vector ˆx making b Aˆx minimal is called the least squares solution of Ax = b. The least squares solution ˆx of Ax = b always exists. Moreover, it coincides with the solution of Ax = b when the latter is consistent. 37

5 Question: How to find ˆx to minimize b Aˆx? (graph) b Aˆx is minimal (b Aˆx) R(A) = {Ax x R n } (b Aˆx) R(A) = N(A T ) A T (b Aˆx) = 0 A T Aˆx = A T b Thm 5.5. The least squares solution(s) ˆx of the system Ax = b is the solution set of A T Ax = A T b (multiplying both sides of the original system by A T ). Remarks: 1. p = Aˆx is the projection of b onto R(A). 2. The vector b Aˆx is called the residue. To get the least squares solutions of Ax = b: 1. Multiply both sides by A T to get A T Ax = A T b; 2. Change the augmented matrix [A T A A T b] to RREF; 3. Solve the resulting system for the RREF to get ˆx. Cor 5.6. If A R m n has full column rank n, then A T A is nonsingular. So the least squares problem Ax = b has the unique solution: A T Aˆx = A T b = ˆx = (A T A) 1 A T b. Ex. Application 2, p238 in 7th ed. Spring constants. x 1 + x 2 = 3 Ex. (Ex 1, pp238) Find the least squares solution to the system 2x 1 + 3x 2 = 1 2x 1 x 2 = 2 Ex. The driving data of a vehicle. Find the local and highway milages. local (mile) highway (mile) gas (gallon)

6 Suppose local p mil/gal, highway q mil/gal. Then p p q = 5 q = 2 30 p q = 2 We solve the least squares problem for (1/p, 1/q). [ ] [ ] [ ] /p 192 A T A = A T b = = /q 712 [ ] [ ] [ ] 1/p p = = 1/q q (Local: 21 mil/gal, Highway: 30 mil/gal.) (refer to the textbook) Given a table of data x x 1 x 2 x m y y 1 y 2 y m [ ] we try to find a linear function y = c 0 + c 1 x that best fits the data. If we set then we obtain a system y i = c 0 + c 1 x i for i = 1,, m 1 x 1 1 x x m [ c0 c 1 y ] 1 y 2 =. The least squares solution of this system produces a linear function y = c 0 + c 1 x that is best fit the data. Ex. (Ex 2, p240 in the 7th ed.) Find the least squares fit by a linear function Homework Sect 5.3 1c, 2c, 3, 4, 5 x y y m. 39

7 5.4 Inner Product Spaces 1. Examples: Standard inner product on R n (scalar product). (3 conditions) Standard inner product on R m n. 2. Let V be a vector space. An inner product, is an map, : V V R, such that for every x, y V, there is a real number x, y satisfying (a) x, x 0 with equality iff x = 0. (b) x, y = y, x. (c) αx + βy, z = α x, z + β y, z for all x, y, z V and all α, β R. Inner product space: a vector space with an inner product. Orthogonality: u v iff u, v = 0. Norm: v = v, v. Caution: Different inner products induce different norm and orthogonality. Ex. In R 3, define x, y = 3x 1 y 1 + 2x 2 y 2 + x 3 y 3. It is an inner product (check it!). The norm of x = (1, 0, 0) T is x = x, x = 3. Ex. Example 2 (p248 in 7th ed). Standard basis for Fourier analysis. Thm 5.7 (The Pythagorean Law). If u and v are orthogonal in an inner product space (i.e. u, v = 0), then u + v 2 = u 2 + v 2. Thm 5.8 (The Cauchy-Schwarz Inequality). In an inner product space, u, v u v. 3. (skip) A vector space V is a normed linear space iff for every v V, there is a real number v R (called the norm of v), such that: (a) v 0 with equality iff v = 0. (b) αv = α v for all α R. (c) v + w v + w for all v, w V. 40

8 5.5 Orthonormal Sets 1. Let (V,, ) be an inner product space and {v 1,, v n } V. Def. {v 1,, v n } is an orthogonal set of vectors if v i, v j = 0 whenever i j. Def. {v 1,, v n } is an orthonormal set (ON set) of vectors if { 0, when i j v i, v j = 1, when i = j So orthonormal set = orthogonal set of unit vectors. Ex. Define x, y := x T y. The following are orthogonal sets: {e 1, e 2, e 3 }, 1, 1, 0, v 1 = 2, v 2 = 1, v 3 = However, only {e 1, e 2, e 3 } is an orthonormal set. The third set {v 1, v 2, v 3 } leads to an orthonormal set { v 1, v 2, v 3 }. v 1 v 2 v 3 When {v 1,, v n } is an orthogonal set of nonzero vectors, we get an orthonormal set {u 1,, u n } by u i := 1 v i v i. R n with scalar product: Write U = [u 1,, u k ]. Then (a) {u 1,, u k } is an orthogonal set iff U T U is a diagonal matrix. (b) {u 1,, u k } is an orthonormal set iff U T U = I k. [ ] [ ] cos θ sin θ Ex. Show that {, } is an orthonormal set for any θ. sin θ cos θ 2. Def. A basis of orthonormal set is an orthonormal basis (ON basis). An ON basis is like the standard basis in R n. Vectors in an ON set are linearly independent. Every vector can be easily written as a linear combination of an ON basis vectors. Thm 5.9. If [u 1,, u n ] is an ON basis of V, then every b V can be expressed as b = n v, u i u i. i=1 41

9 It is a special case of the next theorem. Projection of a vector onto a subspace: Thm Let [u 1,, u m ] be an ON basis of a subspace S of V. The projection of b V on S is (Fig 5.5.2) m p = b, u i u i. Proof. p S = span(u 1,, u m ). Check (b p) u i for i = 1,, m. i=1 ON sets help solve the least squares problems. Thm If the columns of A are orthonormal, then the least squares solution of Ax = b is ˆx = A T b. With an ON basis, the inner product is like the scalar product. Thm If [u 1,, u n ] is an ON basis, then for u = n i=1 a iu i and v = n i=1 b iu i, the inner products (proof) 3. Orthogonal Matrices u, v = n a i b i, u 2 = u, u = i=1 n a 2 i. Def. A square matrix Q R n n is called an orthogonal matrix if Q T Q = I n. Equivalent definitions: (a) QQ T = I n. (b) Q T = Q 1. (c) The column vectors of Q form an ON basis of R n (w.r.t. the scalar product). (d) The row vectors of Q form an ON basis of R n. [ ] cos θ sin θ Ex. Q = is an orthogonal matrix. The linear transformation sin θ cos θ L(x) = Qx is the rotation with θ angle on R 2. Ex. Permutation matrices are orthogonal matrices Homework Sect 5.5 1, 2, 3, 6, 11, i=1

10 5.6 The Gram-Schmidt Orthogonalization Process 1. Goal: Given a basis [x 1,, x n ] of an inner product space, we construct an orthonormal basis [u 1,, u n ] such that 2. span (u 1,, u k ) = span (x 1,, x k ) for k = 1,, n. Methodology: Given an ON set [u 1,, u k ] and a vector v outside the subspace S := span(u 1,, u k ), we construct the projection p = k i=1 v, u i u i of v onto S, and a unit vector u k+1 := v p orthogonal to S. By this way we extend the ON set v p [u 1,, u k ] to a new ON set [u 1,, u k, u k+1 ] (Show by figure). Thm 5.13 (Gram-Schmidt Process). Let [x 1,, x n ] be a basis of the inner product space V. Let u 1 = x 1 x 1, For k = 1,, n 1 : p k = x k+1, u 1 u x k+1, u k u k u k+1 = x k+1 p k x k+1 p k Then [u 1,, u n ] is an ON basis of V such that Ex. Example 2 (pp276 in 7th ed). Ex. HW 5 (a) (pp282 in 7th ed). span (u 1,, u k ) = span (x 1,, x k ). Thm 5.14 (QR Factorization). If A R m n has full column rank n, then A = QR, where the columns of Q R m n are orthonormal, and R R n n is a nonsingular upper triangular matrix. Proof. Apply Gram-Schmidt process to the columns a 1,, a n of A. We get an ON set [u 1,, u n ] with span (u 1,, u k ) = span (a 1,, a k ) for k = 1,, n. So a k is a linear combination of u 1,, u k, say a 1 = r 11 u 1, a 2 = r 12 u 1 + r 22 u 2,. a n = r 1n u r nn u n. 43

11 Then r 11 r 12 r 1n A = [ ] 0 r 22 r 2n u 1,, u n = QR, 0 0 r nn ] [ r11 r 12 r 1n 0 r 22 r 2n where Q = [u 1,, u n ] and R = The Gram-Schmidt process. 0 0 r nn guarantees that the diagonal entries of R are positive. Ex. HW 5(b) (pp282) Ex. Example 3 (pp278 in 7th ed). Brief introduction only. 3. (skip) Use Gram-Schmidt QR factorization to solve the least squares problem. Thm 5.15 (5.6.3, pp295). If A = QR is the QR factorization, then the least squares solution of Ax = b is given by Ex. Example 4 (cf. Example 3) Ex. HW 5(c) Homework Sect 5.6, 3, 8. ˆx = R 1 Q T b. 44

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17 Contents 1 Matrices and Systems of Equations 2 11 Systems of Linear Equations 2 12 Row Echelon Form 3 13 Matrix Algebra 5 14 Elementary Matrices 8 15 Partitioned Matrices 10 2 Determinants 12 21 The Determinant

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Chapter 6 - Orthogonality

Chapter 6 - Orthogonality Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall 2009 http://z.cs.utexas.edu/wiki/pla.wiki/ 1 Orthogonal Vectors and Subspaces http://z.cs.utexas.edu/wiki/pla.wiki/

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

Dot product and linear least squares problems

Dot product and linear least squares problems Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

MATH 532: Linear Algebra

MATH 532: Linear Algebra MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

A Primer in Econometric Theory

A Primer in Econometric Theory A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,

More information

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Math 265 Linear Algebra Sample Spring 2002., rref (A) = Math 265 Linear Algebra Sample Spring 22. It is given that A = rref (A T )= 2 3 5 3 2 6, rref (A) = 2 3 and (a) Find the rank of A. (b) Find the nullityof A. (c) Find a basis for the column space of A.

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

Overview. Motivation for the inner product. Question. Definition

Overview. Motivation for the inner product. Question. Definition Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of

More information

Inner product spaces. Layers of structure:

Inner product spaces. Layers of structure: Inner product spaces Layers of structure: vector space normed linear space inner product space The abstract definition of an inner product, which we will see very shortly, is simple (and by itself is pretty

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Introduction to Linear Algebra, Second Edition, Serge Lange

Introduction to Linear Algebra, Second Edition, Serge Lange Introduction to Linear Algebra, Second Edition, Serge Lange Chapter I: Vectors R n defined. Addition and scalar multiplication in R n. Two geometric interpretations for a vector: point and displacement.

More information

Algebra II. Paulius Drungilas and Jonas Jankauskas

Algebra II. Paulius Drungilas and Jonas Jankauskas Algebra II Paulius Drungilas and Jonas Jankauskas Contents 1. Quadratic forms 3 What is quadratic form? 3 Change of variables. 3 Equivalence of quadratic forms. 4 Canonical form. 4 Normal form. 7 Positive

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

TBP MATH33A Review Sheet. November 24, 2018

TBP MATH33A Review Sheet. November 24, 2018 TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for 1 Logistics Notes for 2016-08-29 General announcement: we are switching from weekly to bi-weekly homeworks (mostly because the course is much bigger than planned). If you want to do HW but are not formally

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

A Review of Linear Algebra

A Review of Linear Algebra A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose

More information

18.06 Professor Johnson Quiz 1 October 3, 2007

18.06 Professor Johnson Quiz 1 October 3, 2007 18.6 Professor Johnson Quiz 1 October 3, 7 SOLUTIONS 1 3 pts.) A given circuit network directed graph) which has an m n incidence matrix A rows = edges, columns = nodes) and a conductance matrix C [diagonal

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Math 24 Spring 2012 Sample Homework Solutions Week 8

Math 24 Spring 2012 Sample Homework Solutions Week 8 Math 4 Spring Sample Homework Solutions Week 8 Section 5. (.) Test A M (R) for diagonalizability, and if possible find an invertible matrix Q and a diagonal matrix D such that Q AQ = D. ( ) 4 (c) A =.

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

Reduction to the associated homogeneous system via a particular solution

Reduction to the associated homogeneous system via a particular solution June PURDUE UNIVERSITY Study Guide for the Credit Exam in (MA 5) Linear Algebra This study guide describes briefly the course materials to be covered in MA 5. In order to be qualified for the credit, one

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Linear Algebra Chap. 2: Least Squares Problems Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014 MAH 60 LINEAR ALGEBRA EXAM III Fall 0 Instructions: the use of built-in functions of your calculator such as det( ) or RREF is permitted ) Consider the table and the vectors and matrices given below Fill

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

Section 7.5 Inner Product Spaces

Section 7.5 Inner Product Spaces Section 7.5 Inner Product Spaces With the dot product defined in Chapter 6, we were able to study the following properties of vectors in R n. ) Length or norm of a vector u. ( u = p u u ) 2) Distance of

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Problem 1. CS205 Homework #2 Solutions. Solution

Problem 1. CS205 Homework #2 Solutions. Solution CS205 Homework #2 s Problem 1 [Heath 3.29, page 152] Let v be a nonzero n-vector. The hyperplane normal to v is the (n-1)-dimensional subspace of all vectors z such that v T z = 0. A reflector is a linear

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information