Linear Algebra 18 Orthogonality

Size: px
Start display at page:

Download "Linear Algebra 18 Orthogonality"

Transcription

1 Linear Algebra 18 Orthogonality Wei-Shi Zheng, What Do You Learn from This Note We still observe the unit vectors we have introduced in Chapter 1: e 1 = 0, e 2 = 1, e 3 = 0 (1) Question: Have you tried to compute the inner product like e 1 e 2, e 1 e 3 and e 2 e 3? You will actually find that e 1 e 2 = 0, e 1 e 3 = 0 and e 2 e 3 = 0 That is the three standard basis vectors are orthogonal, more precisely orthonormal We have a special name for such a type of basis called orthogonal (orthonormal) basis We will comprehensively introduce this in this lecture note Basic Concept:orthogonal set ( 正交集 ), orthogonal basis ( 正交基 ), Orthogonal Matrix ( 正交矩阵 ), Orthogonal Projection ( 正交投影 ), Gram Schmidt Process ( 格拉姆 - 施密特正交化 ) 2 Orthogonal Basis 21 Definition Definition 1 (orthogonal set ( 正交集 )) Let S = { v 1,, v r } R n { 0} We say S is an orthogonal set iff for any i, j = 1,, r and i j, we have v i v j Furthermore, if v 1,, v r are all unit vectors then S is called an orthonormal set 1

2 Example: Textbook P384 Definition 2 (orthogonal basis ( 正交基 )) A basis B of R n which is also orthogonal is called an orthogonal basis Furthermore, B is called an orthonormal basis if it is orthonormal Example: The standard basis E = { e 1,, e n } is an orthonormal basis of R n Example: Textbook P Some Properties Connection between orthogonal set & linearly independent: Theorem 3 Let S = { v 1,, v r } be an orthogonal set Then S is linearly independent Proof Let c 1,, c r be scalars Suppose that 0 = c 1 v c r v r Then for each i = 1,, r, we have 0 = v i 0 = v i (c 1 v c r v r ) = c 1 ( v i v 1 ) + + c r ( v i v 1 ) = c i ( v i v i ) Since v i v i 0, we must have c i = 0 and the result follows Orthogonal Matrix ( 正交矩阵 ): Theorem 4 Let U = ( v 1 v r ) R n r Then 1 { v 1,, v r } is orthogonal iff U T U is invertible and diagonal; 2 { v 1,, v r } is orthonormal iff U T U = I r Proof 1 { v 1,, v r } is orthogonal [ U T U ] ij = vt i v j = v i v j = 2 Similar to 1 U T U is invertible and diagonal { vi 0 if i = j; 0 otherwise 2

3 Definition 5 (orthogonal matrix) orthogonal iff U T U = I n (or U T = U 1 ) ( cos θ sin θ Example: For any θ R, sin θ cos θ A matrix U R n is said to be ) is orthogonal Theorem 6 1 I n is orthogonal 2 If U R n is orthogonal then so is U 1 3 If U 1 and U 2 R n are orthogonal then so is U 1 U 2 (So the set of orthogonal matrices of size n is a group under multiplication) Proof Easy, left as an exercise Why is orthogonal basis useful? Theorem 7 Let B = { b 1,, b n } be an orthogonal basis of R n Then for any v R n we have So if B is orthonormal, then ( ) T ( v b 1 ) [ v] B = ( b 1 b 1 ) ( v b n ) ( b n b n ) [ v] B = ( ( v b 1 ) ( v b n )) T Proof Suppose [ v] B = ( c 1 c n ) T Then for i = 1,, n, we have v b i = ( c 1 b1 + + c n bn ) b i = c 1 ( b 1 b i ) + + c n ( b n b i ) = c i ( b i b i ) So c i = ( v b i ) ( b i b i ) Example: Textbook P385 3

4 Theorem 8 Let B = { b 1,, b n } be an orthonormal basis of R n Then for any u, v R n, 1 u v = [ u] B [ v] B ; 2 v = [ v] B ; 3 d( u, v) = d([ u] B, [ v] B ) This means the coordinate isomorphism with respect to B preserves dot product, norm and distance, in other word, the geometric structure of R n is preserved Proof 1 [ u] B [ v] B = ( u b 1 )( v b 1 ) + + ( u b n )( v b n ) = ( u ( v b 1 ) b 1 ) + + ( u ( v bn ) b n ) = u (( v b 1 ) b ( v b n ) b n ) = u v 2 and 3 are derived from 1 directly Theorem 9 ( u 1 u n ) is orthogonal iff { u 1,, u n } is an orthonormal basis Theorem 10 Let B be an orthonormal basis Then B is an orthonormal basis iff [B ] B is orthogonal Proof Suppose that B = {b 1,, b n} Then B is orthonormal {b 1,, b n} is orthonormal {[b 1] B,, [b n] B } is orthonormal [by Theorem 8] ([b 1] B [b n] B ) is orthogonal 4

5 3 Orthogonal Projection ( 正交投影 ) 注 : 下面的要求大家完全掌握证明 Theorem 11 (The Orthogonal Decomposition Theorem) Let W be s subspace of R n Then each y in R n can be written uniquely in the form y = ˆ y + z, where ˆ y is in W and z is in W In fact, if { u 1,, u p } is any orthogonal basis of W, then ˆ y = y u 1 u y u p u p u 1 u 1 u p u p and Proof Sketch of the proof: z = y ˆ y 1 Prove ˆ y W (Linear combination of all basis in W ) 2 Prove z W ( z u i, for all u i ) 3 Prove the uniqueness of the decomposition Definition 12 (orthogonal projection) We say the orthogonal projection of y onto a subspace W is proj W y = y u 1 u 1 u 1 u y u p u p u p u p where { u 1,, u p } is any orthogonal basis of W Theorem 13 If { u 1,, u p } is any orthonormal basis of a subspace W of R n, then proj W y = ( y u 1 ) u ( y u p ) u p If U = [ u 1,, u p ] then proj W y = UU T y If we replace the subspace W in the above theorem by a special subspace L = span{ u}, we can still have: 5

6 Theorem 14 For any vector y R n, we can have proj L y = y u u u, and z is orthogonal to u in R n We call proj L y the orthogonal projection of y onto L Theorem 15 (The Best Approximation Theorem) Let W be a subspace of R n, y be any vector in R n Then proj W y is the closet point in W to y, in the sense that y proj W y y v for all v in W distinct from ˆ y Proof Since y v = y proj W y + proj W y v, and y proj W y W and proj W y v W, so y v 2 = y proj W y 2 + proj W y v 2 That is y v 2 y proj W y 2 6

7 4 Where to use the Best Approximation Theorem Question: We just do nothing when the matrix equation A x = b has no solution in real scenarios? NO!!!NO!!!NO!!!! Let A R m n and b R m If there is no solution for the matrix equation A x = b, we are still required to find x R n such that A x is the best approximation of b in some sense A feasible approach to this problem is to find x such that the distance between b and A x is as small as possible We formulate this idea as follows: Definition 16 (the least squares solution (version 1)) Let A R m n and b R m Then a least squares solution of A x = b is an ˆ x R n such that b Aˆ x b A x for all x R n The distance d( b, Aˆ x) = b Aˆ x is called the least squares error of A x = b Define W = Col(A) Notice that w = A x for some x R n iff w W and by Theorem 15, the closest vector to b among all vectors in W is the orthogonal projection b W of b onto W, which is unique So Definition 16 can be recast equivalently as Definition 17 (the least squares solution (version 2)) Let A R m n and b R m Then a least squares solution of A x = b is an ˆ x R n such that Aˆ x = b W, where W = Col(A) 7

8 Question: How to compute the least-square solution? The least squares solution set of a matrix equation over R is identified completely as follows: Theorem 18 Let A R m n and b R m Then the least squares solution set of the equation A x = b is precisely the solution set of the equation Proof Define W = Col(A) Then A T A x = A T b Aˆ x = b W b Aˆ x W So the result follows b Aˆ x Nul(A T ) [since W = Nul(A T )] A T ( b Aˆ x) = 0 A T Aˆ x = A T b Examples: Textbook P411, P412, P413 Remark: If { a 1,, a n } is an orthogonal set then So ˆ x can be written down directly as bw = b a 1 a 1 a 1 a b a n a n a n a n ˆ x = ( b a1 a 1 a 1 b an a n a n ) T Example: Textbook P414 8

9 Question: What is the case when A T A is invertible? If A T A is invertible then the matrix equation A x = b has a unique least squares solution, namely (A T A) 1 A T b But this is not always the case The next theorem gives a sufficient and necessary condition for A T A being invertible Lemma 19 Let A R m n Then 1 Nul(A) = Nul(A T A); 2 rank(a) = rank(a T A) Proof 1 Suppose that x Nul(A), namely A x = 0 Then A T Ax = A T 0 = 0 So x Nul(A T A), that is Nul(A) Nul(A T A) Conversely, suppose that x Nul(A T A), namely A T Ax = 0 Then (A x) (A x) = (A x) T (A x) = A x 2 = 0, which forces that A x = 0 So x Nul(A) and Nul(A T A) Nul(A) 2 By the rank & nullity Theorem, rank(a) = n dim Nul(A) = n dim Nul(A T A) = rank(a T A) Theorem 20 Let A = ( a 1 a n ) R m n Then A T A is invertible iff { a 1,, a n } is linearly independent Proof Notice that A T A R n Then A T A is invertible rank(a T A) = n rank(a) = n [by Lemma 4] dim Col(A) = dim Span{ a 1,, a n } = n { a 1,, a n } is linearly independent 9

10 5 Constructing Orthogonal Basis by Gram Schmidt Process ( 格拉姆 - 施密特正交化 ) Question: We have seen the usefulness of orthonormal/orthogonal basis But how can we generate them? 注 : 这里告诉我们怎样从一组向量集中构造正交基 We shall show this by the means of the Gram Schmidt process Theorem 21 (the Gram Schmidt process) Let { w 1,, w r } R n be a linearly independent set Then we can construct an orthogonal set { v 1,, v r } R n such that Span{ v 1,, v r } = Span{ w 1,, w r } Furthermore, by normalising { v 1,, v r }, we obtain an orthonormal set { u 1,, u r } R n such that Span{ u 1,, u r } = Span{ w 1,, w r } Proof We shall prove the first part of the theorem by induction on r( 下面, 我们用数学归纳法证明 ) 1 STEP 1: r = 1 Then we can take v 1 = w 1 and the result is trivial 2 STEP 2: Inductive Hypothesis Assume the result holds for r 1 3 STEP 3: Inductive Step Since { w 1,, w r 1 } is linearly independent, so by Inductive Hypothesis, we can construct an orthogonal set { v 1,, v r 1 } such that Span{ v 1,, v r 1 } = Span{ w 1,, w r 1 } Now, v i 0, for all i = 1,, r 1, so v r = w r w r v 1 v 1 w r v r 1 v r 1 v 1 v 1 v r 1 v r 1 is well defined It is obvious that Span{ v 1,, v r } = Span{ w 1,, w r }, 10

11 so v r 0 Also for i = 1,, r 1, we have v r v i = w r v i w r v 1 v 1 v 1 v 1 v i w r v r 1 v r 1 v r 1 v r 1 v i = w r v i w r v i v i v i v i v i = 0 Therefore { v 1,, v r } is an orthogonal set Finally, for i = 1,, r, define u i = v i v i Then { u 1,, u r } is orthonormal and Span{ u 1,, u r } = Span{ w 1,, w r } The Gram Schmidt Process: According to the proof of the above theorem, we can write down v 1,, v r and u 1,, u r directly as follows: v 1 = w 1, v 2 = w 2 w 2 v 1 v 1 v 1 v 1, v i = w i w i v 1 v 1 w i v i 1 v i 1, v 1 v 1 v i 1 v i 1 v r = w r w r v 1 v 1 w r v r 1 v r 1, v 1 v 1 v r 1 v r 1 u 1 = v 1 v 1, u 2 = v 2 v 2,, u r = v r v r This process of writing down v 1,, v r and u 1,, u r is called the Gram Schmidt process, which can be used to creating an orthonormal basis for any subspace of R n Examples: Textbook P402, P405 11

12 QR Decomposition: Application of Gram Schmidt: An application of Gram Schmidt is for decompose a matrix A whose columns are linearly independent into the following forms: A = QR, where Q is a matrix whose columns form an orthonormal basis for ColA and R is an upper triangular invertible matrix with positive entries onits diagonal More specifically, we have the following theorem: Theorem 22 (QR factorization (QR 分解 )) Let A = ( w 1 w r ) R m r and rank(a) = r Then A can be factorised as A = QR, where Q R m r such that the columns of Q form an orthonormal basis of Col(Q) and R R r r is an upper triangular matrix such that diagonal entries are positive Proof The rank of A is r indicates that { w 1,, w r } is linearly independent So by Theorem 1, we can construct orthogonal set { v 1,, v r } and orthonormal set { u 1,, u r } such that for i = 1,, r, w i = w i v 1 v w i v i 1 v i 1 + v i, v i = v i u i, v 1 v 1 v i 1 v i 1 That is and w 1 2 v 1 v 1 v ( w 1 w r ) = ( v 1 v r ) w r v 1 v 1 v 1 w r v 2 v 2 v 2 ( v 1 v r ) = ( u 1 u r ) diag( v 1, v 2,, v r ) So w v v 1 v 1 v 1 0 v ( w 1 w r ) = ( u 1 u r ) 0 0 v r v 1 w 2 u 1 w r u 1 0 v 2 w r u 2 = ( u 1 u r ) 0 0 v r w r v 1 v 1 v 1 w r v 2 v 2 v 2 12

13 Take Q = ( u 1 u r ) and R = result follows v 1 w 2 u 1 w r u 1 0 v 2 w r u v r and the Example: Textbook P406 13

Lecture Note on Linear Algebra 14. Linear Independence, Bases and Coordinates

Lecture Note on Linear Algebra 14. Linear Independence, Bases and Coordinates Lecture Note on Linear Algebra 14 Linear Independence, Bases and Coordinates Wei-Shi Zheng, wszheng@ieeeorg, 211 November 3, 211 1 What Do You Learn from This Note Do you still remember the unit vectors

More information

Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors

Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors Wei-Shi Zheng, wszheng@ieee.org, 2011 November 18, 2011 1 What Do You Learn from This Note In this lecture note, we are considering a very

More information

Lecture Note on Linear Algebra 17. Standard Inner Product

Lecture Note on Linear Algebra 17. Standard Inner Product Lecture Note on Linear Algebra 17. Standard Inner Product Wei-Shi Zheng, wszheng@ieee.org, 2011 November 21, 2011 1 What Do You Learn from This Note Up to this point, the theory we have established can

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

6. Orthogonality and Least-Squares

6. Orthogonality and Least-Squares Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13

More information

Linear Algebra 20 Symmetric Matrices and Quadratic Forms

Linear Algebra 20 Symmetric Matrices and Quadratic Forms Linear Algebra 20 Symmetric Matrices and Quadratic Forms Wei-Shi Zheng, wszheng@ieee.org, 2011 1 What Do You Learn from This Note Do you still remember the following equation or something like that I have

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

PRACTICE PROBLEMS FOR THE FINAL

PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

CSL361 Problem set 4: Basic linear algebra

CSL361 Problem set 4: Basic linear algebra CSL361 Problem set 4: Basic linear algebra February 21, 2017 [Note:] If the numerical matrix computations turn out to be tedious, you may use the function rref in Matlab. 1 Row-reduced echelon matrices

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

6.1. Inner Product, Length and Orthogonality

6.1. Inner Product, Length and Orthogonality These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.

More information

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix Professor Joana Amorim, jamorim@bu.edu What is on this week Vector spaces (continued). Null space and Column Space of a matrix............................. Null Space...........................................2

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases MTH6140 Linear Algebra II Notes 7 16th December 2010 7 Inner product spaces Ordinary Euclidean space is a 3-dimensional vector space over R, but it is more than that: the extra geometric structure (lengths,

More information

Homework 5. (due Wednesday 8 th Nov midnight)

Homework 5. (due Wednesday 8 th Nov midnight) Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 10: Vector Algebra: Orthogonal Basis Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that

More information

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

The 'linear algebra way' of talking about angle and similarity between two vectors is called inner product. We'll define this next. Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

Lecture 4: Applications of Orthogonality: QR Decompositions

Lecture 4: Applications of Orthogonality: QR Decompositions Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

TBP MATH33A Review Sheet. November 24, 2018

TBP MATH33A Review Sheet. November 24, 2018 TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [

More information

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name: Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions Warm-up True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

WI1403-LR Linear Algebra. Delft University of Technology

WI1403-LR Linear Algebra. Delft University of Technology WI1403-LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403-LR Linear

More information

1. Select the unique answer (choice) for each problem. Write only the answer.

1. Select the unique answer (choice) for each problem. Write only the answer. MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

MATH 54 - FINAL EXAM STUDY GUIDE

MATH 54 - FINAL EXAM STUDY GUIDE MATH 54 - FINAL EXAM STUDY GUIDE PEYAM RYAN TABRIZIAN This is the study guide for the final exam! It says what it does: to guide you with your studying for the exam! The terms in boldface are more important

More information