6. Orthogonality and LeastSquares


 Kerrie McCarthy
 11 months ago
 Views:
Transcription
1 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 1 6. Orthogonality and LeastSquares 6.1 Inner product, length, and orthogonality Orthogonal sets Orthogonal projections The GramSchmidt process Leastsquares problems Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU Inner product, length, and orthogonality Definition The inner product of two vector u and v in R n is written u.v. If u =, v =, then u.v = u 1 v 1 + u 2 v u n v n. Theorem 1 (a) u.v = v.u (b) (u + v).w = u.w + v.w (c) (cu).v = c(u.v) = u.(cv) (d) u.u 0, and u.u = 0 u = 0.
2 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 3 Definition The length (or norm) of v is the nonnegative scalar v defined by Note: cv = c v. Definition The distance between u and v, written as dist(u, v), is the length of the vector u  v. That is, dist(u, v) = u  v. Orthogonal vectors Definition Two vectors u and v in R n are orthogonal if u.v = 0. Theorem 2 (The Pythagorean Theorem, 畢氏定理 ) Two vectors u and v are orthogonal if and only if u + v 2 = u 2 + v 2. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 4 (Lcos ) 2 + (Lsin ) 2 = L 2 L
3 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 5 Orthogonal complements Definition The set of all vectors u that are orthogonal to every vector w in W, then we say that the set is the orthogonal complement of W, and denote by W. (next page) Note 1. A vector x is in W if and only if x is orthogonal to every vector in a set that spans subspace W. 2. W is a subspace of R n for any subset W of R n. 3. Zero vector is orthogonal to any vector. Theorem 3 Let A be an m n matrix. Then (i) (Row A) = Nul A. (ii) (Col A) = Nul A T. ({x A T x = 0} {x x T A = 0} left null space of A) Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 6 R n W w 1 w 2 w i w 3 W = { x x R n, x w i s}
4 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 7 W is a subspace of R n for any subset W of R n. If w is an arbitrary vector in W, then 1. 0 W 2. If u W, then cu.w = c (u.w) = 0; thus cu W 3. If u and v W, then (u+v).w = u.w + v.w = 0; thus (u+v) W If W is a subset of R n, then (W ) = W? If W is a subspace of R n, then (W ) = W? Exercises of Section 6.1. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU Orthogonal sets A set of vectors {u 1,, u n } in R n is said to be an orthogonal set if u i u j = 0 for all i j. Theorem 4 If S = {u 1,, u p } is an orthogonal set of nonzero vectors in R n, then S is linearly independent and hence is a basis for the subspace spanned by S. An orthogonal basis for a subspace W of R n is a basis for W that is also an orthogonal set. u i u j
5 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 9 Theorem 5 Let {u 1,, u p } be an orthogonal basis for a subspace W of R n. Then each y in W has a unique representation as a linear combination of u 1,, u p. In fact, if y = c 1 u c p u p then c j =, j = 1, 2,, p. Note 投影分量 (1) y j = c j u j = u j is the orthogonal projection of y onto u j. (2) y  y j = y  u j is component of y orthogonal to u j. u 2 y u 1 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 10 A set {u 1,, u p } is an orthonormal set if it is an orthogonal set of unit vectors. Normalize the length of vector u = [u 1 u n ] T to u Theorem 6 An m n matrix U has orthonormal columns if and only if U T U =. Proof. 內積
6 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 11 Theorem 7 Let U be an m n matrix with orthonormal columns, and let x and y be in R n. Then (a) Ux = x (preserving length) (b) (Ux).(Uy) = x.y (preserving orthogonality) (c) (Ux).(Uy) = 0 if and only if x.y = 0 Proof. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 12 (b) (Ux).(Uy) = (x 1 u x n u n ).(y 1 u y n u n ) = x 1 y 1 + x 2 y x n y n Exercises of Section 6.2.
7 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU Orthogonal projections Purpose To find the projection of a vector on a subspace. Theorem 8 (The Orthogonal Decomposition Theorem) Let W be a subspace of R n. Then each y in R n can be written uniquely in the form y = + z, (1) where is in W and z is in W. In fact, if { u 1,, u p } is any orthogonal basis of W, then u 3 W = Span {u 1, u 2 } y (2) z and z = y . The vector in Eq.(1) is called the orthogonal projection of y onto W and is often written as proj w y. u 1 ŷ u 2 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 14 For example, The orthogonal projection of y onto W. y W 0 Ex.2. Let u 1 =, u 2 =, and y =. Observe that {u 1, u 2 } is an orthogonal basis for W = Span {u 1, u 2 }. Write y as the sum of vector in W and vector orthogonal to W.
8 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 15 Solution. The orthogonal projection of y onto W is Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 16 Properties of orthogonal projections If {u 1, u 2, u p } is an orthogonal basis for W and if y is in W = Span{u 1, u 2,, u p }, then proj W y = y. Theorem 9 (The Best approximation theorem) Let W be a subspace of R n, y be any vector in R n, and be the orthogonal projection of y onto W. Then is the closest point in W to y, in the sense that for all v in W distinct from. y (3) W 0
9 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 17 Ex.4. The distance from a point y in R n to a subspace W is defined as the distance from y to the nearest point in W. Find the distance from y to W = Span {u 1, u 2 }, where y =, u 1 =, u 2 =. Solution. By the best approximation theorem, the distance from y to W is, where = proj W y. Since {u 1, u 2 } is orthogonal basis for W, Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 18 Theorem 10 If { u 1,.., u p } is an orthonormal basis for a subspace W of R n, then proj W y = (y u 1 ) u 1 + (y u 2 ) u (y u p ) u p (4) If U = [u 1 u 2 u p ], then proj W y = U U T y for all y in R n. (5) Proof. proj W y = (y u 1 ) u 1 + (y u 2 ) u (y u p ) u p Since = U proj W y = U = U (U T y) = U U T y Exercises of Section 6.3.
10 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU The Gram  Schmidt process Purpose The Gram  Schmidt process is a simple algorithm for producing an orthogonal or orthonormal basis for nonzero subspace of R n. Theorem 11 (The Gram  Schmidt process) Let a basis { x 1,, x p } for a subspace W of R n, define Nonzero linear independent set Orthogonal set Then { v 1,, v p } is an orthogonal basis for W. In addition Span { v 1,, v k } = Span { x 1,, x k } for 1 k p. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 20 The geometric meaning of the Gram  Schmidt process (A threedimensional case) x 2 v 2 v 1 = x 1? x3 v3 x2 Note that the angle between x i and v i is less than 90 o. v1 = x1 v2
11 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 21 v 2 x 2 x 1 = v 1 Orthonormal bases An orthonormal basis is constructed easily from an orthogonal basis { v 1,, v p }; simply normalize all v k by v k / v k v k. QR factorization of matrices This factorization is widely used in computer algorithm for various computations, such as solving equations and finding eigenvalues. (Exercise 23 of Section 5.2.) Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 22 Theorem 12 (The QR factorization) If A is an m n matrix with linearly independent columns, then A can be factored as A = QR, where Q is an m n matrix whose columns form an orthonormal basis for Col A and R is an n n upper triangular invertible matrix with positive entries on its diagonal. Ex.4.
12 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 23 (b) To find R. Since A = QR and Q T Q = (Theorem 6 of page 10) Q T A= Q T QR Q T A = R. Exercises of Section 6.4. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 24 Problem No! A = QR and Q T Q = Q T A= Q T QR Q T A = R QQ T A = QR = A QQ T = I?
13 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU Leastsquares problems For a linear system, Ax = b, a solution is demanded but none exists. The best one can do is to find an x that makes Ax as close as possible to b. The general leastsquares problem is just to find an x that makes b  Ax as small as possible. The term leastsquares arises from the fact that b  Ax is the square root of a sum of squares. Definition If A is an m n matrix and b is in R m, a leastsquares solution of Ax = b is an in R n such that b  A b  Ax for all x in R n. Theorem 9 on page 16 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 26 Note No matter that x is selected, the vector Ax will necessarily be in the column space Col A. So we seek an x that makes Ax the closest point in Col A to b. b Col A If is the orthogonal projection of b on Col A, then A =. It means that Ax = is consistent and there is a solution in R n. By the orthogonal decomposition principle, the projection has the property that b  is orthogonal to Col A, so b  A is orthogonal to each column of A. If a j is any column of A, then a j is orthogonal to (b A ) and a jt (b A ) = 0. Since each a jt is a row of A T, A T (b A ) = 0 A T b A T A = 0 A T A = A T b.
14 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 27 Theorem 13 The set of leastsquares solutions of Ax = b coincides with the nonempty set of solutions of the normal equations A T A = A T b. If A T A is invertible, then = (A T A) 1 A T b. Ex.1. Find a leastsquares solution of the inconsistent system In many calculations, A T A is invertible, but this is not always the case. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 28 Ex.2. Find a leastsquares solution of Ax = b for where A T A is not invertible.
15 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 29 Theorem 14 The matrix A T A is invertible if and only if the columns of A are linearly independent. In this case, the equation Ax = b has only one leastsquares solution, and it is given by = (A T A) 1 A T b. Note The distance from b to A is called the leastsquares error. Alternative calculations of leastsquare solutions Meaning In some cases, the normal equations for a leastsquares problem can be illconditioned; that is, small errors in the calculations of the entries of A T A can sometimes cause relatively large errors in the solution. If the columns of A are linearly independent, the leastsquares solution can often be computed more reliably through a QR factorization of A. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 30 illconditioned The condition number associated with the linear equation Ax = b gives a bound on how inaccurate the solution x will be after approximate solution. Conditioning is a property of the matrix, not the algorithm or floating point accuracy of the computer used to solve the corresponding system. The rate at which the solution x will change with respect to a change in b. Thus, if the condition number is large, even a small error in b may cause a large error in x.
16 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 31 The condition number is the maximum ratio of the relative error in x divided by the relative error in b,, where x is 2 norm. For example, where condition number is The larger condition number is, the more ill conditioned the coefficient matrix is. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 32 Theorem 15 Given an m n matrix A with linearly independent columns, let A = QR be a QR factorization of A. Then for each b in R m, the equation Ax = b has a unique leastsquares solution, = R 1 Q T b. Proof. = (A T A) 1 A T b since A = QR = (R T Q T QR) 1 R T Q T b = (R T R) 1 R T Q T b = (R T R) 1 R T Q T b = R 1 (R T ) 1 R T Q T b = R 1 Q T b.
17 Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 33 Note Since R is an upper triangular matrix, should be calculated from the equation R = Q T b. It is much faster to solve this equation by backsubstitution or row operations than to compute R 1 and use the preceding equation. Linear Algebra 6. Orthogonality and LeastSquares CSIE NCU 34 Ex.5. Find the leastsquares solution of Ax = b for Exercises for Section 6.5.
Math 3191 Applied Linear Algebra
Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have
More informationLeast squares problems Linear Algebra with Computer Science Application
Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications
More informationOrthogonality and Least Squares
6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and
More informationMarch 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linearalgebrasummarysheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % ,79,  % 9,,7,,7  % , 7  % Let u u and v Let x x x x,
More informationWorksheet for Lecture 25 Section 6.4 GramSchmidt Process
Worksheet for Lecture Name: Section.4 GramSchmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal
More informationOrthonormal Bases; GramSchmidt Process; QRDecomposition
Orthonormal Bases; GramSchmidt Process; QRDecomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More information6.1. Inner Product, Length and Orthogonality
These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, GramSchmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,
More informationSection 6.4. The Gram Schmidt Process
Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the Bcoordinates of a vector x using dot products: x = m i= x u i u i u i u i Find
More informationLINEAR ALGEBRA 1, 2012I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationMTH 2310, FALL Introduction
MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly
More informationMath Linear Algebra
Math 220  Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner
More informationMath 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationSection 6.2, 6.3 Orthogonal Sets, Orthogonal Projections
Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u
More informationMath 2331 Linear Algebra
6.1 Inner Product, Length & Orthogonality Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality ShangHuan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationRecall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:
Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationWorksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality
Worksheet for Lecture (due December 4) Name: Section 6 Inner product, length, and orthogonality u Definition Let u = u n product or dot product to be and v = v v n be vectors in R n We define their inner
More informationMath 2331 Linear Algebra
6. Orthogonal Projections Math 2 Linear Algebra 6. Orthogonal Projections Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2 Jiwen He, University of
More informationv = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :
Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationMTH 2032 SemesterII
MTH 202 SemesterII 201011 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationMATH 22A: LINEAR ALGEBRA Chapter 4
MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate
More informationMA 265 FINAL EXAM Fall 2012
MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators
More informationI. Multiple Choice Questions (Answer any eight)
Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam  Course Instructor : Prashanth L.A. Date : Sep24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationOrthogonality and Least Squares
6 Orthogonality and Least Squares 6.5 LEASSQUARES S LEASSQUARES S Definition: If A is and is in, a leastsquares solution of is an in such n that n for all x in. m n A x = x Ax Ax m he most important
More informationLinear Algebra Final Exam Study Guide Solutions Fall 2012
. Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize
More informationMATH 304 Linear Algebra Lecture 20: The GramSchmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The GramSchmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationOrthogonal Complements
Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal
More informationDesigning Information Devices and Systems II
EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric
More informationLinear Algebra Final Exam Review
Linear Algebra Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationChapter 3. Directions: For questions 111 mark each statement True or False. Justify each answer.
Chapter 3 Directions: For questions 111 mark each statement True or False. Justify each answer. 1. (True False) Asking whether the linear system corresponding to an augmented matrix [ a 1 a 2 a 3 b ]
More informationDSGA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DSGA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationMATH 167: APPLIED LINEAR ALGEBRA LeastSquares
MATH 167: APPLIED LINEAR ALGEBRA LeastSquares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of
More informationMath 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections
Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product
More informationAnnouncements Monday, November 26
Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342  Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationLecture 9: Vector Algebra
Lecture 9: Vector Algebra Linear combination of vectors Geometric interpretation Interpreting as MatrixVector Multiplication Span of a set of vectors Vector Spaces and Subspaces Linearly Independent/Dependent
More informationLinear Algebra 18 Orthogonality
Linear Algebra 18 Orthogonality WeiShi Zheng, wszheng@ieeeorg, 2011 1 What Do You Learn from This Note We still observe the unit vectors we have introduced in Chapter 1: 1 0 0 e 1 = 0, e 2 = 1, e 3 =
More informationAnnouncements Monday, November 26
Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationThe Gram Schmidt Process
u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple
More informationThe Gram Schmidt Process
The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case
More informationOrthogonal Projection. Hungyi Lee
Orthogonal Projection Hungyi Lee Reference Textbook: Chapter 7.3, 7.4 Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More information5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.
Linear Algebra  Test File  Spring Test # For problems  consider the following system of equations. x + y  z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the
More informationTo find the leastsquares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,
LeastSquares A famous application of linear algebra is the case study provided by the 1974 update of the North American Datum (pp. 373374, a geodetic survey that keeps accurate measurements of distances
More informationHomework 5. (due Wednesday 8 th Nov midnight)
Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationMath 4377/6308 Advanced Linear Algebra
1.4 Linear Combinations Math 4377/6308 Advanced Linear Algebra 1.4 Linear Combinations & Systems of Linear Equations Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/
More informationftuiowamath2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST
me me ftuiowamath255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationSection 6.1. Inner Product, Length, and Orthogonality
Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28  Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationA Review of Linear Algebra
A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose
More informationMATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.
MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Rowcolumn rule: ijth entry of AB:
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationProblem # Max points possible Actual score Total 120
FINAL EXAMINATION  MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationGlossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB
Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the
More informationSUMMARY OF MATH 1600
SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You
More informationChapter 3. Vector spaces
Chapter 3. Vector spaces Lecture notes for MA1111 P. Karageorgis pete@maths.tcd.ie 1/22 Linear combinations Suppose that v 1,v 2,...,v n and v are vectors in R m. Definition 3.1 Linear combination We say
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 WeiTa Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors GramSchmidt procedure, QR factorization orthogonal decomposition induced
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Null and Column Spaces Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./8 Announcements Study Guide posted HWK posted Math 9Applied
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using GramSchmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More informationReview Notes for Linear Algebra True or False Last Updated: February 22, 2010
Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n
More informationSolving a system by backsubstitution, checking consistency of a system (no rows of the form
MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary
More informationMATH 31  ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3  ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More information18.06 Quiz 2 April 7, 2010 Professor Strang
18.06 Quiz 2 April 7, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number or instructor is 2. 3. 1. (33 points) (a) Find the matrix P that projects every vector b in R 3 onto the line
More informationWarmup. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions
Warmup True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be
More informationWI1403LR Linear Algebra. Delft University of Technology
WI1403LR Linear Algebra Delft University of Technology Year 2013 2014 Michele Facchinelli Version 10 Last modified on February 1, 2017 Preface This summary was written for the course WI1403LR Linear
More informationMATH 235: Inner Product Spaces, Assignment 7
MATH 235: Inner Product Spaces, Assignment 7 Hand in questions 3,4,5,6,9, by 9:3 am on Wednesday March 26, 28. Contents Orthogonal Basis for Inner Product Space 2 2 InnerProduct Function Space 2 3 Weighted
More information1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal
. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal 3 9 matrix D such that A = P DP, for A =. 3 4 3 (a) P = 4, D =. 3 (b) P = 4, D =. (c) P = 4 8 4, D =. 3 (d) P
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationORTHOGONALITY AND LEASTSQUARES [CHAP. 6]
ORTHOGONALITY AND LEASTSQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0
More informationThe GramSchmidt Process 1
The GramSchmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationChapter 6. Orthogonality and Least Squares
Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation
More informationExercises for Unit I (Topics from linear algebra)
Exercises for Unit I (Topics from linear algebra) I.0 : Background Note. There is no corresponding section in the course notes, but as noted at the beginning of Unit I these are a few exercises which involve
More informationThe 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.
Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationExercises for Unit I (Topics from linear algebra)
Exercises for Unit I (Topics from linear algebra) I.0 : Background This does not correspond to a section in the course notes, but as noted at the beginning of Unit I it contains some exercises which involve
More information