Chapter 6 - Orthogonality

Size: px
Start display at page:

Download "Chapter 6 - Orthogonality"

Transcription

1 Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall

2 Orthogonal Vectors and Subspaces 2

3 Observations Let x, y R n be linearly independent. The subspace of all vectors αx + βy, α, β R (the space spanned by x and y) forms a plane. All three vectors x, y, and (x y) lie in this plane and they form a triangle: y z = y x x 3

4 Orthogonality of two vectors x and y are orthogonal/perpendicular if they meet at a right angle. The (Euclidean) length of x is given by x 2 = χ χ2 n 1 = x T x. Pythagorean Theorem: The angle between x and y meet is a right angle if and only if z 2 2 = x y 2 2. In this case, x y 2 2 = z 2 2 = y x 2 2 = (y x) T (y x) = (y T x T )(y x) = (y T x T )y (y T x T )x = y T y (x T y + y T x) {z} {z } y 2 2x T y = x 2 2x T y + y 2 + x T x {z} x 2 4

5 Soooo... Let x and y are orthogonal/perpendicular if they meet at a right angle. Then x 2 + y 2 = x 2 2x T y + y 2 Cancelling terms on the left and right of the equality, this implies that x T y = 0. Definition Two vectors x, y R n are said to be orthogonal if x T y = 0. Notation Sometimes we will use the notation x y to indicate that x is perpendicular to y. 5

6 Orthogonality of subspaces Let V, W R n be subspaces. Then V and W are said to be orthogonal if v V and w W implies that v T w = 0. In other words, two subspaces are orthogonal if all vectors in the first subspace are orthogonal to all vectors in the second subspace. Notation V W indicates that subspace V is orthogonal to subspace W. 6

7 Example Let V = Span 1 0 0, and W = Span Show that V W. 7

8 Definition Given subspace V R n, the set of all vectors in R n that are orthogonal to V is denoted by V (pronounced as V-perp ). Example Let V = Span 1 0 0, What is V? 8

9 Exercise Let V, W R n be subspaces. Show that if V W then V W = {0}, the zero vector. Definition Whenever V W = {0} we will sometimes call this the trivial intersection. Trivial in the sense that it only contains only the zero vector. 9

10 Show that if V R n is a subspace, then V is a subspace. 10

11 Definitions (Recap) Let A R m n. Column space of A, C(A): set of all vectors in R m that can be written as Ax: C(A) = {y y = Ax}. Null space of A, N (A): the set of all vectors in R n that map to the zero vector: (New) The row space of A: N (A) = {x Ax = 0}. R(A) = {y y = A T x}. (New) The left null space of A: 11 (N (A T ) =){x x T A = 0}.

12 Exercise Show that the left null space of a matrix A R m n equals N (A T ). 12

13 Theorem Proof Let A R m n. Then the null space of A is orthogonal to the row space of A: R(A) N (A). Assume that y R(A) and x N (A). Then there exists a vector z R n such that y = A T z. (Why?) Now, y T x = (A T z) T x = (z T A)x = z T (Ax) = 0. (Why?) 13

14 Theorem The dimension of R(A) equals the dimension of C(A). Proof The dimension of the row space equals the number of linearly independent rows. which equals the number of nonzero rows that result from the Gauss-Jordan method which equals the number of pivots that show up in that method which equals the number of linearly independent columns. 14

15 Theorem Let A R m n. Then every x R n can be written as x = x r + x n where x r R(A) and x n N (A). Proof The dimension of N (A) and the dimension of C(A), r, add to the number of columns, n. (Why?) Thus, the dimension of R(A) equals r and the dimension of N (A) equals n r. If x R n cannot be written as x r + x n as indicated, then consider the set of vectors that consists of the union of a basis of R(A) and a basis of N (A), plus the vector x. This set is linearly independent and has n + 1 vectors, contradicting the fact that R n has dimension n. 15

16 Theorem Proof Let A R m n. Then A is a one-to-one, onto mapping from R(A) to C(A). Let A R m n. We need to show that A maps R(A) to C(A). This is trivial, since any vector x R m maps to C(A). Uniqueness: We need to show that if x, y R(A) and Ax = Ay then x = y. Notice that Ax = Ay implies that A(x y) = 0, which means that (x y) is both in R(A) (since it is a linear combination of x and y, both of which are in R(A)) and in N (A). Since we just showed that these two spaces are orthogonal, we conclude that (x y) = 0, the zero vector. Thus x = y. 16

17 Theorem Let A R m n. Then A is a one-to-one, onto mapping from R(A) to C(A). Proof (continued) Let A R m n. We need to show that Onto: We need to show that for any b C(A) there exists x r R(A) such that Ax r = b. Notice that if b C, then there exists x R n such that Ax = b. We know that x = x r + x n where x r R(A) and x n N (A). Then b = Ax = A(x r + x n) = Ax r + Ax n = Ax r. 17

18 Theorem Proof Let A R m n. Then the left null space of A is orthogonal to the columns space of A; and The dimension of the left null space of A equals m r, where r is the dimension of the column space of A. This follows trivially by applying the previous theorems to A T. 18

19 Summarizing... dim r row space of A R n x r x = x r + x n Ax r = b Ax = b column space of A b dim r x n R m dim n r nullspace left nullspace dim m r 19

20 Motivating Example 20

21 Example Let us consider the following set of points: (χ 0, ψ 0 ) = (1, 1.97), (χ 1, ψ 1 ) = (2, 6.97), (χ 2, ψ 2 ) = (3, 8.89), (χ 3, ψ 3 ) = (4, 10.01). Find a line that best fits these points. 21

22

23

24 The Problem Find a line that interpolates these points as near as is possible. Near : the sum of the squares of the distances to the line are minimized. Express this with matrices and vectors. χ 0 1 ψ 0 x = χ 1 χ 2 = 2 3 and y = ψ 1 ψ 2 = χ 3 4 ψ 3 The equation of the line we want is y = γ 0 + γ 1 x IF this line COULD go through all these points THEN. ψ 0 = γ 0 + γ 1 χ 1 ψ 1 = γ 0 + γ 1 χ 2 ψ 2 = γ 0 + γ 1 χ 3 ψ 3 = γ 0 + γ 1 χ 4 or, specifically, 1.97 = γ 0 + γ = γ 0 + 2γ = γ 0 + 3γ = γ 0 + 4γ

25 In Matrix Notation... We would like... 0 or, specifically, 0 ψ 0 ψ 1 ψ 2 ψ C A = 1 0 C A = 0 1 χ 0 1 χ 1 1 χ 2 1 χ C A 1 C A γ0 γ 1 γ0 Just looking at the graph it is obvious that these point do not lie on the same line. How do we say that mathematically? γ 1 ««. Therefore all these equations cannot be simultaneously satified. Why? So, what do we do? 25

26 How does it relate to column spaces? For what right-hand sides could we have solved all four equations simultaneously? We would have had to choose y so that Ac = y, where 1 χ ( ) A = 1 χ 1 1 χ 2 = and c = γ0. γ 1 1 χ This means that y must be in the column space of A! It must be possible to express it as y = γ 0 a 0 + γ 1 a 1, where A = ( a 0 a 1 )! What does this mean if we relate this back to the picture? Only if {ψ 0,, ψ 3 } have the property that {(1, ψ 0 ),, (4, ψ 3 )} lie on a line can we find coefficients γ 0 and γ 1 such that Ac = y. 26

27 What are the fundamental questions? When does Ax = b have a solution? When does Ax = b have more than one solution? How do we characterize all these solutions? If Ax = b does not have a solution, how do we find the best approximate solution? 27

28 How does this problem relate to orthogonality and projection? The problem: b does not lie in the column space of A. A question is what vector, z, that does lie in the column space so that we can solve Ac = z instead. Now, if z solves Ac = z exactly, then z = ( ) ( ) γ a 0 a 0 1 = γ γ 0 a 0 + γ 1 a 1, 1 Well DAH! z is in the column space of A. What we want is y = z + w, where w is as small (in length) as possible. This happens when w is orthogonal to z! So, y = γ 0 a 0 + γ 1 a 1 + w, with a T 0 w = at 1 w = 0. The vector z C(A) that is closest to y is known as the projection of y onto the column space of A. We need a way of finding a way to compute this projection. 28

29 Solving a Linear Least-Squares Problem 29

30 Observations The last problem motivated the following general problem: Given m equations in n unknowns, we end up with a system Ax = b where A R m n, x R n, and b R m. This system of equations may have no solutions. This happens when b is not in the column space of A. This system may have a unique solution. This happens only when r = m = n, where r is the rank of the matrix (the dimension of the column space of A). Another way of saying this is that it happens only if A is square and nonsingular (it has an inverse). This system may have many solutions. This happens when b is in the column space of A and r < n (the columns of A are linearly dependent, so that the null space of A is nontrivial). 30

31 Overdetermined systems (Approximately) solve Ax = b where b is not in the column space of A. What we want is an approximate solution ˆx such that Aˆx = z, where z is the vector in the column space of A that is closest to b. In other words, b = z + w where w T v = 0 for all v C(A). From The Figure we conclude that this means that w is in the left null space of A. So, A T w = 0. But that means that which we can rewrite as 0 = A T w = A T (b z) = A T (b Aˆx) A T Aˆx = A T b. (1) 31

32 Lemma If A R m n has linearly independent columns, then A T A is nonsingular (equivalently, has an inverse, A T Aˆx = A T b has a solution for all b, etc.). Proof Proof by contradiction. Assume that A R m n has linearly independent columns and A T A is singular. Then there exists x 0 such that A T Ax = 0. Hence, there exists y = Ax 0 such that A T y = 0. (Why?) This means y is in the left null space of A. But y is also in the column space of A, since Ax = y. Thus, y = 0, since the intersection of the column space of A and the left null space of A only contains the zero vector. This contradicts that A has linearly independent columns. 32

33 What does this mean? If A has linearly independent columns, then The desired ˆx that is the best solution to Ax = b is given by ˆx = (A T A) 1 A T b The vector z C(A) closest to b is given by z = Aˆx = A(A T A) 1 A T b. Thus z = A(A T A) 1 A T b is the vector in the columns space closest to b. The matrix A(A T A) 1 A T projects a vector onto the column space of A. 33

34 Theorem Let A R m n, b R m, and x R n and assume that A has linearly independent columns. Then the solution that minimizes min x b Ax 2 is given by ˆx = (A T A) 1 A T b. 34

35 Definition Let A R m n. If A has linearly independent columns, then (A T A) 1 A T is called the (left) pseudo inverse. Note that this means m n and (A T A) 1 A T A = I. If A has linearly independent rows, then A T (AA T ) 1 is called the right pseudo inverse. Note that this means m n and AA T (AA T ) 1 = I. 35

36 Why Least-Square? Notice that we are trying to find ˆx that minimizes min x b Ax 2. If ˆx minimizes min x b Ax 2, it also minimizes the function F (x) = b Ax 2 2 = (b Ax)T (b Ax) = b T b 2b T Ax x T A T Ax. Recall how one would find the minimum of a function f : R R, f(x) = α 2 x 2 2βαx + β 2! Take the derivative and set it to zero. Here F : R n R. Compute the gradient (essentially the derivative) and set it to zero: 0 2A T b + 2A T Ax = 0, or, A T Ax = A T b. We are looking for ˆx that solves A T Ax = A T b. 36

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture Week9 Vector Spaces 9. Opening Remarks 9.. Solvable or not solvable, that s the question Consider the picture (,) (,) p(χ) = γ + γ χ + γ χ (, ) depicting three points in R and a quadratic polynomial (polynomial

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

Overview. Motivation for the inner product. Question. Definition

Overview. Motivation for the inner product. Question. Definition Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018 Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018 Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can

More information

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0. ) Find all solutions of the linear system. Express the answer in vector form. x + 2x + x + x 5 = 2 2x 2 + 2x + 2x + x 5 = 8 x + 2x + x + 9x 5 = 2 2 Solution: Reduce the augmented matrix [ 2 2 2 8 ] to

More information

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. = SECTION 3.3. PROBLEM. The null space of a matrix A is: N(A) {X : AX }. Here are the calculations of AX for X a,b,c,d, and e. Aa [ ][ ] 3 3 [ ][ ] Ac 3 3 [ ] 3 3 [ ] 4+4 6+6 Ae [ ], Ab [ ][ ] 3 3 3 [ ]

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

χ 1 χ 2 and ψ 1 ψ 2 is also in the plane: αχ 1 αχ 2 which has a zero first component and hence is in the plane. is also in the plane: ψ 1

χ 1 χ 2 and ψ 1 ψ 2 is also in the plane: αχ 1 αχ 2 which has a zero first component and hence is in the plane. is also in the plane: ψ 1 58 Chapter 5 Vector Spaces: Theory and Practice 57 s Which of the following subsets of R 3 are actually subspaces? (a The plane of vectors x =(χ,χ, T R 3 such that the first component χ = In other words,

More information

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4 Department of Aerospace Engineering AE6 Mathematics for Aerospace Engineers Assignment No.. Decide whether or not the following vectors are linearly independent, by solving c v + c v + c 3 v 3 + c v :

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Math 2331 Linear Algebra 6.1 Inner Product, Length & Orthogonality Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Chapter 2 Notes, Linear Algebra 5e Lay

Chapter 2 Notes, Linear Algebra 5e Lay Contents.1 Operations with Matrices..................................1.1 Addition and Subtraction.............................1. Multiplication by a scalar............................ 3.1.3 Multiplication

More information

Linear Algebra in A Nutshell

Linear Algebra in A Nutshell Linear Algebra in A Nutshell Gilbert Strang Computational Science and Engineering Wellesley-Cambridge Press. 2007. Outline 1 Matrix Singularity 2 Matrix Multiplication by Columns or Rows Rank and nullspace

More information

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true? . Let m and n be two natural numbers such that m > n. Which of the following is/are true? (i) A linear system of m equations in n variables is always consistent. (ii) A linear system of n equations in

More information

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in 8.6 Problem Set 3 - s Due Wednesday, 26 September 27 at 4 pm in 2-6. Problem : (=2+2+2+2+2) A vector space is by definition a nonempty set V (whose elements are called vectors) together with rules of addition

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

MTH 464: Computational Linear Algebra

MTH 464: Computational Linear Algebra MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University February 6, 2018 Linear Algebra (MTH

More information

Section 6.1. Inner Product, Length, and Orthogonality

Section 6.1. Inner Product, Length, and Orthogonality Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Chapter 6. Orthogonality and Least Squares

Chapter 6. Orthogonality and Least Squares Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES A SHORT SUMMARY OF VECTOR SPACES AND MATRICES This is a little summary of some of the essential points of linear algebra we have covered so far. If you have followed the course so far you should have no

More information

4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1

4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1 4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1 Two vectors are orthogonal when their dot product is zero: v w = orv T w =. This chapter moves up a level, from orthogonal vectors to orthogonal

More information

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true. Dimension We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true. Lemma If a vector space V has a basis B containing n vectors, then any set containing more

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Linear Independence x

Linear Independence x Linear Independence A consistent system of linear equations with matrix equation Ax = b, where A is an m n matrix, has a solution set whose graph in R n is a linear object, that is, has one of only n +

More information

Linear Systems. Carlo Tomasi. June 12, r = rank(a) b range(a) n r solutions

Linear Systems. Carlo Tomasi. June 12, r = rank(a) b range(a) n r solutions Linear Systems Carlo Tomasi June, 08 Section characterizes the existence and multiplicity of the solutions of a linear system in terms of the four fundamental spaces associated with the system s matrix

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

Row Space, Column Space, and Nullspace

Row Space, Column Space, and Nullspace Row Space, Column Space, and Nullspace MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Every matrix has associated with it three vector spaces: row space

More information

2018 Fall 2210Q Section 013 Midterm Exam II Solution

2018 Fall 2210Q Section 013 Midterm Exam II Solution 08 Fall 0Q Section 0 Midterm Exam II Solution True or False questions points 0 0 points) ) Let A be an n n matrix. If the equation Ax b has at least one solution for each b R n, then the solution is unique

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

Linear Systems. Carlo Tomasi

Linear Systems. Carlo Tomasi Linear Systems Carlo Tomasi Section 1 characterizes the existence and multiplicity of the solutions of a linear system in terms of the four fundamental spaces associated with the system s matrix and of

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

MTH 464: Computational Linear Algebra

MTH 464: Computational Linear Algebra MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

Announcements Monday, October 29

Announcements Monday, October 29 Announcements Monday, October 29 WeBWorK on determinents due on Wednesday at :59pm. The quiz on Friday covers 5., 5.2, 5.3. My office is Skiles 244 and Rabinoffice hours are: Mondays, 2 pm; Wednesdays,

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C = CHAPTER I BASIC NOTIONS (a) 8666 and 8833 (b) a =6,a =4 will work in the first case, but there are no possible such weightings to produce the second case, since Student and Student 3 have to end up with

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 : Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =

More information

A Primer in Econometric Theory

A Primer in Econometric Theory A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,

More information

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form Math 2 Homework #7 March 4, 2 7.3.3. Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y/2. Hence the solution space consists of all vectors of the form ( ( ( ( x (5 + 3y/2 5/2 3/2 x =

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n

More information

Math 21b: Linear Algebra Spring 2018

Math 21b: Linear Algebra Spring 2018 Math b: Linear Algebra Spring 08 Homework 8: Basis This homework is due on Wednesday, February 4, respectively on Thursday, February 5, 08. Which of the following sets are linear spaces? Check in each

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

18.06 Professor Johnson Quiz 1 October 3, 2007

18.06 Professor Johnson Quiz 1 October 3, 2007 18.6 Professor Johnson Quiz 1 October 3, 7 SOLUTIONS 1 3 pts.) A given circuit network directed graph) which has an m n incidence matrix A rows = edges, columns = nodes) and a conductance matrix C [diagonal

More information

MATH 2360 REVIEW PROBLEMS

MATH 2360 REVIEW PROBLEMS MATH 2360 REVIEW PROBLEMS Problem 1: In (a) (d) below, either compute the matrix product or indicate why it does not exist: ( )( ) 1 2 2 1 (a) 0 1 1 2 ( ) 0 1 2 (b) 0 3 1 4 3 4 5 2 5 (c) 0 3 ) 1 4 ( 1

More information

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics Rank and Nullity MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Objectives We have defined and studied the important vector spaces associated with matrices (row space,

More information

Chapter 3 - From Gaussian Elimination to LU Factorization

Chapter 3 - From Gaussian Elimination to LU Factorization Chapter 3 - From Gaussian Elimination to LU Factorization Maggie Myers Robert A. van de Geijn The University of Texas at Austin Practical Linear Algebra Fall 29 http://z.cs.utexas.edu/wiki/pla.wiki/ 1

More information

Linear Algebra Fundamentals

Linear Algebra Fundamentals Linear Algebra Fundamentals It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix. Because they form the foundation on which we later

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Notes on Solving Linear Least-Squares Problems

Notes on Solving Linear Least-Squares Problems Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation

More information

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x = Linear Algebra Review Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1 x x = 2. x n Vectors of up to three dimensions are easy to diagram.

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

MA 511, Session 10. The Four Fundamental Subspaces of a Matrix

MA 511, Session 10. The Four Fundamental Subspaces of a Matrix MA 5, Session The Four Fundamental Subspaces of a Matrix Let A be a m n matrix. (i) The row space C(A T )ofais the subspace of R n spanned by the rows of A. (ii) The null space N (A) ofa is the subspace

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and 6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if (a) v 1,, v k span V and (b) v 1,, v k are linearly independent. HMHsueh 1 Natural Basis

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education MTH 3 Linear Algebra Study Guide Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education June 3, ii Contents Table of Contents iii Matrix Algebra. Real Life

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) In this lecture, F is a fixed field. One can assume F = R or C. 1. More about the spanning set 1.1. Let S = { v 1, v n } be n vectors in V, we have defined

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016 1. Let V be a vector space. A linear transformation P : V V is called a projection if it is idempotent. That

More information

MATH PRACTICE EXAM 1 SOLUTIONS

MATH PRACTICE EXAM 1 SOLUTIONS MATH 2359 PRACTICE EXAM SOLUTIONS SPRING 205 Throughout this exam, V and W will denote vector spaces over R Part I: True/False () For each of the following statements, determine whether the statement is

More information

Math 3C Lecture 25. John Douglas Moore

Math 3C Lecture 25. John Douglas Moore Math 3C Lecture 25 John Douglas Moore June 1, 2009 Let V be a vector space. A basis for V is a collection of vectors {v 1,..., v k } such that 1. V = Span{v 1,..., v k }, and 2. {v 1,..., v k } are linearly

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Chapter 7. Linear Algebra: Matrices, Vectors,

Chapter 7. Linear Algebra: Matrices, Vectors, Chapter 7. Linear Algebra: Matrices, Vectors, Determinants. Linear Systems Linear algebra includes the theory and application of linear systems of equations, linear transformations, and eigenvalue problems.

More information

Section 4.5. Matrix Inverses

Section 4.5. Matrix Inverses Section 4.5 Matrix Inverses The Definition of Inverse Recall: The multiplicative inverse (or reciprocal) of a nonzero number a is the number b such that ab = 1. We define the inverse of a matrix in almost

More information

Chapter 2: Matrix Algebra

Chapter 2: Matrix Algebra Chapter 2: Matrix Algebra (Last Updated: October 12, 2016) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). Write A = 1. Matrix operations [a 1 a n. Then entry

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis Math 24 4.3 Linear Independence; Bases A. DeCelles Overview Main ideas:. definitions of linear independence linear dependence dependence relation basis 2. characterization of linearly dependent set using

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Section 2.2: The Inverse of a Matrix

Section 2.2: The Inverse of a Matrix Section 22: The Inverse of a Matrix Recall that a linear equation ax b, where a and b are scalars and a 0, has the unique solution x a 1 b, where a 1 is the reciprocal of a From this result, it is natural

More information

Solutions to Math 51 First Exam October 13, 2015

Solutions to Math 51 First Exam October 13, 2015 Solutions to Math First Exam October 3, 2. (8 points) (a) Find an equation for the plane in R 3 that contains both the x-axis and the point (,, 2). The equation should be of the form ax + by + cz = d.

More information

Normed & Inner Product Vector Spaces

Normed & Inner Product Vector Spaces Normed & Inner Product Vector Spaces ECE 174 Introduction to Linear & Nonlinear Optimization Ken Kreutz-Delgado ECE Department, UC San Diego Ken Kreutz-Delgado (UC San Diego) ECE 174 Fall 2016 1 / 27 Normed

More information

GEOMETRY OF MATRICES x 1

GEOMETRY OF MATRICES x 1 GEOMETRY OF MATRICES. SPACES OF VECTORS.. Definition of R n. The space R n consists of all column vectors with n components. The components are real numbers... Representation of Vectors in R n.... R. The

More information

EE263: Introduction to Linear Dynamical Systems Review Session 2

EE263: Introduction to Linear Dynamical Systems Review Session 2 EE263: Introduction to Linear Dynamical Systems Review Session 2 Basic concepts from linear algebra nullspace range rank and conservation of dimension EE263 RS2 1 Prerequisites We assume that you are familiar

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information