L2-1. Of space and subspace 10 Oct 2015

Similar documents
Linear Algebra, Summer 2011, pt. 2

ARE211, Fall2012. Contents. 2. Linear Algebra (cont) Vector Spaces Spanning, Dimension, Basis Matrices and Rank 8

Math 24 Spring 2012 Questions (mostly) from the Textbook

Math 54 Homework 3 Solutions 9/

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

1111: Linear Algebra I

Section 1.5. Solution Sets of Linear Systems

Solving Systems of Equations Row Reduction

LINEAR ALGEBRA: THEORY. Version: August 12,

Math 54 HW 4 solutions

Usually, when we first formulate a problem in mathematics, we use the most familiar

Row Space, Column Space, and Nullspace

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

Chapter 1: Systems of Linear Equations

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Topic 14 Notes Jeremy Orloff

Linear Equation: a 1 x 1 + a 2 x a n x n = b. x 1, x 2,..., x n : variables or unknowns

Study Guide for Linear Algebra Exam 2

Linear algebra and differential equations (Math 54): Lecture 10

Math 220 F11 Lecture Notes

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Contents. 1 Vectors, Lines and Planes 1. 2 Gaussian Elimination Matrices Vector Spaces and Subspaces 124

Lecture 3: Linear Algebra Review, Part II

Linear Algebra. Chapter Linear Equations

Lecture 18: The Rank of a Matrix and Consistency of Linear Systems

Chapter 2 Notes, Linear Algebra 5e Lay

Lecture 9: Vector Algebra

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

. = V c = V [x]v (5.1) c 1. c k

Vector, Matrix, and Tensor Derivatives

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

February 20 Math 3260 sec. 56 Spring 2018

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations.

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 8

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Linear independence, span, basis, dimension - and their connection with linear systems

Announcements Monday, September 18

Linear Independence. Linear Algebra MATH Linear Algebra LI or LD Chapter 1, Section 7 1 / 1

4.3 - Linear Combinations and Independence of Vectors

2 Systems of Linear Equations

Abstract Vector Spaces

1 Dirac Notation for Vector Spaces

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

MATH10212 Linear Algebra B Homework Week 4

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Vector space and subspace

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Spanning, linear dependence, dimension

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Section 2.2: The Inverse of a Matrix

Abstract & Applied Linear Algebra (Chapters 1-2) James A. Bernhard University of Puget Sound

Chapter 1: Linear Equations

Linear vector spaces and subspaces.

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Linear Equations in Linear Algebra

Lecture 2: Linear Algebra

Notes on Row Reduction

Linear Independence Reading: Lay 1.7

MTH 464: Computational Linear Algebra

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

MATH10212 Linear Algebra B Homework Week 5

EXAM 2 REVIEW DAVID SEAL

ENGINEERING MATH 1 Fall 2009 VECTOR SPACES

The Gauss-Jordan Elimination Algorithm

ICS 6N Computational Linear Algebra Vector Space

Lecture 22: Section 4.7

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

Section 1.8/1.9. Linear Transformations

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Chapter 4. Solving Systems of Equations. Chapter 4

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

1 Last time: inverses

Span and Linear Independence

Review for Exam 2 Solutions

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

Linear Programming and its Extensions Prof. Prabha Shrama Department of Mathematics and Statistics Indian Institute of Technology, Kanpur

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Review Notes for Midterm #2

Linear Algebra Summary. Based on Linear Algebra and its applications by David C. Lay

Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

Chapter 1: Linear Equations

Linear Independence x

NAME MATH 304 Examination 2 Page 1

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

12. Perturbed Matrices

Linear Algebra for Beginners Open Doors to Great Careers. Richard Han

There are two main properties that we use when solving linear equations. Property #1: Additive Property of Equality

MATH2210 Notebook 3 Spring 2018

Linear Algebra Review. Vectors

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

LINEAR ALGEBRA - CHAPTER 1: VECTORS

Transcription:

L-. Of space and subspace 0 Oct 05 Sisko: "Ahh, subspace compression." Dax: "Do you know what that is?" Sisko: "Just a guess here. Technobabble?" -- Star Trek Deep Space 9, # Primary concepts: vector space, subspace, basis vectors, span, dimension, linear independence (again), columnspace. Online textbook (Hefferon) Chapter. Review L-0 if needed. Note: Two of these labs (-, -) build up terms and ideas that are essential for understanding the blend of algebra and geometry in the following labs (- 4 through -7). These latter labs produce some really useful stuff (so you have to wade through the preliminaries). Lab - is mostly a summary of the good stuff from - and - with a lovely problem set at the end. Recall: any matrix can be viewed as a set of column vectors or a set of row vectors. We can use that idea to build a relationship between matrices, vectors, equations and geometry. Prepare yourself for lots of new words. In the following discussion, it is assumed that all vectors are located; this means they have their little tails squarely parked at the origin. Also recall that the notations R, R, R, etc stand for spaces of real numbers, which can be represented geometrically: a line, a plane, -space, hyperspace and whatever higher dimensional spaces are called. Truths: A. Any vector V by itself (including the 0 vector) defines a location in space relative to the origin, aka a point. A linear combination of any two vectors is also a vector and therefore also represents a point. B. Taking all possible scalar multiples of one nonzero vector defines a line: the vector defines the orientation of the line and the scalars stretch/squeeze the vector in either that direction or the opposite direction. Thus line L = c V, where c may be positive or negative. To move the same line around, just add a second vector: L = P + cv. However, the line itself only has one degree of freedom as represented by the scalar c. C. If two vectors are nonzero and point in the same direction (the vectors are parallel or dependent), all possible linear combinations of the vectors still

defines the same line. Thus for V W, line L = cv + dw. It s as if W adds nothing new. D. All possible linear combinations of any two independent nonzero vectors V and W in any R n defines a plane in that space: P = cv + dw is a plane. E. Taking all linear combinations of any three independent nonzero vectors defines a three dimensional space, which could be all of R or some subset of a larger dimensioned space. Example: The expression c d b, where b is a vector in R, defines a linear combination of the two vectors and. Given any vector b, we can get there from the origin with the appropriate choices of c and d. In other words, b is in the plane defined by the two vectors and the equation c d b has a solution consisting of an ordered pair, (c,d). In other other words, this is the problem b, where the two c b d b columns of the matrix are independent (because the two vectors they came from are independent). Thus there must be a solution for every vector b! All this talk about vectors leads to an important definition: Vector space: A set of vectors together with the two algebraic operations required for linear combination. A vector space has the following useful properties:. The sum of any two elements in the space is also in the space

. The product of any element in the space with a scalar is also in the space.. Algebraic properties of vector addition and scalar multiplication hold for any vector in the vector space. Once you re in a vector space, you can t get out. Oh, great, there are existentialist video games. No wonder adolescent depression is epidemic. We could have said all that with a single somewhat formal statement: Or Vector spaces are closed with respect to linear combination. Any linear combination of vectors defines a vector space. Examples: the zero vector by itself defines a very dull vector space. However, a set of vectors excluding the zero vector is NOT a vector space: Since it is possible to subtract one vector from itself (and to multiply any vector by the scalar 0), the zero vector MUST be an element of ALL vector spaces. Vector spaces cannot have holes; nor can they be just a bounded portion of another space (like taking one quadrant out of a plane). If the vectors in the examples above are all in R, we say that the vector space defined in each case is a subspace of R. For example, a line and a plane (both perfectly good vector spaces in their own right) are subspaces of R. Read carefully the examples. through.8 in Hefferon s chapter (pages 80-84). Practice p. 88 Exercises.8 -. More truths, aka wonderful properties of vector spaces and their subspaces: F. A subspace, like any vector space, contains every linear combination of the vectors in the subspace. Vectors that can be combined to make the subspace are said to span that subspace.

G. The representation of any vector as a linear combination of two nonparallel vectors is unique: In the example c d b above, for every vector b, there is one and only one pair of scalars c and d. The two vectors are said to be basis vectors for the vector space so defined. Together, the basis vectors span the subspace. In a slightly confusing manner, a set of basis vectors is often called a basis. In a more confusing manner, the plural of basis is bases (that is pronounced bay-sees; which has nothing to do with baseball). The answer to how many bases span a baseball field? is not 4. Example The natural basis for R consists of the unit vectors along the familiar 0 coordinate axes:, 0 ; but that is really just an arbitrary convention. In the bygone days of more puffy nomenclature, this would also be called the canonical basis. What is the natural basis of R? Of R 7? If we lumped the canonical basis vectors for a given space together, what kind of matrix would we obtain? Since the basis vectors (by definition) span their vector space, the minimum number of basis vectors necessary to define the entire vector space is known as the dimension of that space. Question: How many different bases are there for a given nonzero vector space? Here is the vector way of looking at linear independence: For any two vectors V and W in a given vector space, the 0 vector in that space and scalars x and y, if the solution to the equation xv yw 0 requires that x = y = 0, then V and W are linearly independent. Otherwise, they are linearly dependent. 4

Why this is extremely important: If the equation Ax = 0 has only the trivial solution (x = the zero vector), the matrix A consists of set of linearly independent column vectors. This begets another formal statement: If the matrix A has linearly independent columns, the equation Ax = 0 has only the trivial solution. Example: / Are the two vectors V and W / dimension of the subspace they span? linearly independent? If so, what is the The answer is yes if and only if the unique solution to / x y 0 is the trivial x = y = 0. / / However, 0, so V and W must be dependent. In other / words, V and W are along the same line -- they are scalar multiples of one another. In fact, W V. Thus the dimension of the subspace of R spanned by V and W is (that subspace is a line). The matrix formed by using V and W as columns this has a nonzero solution to Ax = 0. What is that solution vector x? We all need a little more space An m by n matrix consists of n column vectors in R m. Taking all linear combinations of the columns defines a subspace of R m known as the columnspace of the matrix. Note that the columnspace is not necessarily the same as R m because there may only be up to n basis vectors and m might not equal n. Confusing? How about this example: 5

The 4x matrix A consists of column vectors (n =, the number of 4 4 5 columns in A) in R 4 (m = 4, the number of rows in A). In short: The columnspace of an m by n matrix is a subspace of R m. The columnspace of A is often written as C(A). By definition, C(A) = all linear combinations of the column vectors of A. Can we write this columnspace as Ax x x x x 4 4 4 5 4 5 for the vector x x x x in R? And doesn t that look familiar? But wait a bit -- Are the columns of A independent of one another? In other words, can we combine two of them to get the other one (ie, can we remove one of the columns from the matrix and still get the same subspace?) Yes! The third column is the sum of the other two; column is the difference between C and C, etc. By convention, we keep the leftmost columns; here we throw column out of our basis for the columnspace. So C(A) is actually a two-dimensional subspace of R 4 defined by C( A) x x, 4 and that represents a two dimensional object (a plane) in 4-dimensional hyperspace. Which always brings to mind the all-important words of Han Solo in Star Wars (episode 4): Traveling through hyperspace ain't like dusting crops, boy! Question: What would happen if you row reduced this matrix A? How many pivots would you find? 6

Example Find a basis for the columnspace of 5 7 5 A. 0 8 0 Hmm which columns are dependent, if any? All we need do is row reduce the matrix A; the basis vectors we seek will be the columns of A that are pivot columns. Enjoy some GJE -- then write out the basis for C(A). Let them eat space If the columns of a matrix can define its columnspace C(A), the rows of the same matrix can define its rowspace, R(A). The rows of A are a basis for the rowspace only when they are independent. Rowspace example using REF again! 0 A 0 0 0 0 R 0 0 0 0 0 0 0 0 Last step includes a multiplication of row by - There were two pivots; note the emergence of a x identity matrix (shown in the box) at the end. The two basis vectors for the rowspace are the first two rows of the final matrix R; The row full of zeros is not part of the basis. All these operations were row operations and that gives us row equivalence; the rowspace of the final matrix R is the same as the rowspace of A. The steps are reversible so that the rows of A are in the rowspace of R. However, the columnspace of A is not the same as the columnspace of R. For an mxn matrix A, the columnspace of A consists of vectors in R m : columnspace vectors have m elements corresponding to the m rows of A. On the other hand, the rowspace of A consists of row vectors of length corresponding to R n. In the A matrix above, the rowspace is in R 4 while the columnspace is in R. We prefer column vectors to row vectors. So let s take the transpose of A and find its columnspace. Thus R(A) = C(A T ) or the rowspace is all combinations of the columns of A T ; if A is mxn, then A T is nxm and C(A T ) is a subspace of R n 7

Practice Find a basis for the columnspace of the matrix 0 5 0 0 0 A 0 0 0 0 0 0 0 0 0 Hint: Can any of the columns of A be expressed as a linear combination of other columns? What does that say about that column s membership in the basis? Ans. What happens when you row reduce this matrix? There are independent columns, the first, second and last. The other two can be written as linear combinations. The rank of this matrix is. 9 8 Now consider the matrix B in relation to the matrix A 0 7 4 8 above. Because of the last row of A, there s no operation we can do on the columns of A to obtain B. But how do the rows of A relate to the rows of B? And how do we operate on the rows of a matrix? Use this to find a basis for Col(B). Ans. A and B row reduce to exactly the same thing. How does that help? 8