Chapter 3. Vector spaces

Similar documents
CSL361 Problem set 4: Basic linear algebra

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MAT 242 CHAPTER 4: SUBSPACES OF R n

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

1 Last time: inverses

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Chapter 1. Vectors, Matrices, and Linear Spaces

Study Guide for Linear Algebra Exam 2

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

The definition of a vector space (V, +, )

Abstract Vector Spaces

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Chapter 2 Subspaces of R n and Their Dimensions

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

Vector Spaces and Subspaces

Chapter 1 Vector Spaces

No books, notes, any calculator, or electronic devices are allowed on this exam. Show all of your steps in each answer to receive a full credit.

R b. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 1 1, x h. , x p. x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9

2. Every linear system with the same number of equations as unknowns has a unique solution.

Lecture 22: Section 4.7

Math113: Linear Algebra. Beifang Chen

Math 369 Exam #2 Practice Problem Solutions

Vector Spaces 4.4 Spanning and Independence

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

DEF 1 Let V be a vector space and W be a nonempty subset of V. If W is a vector space w.r.t. the operations, in V, then W is called a subspace of V.

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Linear Algebra 1 Exam 2 Solutions 7/14/3

Chapter 1: Systems of Linear Equations

Math 3191 Applied Linear Algebra

2018 Fall 2210Q Section 013 Midterm Exam II Solution

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Abstract Vector Spaces and Concrete Examples

Math 4377/6308 Advanced Linear Algebra

1 Systems of equations

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MATH 304 Linear Algebra Lecture 8: Vector spaces. Subspaces.

Math 3108: Linear Algebra

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Exercises Chapter II.

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Linear Algebra, Summer 2011, pt. 2

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math Linear Algebra Final Exam Review Sheet

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Solutions to Math 51 First Exam April 21, 2011

MATH 423 Linear Algebra II Lecture 3: Subspaces of vector spaces. Review of complex numbers. Vector space over a field.

Linear Algebra and Matrix Inversion

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Vector space and subspace

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Linear Independence x

Vector Spaces. (1) Every vector space V has a zero vector 0 V

MA 265 FINAL EXAM Fall 2012

Lecture 9: Vector Algebra

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Algorithms to Compute Bases and the Rank of a Matrix

7. Dimension and Structure.

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Linear Algebra. Min Yan

MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 1 Due: Wednesday July 13, 2005

Math 2940: Prelim 1 Practice Solutions

Span & Linear Independence (Pop Quiz)

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATH 2360 REVIEW PROBLEMS

17. C M 2 (C), the set of all 2 2 matrices with complex entries. 19. Is C 3 a real vector space? Explain.

1111: Linear Algebra I

Linear Equations in Linear Algebra

Elementary maths for GMT

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Eigenvalues and Eigenvectors

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations.

Math Linear algebra, Spring Semester Dan Abramovich

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Linear equations in linear algebra

Lecture 11: Vector space and subspace

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Transcription:

Chapter 3. Vector spaces Lecture notes for MA1111 P. Karageorgis pete@maths.tcd.ie 1/22

Linear combinations Suppose that v 1,v 2,...,v n and v are vectors in R m. Definition 3.1 Linear combination We say that v is a linear combination of v 1,v 2,...,v n, if there exist scalars x 1,x 2,...,x n such that v = x 1 v 1 +x 2 v 2 +...+x n v n. Geometrically, the linear combinations of a nonzero vector form a line. The linear combinations of two nonzero vectors form a plane, unless the two vectors are collinear, in which case they form a line. Theorem 3.2 Expressing a vector as a linear combination Let B denote the matrix whose columns are the vectors v 1,v 2,...,v n. Expressing v = x 1 v 1 +x 2 v 2 +...+x n v n as a linear combination of the given vectors is then equivalent to solving the linear system Bx = v. 2/22

Linear combinations: Example We express v as a linear combination of v 1,v 2,v 3 in the case that 1 v 1 = 1, 1 v 2 = 2, 3 v 3 = 1, 3 v = 2. 2 4 2 4 Let B denote the matrix whose columns are v 1,v 2,v 3 and consider the linear system Bx = v. Using row reduction, we then get 1 1 3 3 1 0 5 4 [B v] = 1 2 1 2 0 1 2 1. 2 4 2 4 0 0 0 0 In particular, the system has infinitely many solutions given by x 1 = 4 5x 3, x 2 = 1+2x 3. Let us merely pick one solution, say, the one with x 3 = 0. Then we get x 1 = 4 and x 2 = 1, so v = x 1 v 1 +x 2 v 2 +x 3 v 3 = 4v 1 v 2. 3/22

Linear independence Suppose that v 1,v 2,...,v n are vectors in R m. Definition 3.3 Linear independence We say that v 1,v 2,...,v n are linearly independent, if none of them is a linear combination of the others. This is actually equivalent to saying that x 1 v 1 +x 2 v 2 +...+x n v n = 0 implies x 1 = x 2 =... = x n = 0. Theorem 3.4 Checking linear independence in R m The following statements are equivalent for each m n matrix B. 1 The columns of B are linearly independent. 2 The system Bx = 0 has only the trivial solution x = 0. 3 The reduced row echelon form of B has a pivot in every column. 4/22

Linear independence: Example We check v 1,v 2,v 3,v 4 for linear independence in the case that 1 v 1 = 1, 1 v 2 = 2, 4 v 3 = 5, 3 v 4 = 1. 2 3 9 2 Letting B be the matrix whose columns are these vectors, we get 1 1 4 3 1 0 3 0 B = 1 2 5 1 0 1 1 0. 2 3 9 2 0 0 0 1 Since the third column does not contain a pivot, we conclude that the given vectors are not linearly independent. On the other hand, the 1st, 2nd and 4th columns contain pivots, so the vectors v 1,v 2,v 4 are linearly independent. As for v 3, this can be expressed as a linear combination of the other three vectors. 5/22

Span and complete sets Suppose that v 1,v 2,...,v n are vectors in R m. Definition 3.5 Span and completeness The set of all linear combinations of v 1,v 2,...,v n is called the span of these vectors and it is denoted by Span{v 1,v 2,...,v n }. If the span is all of R m, we say that v 1,v 2,...,v n form a complete set for R m. Theorem 3.6 Checking completeness in R m The following statements are equivalent for each m n matrix B. 1 The columns of B form a complete set for R m. 2 The system Bx = y has a solution for every vector y R m. 3 The reduced row echelon form of B has a pivot in every row. 6/22

Bases of R m Definition 3.7 Basis of R m A basis of R m is a set of linearly independent vectors which form a complete set for R m. Every basis of R m consists of m vectors. Theorem 3.8 Basis criterion The following statements are equivalent for each m m matrix B. 1 The columns of B form a complete set for R m. 2 The columns of B are linearly independent. 3 The columns of B form a basis of R m. 4 The matrix B is invertible. Theorem 3.9 Reduction/Extension to a basis A complete set of vectors can always be reduced to a basis and a set of linearly independent vectors can always be extended to a basis. 7/22

Subspaces of R m Definition 3.10 Subspace of R m A subspace V of R m is a nonempty subset of R m which is closed under addition and scalar multiplication. In other words, it satisfies 1 v V and w V implies v +w V, 2 v V and c R implies cv V. Every subspace of R m must contain the zero vector. Moreover, lines and planes through the origin are easily seen to be subspaces of R m. Definition 3.11 Basis and dimension A basis of a subspace V is a set of linearly independent vectors whose span is equal to V. If a subspace has a basis consisting of n vectors, then every basis of the subspace must consist of n vectors. We usually refer to n as the dimension of the subspace. 8/22

Column space and null space When it comes to subspaces of R m, there are three important examples. 1 Span. If we are given some vectors v 1,v 2,...,v n in R m, then their span is easily seen to be a subspace of R m. 2 Null space. The null space of a matrix A is the set of all vectors x such that Ax = 0. It is usually denoted by N(A). 3 Column space. The column space of a matrix A is the span of the columns of A. It is usually denoted by C(A). Definition 3.12 Standard basis Let e i denote the vector whose ith entry is equal to 1, all other entries being zero. Then e 1,e 2,...,e m form a basis of R m which is known as the standard basis of R m. Given any matrix A, we have Ae i = ith column of A. 9/22

Finding bases for the column/null space Theorem 3.13 A useful characterisation A vector x is in the column space of a matrix A if and only if x = Ay for some vector y. It is in the null space of A if and only if Ax = 0. To find a basis for the column space of a matrix A, we first compute its reduced row echelon form R. Then the columns of R that contain pivots form a basis for the column space of R and the corresponding columns of A form a basis for the column space of A. The null space of a matrix A is equal to the null space of its reduced row echelon form R. To find a basis for the latter, we write down the equations for the system Rx = 0, eliminate the leading variables and express the solutions of this system in terms of the free variables. The dimension of the column space is equal to the number of pivots and the dimension of the null space is equal to the number of free variables. The sum of these dimensions is the number of columns. 10/22

Column space: Example We find a basis for the column space of the matrix 1 2 4 5 4 A = 3 1 7 2 3. 2 1 5 1 5 The reduced row echelon form of this matrix is given by 1 0 2 0 0 R = 0 1 1 0 7. 0 0 0 1 2 Since the pivots of R appear in the 1st, 2nd and 4th columns, a basis for the column space of A is formed by the corresponding columns 1 v 1 = 3, 2 v 2 = 1, 5 v 4 = 2. 2 1 1 11/22

Null space: Example, page 1 To find a basis for the null space of a matrix A, one needs to find a basis for the null space of its reduced row echelon form R. Suppose, for instance, that the reduced row echelon form is [ ] 1 0 2 3 R =. 0 1 4 1 Its null space consists of the vectors x such that Rx = 0. Writing the corresponding equations explicitly, we must then solve the system x 1 2x 3 +3x 4 = 0, x 2 4x 3 +x 4 = 0. Once we now eliminate the leading variables, we conclude that x 1 2x 3 3x 4 x = x 2 x 3 = 4x 3 x 4 x 3. x 4 x 4 12/22

Null space: Example, page 2 This determines the vectors x which lie in the null space of R. They can all be expressed in terms of the free variables by writing x = 2x 3 3x 4 4x 3 x 4 x 3 x 4 = x 3 2 3 4 1 +x 1 4 0 = x 3v +x 4 w 0 1 for some particular vectors v,w. Since the variables x 3,x 4 are both free, this means that the null space of R is the span of v,w. To show that v,w form a basis for the null space, we also need to check linear independence. Suppose that x 3 v +x 4 w = 0 for some scalars x 3,x 4. We must then have x = 0 by above. Looking at the last two entries of x, we conclude that x 3 = x 4 = 0. 13/22

Vector spaces Definition 3.14 Vector space A set V is called a vector space, if it is equipped with the operations of addition and scalar multiplication in such a way that the usual rules of arithmetic hold. The elements of V are generally regarded as vectors. We assume that addition is commutative and associative with a zero element 0 which satisfies 0+v = v for all v V. We also assume that the distributive laws hold and that 1v = v for all v V. The rules of arithmetic are usually either obvious or easy to check. We call V a real vector space, if the scalars are real numbers. The scalars could actually belong to any field such as Q, R and C. A field is equipped with addition and multiplication, it contains two elements that play the roles of 0 and 1, while every nonzero element has an inverse. For instance, Q, R and C are fields, but Z is not. 14/22

Examples of vector spaces 1 The zero vector space {0} consisting of the zero vector alone. 2 The vector space R m consisting of all vectors in R m. 3 The space M mn of all m n matrices. 4 The space of all (continuous) functions. 5 The space of all polynomials. 6 The space P n of all polynomials of degree at most n.... The set of all matrices is not a vector space. The set of polynomials of degree n is not a vector space. Concepts such as linear combination, span and subspace are defined in terms of vector addition and scalar multiplication, so one may naturally extend these concepts to any vector space. 15/22

Vector space concepts Linear combination: v = x i v i for some scalars x i. Span: The set of all linear combinations of some vectors. Complete set for V : A set of vectors whose span is equal to V. Subspace: A nonempty subset which is closed under addition and scalar multiplication. For instance, the span is always a subspace. Linearly independent: x i v i = 0 implies that x i = 0 for all i. Basis of V : A set of linearly independent vectors that span V. Basis of a subspace W: Linearly independent vectors that span W. Dimension of a subspace: The number of vectors in a basis.... The zero element 0 could stand for a vector, a matrix or a polynomial. In these cases, we require that all entries/coefficients are zero. 16/22

Example: Linear combinations in P 2 We express f = 2x 2 +4x 5 as a linear combination of f 1 = x 2 1, f 2 = x 1, f 3 = x. This amounts to finding scalars c 1,c 2,c 3 such that f = c 1 f 1 +c 2 f 2 +c 3 f 3. In other words, we need to find scalars c 1,c 2,c 3 such that 2x 2 +4x 5 = c 1 x 2 c 1 +c 2 x c 2 +c 3 x. Comparing coefficients, we end up with the system c 1 = 2, c 2 +c 3 = 4, c 1 c 2 = 5. This has a unique solution, namely c 1 = 2, c 2 = 3 and c 3 = 1. One may thus express f as the linear combination f = 2f 1 +3f 2 +f 3. 17/22

Example: Linear independence in P 2 We show that f 1,f 2,f 3 are linearly independent in the case that f 1 = x 2 x, f 2 = x 2 1, f 3 = x+1. Suppose c 1 f 1 +c 2 f 2 +c 3 f 3 = 0 for some scalars c 1,c 2,c 3. Then c 1 x 2 c 1 x+c 2 x 2 c 2 +c 3 x+c 3 = 0 and we may compare coefficients to find that c 1 +c 2 = 0, c 3 c 1 = 0, c 3 c 2 = 0. This gives a system that can be easily solved. The last two equations give c 1 = c 3 = c 2, so the first equation reduces to 2c 1 = 0. Thus, the three coefficients are all zero and f 1,f 2,f 3 are linearly independent. 18/22

Example: Basis for a subspace in P 2 Let U be the subset of P 2 which consists of all polynomials f(x) such that f(1) = 0. To find the elements of this subset, we note that f(x) = ax 2 +bx+c = f(1) = a+b+c. Thus, f(x) U if and only if a+b+c = 0. Using this equation to eliminate a, we conclude that all elements of U have the form f(x) = ax 2 +bx+c = ( b c)x 2 +bx+c = b(x x 2 )+c(1 x 2 ). Since b,c are arbitrary, we have now expressed U as the span of two polynomials. To check those are linearly independent, suppose that b(x x 2 )+c(1 x 2 ) = 0 for some constants b,c. One may then compare coefficients to find that b = c = 0. Thus, the two polynomials form a basis of U. 19/22

Example: Basis for a subspace in M 22 Let U be the subset of M 22 which consists of all 2 2 matrices whose diagonal entries are equal. Then every element of U has the form [ ] a b A =, a,b,c R. c a The entries a,b,c are all arbitrary and we can always write [ ] [ ] [ ] 1 0 0 1 0 0 A = a +b +c. 0 1 0 0 1 0 This equation expresses U as the span of three matrices. To check these are linearly independent, suppose that [ ] [ ] [ ] 1 0 0 1 0 0 a +b +c = 0 0 1 0 0 1 0 for some scalars a,b,c. Then the matrix A above must be zero, so its entries a,b,c are all zero. Thus, the three matrices form a basis of U. 20/22

Coordinate vectors in a vector space Definition 3.15 Coordinate vectors Suppose V is a vector space that has v 1,v 2,...,v n as a basis. Then every element v V can be expressed as a linear combination v = x 1 v 1 +x 2 v 2 +...+x n v n for some unique coefficients x 1,x 2,...,x n. The unique vector x R n that consists of these coefficients is called the coordinate vector of v with respect to the given basis. By definition, the coordinate vector of v i is the standard vector e i. Identifying each element v V with its coordinate vector x R n, one may identify the vector space V with the vector space R n. Coordinate vectors depend on the chosen basis. If we decide to use a different basis, then the coordinate vectors will change as well. 21/22

Coordinate vectors in R n Theorem 3.16 Coordinate vectors in R n Suppose that v 1,v 2,...,v n form a basis of R n and let B denote the matrix whose columns are these vectors. To find the coordinate vector of v with respect to this basis, one needs to solve the system Bx = v. Thus, the coordinate vector of v is given explicitly by x = B 1 v. In practice, the explicit formula is only useful when the inverse of B is easy to compute. This is the case when B is 2 2, for instance. The coordinate vector of w is usually found using the row reduction [ v1 v 2 v n w ] [ I n x ]. One may similarly deal with several vectors w i using the row reduction [ v1 v 2 v n w 1 w m ] [ In x 1 x m ]. The vector v is generally different from its coordinate vector x. These two coincide, however, when one is using the standard basis of R n. 22/22