MATH 110: LINEAR ALGEBRA HOMEWORK #2

Similar documents
Solution to Homework 1

Chapter 1 Vector Spaces

1.5 Linear Dependence and Linear Independence

System of Linear Equations

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Chapter 1: Systems of Linear Equations

MTH5102 Spring 2017 HW Assignment 3: Sec. 1.5, #2(e), 9, 15, 20; Sec. 1.6, #7, 13, 29 The due date for this assignment is 2/01/17.

SUPPLEMENT TO CHAPTER 3

BASES. Throughout this note V is a vector space over a scalar field F. N denotes the set of positive integers and i,j,k,l,m,n,p N.

Linear Algebra (Math-324) Lecture Notes

Linear Equations in Linear Algebra

Chapter 2: Linear Independence and Bases

Math 54 HW 4 solutions

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Row Space, Column Space, and Nullspace

1 Last time: inverses

Math 235: Linear Algebra

Lecture Summaries for Linear Algebra M51A

Math 54 Homework 3 Solutions 9/

Online Exercises for Linear Algebra XM511

Math 4377/6308 Advanced Linear Algebra

Components and change of basis

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

Homework Set #8 Solutions

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Math113: Linear Algebra. Beifang Chen

Math 24 Spring 2012 Questions (mostly) from the Textbook

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Example. We can represent the information on July sales more simply as

Study Guide for Linear Algebra Exam 2

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

NAME MATH 304 Examination 2 Page 1

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Lecture 16: 9.2 Geometry of Linear Operators

MATH 423 Linear Algebra II Lecture 12: Review for Test 1.

Evaluating Determinants by Row Reduction

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Math 4377/6308 Advanced Linear Algebra

Abstract Vector Spaces

Math Linear Algebra Final Exam Review Sheet

Linear equations in linear algebra

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

4.6 Bases and Dimension

Determinants. Beifang Chen

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Prelims Linear Algebra I Michaelmas Term 2014

Chapter 3. Vector spaces

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Linear Equations in Linear Algebra

Extra Problems for Math 2050 Linear Algebra I

1 Last time: least-squares problems

APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF

Lecture 22: Section 4.7

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

10. Linear Systems of ODEs, Matrix multiplication, superposition principle (parts of sections )

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Vector Spaces 4.5 Basis and Dimension

Math 3191 Applied Linear Algebra

Chapter 1: Linear Equations

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

MATH 221, Spring Homework 10 Solutions

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

Math 110: Worksheet 3

Math 4377/6308 Advanced Linear Algebra

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Topics. Vectors (column matrices): Vector addition and scalar multiplication The matrix of a linear function y Ax The elements of a matrix A : A ij

Math Final December 2006 C. Robinson

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

Linear Equations in Linear Algebra

Chapter 1: Linear Equations

Linear Algebra Summary. Based on Linear Algebra and its applications by David C. Lay

Chapter 2 Notes, Linear Algebra 5e Lay

MTH 362: Advanced Engineering Mathematics

Linear Equations in Linear Algebra

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Review of linear algebra

Solution to Homework 8, Math 2568

Finite Math - J-term Section Systems of Linear Equations in Two Variables Example 1. Solve the system

Math Linear Algebra

Presentation by: H. Sarper. Chapter 2 - Learning Objectives

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Bilinear and quadratic forms

Math 24 Winter 2010 Sample Solutions to the Midterm

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Test 1 Review Problems Spring 2015

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

MATH 110: LINEAR ALGEBRA HOMEWORK #1

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Lecture 18: The Rank of a Matrix and Consistency of Linear Systems

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Matrices related to linear transformations

First we introduce the sets that are going to serve as the generalizations of the scalars.

MAT 2037 LINEAR ALGEBRA I web:

Transcription:

MATH 11: LINEAR ALGEBRA HOMEWORK #2 1.5: Linear Dependence and Linear Independence Problem 1. (a) False. The set {(1, ), (, 1), (, 1)} is linearly dependent but (1, ) is not a linear combination of the other 2 vectors. (b) True. If V is in the set, then 1 V = V is a nontrivial linear relation. (c) False. Without any vectors in the set, we cannot form any linear relations. (d) False. Take the set in (a), and look at the subset {(1, ), (, 1)}. (e) True, by corollary to theorem 1.6 on page 39. (f) True. This is precisely the definition of linear independence. Problem 2. (b) Linearly independent. For suppose that ( ) ( ) ( 1 2 1 1 = a + b 1 4 2 4 ) ( = a b a +2b 2a + b 4a 4b ) for some a, b R. The corresponding four linear equations are = a b, = 2a+b, = a +2b, and=4a 4b. By the first equation, a = b, and so, by the second equation, = b. Therefore, a = b =. (d) Linearly dependent, because 2(x 3 x)+(3/2)(2x 2 +4) 1( 2x 3 +3x 2 +2x +6)= 2x 3 +2x +3x 2 +6 ( 2x 3 +3x 2 +2x +6)=. Problem 8. (a) Suppose that a(1, 1, )+b(1,, 1)+c(, 1, 1) = for some a, b, c R. Then(a+b, a+ c, b + c) =, and so a, b, andc must satisfy the following system of linear equations: a + b =,a + c =,b + c =. Subtracting the second equation from the first, we obtain b c =. Adding this to the third equation, we see that 2b =. Multiplying this equation by 1/2 onbothsides,weobtainb =. Consequently,a = c =as well. Hence S must be linearly independent. (b) If F has characteristic 2, then we can no longer multiply the above equation 2b = by 1/2 toconcludethatb =. In fact, in this case, 2b = is satisfied by all b F. So if we let b be any nonzero element of F and take a = c = b (= b), we will have asolutiontoa(1, 1, ) + b(1,, 1) + c(, 1, 1) = where the scalars are nonzero. In particular, 1(1, 1, ) + 1(1,, 1) + 1(, 1, 1) =, and so S is linearly dependent. Problem 9. Suppose that {u, v} is linearly dependent. Then au + bv =forsomea, b F, where at least one of a and b is nonzero. Without loss of generality, we may assume that a. SinceF is a field, a must have a multiplicative inverse, a 1 F. So u =( a 1 b)v, i.e., u is a multiple of v. 1

2 MATH 11: HOMEWORK #2 Conversely, suppose that u or v is a multiple of the other. Without loss of generality, we may assume that u is a multiple of v, thatisu = av for some a F. Hence, 1u av =, and so {u, v} is linearly dependent, by definition of linear dependence. Problem 12. For theorem 1.6, suppose S 1 is linearly dependent. Hence we can find elements s 1,s 2,...,s n S 1, and scalars c 1,...,c n (not all ), such that c 1 s 1 + c 2 s 2 + + c n s n =. Since S 1 S 2,eachs i is also an element of S 2. Thus, the above linear relation in S 1 gives us a linear relation in S 2, and we see that S 2 is linearly dependent. The corollary follows since it is precisely the contrapositive statement of theorem 1.6. In other words, saying A = B is the same as saying B = A. Problem 13. (a) Suppose that {u, v} is linearly independent and that a(u + v)+b(u v) =forsome a, b F. Then (a + b)u +(a b)v =. Since{u, v} is linearly independent, this means that a + b =anda b =. Adding these two equations together, we get 2a =. Since the characteristic of F is not equal to two, we conclude that a =. Consequently, b = as well. Hence {u + v, u v} is linearly independent. Conversely, suppose that {u + v, u v} is linearly independent. Let s = u + v and t = u v. Since{s, t} is linearly independent, by the previous paragraph, {s+t, s t} is linearly independent. But, s+t =2uand s t =2v. Now,thefactthat{2u, 2v} is linearly independent implies that {u, v} is linearly independent. (Since if au + bv = for some a, b F,then2au +2bv = a(2u)+b(2v) =, implying that a = b =.) (b) Suppose that {u, v, w} is linearly independent and that a(u+v)+b(u+w)+c(v+w) = forsomea, b, c F.Then(a+b)u+(a+c)v+(b+c)w =.Since{u, v, w} is linearly independent, this means that a + b =,a + c =,andb + c =. Subtracting the second equation from the first, we get b c =. Adding this to the third equation, we get 2b =. Since the characteristic of F is not equal to two, we conclude that b =. Consequently,a = c = as well. Hence {u + v, u + w, v + w} is linearly independent. Conversely, suppose that {u + v, u + w, v + w} is linearly independent. Note that u = 1(u + v) + 1(u + w) 1(v + w), v = 1(u + v) 1(u + w) + 1 (v + w), and 2 2 2 2 2 2 w = 1(u + v)+ 1(u + w)+ 1 (v + w). (We can divide by 2, since the characteristic 2 2 2 of F is not equal to 2.) Suppose that au + bv + cw =forsomea, b, c F. Then a( 1(u + v)+ 1(u + w) 1(v + w)) + b( 1(u + v) 1(u + w)+ 1(v + w)) + c( 1 (u + v)+ 2 2 2 2 2 2 2 1 (u+w)+ 1 a+b c (v+w)) =, and so (u+v)+ a b+c (u+w)+ a+b+c (v+w) =. Since 2 2 2 2 2 {u + v, u + w, v + w} is linearly independent, we have that a+b c =, a b+c =,and 2 2 a+b+c =. Clearly, the only solution to this system of equations is a = b = c =. 2 So {u, v, w} is linearly independent. Problem 17. IfM is an n n upper triangular matrix with nonzero entries, then a 11 a 12 a 13... a 1n a 22 a 23... a 2n M = a 33... a 2n.......,... a nn

MATH 11: HOMEWORK #2 3 where each a ij F,anda 11,a 22,...,a nn are not zero. To show that the columns are linearly independent, suppose that a 11 a 12 a 1n a r 1. + r 22 a 2. + + r 2n a n 3n. =. a nn for some r 1,r 2,...,r n F. This leads to a system of n linear equations: r 1 a 11 + r 2 a 12 + r 3 a 13 + + r n a 1n = +r 2 a 22 + r 3 a 23 + + r n a 2n = ++r 3 a 33 + + r n a 3n =. + ++r n 1 a n 1,n 1 + r n a n 1,n = + +++r n a nn =. Now, since a nn, the last equations implies that r n =. Substituting this into the next to the last equation, we see that r n 1 =. Continuing in this fashion, we conclude that r 1 = r 2 = = r n =. Hence the columns are linearly independent. 1.6: Basis and Dimension Problem (2). (Not from the book.) Recall that {E ij :1 i n, 1 j n} forms a basis for M n n (F ), where E ij is the matrix whose only nonzero entry is a 1 in the ith row and jth column. (See Example 1.6.3 or the lecture notes.) So any matrix M M n n (F ) can be written as M = n i,j=1 a ije ij for some elements a ij F. Also, recall that M t = n i,j=1 a ij(e ij ) t (see p. 17 and Exercise 1.3.3). Now (E ij ) t = E ji,som t = n i,j=1 a ije ji = n i,j=1 a jie ij.ifmis symmetric, then M = M t,andso=m M t = n i,j=1 (a ij a ji )E ij. Since {E ij :1 i n, 1 j n} is linearly independent, this implies that a ij = a ji for all i, j (1 i n, 1 j n). Hence, M = n i=1 a iie ii + i<j a ij(e ij + E ji ), i.e, M can be written as a diagonal matrix plus a linear combination of matrices of the form E ij + E ji. Therefore, the set S = {E ii :1 i n} {E ij + E ji :1 i<j n} spans the subspace of symmetric matrices, or W. We claim that S is actually a basis for W. To show this, we need to prove that S is linearly independent. Suppose that a linear combination of the elements of this set is : n i=1 a iie ii + i<j a ij(e ij + E ji ) = for some elements a ij F. Then = n i=1 a iie ii + i<j a ije ij + i<j a ije ji. Since {E ij :1 i n, 1 j n} is linearly independent, this implies that each a ij =. Hence, S is linearly independent, and therefore a basis for W.SinceSconsists of n + n2 n = n2 +n elements, the dimension of W 2 2 is n2 +n. 2 Problem 1. (a) False. The empty set is a basis for the zero vector space. (See Example 1 in Sect. 1.6.) (b) True. See Theorem 1.9. (c) False. An infinite-dimensional vector space has no finite basis (e.g, P(F )). (d) False. See Examples 2 and 15 in Sect. 1.6 for two different bases for V = R 4. (e) True. See Corollary 1 in Sect. 1.6.

4 MATH 11: HOMEWORK #2 (f) False. By Example 1 in Sect. 1.6, P n (F ) has dimension n +1. (g) False. By Example 9 in Sect. 1.6, M m n (F ) has dimension mn. (h) True. See Theorem 1.1. (i) False. For example, if we set v 1 =(1, ), v 2 =(, 1), and v 3 =(1, 1), then {v 1,v 2,v 3 } generates R 2, but (1, 1) can written both as 1 v 1 +1 v 2 + v 3 and as v 1 + v 2 +1 v 3. (j) True. See Theorem 1.11. (k) True. By Theorem 1.11, if a subspace of V has dimension n, then that subspace is equal to V. The only vector space that has dimension is the zero vector space. (l) True. If S is linearly independent, then S spans V, by Corollary 2 (b) (Sect. 1.6). Conversely, if S spans V, then S is linearly independent, by part (a) of that corollary. Problem 5. No. By Example 8 (Sect. 1.6), the dimension of R 3 is 3, and it is thus generated by a 3-element set. So, by Theorem 1.1, any linearly independent subset of R 3 can have at most 3 elements. Therefore, {(1, 4, 6), (1, 5, 8), (2, 1, 1), (, 1, )} cannot be linearly independent. Or, more directly, 15(1, 4, 6) 13(1, 5, 8) + 14(2, 1, 1) + 111(, 1, ) =. Problem 11. Since {u, v} is a basis, V must have dimension 2. So, by Corollary 2 (b) (Sect. 1.6), to show that {u + v, au} and {au, bv} are bases, it is enough to show that they are linearly independent. Suppose that c(u + v) +d(au) =forsomec, d F. Then (c + da)u + cv =,andso c + da =andc =, by the linear independence of {u, v}. But,sincea,d must also be zero. So {u + v, au} is linearly independent. Suppose that c(au) + d(bu) = forsomec, d F. Then, ca = = db, since{u, v} is linearly independent. Since a and b are nonzero, c = d =. So{au, bv} is linearly independent. Problem 12. As in the previous problem, it is enough to show that {u + v + w, v + w, w} is linearly independent. Suppose that a(u + v + w)+b(v + w)+cw =forsomea, b, c F. Then au +(a + b)v +(a + b + c)w =. Since{u, v, w} is linearly independent, a =, a + b =,anda + b + c =. Solving this linear system, we see that a = b = c =,andso {u + v + w, v + w, w} is linearly independent. Problem 13. Subtracting the first equation from the second, we see that x 1 = x 2. Plugging this back into the first equation, we see that x 2 = x 3. Hence, the solutions to this system are precisely triplets (x 1,x 2,x 3 )oftheform(x, x, x) =x(1, 1, 1). So {(1, 1, 1)} spans the subspace of R 3 consisting of solutions to the given system. Also, the set {(1, 1, 1)} is clearly linearly independent. Thus, {(1, 1, 1)} is a basis for the subspace in question. Problem 29. (a) Let {u 1,u 2,...,u k } be a basis for W 1 W 2. Since {u 1,u 2,...,u k } is a linearly independent set that is contained in W 1, it can be extended to a basis {u 1,u 2,...,u k, v 1,v 2,...,v m } for W 1, by Corollary 2 (c) (Sect. 1.6). Similarly, {u 1,u 2,...,u k } can be extended to a basis {u 1,u 2,...,u k,w 1,w 2,...,w p } for W 2. Now, each element of W 1 can be written as a linear combination of {u 1,u 2,...,u k,v 1,v 2,...,v m }, and each element of W 2 can be written as a linear combination of {u 1,u 2,...,u k, w 1,w 2,...,w p }. So each element of W 1 + W 2 can be written as a linear combination of {u 1,u 2,...,u k,v 1,v 2,...,v m,w 1,w 2,...,w p }, by definition of sum. In other words, {u 1,u 2,...,u k,v 1,v 2,...,v m,w 1,w 2,...,w p } spans W 1 + W 2.

MATH 11: HOMEWORK #2 5 Also, note that span(u 1,u 2,...,u k,v 1,v 2,...,v m ) span(w 1,w 2,...,w p ) = {}, since if there is a nonzero element t that is in both sets, then it is contained in W 1 W 2, and hence must be a linear combination of {u 1,u 2,...,u k }. Writing t = a 1 u 1 + + a k u k and t = b 1 w 1 + + b p w p for some a 1,...a k,b 1,...,b p F,weget a 1 u 1 + +a k u k b 1 w 1 b p w p =. Since we assumed that t is nonzero, this gives a nontrivial linear combination of {u 1,u 2,...,u k,w 1,w 2,...,w p }, contradicting the assumption that this is a linearly independent set. Hence, there can be no nonzero element in span(u 1,u 2,...,u k,v 1,v 2,...,v m ) span(w 1,w 2,...,w p ). Now, suppose that a 1 u 1 +a 2 u 2 + +a k u k +b 1 v 1 +b 2 v 2 + +b m v m +c 1 w 1 +c 2 w 2 + + c p w p =forsomea 1,...a k,b 1,...,b m,c 1,...,c p F.Thena 1 u 1 +a 2 u 2 + +a k u k + b 1 v 1 +b 2 v 2 + +b m v m = c 1 w 1 c 2 w 2 c p w p, and so each side of this expression must equal zero, by the previous paragraph. Since {u 1,u 2,...,u k,v 1,v 2,...,v m } and {w 1,w 2,...,w p } are linearly independent sets, this means that a 1 = = a k = b 1 = = b m = c 1 = = c p =. Hence, {u 1,u 2,...,u k,v 1,v 2,...,v m,w 1,w 2,...,w p } is linearly independent and thus is a basis for W 1 + W 2. In particular, W 1 + W 2 is finite-dimensional, and dim(w 1 + W 2 )=k + m + p. Now, looking at our bases for W 1 W 2, W 1,andW 2 we see that dim(w 1 W 2 )=k, dim(w 1 )=k + m, and dim(w 2 )=k + p. Sodim(W 1 )+dim(w 2 ) dim(w 1 W 2 )= k + m + p, and hence dim(w 1 + W 2 )=dim(w 1 )+dim(w 2 ) dim(w 1 W 2 ). (b) Since V = W 1 + W 2, by part (a), dim(v )=dim(w 1 )+dim(w 2 ) dim(w 1 W 2 ). Then dim(v )=dim(w 1 )+dim(w 2 ) dim(w 1 W 2 )= W 1 W 2 = {} V is the direct sum of W 1 and W 2. Problem 31. (a) W 1 W 2 W 2,sodim(W 1 W 2 ) dim(w 2 )=n, by Theorem 1.11. (b) By Problem 29, dim(w 1 + W 2 )=dim(w 1 )+dim(w 2 ) dim(w 1 W 2 )=m + n dim(w 1 W 2 ) m + n, sincedim(w 1 W 2 ) is a nonnegative integer.