Definition 3 A Hamel basis (often just called a basis) of a vector space X is a linearly independent set of vectors in X that spans X.

Similar documents
Econ Lecture 8. Outline. 1. Bases 2. Linear Transformations 3. Isomorphisms

Vector Spaces and Linear Transformations

Lecture 9. Econ August 20

Linear Algebra Lecture Notes-I

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Test 1 Review Problems Spring 2015

2: LINEAR TRANSFORMATIONS AND MATRICES

A Do It Yourself Guide to Linear Algebra

(here, we allow β j = 0). Generally, vectors are represented as column vectors, not row vectors. β 1. β n. crd V (x) = R n

MATH 110: LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 6 SOLUTIONS

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

8 General Linear Transformations

Linear Algebra. Chapter 5

Chapter 2 Linear Transformations

Math 113 Midterm Exam Solutions

Linear Algebra Lecture Notes

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

1 Invariant subspaces

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

A = 1 6 (y 1 8y 2 5y 3 ) Therefore, a general solution to this system is given by

Vector Spaces and Linear Maps

NORMS ON SPACE OF MATRICES

Lecture Summaries for Linear Algebra M51A

Definition of adjoint

Abstract Vector Spaces and Concrete Examples

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Finite dimensional topological vector spaces

Exercises Chapter II.

Kernel and range. Definition: A homogeneous linear equation is an equation of the form A v = 0

(a). W contains the zero vector in R n. (b). W is closed under addition. (c). W is closed under scalar multiplication.

Finite dimensional topological vector spaces

Linear Algebra Practice Problems

Where is matrix multiplication locally open?

Chapter 2: Linear Independence and Bases

The converse is clear, since

NOTES ON LINEAR ALGEBRA OVER INTEGRAL DOMAINS. Contents. 1. Introduction 1 2. Rank and basis 1 3. The set of linear maps 4. 1.

A linear algebra proof of the fundamental theorem of algebra

Online Exercises for Linear Algebra XM511

The definition of a vector space (V, +, )

A linear algebra proof of the fundamental theorem of algebra

Math 115A: Homework 4

Solution: (a) S 1 = span. (b) S 2 = R n, x 1. x 1 + x 2 + x 3 + x 4 = 0. x 4 Solution: S 5 = x 2. x 3. (b) The standard basis vectors

Honors Algebra II MATH251 Course Notes by Dr. Eyal Goren McGill University Winter 2007

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

Math 115A: Homework 5

Review of Linear Algebra

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Definitions for Quizzes

Equivalence Relations

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

Linear Vector Spaces

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall 2011

Study Guide for Linear Algebra Exam 2

Bare-bones outline of eigenvalue theory and the Jordan canonical form

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Name: Handed out: 14 November Math 4310 Take-home Prelim 2 Solutions OFFICIAL USE ONLY 1. / / / / / 16 6.

Practice Final Exam Solutions

Homework 11/Solutions. (Section 6.8 Exercise 3). Which pairs of the following vector spaces are isomorphic?

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Linear Algebra. Preliminary Lecture Notes

Math 110: Worksheet 3

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

= A+A t +αb +αb t = T(A)+αT(B). Thus, T is a linear transformation.

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

Linear Maps, Matrices and Linear Sytems

Linear Algebra. Preliminary Lecture Notes

Linear maps. Matthew Macauley. Department of Mathematical Sciences Clemson University Math 8530, Spring 2017

Final EXAM Preparation Sheet

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

LINEAR ALGEBRA MICHAEL PENKAVA

A Primer in Econometric Theory

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

A Little Beyond: Linear Algebra

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

1. Give the dimension of each of the following vector spaces over R.

4 Chapter 4 Lecture Notes. Vector Spaces and Subspaces

Linear Algebra Final Exam Solutions, December 13, 2008

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

Math Linear algebra, Spring Semester Dan Abramovich

Linear Maps and Matrices

MATH 326: RINGS AND MODULES STEFAN GILLE

Carleton College, winter 2013 Math 232, Solutions to review problems and practice midterm 2 Prof. Jones 15. T 17. F 38. T 21. F 26. T 22. T 27.

Lecture 9: Vector Algebra

MAT 242 CHAPTER 4: SUBSPACES OF R n

LINEAR ALGEBRA REVIEW

Math 3108: Linear Algebra

Lecture 7: Positive Semidefinite Matrices

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Linear transformations

c i r i i=1 r 1 = [1, 2] r 2 = [0, 1] r 3 = [3, 4].

Math 110, Spring 2015: Midterm Solutions

1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem.

SYMBOL EXPLANATION EXAMPLE

Transcription:

Economics 04 Summer/Fall 011 Lecture 8 Wednesday August 3, 011 Chapter 3. Linear Algebra Section 3.1. Bases Definition 1 Let X be a vector space over a field F. A linear combination of x 1,..., x n X is a vector of the form y = α i x i where α 1,..., α n F α i is the coefficient of x i in the linear combination. If V X, the span of V, denoted span V, is the set of all linear combinations of elements of V. The set V X spans X if span V = X. Definition A set V X is linearly dependent if there exist v 1,..., v n V and α 1,..., α n F not all zero such that α i v i = 0 A set V X is linearly independent if it is not linearly dependent. Thus V X is linearly independent if and only if α i v i = 0, v i V i α i = 0 i Definition 3 A Hamel basis (often just called a basis of a vector space X is a linearly independent set of vectors in X that spans X. Example: {(1, 0, (0, 1} is a basis for R (this is the standard basis. {(1, 1, ( 1, 1} is another basis for R : Suppose (x, y = α(1, 1 + β( 1, 1 for some α, β R x = α β y = α + β x + y = α α = x + y 1

y x = β β = y x (x, y = x + y (1, 1 + y x ( 1, 1 Since (x, y is an arbitrary element of R, {(1, 1, ( 1, 1} spans R. If (x, y = (0, 0, α = 0 + 0 = 0, β = 0 0 = 0 so the coefficients are all zero, so {(1, 1, ( 1, 1} is linearly independent. Since it is linearly independent and spans R, it is a basis. Example: {(1, 0, 0, (0, 1, 0} is not a basis of R 3, because it does not span R 3. Example: {(1, 0, (0, 1, (1, 1} is not a basis for R. so the set is not linearly independent. 1(1, 0 + 1(0, 1 + ( 1(1, 1 = (0, 0 Theorem 4 (Thm. 1. 1 Let V be a Hamel basis for X. Then every vector x X has a unique representation as a linear combination of a finite number of elements of V (with all coefficients nonzero. Proof: Let x X. Since V spans X, we can write x = s S 1 α s v s where S 1 is finite, α s F, α s 0, and v s V for each s S 1. Now, suppose x = α s v s = β s v s s S 1 s S where S is finite, β s F, β s 0, and v s V for each s S. Let S = S 1 S, and define α s = 0 for s S \ S 1 Then β s = 0 for s S 1 \ S 0 = x x = s S 1 α s v s s S β s v s = α s v s β s v s s S s S = s S(α s β s v s 1 See Corrections handout. The unique representation of 0 is 0 = i α ib i.

Since V is linearly independent, we must have α s β s = 0, so α s = β s, for all s S. s S 1 α s 0 β s 0 s S so S 1 = S and α s = β s for s S 1 = S, so the representation is unique. Theorem 5 Every vector space has a Hamel basis. Proof: The proof uses the Axiom of Choice. Indeed, the theorem is equivalent to the Axiom of Choice. A closely related result, from which you can derive the previous result, shows that any linearly independent set V in a vector space X can be extended to a basis of X. Theorem 6 If X is a vector space and V X is linearly independent, then there exists a linearly independent set W X such that V W span W = X Theorem 7 Any two Hamel bases of a vector space X have the same cardinality (are numerically equivalent. Proof: The proof depends on the so-called Exchange Lemma, whose idea we sketch. Suppose that V = {v λ : λ Λ} and W = {w γ : γ Γ} are Hamel bases of X. Remove one vector v λ0 from V, so that it no longer spans (if it did still span, then v λ0 would be a linear combination of other elements of V, and V would not be linearly independent. If w γ span (V \ {v λ0 } for every γ Γ, then since W spans, V \ {v λ0 } would also span, contradiction. Thus, we can choose γ 0 Γ such that w γ0 span (V \ {v λ0 } Because w γ0 span V, we can write w γ0 = α i v λi i=0 where α 0, the coefficient of v λ0, is not zero (if it were, then we would have w γ0 span (V \ {v λ0 }. Since α 0 0, we can solve for v λ0 as a linear combination of w γ0 and v λ1,..., v λn, so span ((V \ {v λ0 } {w γ0 } = X span V so ((V \ {v λ0 } {w γ0 } 3

spans X. From the fact that w γ0 span (V \ {v λ0 } one can show that ((V \ {v λ0 } {w γ0 } is linearly independent, so it is a basis of X. Repeat this process to exchange every element of V with an element of W (when V is infinite, this is done by a process called transfinite induction. At the end, we obtain a bijection from V to W, so that V and W are numerically equivalent. Definition 8 The dimension of a vector space X, denoted dim X, is the cardinality of any basis of X. Definition 9 Let X be a vector space. If dim X = n for some n N, then X is finitedimensional. Otherwise, X is infinite-dimensional. Recall that for V X, V denotes the cardinality of the set V. 3 Example: The set of all m n real-valued matrices is a vector space over R. A basis is given by {E ij : 1 i m, 1 j n} where (E ij kl = { 1 if k = i and l = j 0 otherwise. The dimension of the vector space of m n matrices is mn. Theorem 10 (Thm. 1.4 Suppose dimx = n N. If V X and V > n, then V is linearly dependent. Proof: If not, so V is linearly independent, then there is a basis W for X that contains V. But W V > n = dim X, a contradiction. Theorem 11 (Thm. 1.5 Suppose dim X = n N and V X, V = n. If V is linearly independent, then V spans X, so V is a Hamel basis. If V spans X, then V is linearly independent, so V is a Hamel basis. Proof: (Sketch 3 See the Appendix to Lecture for some facts about cardinality. 4

If V does not span X, then there is a basis W for X that contains V as a proper subset. Then W > V = n = dimx, a contradiction. If V is not linearly independent, then there is a proper subset V of V that is linearly independent and for which span V = span V = X. But then V < V = n = dim X, a contradiction. Note: Read the material on Affine Spaces on your own. Section 3.. Linear Transformations Definition 1 Let X and Y be two vector spaces over the field F. We say T : X Y is a linear transformation if T(α 1 x 1 + α x = α 1 T(x 1 + α T(x x 1, x X, α 1, α F Let L(X, Y denote the set of all linear transformations from X to Y. Theorem 13 L(X, Y is a vector space over F. The hard part of proving this theorem is figuring out what you are being asked to prove. Once you figure that out, this is completely trivial, although writing out a complete proof that checks all the vector space axioms is rather tedious. The key is to define scalar multiplication and vector addition, and show that a linear combination of linear transformations is a linear transformation. Proof: First, define linear combinations in L(X, Y as follows. For T 1, T L(X, Y and α, β F, define αt 1 + βt by We need to show that αt 1 + βt L(X, Y. so αt 1 + βt L(X, Y. (αt 1 + βt (x = αt 1 (x + βt (x (αt 1 + βt (γx 1 + δx = αt 1 (γx 1 + δx + βt (γx 1 + δx = α (γt 1 (x 1 + δt 1 (x + β (γt (x 1 + δt (x = γ (αt 1 (x 1 + βt (x 1 + δ (αt 1 (x + βt (x = γ (αt 1 + βt (x 1 + δ (αt 1 + βt (x The rest of the proof involves straightforward checking of the vector space axioms. 5

Composition of Linear Transformations Given R L(X, Y and S L(Y, Z, S R : X Z. We will show that S R L(X, Z, that is, the composition of two linear transformations is also linear. so S R L(X, Z. (S R(αx 1 + βx = S(R(αx 1 + βx = S(αR(x 1 + βr(x = αs(r(x 1 + βs(r(x = α(s R(x 1 + β(s R(x Definition 14 Let T L(X, Y. The image of T is Im T = T(X The kernel of T is kert = {x X : T(x = 0} The rank of T is Rank T = dim(im T Theorem 15 (Thms..9,.7,.6: The Rank-Nullity Theorem Let X be a finitedimensional vector space and T L(X, Y. Then Im T and ker T are vector subspaces of Y and X respectively, and dimx = dim kert + RankT Proof: (Sketch First show that ImT and kert are vector subspaces of X (exercise. Then let V = {v 1,..., v k } be a basis for kert (note that kert X so dim kert dim X = n. If kert = {0}, take k = 0 so V =. Extend V to a basis W for X with W = {v 1,..., v k, w 1,..., w r }. Then {T(w 1,..., T(w r } is a basis for ImT (do this as an exercise. By definition, dimkert = k and dimim T = r Since W is a basis for X, k + r = W = dim X, that is, dimx = dim kert + RankT Theorem 16 (Thm..13 T L(X, Y is one-to-one if and only if kert = {0}. Proof: Suppose T is one-to-one. Suppose x kert. Then T(x = 0. But since T is linear, T(0 = T(0 0 = 0 T(0 = 0. Since T is one-to-one, x = 0, so kert = {0}. 6

Conversely, suppose that kert = {0}. Suppose T(x 1 = T(x. Then T(x 1 x = T(x 1 T(x = 0 which says x 1 x kert, so x 1 x = 0, or x 1 = x. Thus, T is one-to-one. Definition 17 T L(X, Y is invertible if there is a function S : Y X such that S(T(x = x x X T(S(y = y y Y In other words S T = id X and T S = id Y, where id denotes the identity map. In this case denote S by T 1. Note that T is invertible if and only if it is one-to-one and onto. This is just the condition for the existence of an inverse function. The linearity of the inverse follows from the linearity of T. Theorem 18 (Thm..11 If T L(X, Y is invertible, then T 1 L(Y, X, i.e. T 1 is linear. Proof: Suppose α, β F and v, w Y. Since T is invertible, there exists unique v, w X such that T(v = v T 1 (v = v T(w = w T 1 (w = w. Then so T 1 L(Y, X. T 1 (αv + βw = T 1 (αt(v + βt(w = T 1 (T(αv + βw = αv + βw = αt 1 (v + βt 1 (w Theorem 19 (Thm. 3. Let X, Y be two vector spaces over the same field F, and let V = {v λ : λ Λ} be a basis for X. Then a linear transformation T L(X, Y is completely determined by its values on V, that is: 1. Given any set {y λ : λ Λ} Y, T L(X, Y s.t. T(v λ = y λ λ Λ. If S, T L(X, Y and S(v λ = T(v λ for all λ Λ, then S = T. 7

Proof: 1. If x X, x has a unique representation of the form (Recall that if x = 0, then n = 0. Define x = α i v λi with α i 0 i = 1,..., n T(x = α i y λi Then T(x Y. The verification that T is linear is left as an exercise.. Suppose S(v λ = T(v λ for all λ Λ. Given x X, so S = T. ( n S(x = S α i v λi = α i S (v λi = α i T (v λi ( n = T α i v λi = T(x Section 3.3. Isomorphisms Definition 0 Two vector spaces X, Y over a field F are isomorphic if there is an invertible T L(X, Y. T L(X, Y is an isomorphism if it is invertible (one-to-one and onto. Isomorphic vector spaces are essentially indistinguishable as vector spaces. Theorem 1 (Thm. 3.3 Two vector spaces X, Y over the same field are isomorphic if and only if dimx = dim Y. 8

Proof: Suppose X, Y are isomorphic, and let T L(X, Y be an isomorphism. Let be a basis of X, and let U = {u λ : λ Λ} v λ = T(u λ, V = {v λ : λ Λ} Since T is one-to-one, U and V have the same cardinality. If y Y, then there exists x X such that y = T(x ( n = T α λi u λi = α λi T (u λi = α λi v λi which shows that V spans Y. To see that V is linearly independent, suppose Since T is one-to-one, kert = {0}, so m 0 = β i v λi m = β i T (u λi ( m = T β i u λi m β i u λi = 0 Since U is a basis, we have β 1 = = β m = 0, so V is linearly independent. Thus, V is a basis of Y ; since U and V are numerically equivalent, dim X = dim Y. Now suppose dimx = dim Y. Let U = {u λ : λ Λ} and V = {v λ : λ Λ} be bases of X and Y ; note we can use the same index set Λ for both because dimx = dimy. By Theorem 3., there is a unique T L(X, Y such that T(u λ = v λ for all λ Λ. If T(x = 0, then 0 = T(x ( n = T α i u λi 9

= = α i T (u λi ı=1 α i v λi ı=1 α 1 = = α n = 0 since V is a basis x = 0 kert = {0} If y Y, write y = m β i v λi. Let Then T is one-to-one m x = β i u λi ( m T(x = T β i u λi m = β i T(u λi m = β i v λi = y so T is onto, hence T is an isomorphism and X, Y are isomorphic. 10