Linear Algebra Lecture Notes-I

Similar documents
Linear Algebra Lecture Notes-II

Vector Spaces and Linear Transformations

Linear Algebra Lecture Notes

Math 110: Worksheet 3

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Definition 3 A Hamel basis (often just called a basis) of a vector space X is a linearly independent set of vectors in X that spans X.

Test 1 Review Problems Spring 2015

2: LINEAR TRANSFORMATIONS AND MATRICES

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Chapter 1 Vector Spaces

Linear Algebra Practice Problems

A Do It Yourself Guide to Linear Algebra

8 General Linear Transformations

Math 110, Spring 2015: Midterm Solutions

Lecture Summaries for Linear Algebra M51A

Math 113 Midterm Exam Solutions

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MA106 Linear Algebra lecture notes

Honors Algebra II MATH251 Course Notes by Dr. Eyal Goren McGill University Winter 2007

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

Abstract Vector Spaces

Elementary linear algebra

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

Solution to Homework 1

LINEAR ALGEBRA MICHAEL PENKAVA

Linear Algebra (Math-324) Lecture Notes

Math 353, Practice Midterm 1

Algebra Workshops 10 and 11

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Linear Vector Spaces

1 Invariant subspaces

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Lecture 2: Linear Algebra

LINEAR ALGEBRA (PMTH213) Tutorial Questions

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

Lineaire algebra 1 najaar Oefenopgaven. Stephen H. Friedberg, Arnold J. Insel, Lawrence E. Spence, Linear Algebra

Chapter 2: Linear Independence and Bases

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10

2.3. VECTOR SPACES 25

MAT 242 CHAPTER 4: SUBSPACES OF R n

AFFINE AND PROJECTIVE GEOMETRY, E. Rosado & S.L. Rueda 4. BASES AND DIMENSION

Math 113 Winter 2013 Prof. Church Midterm Solutions

1. General Vector Spaces

MATH 110: LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 6 SOLUTIONS

LINEAR ALGEBRA REVIEW

MATH 235. Final ANSWERS May 5, 2015

Linear Algebra. Chapter 5

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1.

1. Select the unique answer (choice) for each problem. Write only the answer.

IFT 6760A - Lecture 1 Linear Algebra Refresher

Math 321: Linear Algebra

Algebra II. Paulius Drungilas and Jonas Jankauskas

Review 1 Math 321: Linear Algebra Spring 2010

Linear Algebra. Session 8

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

4.6 Bases and Dimension

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

CSL361 Problem set 4: Basic linear algebra

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases

Econ Lecture 8. Outline. 1. Bases 2. Linear Transformations 3. Isomorphisms

Linear Algebra, 4th day, Thursday 7/1/04 REU Info:

LINEAR ALGEBRA REVIEW

Lecture 7: Positive Semidefinite Matrices

Typical Problem: Compute.

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

Linear Algebra Highlights

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Foundations of Matrix Analysis

Review of Linear Algebra

MAT Linear Algebra Collection of sample exams

4 Chapter 4 Lecture Notes. Vector Spaces and Subspaces

Exercise Sheet 1.

Definitions for Quizzes

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Abstract Vector Spaces and Concrete Examples

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Online Exercises for Linear Algebra XM511

1. Let V, W be two vector spaces over F and let T : V W be a set theoretic map. Prove that the following are equivalent: T (cu + v) = ct (u) + T (v)

Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition.

MATH 423 Linear Algebra II Lecture 12: Review for Test 1.

Linear Algebra II Lecture 8

x y + z = 3 2y z = 1 4x + y = 0

Further Mathematical Methods (Linear Algebra)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

a (b + c) = a b + a c

Linear Algebra. Workbook

Normed Vector Spaces and Double Duals

Vector Spaces. (1) Every vector space V has a zero vector 0 V

5 Linear Transformations

Review of Some Concepts from Linear Algebra: Part 2

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Math Linear Algebra II. 1. Inner Products and Norms

Dimension. Eigenvalue and eigenvector

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

LINEAR ALGEBRA W W L CHEN

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Transcription:

Linear Algebra Lecture Notes-I Vikas Bist Department of Mathematics Panjab University, Chandigarh-6004 email: bistvikas@gmail.com Last revised on February 9, 208 This text is based on the lectures delivered for the B.Tech students of IIT Bhilai. These lecture notes are basics of Linear Algebra, and can be treated as a first course in Linear Algebra. The students are suggested to go through the examples carefully and attempt the exercises appearing in the text. CONTENTS Definitions 2 Basis and dimension 3 3 Linear Transformation 6 4 Matrix of a linear transformation 9 5 Linear Functionals 3

DEFINITIONS DEFINITIONS DEFINITION.. A vector space V over a field K: V a non-empty set, its elements are called vectors and has two operations: (a) V V V, (x, y) x + y, a binary operation called addition; (b) K V V, (α, v) αv called scalar multiplication, such that the following properties hold: (i) (x + y) + z = x + (y + z) for all x, y, z V ; (ii) (x + y) = (y + x) for all x, y, z V ; (iii) There is a vector u V such that x + u = x for all x V ; (iv) For each x V there is a vector y V such that x + y = u. (v) x = x for every x V ; (vi) α(βx) = (αβ)x for all α, β K and x V. (vii) (α + β)x = αx + βx for all α, β K and x V. (viii) α(x + y) = αx + αy for all α K and x, y V. The following are the consequences of definition: () Element u defined above is unique: indeed if u, u are satisfying (iii), then u = u + u = u! The element u is called the zero vector, denoted by 0. (2) Now (iv) reads for a given vector x V there is a vector y such that x + y = 0. Next this vector y is unique for x. Indeed if x + y = 0 = x + y, then y = y + 0 = y + (x + y ) = (y + x) + y = 0 + y = y. The vector y is called the inverse of x and is denoted by x. (3) 0x = 0 for all x V. Note the the 0 of left side is the 0 of the field and the the 0 on the right is the zero vector. 0x = (0 + 0)x = 0x + 0x. Thus 0x = 0. (4) x = ( )x, the scalar multiplication of x with. Since x + ( )x = ( + ( ))x = 0x = x. EXERCISE.2. Show that if αx = 0 for α K and x V, then either α = 0 or v = 0. EXAMPLE.3. Following are standard examples of vector spaces.. The set of m n matrices K m n with entries from filed K. with usual addition and scalar multiplication. 2. The set of polynomials K[x] with coefficients in K. 3. Let X be a non-empty set and V an arbitrary vector space over K. The space of all functions from X to V is a vector space over K under pointwise addition and multiplication. That is, for f, g : X V and for α K we define: (f + g)(x) = f(x) + g(x) (αf)(x) = αf(x), where the operations on the right hand side are those in V. This space is also denoted V X. If you are not familiar with fields yo can assume K = Q or R or C. These are examples of fields.

DEFINITIONS From here onwards V will denote a vector space over a field K. DEFINITION.4. A subset W of a vector space V is a subspace if W itself a vector space under the operations defined on V. Note that the subspace must have the zero element. Moreover if x, y W then αx W for all α K and x + y W. In fact, we have the following. PROPOSITION.5. A subset W of V is a subspace if and only if W is non-empty and αx + βy W for all α, β K whenever x, y W. Proof. Exercise EXERCISE.6. Verify that the following subsets.. The set of all symmetric matrices in K n n. 2. The set of all solutions of the the system Ax = 0, where A is an n n matrix. 3. The set of all polynomials of degree at-most n. 4. The set of all m n matrices with the first column zero. NOTATION.7. K n is for K n, the space of all column vectors with coordinates in K. EXERCISE.8. (i) Let {W i : i I} be a family of subspaces of a vector space V. Prove that i I W i is a subspace of V. The intersection of a family of subspaces is a subspace. (ii) Give an example to show that the union of two subspaces may not be a subspace. Let X V. Let C be the set of all those subspaces of V that contain X. Then {W : W C} is a subspace of V. In fact, this will be the smallest subspace that contain X. We call this subspace, the subspace generated by X, denoted by X. Let X V. An expression x X α xx such that α x K and all but finitely many α x are non-zero, is called a linear combination of (the elements) of X. In particular, if v,... v k are finitely many then α v + + α k v k where α i K is a linear combination of v,..., v k. EXERCISE.9. Let X V. Define LC(X) the set of all linear combination of X. (i) Show that LC(X) is a subspace of V. (ii) Show that LC(X) = X. If X is an empty set we define X = 0, the zero subspace. The subspace X is also called the linear span of X or the subspace spanned by X. 2

2 BASIS AND DIMENSION 3 EXAMPLE.0. Decide if 2 is in the subspace of R 3 spanned by vector and. 0 3 3 2, if and only if 2 = α + β for some α, β R. 0 0 0 No such α, β exist. 2 BASIS AND DIMENSION DEFINITION 2.. A vector space V is called finite dimensional if there is a finite set X such that V = X. Otherwise infinite dimensional. EXAMPLE 2.2. K m n is a finite dimensional vector space and K[x] is an infinite dimensional. If E i,j is an m n matrix whose (i, j)-th entry is and all other entries zero, then K m n = E i,j : i =,..., m; j =,... n. EXAMPLE 2.3. K[x] is infinite dimensional as K[x] =, x, x 2,.... We can prove there K[x] is infinite dimensional. Assume otherwise, and let K[x] = p (x),..., p k (x). If d is the highest degree of these polynomials, then x d+ is not a linear combination of these polynomials. We now always assume that all vector spaces are finite dimensional unless otherwise is stated. DEFINITION 2.4. Vectors v,..., v k V are linearly dependent if there are scalars α,..., α k not all zeros such that α v +... + α k v k = 0. If no such scalars exist then v,..., v k are called linearly independent, that is, if α,..., α k F, are such that α v +... + α k v k = 0, then α =... = α k = 0. A subset X of V is a linearly dependent set if there are vectors x,..., x k in X those are linearly dependent, otherwise linearly independent set. Thus X is a linearly independent set if every finite subset of X is linearly independent. Note that if v V is non-zero, then v is linearly independent as αv = 0 α = 0. Zero vector is linearly dependent as α0 = 0 for and α K. EXERCISE 2.5. Let X = {v,..., v k } V. Verify the following statements: (i) If X is a linearly independent set then every subset Y of X is linearly independent 2. 2 Empty set is linearly independent as there is no vector of V that can be written as linear combination of the elements of empty set!! 3

2 BASIS AND DIMENSION (ii) If X is linearly dependent, then every subset Y of V that contains X is also linearly dependent Thus, if 0 X, then X is linearly dependent. If X is linearly independent then every element of X is non-zero. DEFINITION 2.6. A linearly independent subset X of V such that X = V is called a basis of V. EXERCISE 2.7. Let X = {v,..., v k } be a subset of V. Show that if v is a linear combination of {v 2,..., v k }, then v,..., v k = v 2,..., v k. Deduce that if X is a set of non-zero vectors and is linearly dependent then there is v i X such that X = X \ {v i }. PROPOSITION 2.8. Let X = {x,..., x n } and Y = {y,..., y m } such that V = X and Y is linearly independent. Then m n. Proof. Assume that m > n. Then y = α x + + α n x n. Since y 0, there is α i 0. Without any loss we can assume that α 0. ( Change the order of the elements of X so that this is the first vector.) Thus if X = X \ {x }) {y }, then V = X. Now y 2 is a linear combination of the elements of X : y 2 = β y + α 2 x 2 + + α n x n. Since y 2 0, the linear combination is non trivial. Moreover if each α i = 0, then y 2 β y = 0, this is not possible as vectors y and y 2 are linearly independent. Thus there is one α i 0. Again we can assume that α 2 0. Let X 2 = (X \ {x 2 }) {y 2 }. Then V = X = X 2. Continuing this argument we finally have X n = {y,..., y n } and V = X n. This implies that y n+ is a linear combination of y,..., y n, a contradiction as y,..., y m are linearly independent. Hence m n. PROPOSITION 2.9. If V is finite dimensional then V every basis with finite number of elements. Proof. Immediate from the above Proposition 3.8. if V is a linear span of n non-zero vectors, then any linearly independent set cannot have more than n elements. PROPOSITION 2.0. Let V be a finite dimensional vector space. Then any two bases of V have the same cardinality, that is, the same number of elements. Proof. Suppose that X = {x,..., x n } and Y = {y,..., y m } are two bases of V. Then as V = X and Y is linearly independent, by Proposition 2.8, m n. y m = α x + + α n x n. Now interchanging the roles of X and Y we get m n. Hence m = n. The above Proposition also holds for infinite dimensional vector spaces. We thus have the following definition. 4

2 BASIS AND DIMENSION DEFINITION 2.. The number of elements in a basis of a vector space V is the dimension of V. This we denote by dim V. PROPOSITION 2.2. A subset X = {v,..., v k } of V is a basis if and only every element of V is a unique linear combination of the elements of X. Proof. Suppose X is a basis and v = α v + + α k v k = β v +... β k v k. Then (α β )v + + (α k β k )v k = 0. PROPOSITION 2.3. Let X = {x,..., x k } is a linearly independent set, then there is a basis of V that contain X. In other words, a linearly independent set can be extended to form a basis of V. Thus every vector space has a basis. by dim V. Proof. Let W = X. If V = W, then X is a basis. Otherwise, choose x k+ V \ W. Then vectors x,..., x k.x k+ are linearly independent. If α x + + α k x k + α k+ x k+ = 0, and α k+ 0, then x k+ inw. Thus α k+ = 0 and so all α i = 0. (Why?) Next either V = x,..., x k.x k+, so we get a basis containing X, or there is x k+ V \ x,..., x k.x k+. Continuing this manner, we must stop at some stage as V is finite dimensional. PROPOSITION 2.4. Let V be a vector space over K. Then X = {x,..., x n } is a basis if and only if X is a maximal linearly independent set, that is for any y V, then set X {y} is linearly dependent. Proof. Let X be a maximal linearly independent set and y V. If α x + + α n x n + βy = 0, then β = 0 implies that α = = α k = 0 (why?). This means that X {y} is linearly independent set, contradicting that X is maximal linearly independent. Thus β 0, and so y X. Therefore V = X and X is a basis. Conversely. if X is a basis then each y V is a linear combination of the x,..., x k. Hence Xis a maximal linearly independent set. We now summarize the equivalent statements for a basis of a vector space. PROPOSITION 2.5. Let B = {v,... v n } be a subset of V. Then the following are equivalent. (i) B is a basis. (ii) Every element of v is a unique linear combination of the elements of B. (iii) If v V \ B, then B {v} is a linearly dependent set. (iv) No proper subset of B cannot span V. EXAMPLE 2.6. {E i,j : i =,..., m; j =,..., n} is a basis of K m n. Let e j be a column vector of length n with j-th coordinate and all other coordinates 0. Then {e,..., e n } is a basis of K n. We call this basis as the standard basis of K n. In this text we will always denote elements of standard basis by e j and E i,j will represent a matrix with (i, j)-th entry and all other entries 0. 5

3 LINEAR TRANSFORMATION EXAMPLE 2.7. {, x, x 2,... } is a basis of K[x]. EXERCISE 2.8. Prove that dim x,..., x k k. EXERCISE 2.9. Let W and W 2 are subspaces of V. (i) If W W 2 then dim W dim W 2. (ii) If W W 2, dim W = dim W 2 and dim W 2 is finite, then W = W 2. The above exercise shows that if V is finite dimensional vector space and W is a subspace of V such that dim W < dim V, then W V Thus W is a subspace of V and V is finite dimensional, then W = V if and only if dimw = dimv. W However, if V is an infinite dimensional, then V can have a proper subspace whose dimension equal to dim V. Let U and W be subspaces of V. We have seen that U W need not be a subspace of V. Define U + V = U V, called the sum of U and V. Note that if U = u,..., u r, and W = w,..., w s, then U + W = u,..., u r, w,..., w s. EXERCISE 2.20. Show that U + W = {u + w : u U, w W }. PROPOSITION 2.2. Let U and W be subspaces of V. Then dim(u + W ) = dimu + dimw dim(u W ). Proof. Let A be a basis of U W and extend it to form a basis of U as well as a basis of W. Let B and C be the sets such that A B is a basis of U and A C is a basis of W. Then A, B, C are pair-wise disjoint sets. The proof is over if A B C is a basis U + W. Thus we need to show that this set is actually a basis of U + W. Clearly A B C = U + W. Thus it is enough to show that the set A B C is linearly independent. Write A = {v,..., v t } B = {u,..., u r } and C = {w,..., w s }. Suppose that a linear combination of A B C is zero: t i= α iv i + r i= β iu i + s i= γ iw i = 0. Then t i= α iv i + r i= β iu i = s i= γ iw i U W. Therefore s i= γ iw i = t i= δ iv i, and so s i= γ iw i + t i= δ iv i = 0. Now the linear independence of A C implies that γ i = 0 for all i, and then the linear independence of A B implies the linear independence of A B C. 3 LINEAR TRANSFORMATION DEFINITION 3.. Let V and W be vector spaces over K. A mapping T : V W is a linear transformation if T (αx + βy) = αt (x) + βt (y). EXAMPLE 3.2. Let X R n p. Then the mapping R X : R m n R m p given by R X (A) = AX is a linear transformation. Similarly, if Y R p m. Then L Y : R m n R p n given by L Y (A) = Y A is a linear transformation. 6

3 LINEAR TRANSFORMATION EXAMPLE 3.3. The mapping tr : R n n R given by tr(a) = trace(a) is a linear transformation 3. EXERCISE 3.4. Show that the determinant map R n n R given by A det(a) is not a linear transformation. EXAMPLE 3.5. The derivative mapping D : K[x] K[x] given by D(p(x)) = p () (x) is linear transformation. EXERCISE 3.6. Let T : V W be a linear transformation. Show that T (0) = 0. Let T : V W be a linear transformation. Associated to T there are two important subspaces, the image of T : imt = {T x : x V }, a subspace of W, also denoted by T (V ), and the kernel of T : kert = {x V : T x = 0}, a subspace of V. The rank of T is the dimension imt, denoted by rankt. The nullity of T, is the dimension of kert, denoted by nullityt EXERCISE 3.7. let T : V W be a linear transformation. Show that kert is a subspace of V and imt is a subspace of W. If {v,..., v n } is a basis of V, verify that imt = T (v ),..., T (v n ). PROPOSITION 3.8. Let T : V W be a linear transformation. Then T is one-one if and only if kert = {0}. Proof. Assume that T is one-one. If v kert, then T (v) = 0 = T (0). Hence v = 0 and kert = {0}. Conversely, if kert = {0} and for u, v V, T (u) = T (v), then T (u v) = 0. Thus u v kert = {0} and so u = v and T is one-one. PROPOSITION 3.9 (Rank Nullity Theorem). Let T : V W be a linear transformation. Then () dim V = rankt + nullityt Proof. Let {u,... u p } be a basis of kert. Extend this to form a basis of V. Let {v,... v q } be such that {u,... u p } {v,... v q } is a basis of V. Then n = p + q. To prove the statement we need to check that the rank of T is q. 3 The trace of a matrix is the sum of its diagonal entries. 7

3 LINEAR TRANSFORMATION We show that {T v,..., T v q } is a basis of imt. First note that as v i / kert, so T v i 0. Now if α T v +... + α q T v q = 0, then α v +... + α q v q kert, and so α v +... + α q v q = β u +... + β p u p. Now the linear independence of these elements imply that α i = 0 for all i, and so {T v,..., T v q } is a linearly independent set. Next if w = T v for some v V, Then as v = α v +... + α q v q + β u +... + β p u p so w = T v = α T v +... + α q T v q. Hence imt = T v,..., T v q and {T v,..., T v q } is a basis of imt. PROPOSITION 3.0. Let T : V W be a linear transformation. If T is one-one then dim V dim W. If T is onto then dim V dim W. Thus if T is one-one and onto then dim V = dim W. Proof. If T is one-one then by Proposition 3.8, nullity(t ) = 0 and so by Eq., dim V = dim imt dim W. If T is onto then imt = W. Thus Eq implies that dim V rank(t ) = dim im(t ) = dim W. EXERCISE 3.. Let T : V V. Show using rank nullity theorem that T is one-one if and only if T is onto. a a This theorem is not true for infinite dimensional vector spaces. DEFINITION 3.2. A one-one onto linear transformation T : V W is called an isomorphism. Vector spaces V and W are isomorphic if there is an isomorphism between V and W. If V and W are isomorphic we write V W. PROPOSITION 3.3. Vector spaces V and W are isomorphic if and only if dim V = dim W. Proof. If V and W are isomorphic then there is a one-one onto linear transformation from V to W. Thus by Proposition 3.0, dim V = dim W. Conversely, assume that dim V = dim W = n. Let {v,..., v n } and {w,..., w n } be bases of V and W. Then the map given by α v + + α n v n α w + + α n w n is an isomorphism. The following is immediate. PROPOSITION 3.4. An n dimensional vector space over K is isomorphic to K n. EXAMPLE 3.5. Let A be an m n matrix. Define τ A : R n R m by τ A (u) = Au, where the right side is the usual matrix multiplication. Then τ A is a linear transformation. If {e,... e n } is the standard basis of R n, then im(τ A ) = Ae,... Ae n, the subspace generated by the columns of A. (Recall that Ae j is the j-th column of 8

4 MATRIX OF A LINEAR TRANSFORMATION A.) Thus rank(τ A ) = rank(a). It is because of this reason we consider a matrix as a linear transformation. In fact, we seee in the next section that every linear transformation can be seen as a matrix. EXERCISE 3.6. let V be a vector space of dimension n. Let {v,..., v n } be an ordered basis of V and let {w,..., w n } be an ordered vectors in W (not necessarily distinct, not necessarily linearly independent). Show that the mapping T : V W given by T (α v + + α n v n ) = α w + + α n w n is an isomorphism. 0 EXAMPLE 3.7. If u =, u = 0 and u 3 = 0, then the mapping 0 T : R 3 R 4 given by T (α e + α 2 e 2 + α 3 e 3 ) = α u + α 2 u 2 + α 3 u 3 is a linear transformation. Find its rank and nullity. EXERCISE 3.8. Let V and W be vector spaces over K. Show that L(V, W ), the set of all linear transformations from V to W form a vector space under the usual addition and scalar multiplication of mappings. EXAMPLE 3.9. Let B = {v,..., v n } and B = {w,..., w m } be bases for V and W respectively. Define T i,j { : V W a linear transformation whose action on w i for k = j bases is given by: T i,j (v k ) =. If T L(V, W ), then 0 otherwise T (v j ) = a,j w + + a m,j w m, for j =,..., n. Thus T = m i= n j= a i,jt i,j. Also as B and B are bases the linear transformations T i,j : i =,..., m; j =,..., n are linearly independent also. Let B = {v,..., v n } and B = {w,..., w m } be bases for V and W respectively. Define T i,j : V W a linear transformation whose action on bases is given by: { w i for k = j T i,j (v k ) = 0 otherwise. Then {T i,j : i =,..., m; j =,..., n} is a basis of L(V, W ). In particular dim(l(v, W )) = mn = dim W dim V. 4 MATRIX OF A LINEAR TRANSFORMATION Let B = {v,..., v n } be an ordered basis of V. Then for each v V there are unique α,..., α n K such that v = α v +... + α n v n. This defines a mapping 9

[. ] B : V K n defined by v 4 MATRIX OF A LINEAR TRANSFORMATION α. α n = [v] B. It is easy to verify that this map is a bijective linear transformation, that is, an isomorphism between V and K n. EXAMPLE 4.. Consider an ordered basis B = {q (x) = x +, q 2 (x) = x, q 3 (x) = x 3 + x 2, q 4 (x) = x 3 x 2 } of R 3 [x]. Then 3x 3 x 2 + 5x + 3 = 4q (x) + q 2 (x) + q 3 (x) + 2q 4 (x). Therefore 4 [3x 3 x 2 + 5x + 3] B =. 2 If we change the order of basis elements and let B = {p (x) = x 3 + x 2, p 2 (x) = x 3 x 2, p 3 (x) = x, p 4 (x) = x + }. Then [3x 3 x 2 + 5x + 3] B = 2. 4 EXAMPLE 4.2. In R 4 if we take ordered basis B =,, 0, 0 0 0 0 0 4 then for u = 3 2 we have [u] B =. Let V and W be vector spaces over K of dimensions n and m respectively. Let B = {v,..., v n } and B = {w,..., w m } be ordered bases of V and W respectively and let T : V W be a linear transformation. Then for all j =,..., n : T (v j ) = α,j w + α 2,j w 2 + + α m.j w m. Therefore, [T v j ] B = α j. α mj K m. Define an m n matrix B [T ] B whose j-th column is [T v j ] B, that is, B [T ] B = [[T v ] B... [T v n ] B ]. 0

4 MATRIX OF A LINEAR TRANSFORMATION The matrix B [T ] B is called the matrix of linear transformation T with respect to ordered bases B and B. If v = n i= β ijv i, then [T v] B = n i= β ij[t v i ] B = B [T ] B [v] B. Hence (2) [T v] B = B [T ] B [v] B. Thus the image of v under T can be seen as the multiplication of a column vector [v] B by the matrix of linear transformation B [T ] B that is determined by the choice of ordered bases. If T : V V a linear operator 4 and B is a basis, then we write [T ] B for B [T ] B EXAMPLE 4.3. Consider the space R 3 [x]. The differential operator is a linear operator on R 3 [x]. The matrix of D with respect to basis B = {, x, x 2, x 3 } is 0 0 0 [D] B = 0 0 2 0 0 0 0 3. 0 0 0 0 If B = {p (x) = + x, p 2 (x) = x, p 3 (x) = x 2 + x 3, p 4 (x) = x 2 x 3 }, then Dp (x) = = 2 p (x) + 2 p 2(x) Dp 2 (x) = = 2 p (x) 2 p 2(x) Dp 3 (x) = 2x + 3x 2 = p (x) p 2 (x) + 3 2 p 3(x) + 3 2 p 4(x) Dp 4 (x) = 2x 3x 2 = p (x) p 2 (x) 3 2 p 3(x) 3 2 p 4(x) /2 /2 Thus [D] B = /2 /2 0 0 3/2 3/2. 0 0 3/2 3/2 x EXAMPLE 4.4. Let T : R 4 R 3 given by T x 2 x 3x 2 + x 3 + 2x 4 x 3 = x 2x 2 + x 3 x 4. 2x x 5x 2 + x 4 4 Let B = u =, u 2 =, u 3 = 0, u 4 = 0 0, a basis of R 4 and let 0 0 0 4 A linear transformation from a vector space V to itself is called a linear operator.

4 MATRIX OF A LINEAR TRANSFORMATION B = w = 0, w 2 =, w 3 =, a basis of R3. Now we find B [T ] B. 0 0 T (u ) = = 3w 2 2w 3, T (u 2 ) = 0 = w + 3w 2 3w 3 2 3 2 T (u 3 ) = = w + 2w 2 3w 3, T (u 4 ) = = w 2 + 2w 3 3 2 Therefore B [T ] B = 0 0 3 3 2 2 3 3 2 EXERCISE 4.5. Show that if I : V V is the identity linear operator and B is an ordered basis of V, then [I] B = I n, then n n identity matrix. EXERCISE 4.6. Let V and W be vector space with bases B = {v,..., v n } and B = {w,..., w m }. Show that the mapping T B [T ] B is an isomorphism from L(V, W ) to K m n. Also show that the matrix of T i,j (See Example 3.9) is E i,j Let T : V W be a linear transformation. Let B, B 2 be two ordered bases of V and B, B 2 be ordered bases for W. Then for T we have two different matrices: B [T ] B and B [T ] B. It can be proved that there are invertible matrices P and Q such that B [T ] B = P B [T ] B Q. It is because of this reason the rank of matrix will not change and so we have the following. PROPOSITION 4.7. Let T : V W be a linear transformation and let B and B be beases for V and W. Then the rank of T is exactly the rank of the matrix B [T ] B. EXAMPLE 4.8. Let T : R 3 R 4 given by T x 2 = x 3 matrix of T with respect to the standard bases of R 3 and R 4 is. x x + x 2 + 3x 3 x 2x 2 x 2 + x 3 4x 3x 2. The 2 3 2 0 0 4 3 0 This matrix has rank 3. Therefore the rank of linear transformation is 3 and nullity is zero (Why?). 2

5 LINEAR FUNCTIONALS 5 LINEAR FUNCTIONALS DEFINITION 5.. A (linear) functional on a vector space V over a field K is a linear map from V to K. If f : V K is a non-zero functional, then f is always onto and the rank of f is. The vector space L(V, K) is called the dual space of V and is denoted by V. Note that dim V = dim V.. Let e j : R R given by e j (a) = a j, the j-th coordinate of a R n, is a linear functional. EXAMPLE 5.2. 2. d : R[x] R[x] given by dp(x) = f () (0) is a functional. 3. j : R[x] R[x] given by jp(x) = 0 p(x) dx is a functional. 4. The trace map from the space of n n matrices is a functional. 5. For fixed y R n, the mapping R n R given by x y t x is a functional. PROPOSITION 5.3. Let V be a vector space with basis B = {v,..., v n }. Define a functional vi on V whose action on basis is given by: { vi if j = i (v j ) = 0 if j i. Then B = {v,..., v n} is a basis of Vi, called the basis dual to B. Proof. Let f V i, then f(v j) = α j K, for all j =,..., n. Thus f = α v + + α nv n. It is an easy exercise to check that B is a linearly independent set. EXAMPLE 5.4. Consider the vector space R 3. Find the dual basis for B = x =, x 2 =, x 3 = 0. The basis dual to the given basis is a functional {x, x 2, x 3 }, Since these are maps from R3 R, with respect to the standard 0 basis these are matrices of size 3 (Why?) Thus for i =, 2, 3 x i (R3 ) is such that x i (x i) =, x i (x j) = 0 for j i. Therefore if x = [ a b c ], then we have the following equations: a + b + c =, a + b = 0, a + c = 0 a =, b = = c. Thus x = [ ]. Similarly, x 2 = [ 0 ] and x 3 = [ 0 ] 3

5 LINEAR FUNCTIONALS EXERCISE 5.5. Let X V. Define X = {f V : f(s) = 0 for all s S}. Show that S is a subspace of V. Let W be a subspace of V with a basis {x,..., x k }. Extend this basis to {x,..., x k, x k+,..., x n } so that this is a basis for V. If {x,..., x k, x k+,..., x n} is the dual basis, then {x k+,..., x n} is a basis of W. Indeed if f V, then f = n i= a ix i. Thus if f W, then f(x ) = 0 = f(x 2 ) = = f(x k ). Hence f = n i=k+ a ix i x k+,..., x n. Since {x k+,..., x n} W, it follows that W = x k+,..., x n. Also as {x k+,..., x n} is linearly independent, it is a basis for W. Therefore we have the following: dim V = dim W + dim W. Let T : V W be linear transformation. Define T : W V by T (f) = f T. This mapping is called the transpose of linear transformation. Let X = {x,..., x n } and Y = {y,..., y m } be bases for V and W. Write X and Y be dual bases for V and W. Assume that m (3) T (x j ) = a i,j y j Then T (yj ) : V K. The map is called the transpose of the linear transformation T because the matrix of T with respect to the dual basis is the transpose of the matrix of T. More precise statement is the following theorem. PROPOSITION 5.6. Let X = {x,..., x n } and Y = {y,..., y m } be bases for V and W. and let T : V W be a linear transformation. Then i= X [T ] Y = ( Y [T ] X ) t, that is if A is the matrix of T with respect to given given basis then the matrix of T with respect to the corresponding dual basis is A t. Proof. Let A = Y [T ] X and B = X [T ] Y. Therefore: Therefore n n T (yt ) = b s,t x s and T (x j ) = a i,j y j. s= i= n a p,q = yp(t (x q )) = T (yp)(x q ) = ( b s,p x s)(x q ) = b q,p s= Hence b q,p = a p,q or B = A t. 4

5 LINEAR FUNCTIONALS x EXAMPLE 5.7. Let T : R 3 R 4 given by T x 2 = x 3 x x + x 2 x 2 + x 3 x + x 2 + x 3 x x 2 + x 3 f x 2 = x + x 2 + x 3 + x 4 be a linear functional on R 4. Find T (f). x 3 T (f) is a functional on R 3. Now for y = y 2 R 3 : y 3 y T (f) y 2 = y T (f)(e ) + y 2 T (f)(e 2 ) + y 3 T (f)(e 3 ). y 3 y T (f)(e ) = f(t (e )) = f 0 = 3 T (f)(e 2 ) = f(t (e 2 )) = f 0 = 0 T (f)(e 3 ) = f(t (e 3 )) = f = 3. Let Hence f(y) = 3y + y 2 + y 3. Now compare with this method. The matrix of f with respect to the standard basis is [ ] Let y = y e + y 2 e 2 + y 3 e 3 R 3. Then T (f)(y) = f(t (y)). Writing the matrix representations we have: f(t (y)) = [ ] 0 0 0 y y 2 = 3y + y 2 + y 3. y 3 EXERCISE 5.8. Find kert for T defined in the above example. Decide if T is onto. 5