Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Size: px
Start display at page:

Download "Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)"

Transcription

1 Vector Space Basics (Remark: these notes are highly formal and may be a useful reference to some students however I am also posting Ray Heitmann's notes to Canvas for students interested in a direct computational approach. Chapter 3 of Braun also covers most of this material. I will assume basic knowledge about matrices, matrix multiplication, matrix inversion, row reduction etc.) 1 Abstract Vector Spaces Denition 1. A (real) vector space is a set V with two binary operations + : V V V and : R V V, called vector addition and scalar multiplication respectively, such that all the following properties hold: 1. (commutativity of vector addition) u, v V, u + v = v + u 2. (associativity of vector addition) u, v, w V, u + (v + w) = (u + v) + w 3. (existence of additive identity) z V such that y V, z + y = y + z = y 1

2 (a) (corollary to properties (1-3): there is only one such z, call it 0 V ) 4. (existence of additive inverse) u V w V such that u + w = w + u = 0 V (a) (we say that w is an additive inverse to u if w + u = u + w = 0 V ; property (4) says that every u V has at least one additive inverse) 5. (scalar multiplication distributes over vector addition) u, v V, c R, c (u + v) = (c u) + (c v) 6. (scalar multiplication distributes over addition in R) u V, a, b R, (a + b) u = (a u) + (b u) 7. (compatibility of scalar multiplication with multiplication in R) u V, a, b R, (ab) u = a (b u) 8. (identity law for scalar multiplication) u V, 1 u = u 2

3 The above list of assumptions is essentially minimal to obtain the full power of linear algebra; the price for this generality is that several obvious results are not actually completely obvious. We will show now that all the usual behavior we desire does follow from the above assumptions. Lemma 2. Let u V ; then, 0 u = 0 V. Also, if c R then c 0 V = 0 V. Proof. By property (6) and the fact that = 0 holds in R, we have: 0 u = (0 + 0) u = (0 u) + (0 u) Let w V be an additive inverse of 0 u; such an element of V does exist by property (4). In particular we have (0 u) + w = w + (0 u) = 0 V Hence 0 V = w + (0 u) = w + [(0 u) + (0 u)] Using the associativity of vector addition, this implies 0 V = [w + (0 u)] + (0 u) But w is an additive inverse of 0 u so this implies (using property (3)) 0 V = 0 V + (0 u) = 0 u 3

4 In particular we have 0 u = 0 V. Taking u = 0 V yields 0 0 V = 0 V, therefore using property (7) we have c 0 V = c (0 0 V ) = (c0) 0 V = 0 0 V = 0 V Lemma 3. Let u V and let v be an additive inverse of u; then, v = ( 1) u. Proof. First, by the previous lemma, ( 1) u is itself an additive inverse of u, because for instance we have: u + (( 1) u) = (1 u) + (( 1) u) = (1 + ( 1)) u = 0 u = 0 V (which properties have been used?) On the other hand since v is an additive inverse of u we have v + u = u + v = 0 V therefore v = v + 0 V = v + [u + (( 1) u)] = = (v + u) + (( 1) u) = 0 V + (( 1) u) = ( 1) u so v = ( 1) u as desired. 4

5 We can conclude from the above lemmas that each u V has a unique additive inverse, and moreover it is equal to ( 1) u. This allows us to dene vector subtraction the following way: if u, v V, u v = u + (( 1) v) In particular vector subtraction : V V V is now its own binary operation, distinct from vector addition, and we can check that the usual rules of subtraction are obeyed. Here are a number of examples of vector spaces, ranging from very concrete to highly abstract: Example 4. R n is a vector space if, for every x = (x 1, x 2,..., x n ) R n and y = (y 1, y 2,..., y n ) R n, and c R, we dene x + y = (x 1 + y 1, x 2 + y 2,..., x n + y n ) c x = (cx 1, cx 2,..., cx n ) Example 5. The set of all m n real matrices, M mn, is a vector space under entrywise addition and entrywise scalar multiplication (the usual addition and scalar multiplication operations for matrices). 5

6 Example 6. Let A M mn be a xed matrix. Then the set of all x R n such that Ax = 0 R m is a vector space under the usual vector addition and scalar multiplication operations of R n. Example 7. Let A M mn be a xed matrix and let 0 R m b R m be a nonzero vector such that the inhomogeneous equation Ax = b has at least one solution, say x 0. Then the set of all x R n such that Ax = b is not a vector space under the usual operations of R n. However, it is a vector space under the following operations, call them and, where x, y R n and c R: x y = (x x 0 ) + (y x 0 ) + x 0 c x = c (x x 0 ) + x 0 (It is not at all obvious that these operations dene a vector space it has to be checked carefully.) Note that a dierent choice of x 0 would result in dierent vector space operations on the set of solutions of the inhomogeneous system, because for instance x x = x if and only if x = x 0. Example 8. The set of all polynomials with degree at most 5, call it P 5, is a vector space under the following operations: for any two polynomials p, q, 6

7 and c R, (p + q) (t) = p (t) + q (t) (c p) (t) = cp (t) Note that P 5 is in some sense the same as R 6, because the addition and scalar multiplication of fth-degree polynomials is the same as addition and scalar multiplication of their coecients, which are six in number. Example 9. The set of all polynomials (in one variable) with degree at most n, written P n, is a vector space under the same operations as in Example 8. (note that P n is somehow the same as R n+1 ). Example 10. The set P of all polynomials (in one variable) is a vector space under the same operations as in Example 8. These can be expressed as p (t) = n=0 a nt n where all but nitely many a n 's are zero. Example 11. The set of all functions f : R R is a vector space under the following operations: for two functions f, g : R R and c R, (f + g) (t) = f (t) + g (t) (c f) (t) = cf (t) 7

8 Example 12. The set of all power series with non-zero radius of convergence is a vector space under the usual addition and scalar multiplication. (This requires us to prove that the sum of two power series with non-zero radius of convergence again has non-zero radius of convergence.) Example 13. The set of all formal power series (which may have radius of convergence equal to zero) is a vector space under the following operations: a n X n + n=0 b n X n = n=0 ( ) c a n X n = n=0 (a n + b n ) X n n=0 (ca n ) X n Note that in this denition we cannot generally substitute any real value for X (except perhaps X = 0) because we are not guaranteed any convergence. For example n=0 n!xn is a valid formal power series but it does not converge for any real X 0. The formal power series is a useful construction when you want to prove abstract results which would hold for any convergent series, without actually considering the detailed convergence process in your proof. n=0 8

9 1.1 Subspaces. Checking that you have a dened a vector space is usually a two step process. First you have to show that the vector and scalar multiplication operations actually map into V (and not some bigger set or nowhere at all), and second you have to verify all eight properties stated in the denition of a vector space. This is extremely painful and tedious in general which is why we introduce the following denition: Denition 14. Let V be a vector space, with vector addition + and scalar multiplication, and let S be a subset of V such that 0 V S. Suppose that the following two properties hold: x, y S, x + y S x S, c R, c x S Then it follows from the denition of vector spaces that S is itself a vector space with operations + and inherited from V. We say that the set S, equipped with the operations + and from V, is a subspace of V. Remark 15. Note that usually we only know that x + y V and c x V ; thus only very special subsets of V can be subspaces. The convenience of this denition is the following: we already have a number of mathematical objects which we know are vector spaces (such as R n, or the set of all polynomials in one variable). Hence if we have a set 9

10 S which is embedded in some vector space V, and we can show that S is closed under the operations of V, then S is automatically a vector space under those same operations. (If we want to put dierent operations on S then this trick does not work!) Remark 16. Clearly an equivalent denition is obtained if we let S be a nonempty subset of V which is closed under addition and scalar multiplication (then automatically it follows that 0 V S). The empty set is never a vector space since it does not contain a zero element (because it does not contain any elements!). On the other hand if V is any vector space then the singleton set {0 V } is actually a subspace of V, and a vector space in its own right. Most theorems we will prove will either hold trivially or fail trivially for the trivial vector space {0 V }. When theorems fail for the trivial vector space we will try to say for any non-trivial vector space V... When theorems hold trivially for the trivial vector space we will not provide a separate proof for this case (since it is trivial). From now on (unless stated otherwise) we will write cx instead of c x for scalar multiplication, and + will be understood as vector addition as above; additionally, 0 V will simply be written 0. Example 17. Let A M mn, then the following set S = {x R n such that Ax = 0} is a subspace of R n (under the usual addition and scalar multiplication in 10

11 R n ). To prove this, note that if Ax = 0 and Ay = 0 and c R, then A (x + y) = 0 and A (cx) = 0; and, clearly 0 S. 2 Linear Transformations Denition 18. Let V, W be vector spaces and suppose T : V W is a map. (This means that for every v V, there is an assignment T (v) W ; the word map is interchangeable with the word function.) We say that T is a linear map, or a linear transformation, if both of the following properties hold: u, v V, T (u + v) = T (u) + T (v) u V, c R, T (cu) = ct (u) (Note that in either formula, the operations on the left hand side occur in V whereas the operations on the right hand side occur in W.) We sometimes abbreviate T u in place of T (u) when T is a linear transformation. Proposition 19. Let U, V, W be vector spaces and suppose T : U V and S : V W are linear transformations. Then the composition S T : U W is also a linear transformation. Denition 20. Let V be a vector space; then we dene the identity transformation Id V : V V by Id V (v) = v. Lemma 21. The identity transformation on any vector space V is a linear transformation. 11

12 Lemma 22. Let V, W be vector spaces and let T : V W be a linear transformation. Then T Id V = T and Id W T = T. Example 23. Let A M mn be any m n real matrix, and dene the map T : R n R m by T x = Ax In other words T x is what we get if we view x R n as a column vector and multiply A from the left. That this denes a linear transformation follows from the properties of matrix multiplication. Example 24. Let P n denote the space of polynomials with degree at most n. Dene the map T : P n P n 1 by T p = p where p (t) = n k=0 ka kt k 1 is the derivative of p (t) = n k=0 a kt k. By the properties of dierentiation, this denes a linear transformation. Example 25. Dene the map T : M mn M nm by T A = A T 12

13 that is T takes A to the transpose of A. Then T is a linear transformation by the properties of transpose. Example 26. Fix any nonsingular matrix B M nn and dene the map T : M nn M nn by T A = B 1 AB Then T is a linear transformation by the properties of matrix multiplication. Sometimes we are interested in linear transformations that completely identify two spaces, so that (at least as far as vector space structure is concerned) the two spaces are the same. Denition 27. Let V, W be vector spaces and let T : V W be a linear map. We say that T is one to one if the following property holds: u, v V, T (u) = T (v) = u = v Equivalently (by contrapositive) T is one to one if u v implies T (u) T (v); that is, distinct points map to distinct points. Denition 28. Let V, W be vector spaces and let T : V W be a linear 13

14 map. We say that T is onto if w W v V such that T (v) = w In other words T is onto if its range is all of W. Denition 29. Let V, W be vector spaces and let T : V W be a linear map. We say that T is a linear isomorphism if it is one-to-one and onto. If there exists a linear isomorphism T : V W then we say that V and W are linearly isomorphic. These denitions are highly abstract so we will try to make them more concrete with some examples. Example 30. Let P 5 denote the space of polynomials with degree at most ve. Dene the map T : R 6 P 5 by T ((a 0,..., a 5 )) = p (a0,...,a 5 ) where p (a0,...,a 5 ) (t) = 5 a k t k k=0 Then T : R 6 P 5 is a linear isomorphism. Denition 31. Let V, W be vector spaces and let T : V W be a linear transformation. We say that T is invertible if there exists a linear trans- 14

15 formation S : W V such that S T = Id V and T S = Id W. Such a transformation S is called an inverse of T. Remark 32. If T : V W is a linear transformation of vector spaces V, W and there exists a map S : W V (not assumed linear) such that S T = Id V and T S = Id W, then it automatically follows that S is a linear transformation. The proof is two lines: Sw 1 + Sw 2 = S (T (Sw 1 + Sw 2 )) = S (T Sw 1 + T Sw 2 ) = S (w 1 + w 2 ) cs (w) = S (T (cs (w))) = S (ct Sw) = S (cw) Proposition 33. A linear transformation has at most one inverse. Proof. Let T : V W be a linear transformation of vector spaces V, W ; furthermore, suppose that T has two inverses, S 1 : W V and S 2 : W V. Then we have S 1 = S 1 Id W = S 1 (T S 2 ) = (S 1 T ) S 2 = Id V S 2 = S 2 hence S 1 = S 2. Since a linear transformation T can have at most one inverse, when it has one we call it the inverse of T and we write it T 1. Example 34. Let A M nn be a nonsingular matrix with inverse matrix 15

16 A 1. Dene the linear transformations S, T : R n R n by Sx = Ax T x = A 1 x Then S, T are both invertible linear transformations; furthermore, S 1 = T and T 1 = S. Theorem 35. Let V, W be vector spaces and let T : V W be a linear transformation. Then T is invertible if and only if T is a linear isomorphism. Proof. Assume T is invertible, with inverse T 1 : W V. We see that T is one-to-one because v 1 v 2 = T 1 (T (v 1 v 2 )) = T 1 (T v 1 T v 2 ) so if T v 1 = T v 2 then v 1 = v 2. Additionally for any w W we have w = T ( T 1 w ) so w = T v where v = T 1 w; hence, T is onto. Since T is one-to-one and onto, T is a linear isomorphism. Now suppose instead that T is a linear isomorphism. Since T is onto, for any w W there is some v V such that T v = w; morover, since T is one-to-one, there can be at most one such v. Therefore we can dene a map 16

17 S : W V so that Sw is the unique vector v V such that T v = w. We have w W, T (Sw) = w by denition of S, and therefore T S = Id W. Additionally, v V, S (T v) = v and again this follows from the denition of S, so S T = Id V. We easily show that S is a linear transformation; altogether we can conclude that S is an inverse of T, and in particular T is invertible. Remark 36. If T : V W is a linear isomorphism then T 1 : W V is also a linear isomorphism. 3 Linear Independence, Bases, Dimension Denition 37. Let V be a vector space. A subset E V is said to be linearly dependent if there exists a nite collection of distinct elements v 1, v 2,..., v N E, and scalars c 1, c 2,..., c N R, such that at least one c i 0 and c 1 v 1 + c 2 v c N v N = 0 Denition 38. Let V be a vector space. A subset E V is said to be 17

18 linearly independent if it is not linearly dependent. Remark 39. The empty set is linearly independent. Denition 40. Let V be a vector space; we say that v V is a linear combination (or nite linear combination) of the vectors v 1, v 2,..., v N V if there exist scalars c 1, c 2,..., c N R such that v = c 1 v 1 + c 2 v c N v N Lemma 41. If V is a vector space and E V is a subset, then E is linearly dependent if and only if there exists a vector v E which is a linear combination of other elements v 1, v 2,..., v N E. Denition 42. Let V be a vector space and let E V be a subset. Then we dene span E to be the set of all (nite) linear combinations of elements of E. We also dene span = {0} V. Lemma 43. If V is a vector space and E V then span E is a subspace of V. Moreover if E W V and W is a subspace of V, then span E W. Thus span E is the smallest subspace of V containing every element of E. Denition 44. Let V be a vector space and let B V be a subset. We say that B is a basis of V if B is linearly independent and span B = V. 18

19 Denition 45. Let V be a vector space; if there exists a subset B V such that B is a basis of V and B is a nite set, then we say that V is nite dimensional. If V is not nite dimensional then we say that V is innite dimensional. Example 46. The space P n of polynomials (in one variable) with degree at most n is nite dimensional, because the set {1, t, t 2,..., t n } is a basis of P n. The space P of all polynomials (in one variable) is innite dimensional. Lemma 47. Let V, W be vector spaces and let T : V W be a linear transformation. If T is a linear isomorphism and V is nite dimensional then W is also nite dimensional. Lemma 48. Let V be an innite dimensional vector space; then, for every n N there exists a linearly independent subset E V such that E has exactly n elements. Proof. Use induction. For n = 1 this is trivial: let E = {x 0 } for any 0 x 0 V (such an x 0 exists because the trivial vector space is nite dimensional). Suppose now that for some n N there exist a linearly independent subset E V such that E has exactly n elements. We claim that span E V ; indeed, if this were not the case then V would be nite dimensional. Therefore, there exists a vector z V such that z / span E. Then E {z} is a linearly independent subset of V having exactly n + 1 elements. 19

20 Lemma 49. Let V be a nite dimensional vector space, with a nite basis B having exactly N elements. Then every linearly independent subset of V has at most N elements. Proof. Let E V be a linearly independent subset; we will assume E has at least N + 1 elements to reach a contradiction, hence proving the lemma. Let v 1, v 2,..., v N+1 E be N + 1 distinct elements of E. Since B is a basis of V, each v V is a linear combination of elements of B. Denoting the elements of B as w 1, w 2,..., w N we have numbers c i,j R such that v 1 = c 1,1 w 1 + c 2,1 w c N,1 w N v 2 = c 1,2 w 1 + c 2,2 w c N,2 w N. v N+1 = c 1,N+1 w 1 + c 2,N+1 w c N,N+1 w N Arrange the numbers c i,j as the following N (N + 1) matrix: C = c 1,1 c 1,2... c 1,N+1 c 2,1 c 2,2... c 2,N c N,1 c N,2 c N,N+1 20

21 We can solve the equation Cx = 0 (with x R N+1 ) by row reduction. Now in reduced row echelon form (RREF) each row can have at most one pivot; since there are N rows, there can be at most N pivots in the RREF. Hence there is at least one free variable, which can take on any real value. Therefore there are innitely many solutions to the equation Cx = 0, and this certainly implies that there exists some x 0 such that Cx = 0. Call this vector x = ( x 1, x 2,..., x N+1 ) 0. Now consider the following vector: ṽ = x 1 v 1 + x 2 v x N+1 v N+1 Using the more compact summation notation, this can be written But v k = N j=1 c j,kw j, therefore Re-arranging, this says ṽ = ṽ = N+1 k=1 N+1 k=1 x k v k N c j,k x k w j j=1 ṽ = N j=1 ( N+1 k=1 c j,k x k ) w j But N+1 k=1 c j,k x k is just the jth entry of the vector Cx, and by construction 21

22 Cx = 0. Therefore N+1 k=1 c j,k x k for each j {1, 2,..., N} and we have ṽ = N 0w j = 0 j=1 Hence ṽ = 0. Then again we have ṽ = N+1 k=1 x kv k, hence x 1 v 1 + x 2 v x N+1 v N+1 = 0 Since v 1, v 2,..., v N+1 are distinct elements of E, and the numbers x k are not all zero, this implies that the set E is not linearly independent, so we have a contradiction. Theorem 50. Let V be a vector space and let W V be a subspace. If V is nite dimensional then W is nite dimensional. Proof. Suppose W is innite dimensional. By Lemma 48, for each natural number n there is a linearly independent subset E of W having exactly n elements. But a linearly independent subset of W is also a linearly independent subset of V. Therefore, for each natural number n there is a linearly independent subset E of V having exactly n elements. On the other hand, V is nite dimensional so it has a nite basis B. Let N be the number of elements of B. Then by Lemma 49, any linearly independent subset of V has at most N elements. But we just said that, for every n N, V has a linearly independent subset E having exactly n elements; choosing n = N + 1 yields the contradiction. 22

23 Remark 51. Note that in Theorem 50, we have proven that if W is a subspace of the nite dimensional space V then W is nite dimensional; in particular, W has a nite basis. However we did not actually construct any particular basis for W ; indeed, given a basis B of V, it is entirely possible that B W =. Due to Theorem 50, we do not always have to exhibit a nite basis to show that a vector space W is nite dimensional; it is sucient to show that W is linearly isomorphic to a subspace of a nite dimensional space. Example 52. Let V be the set of all smooth functions f : R R such that t R, f (t) f (t) = 0 Now V is a vector space, and it is a subspace of the space of all smooth functions on R, but that larger vector space is not nite dimensional. To show that V is nite dimensional, we can dene the following map T : R 2 V T c 1 c 2 = c 1 f 1 + c 2 f 2 where f 1 (t) = e t and f 2 (t) = e t. Then T is one-to-one because the Wronskian W [f 1, f 2 ] = 2 0; also, T is onto because all solutions of the ODE are of the form c 1 f 1 + c 2 f 2 for some constants c 1, c 2. Hence V is linearly isomorphic to the nite dimensional space R 2, so V is itself nite dimensional. (Note that we could equally well observe that {f 1, f 2 } is a basis of V in order to conclude that V is nite dimensional.) 23

24 Theorem 53. Let V be a nite dimensional vector space; furthermore, let B 1 be a basis of V, and suppose B 2 is also a basis of V. Then B 1 and B 2 are both nite sets and they have the same number of elements. Proof. Since V is nite dimensional, there is a nite basis of V, call it B 0. Let N 0 N {0} be the number of elements of B 0. By Lemma 49, since B 0 is a nite basis of V and B 1 is a linearly independent subset of V, we nd that B 1 has at most N 0 elements. Moreover, again by Lemma 49, since B 0 is a nite basis of V and B 2 is a linearly independent subset of V, we nd that B 2 has at most N 0 elements. In particular, both B 1 and B 2 are nite bases. Now since B 1, B 2 are nite sets, let N 1, N 2 N {0} denote (respectively) the size of B 1, B 2. By Lemma 49, since B 1 is a nite basis of V and B 2 is a linearly independent subset of V, we have that N 2 N 1. Then again, since B 2 is a nite basis of V and B 1 is a linearly indepdendent subset of V, we have that N 1 N 2. Therefore, N 1 = N 2. Denition 54. Let V be a nite dimensional vector space (then V has a nite basis because that is what it means to be nite dimensional). The dimension of V is dened to be the number of elements in a basis of V ; by Theorem 53, it does not matter which basis we choose. We write the dimension of V as dim V. If V is innite dimensional we may write dim V = as a convenient (but nonrigorous) shorthand. Theorem 55. Let V, W be vector spaces and let T : V W be a linear transformation. If T is a linear isomorphism, and either V or W is nite 24

25 dimensional, then both V and W are nite dimensional and dim V = dim W. Proof. Simply observe, if V is nite dimensional, that the image of any basis of V under T is a basis of W. Similarly, if W is nite dimensional, then the image of any basis of W under T 1 is a basis of V. Example 56. (Euclidean space) dim R n = n Example 57. (polynomials of degree at most n) dim P n = n + 1 Example 58. (all m n real matrices) dim M mn = mn Example 59. (all polynomials) dim P = Theorem 60. Let V be a nite-dimensional vector space with dim V = n. Then V is linearly isomorphic to R n. Proof. Let B = {v 1, v 2,..., v n } be a nite basis of V. Dene a map T : R n V as follows: T c 1 c 2. c n = c 1 v 1 + c 2 v c n v n 25

26 It is trivial to check that T is a linear map. Clearly T is onto, since any v V can be written as a linear combination of the vectors v 1, v 2,..., v n (since B is a basis). So it only remains to show that T is one-to-one. Suppose there are numbers c 1, c 2,..., c n and c 1, c 2,..., c n such that T c 1 c 2. c n = T c 1 c 2. c n Then by the denition of T we have c 1 v 1 + c 2 v c n v n = c 1 v 1 + c 2 v c n v n Therefore (c 1 c 1 ) v 1 + (c 2 c 2 ) v (c n c n ) v n = 0 But B is a basis, hence linearly independent, so we conclude c 1 c 1 = 0, c 2 c 2 = 0,..., c n c n = 0. Hence c 1 c 2. c n = c 1 c 2. c n 26

27 so T is one-to-one. 4 Matrix Representation of Linear Transformations We have seen in Theorem 60 that, just by choosing a basis, any nitedimensional vector space can be regarded as equivalent (in the sense of linear isomorphism) to a copy of R n. (Note carefully that extra structures, such as dot products, are not necessarily preserved even for linear isomorphisms from R n to itself.) We have also seen that if A M mn is a matrix then A denes a linear transformation R n R m by left-multiplication of any (column) vector x R n. What we are going to show is that any linear transformation R n R m arises as left-multiplication by some m n matrix. Though we will not go into all the details (which you can nd in any linear algebra textbook), by combining this result with Theorem 60, any linear transformation of nite-dimensional vector spaces V and W can be represented by a matrix. Of course the matrix will depend on your choice of bases for V and W ; there is a standard rule (written in any linear algebra textbook) for transforming the matrix of a linear transformation from one pair of bases to another pair. We will not discuss those details. Theorem 61. Let T : R n R m be a linear transformation (where elements 27

28 of R n and R m are regarded as column vectors). Then there exists a unique m n matrix A M mn such that x R n, T x = Ax where Ax is the usual matrix-vector product. Proof. Let us rst prove the uniqueness. Suppose that A, B M mn are two matrices that both coincide with T ; in that case, we clearly have x R n, Ax = Bx Therefore, taking x = e j (with 1 j n) and dotting both sides against e i (with 1 i m) we have 1 j n, 1 i m, e T i Ae j = e T i Be j But this is equivalent to the following statement: 1 j n, 1 i m, a ij = b ij in particular A = B. Now we turn to the existence. Let us write y j = T e j R m for 1 j n; furthermore, let us dene the numbers a ij, with 1 i m and 1 j n, 28

29 by the following formula: a ij = e T i y j Dene the matrix A as follows: A = a 11 a a 1n a 21 a a 2n a m1 a m2 a mn Clearly Ae j = y j. Let x R n ; then we can write x = c 1 e 1 + c 2 e c n e n = n k=1 c ke k. Therefore, On the other hand, T x = Ax = n c k T e k = k=1 n c k Ae k = k=1 n c k y k k=1 n c k y k Since both T x and Ax are equal to n k=1 c ky k, it follows that T x = Ax. But x R n was arbitrary so we conclude that k=1 x R n, T x = Ax 29

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................

More information

Chapter 1 Vector Spaces

Chapter 1 Vector Spaces Chapter 1 Vector Spaces Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 110 Linear Algebra Vector Spaces Definition A vector space V over a field

More information

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v 250) Contents 2 Vector Spaces 1 21 Vectors in R n 1 22 The Formal Denition of a Vector Space 4 23 Subspaces 6 24 Linear Combinations and

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS KOLMAN & HILL NOTES BY OTTO MUTZBAUER 11 Systems of Linear Equations 1 Linear Equations and Matrices Numbers in our context are either real numbers or complex

More information

Chapter 3. Vector spaces

Chapter 3. Vector spaces Chapter 3. Vector spaces Lecture notes for MA1111 P. Karageorgis pete@maths.tcd.ie 1/22 Linear combinations Suppose that v 1,v 2,...,v n and v are vectors in R m. Definition 3.1 Linear combination We say

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

Math 3108: Linear Algebra

Math 3108: Linear Algebra Math 3108: Linear Algebra Instructor: Jason Murphy Department of Mathematics and Statistics Missouri University of Science and Technology 1 / 323 Contents. Chapter 1. Slides 3 70 Chapter 2. Slides 71 118

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Linear Algebra, 4th day, Thursday 7/1/04 REU Info:

Linear Algebra, 4th day, Thursday 7/1/04 REU Info: Linear Algebra, 4th day, Thursday 7/1/04 REU 004. Info http//people.cs.uchicago.edu/laci/reu04. Instructor Laszlo Babai Scribe Nick Gurski 1 Linear maps We shall study the notion of maps between vector

More information

Chapter 2: Matrix Algebra

Chapter 2: Matrix Algebra Chapter 2: Matrix Algebra (Last Updated: October 12, 2016) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). Write A = 1. Matrix operations [a 1 a n. Then entry

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Chapter 1: Systems of Linear Equations

Chapter 1: Systems of Linear Equations Chapter : Systems of Linear Equations February, 9 Systems of linear equations Linear systems Lecture A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where

More information

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same. Introduction Matrix Operations Matrix: An m n matrix A is an m-by-n array of scalars from a field (for example real numbers) of the form a a a n a a a n A a m a m a mn The order (or size) of A is m n (read

More information

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1) EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily

More information

MAT 242 CHAPTER 4: SUBSPACES OF R n

MAT 242 CHAPTER 4: SUBSPACES OF R n MAT 242 CHAPTER 4: SUBSPACES OF R n JOHN QUIGG 1. Subspaces Recall that R n is the set of n 1 matrices, also called vectors, and satisfies the following properties: x + y = y + x x + (y + z) = (x + y)

More information

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Jim Lambers MAT 610 Summer Session Lecture 1 Notes Jim Lambers MAT 60 Summer Session 2009-0 Lecture Notes Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra

More information

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian. MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian. Spanning set Let S be a subset of a vector space V. Definition. The span of the set S is the smallest subspace W V that contains S. If

More information

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

More information

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form Chapter 5. Linear Algebra A linear (algebraic) equation in n unknowns, x 1, x 2,..., x n, is an equation of the form a 1 x 1 + a 2 x 2 + + a n x n = b where a 1, a 2,..., a n and b are real numbers. 1

More information

Solution to Homework 1

Solution to Homework 1 Solution to Homework Sec 2 (a) Yes It is condition (VS 3) (b) No If x, y are both zero vectors Then by condition (VS 3) x = x + y = y (c) No Let e be the zero vector We have e = 2e (d) No It will be false

More information

Math113: Linear Algebra. Beifang Chen

Math113: Linear Algebra. Beifang Chen Math3: Linear Algebra Beifang Chen Spring 26 Contents Systems of Linear Equations 3 Systems of Linear Equations 3 Linear Systems 3 2 Geometric Interpretation 3 3 Matrices of Linear Systems 4 4 Elementary

More information

Inverses and Elementary Matrices

Inverses and Elementary Matrices Inverses and Elementary Matrices 1-12-2013 Matrix inversion gives a method for solving some systems of equations Suppose a 11 x 1 +a 12 x 2 + +a 1n x n = b 1 a 21 x 1 +a 22 x 2 + +a 2n x n = b 2 a n1 x

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

Abstract Vector Spaces and Concrete Examples

Abstract Vector Spaces and Concrete Examples LECTURE 18 Abstract Vector Spaces and Concrete Examples Our discussion of linear algebra so far has been devoted to discussing the relations between systems of linear equations, matrices, and vectors.

More information

1 Last time: inverses

1 Last time: inverses MATH Linear algebra (Fall 8) Lecture 8 Last time: inverses The following all mean the same thing for a function f : X Y : f is invertible f is one-to-one and onto 3 For each b Y there is exactly one a

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Linear Algebra and Matrix Inversion

Linear Algebra and Matrix Inversion Jim Lambers MAT 46/56 Spring Semester 29- Lecture 2 Notes These notes correspond to Section 63 in the text Linear Algebra and Matrix Inversion Vector Spaces and Linear Transformations Matrices are much

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

More information

Chapter 2: Linear Independence and Bases

Chapter 2: Linear Independence and Bases MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Calculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm

Calculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm Calculus and linear algebra for biomedical engineering Week 3: Matrices, linear systems of equations, and the Gauss algorithm Hartmut Führ fuehr@matha.rwth-aachen.de Lehrstuhl A für Mathematik, RWTH Aachen

More information

Math 110, Spring 2015: Midterm Solutions

Math 110, Spring 2015: Midterm Solutions Math 11, Spring 215: Midterm Solutions These are not intended as model answers ; in many cases far more explanation is provided than would be necessary to receive full credit. The goal here is to make

More information

Elementary maths for GMT

Elementary maths for GMT Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1

More information

Chapter 2 Subspaces of R n and Their Dimensions

Chapter 2 Subspaces of R n and Their Dimensions Chapter 2 Subspaces of R n and Their Dimensions Vector Space R n. R n Definition.. The vector space R n is a set of all n-tuples (called vectors) x x 2 x =., where x, x 2,, x n are real numbers, together

More information

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

More information

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) In this lecture, F is a fixed field. One can assume F = R or C. 1. More about the spanning set 1.1. Let S = { v 1, v n } be n vectors in V, we have defined

More information

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence Linear Combinations Spanning and Linear Independence We have seen that there are two operations defined on a given vector space V :. vector addition of two vectors and. scalar multiplication of a vector

More information

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f Numer. Math. 67: 289{301 (1994) Numerische Mathematik c Springer-Verlag 1994 Electronic Edition Least supported bases and local linear independence J.M. Carnicer, J.M. Pe~na? Departamento de Matematica

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Vector Spaces 4.5 Basis and Dimension

Vector Spaces 4.5 Basis and Dimension Vector Spaces 4.5 and Dimension Summer 2017 Vector Spaces 4.5 and Dimension Goals Discuss two related important concepts: Define of a Vectors Space V. Define Dimension dim(v ) of a Vectors Space V. Vector

More information

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and

More information

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2 MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS SYSTEMS OF EQUATIONS AND MATRICES Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a

More information

NOTES (1) FOR MATH 375, FALL 2012

NOTES (1) FOR MATH 375, FALL 2012 NOTES 1) FOR MATH 375, FALL 2012 1 Vector Spaces 11 Axioms Linear algebra grows out of the problem of solving simultaneous systems of linear equations such as 3x + 2y = 5, 111) x 3y = 9, or 2x + 3y z =

More information

Abstract Vector Spaces

Abstract Vector Spaces CHAPTER 1 Abstract Vector Spaces 1.1 Vector Spaces Let K be a field, i.e. a number system where you can add, subtract, multiply and divide. In this course we will take K to be R, C or Q. Definition 1.1.

More information

Math 314 Lecture Notes Section 006 Fall 2006

Math 314 Lecture Notes Section 006 Fall 2006 Math 314 Lecture Notes Section 006 Fall 2006 CHAPTER 1 Linear Systems of Equations First Day: (1) Welcome (2) Pass out information sheets (3) Take roll (4) Open up home page and have students do same

More information

Review 1 Math 321: Linear Algebra Spring 2010

Review 1 Math 321: Linear Algebra Spring 2010 Department of Mathematics and Statistics University of New Mexico Review 1 Math 321: Linear Algebra Spring 2010 This is a review for Midterm 1 that will be on Thursday March 11th, 2010. The main topics

More information

APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF

APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF ELEMENTARY LINEAR ALGEBRA WORKBOOK/FOR USE WITH RON LARSON S TEXTBOOK ELEMENTARY LINEAR ALGEBRA CREATED BY SHANNON MARTIN MYERS APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF When you are done

More information

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector

More information

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Introduction to Matrices

Introduction to Matrices POLS 704 Introduction to Matrices Introduction to Matrices. The Cast of Characters A matrix is a rectangular array (i.e., a table) of numbers. For example, 2 3 X 4 5 6 (4 3) 7 8 9 0 0 0 Thismatrix,with4rowsand3columns,isoforder

More information

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C = CHAPTER I BASIC NOTIONS (a) 8666 and 8833 (b) a =6,a =4 will work in the first case, but there are no possible such weightings to produce the second case, since Student and Student 3 have to end up with

More information

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any

More information

Choose three of: Choose three of: Choose three of:

Choose three of: Choose three of: Choose three of: MATH Final Exam (Version ) Solutions July 8, 8 S. F. Ellermeyer Name Instructions. Remember to include all important details of your work. You will not get full credit (or perhaps even any partial credit)

More information

LS.1 Review of Linear Algebra

LS.1 Review of Linear Algebra LS. LINEAR SYSTEMS LS.1 Review of Linear Algebra In these notes, we will investigate a way of handling a linear system of ODE s directly, instead of using elimination to reduce it to a single higher-order

More information

Chapter 7. Linear Algebra: Matrices, Vectors,

Chapter 7. Linear Algebra: Matrices, Vectors, Chapter 7. Linear Algebra: Matrices, Vectors, Determinants. Linear Systems Linear algebra includes the theory and application of linear systems of equations, linear transformations, and eigenvalue problems.

More information

Linear Algebra (part 1) : Matrices and Systems of Linear Equations (by Evan Dummit, 2016, v. 2.02)

Linear Algebra (part 1) : Matrices and Systems of Linear Equations (by Evan Dummit, 2016, v. 2.02) Linear Algebra (part ) : Matrices and Systems of Linear Equations (by Evan Dummit, 206, v 202) Contents 2 Matrices and Systems of Linear Equations 2 Systems of Linear Equations 2 Elimination, Matrix Formulation

More information

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT Math Camp II Basic Linear Algebra Yiqing Xu MIT Aug 26, 2014 1 Solving Systems of Linear Equations 2 Vectors and Vector Spaces 3 Matrices 4 Least Squares Systems of Linear Equations Definition A linear

More information

Definition 2.3. We define addition and multiplication of matrices as follows.

Definition 2.3. We define addition and multiplication of matrices as follows. 14 Chapter 2 Matrices In this chapter, we review matrix algebra from Linear Algebra I, consider row and column operations on matrices, and define the rank of a matrix. Along the way prove that the row

More information

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8. Linear Algebra M1 - FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:

More information

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in Chapter 4 - MATRIX ALGEBRA 4.1. Matrix Operations A a 11 a 12... a 1j... a 1n a 21. a 22.... a 2j... a 2n. a i1 a i2... a ij... a in... a m1 a m2... a mj... a mn The entry in the ith row and the jth column

More information

MATH10212 Linear Algebra B Homework Week 4

MATH10212 Linear Algebra B Homework Week 4 MATH22 Linear Algebra B Homework Week 4 Students are strongly advised to acquire a copy of the Textbook: D. C. Lay Linear Algebra and its Applications. Pearson, 26. ISBN -52-2873-4. Normally, homework

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and 6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if (a) v 1,, v k span V and (b) v 1,, v k are linearly independent. HMHsueh 1 Natural Basis

More information

Chapter 1. Vectors, Matrices, and Linear Spaces

Chapter 1. Vectors, Matrices, and Linear Spaces 1.6 Homogeneous Systems, Subspaces and Bases 1 Chapter 1. Vectors, Matrices, and Linear Spaces 1.6. Homogeneous Systems, Subspaces and Bases Note. In this section we explore the structure of the solution

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 Jesús De Loera, UC Davis February 1, 2012 General Linear Systems of Equations (2.2). Given a system of m equations and n unknowns. Now m n is OK! Apply elementary

More information

Vector Spaces 4.4 Spanning and Independence

Vector Spaces 4.4 Spanning and Independence Vector Spaces 4.4 and Independence Summer 2017 Goals Discuss two important basic concepts: Define linear combination of vectors. Define Span(S) of a set S of vectors. Define linear Independence of a set

More information

Linear Algebra. Linear Algebra. Chih-Wei Yi. Dept. of Computer Science National Chiao Tung University. November 12, 2008

Linear Algebra. Linear Algebra. Chih-Wei Yi. Dept. of Computer Science National Chiao Tung University. November 12, 2008 Linear Algebra Chih-Wei Yi Dept. of Computer Science National Chiao Tung University November, 008 Section De nition and Examples Section De nition and Examples Section De nition and Examples De nition

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

1 Linear transformations; the basics

1 Linear transformations; the basics Linear Algebra Fall 2013 Linear Transformations 1 Linear transformations; the basics Definition 1 Let V, W be vector spaces over the same field F. A linear transformation (also known as linear map, or

More information

ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS

ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS ELEMENTARY SUBALGEBRAS OF RESTRICTED LIE ALGEBRAS J. WARNER SUMMARY OF A PAPER BY J. CARLSON, E. FRIEDLANDER, AND J. PEVTSOVA, AND FURTHER OBSERVATIONS 1. The Nullcone and Restricted Nullcone We will need

More information

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

MATH 304 Linear Algebra Lecture 20: Review for Test 1. MATH 304 Linear Algebra Lecture 20: Review for Test 1. Topics for Test 1 Part I: Elementary linear algebra (Leon 1.1 1.4, 2.1 2.2) Systems of linear equations: elementary operations, Gaussian elimination,

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

This last statement about dimension is only one part of a more fundamental fact.

This last statement about dimension is only one part of a more fundamental fact. Chapter 4 Isomorphism and Coordinates Recall that a vector space isomorphism is a linear map that is both one-to-one and onto. Such a map preserves every aspect of the vector space structure. In other

More information

LINEAR ALGEBRA: THEORY. Version: August 12,

LINEAR ALGEBRA: THEORY. Version: August 12, LINEAR ALGEBRA: THEORY. Version: August 12, 2000 13 2 Basic concepts We will assume that the following concepts are known: Vector, column vector, row vector, transpose. Recall that x is a column vector,

More information

MAT 2037 LINEAR ALGEBRA I web:

MAT 2037 LINEAR ALGEBRA I web: MAT 237 LINEAR ALGEBRA I 2625 Dokuz Eylül University, Faculty of Science, Department of Mathematics web: Instructor: Engin Mermut http://kisideuedutr/enginmermut/ HOMEWORK 2 MATRIX ALGEBRA Textbook: Linear

More information

Linear Equations in Linear Algebra

Linear Equations in Linear Algebra 1 Linear Equations in Linear Algebra 1.7 LINEAR INDEPENDENCE LINEAR INDEPENDENCE Definition: An indexed set of vectors {v 1,, v p } in n is said to be linearly independent if the vector equation x x x

More information

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve: MATH 2331 Linear Algebra Section 1.1 Systems of Linear Equations Finding the solution to a set of two equations in two variables: Example 1: Solve: x x = 3 1 2 2x + 4x = 12 1 2 Geometric meaning: Do these

More information

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

More information

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation

More information

MATH2210 Notebook 3 Spring 2018

MATH2210 Notebook 3 Spring 2018 MATH2210 Notebook 3 Spring 2018 prepared by Professor Jenny Baglivo c Copyright 2009 2018 by Jenny A. Baglivo. All Rights Reserved. 3 MATH2210 Notebook 3 3 3.1 Vector Spaces and Subspaces.................................

More information

Our goal is to solve a general constant coecient linear second order. this way but that will not always happen). Once we have y 1, it will always

Our goal is to solve a general constant coecient linear second order. this way but that will not always happen). Once we have y 1, it will always October 5 Relevant reading: Section 2.1, 2.2, 2.3 and 2.4 Our goal is to solve a general constant coecient linear second order ODE a d2 y dt + bdy + cy = g (t) 2 dt where a, b, c are constants and a 0.

More information

EIGENVALUES AND EIGENVECTORS 3

EIGENVALUES AND EIGENVECTORS 3 EIGENVALUES AND EIGENVECTORS 3 1. Motivation 1.1. Diagonal matrices. Perhaps the simplest type of linear transformations are those whose matrix is diagonal (in some basis). Consider for example the matrices

More information

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016 Linear Algebra Notes Lecture Notes, University of Toronto, Fall 2016 (Ctd ) 11 Isomorphisms 1 Linear maps Definition 11 An invertible linear map T : V W is called a linear isomorphism from V to W Etymology:

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Matrix Arithmetic. j=1

Matrix Arithmetic. j=1 An m n matrix is an array A = Matrix Arithmetic a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn of real numbers a ij An m n matrix has m rows and n columns a ij is the entry in the i-th row and j-th column

More information

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics Ulrich Meierfrankenfeld Department of Mathematics Michigan State University East Lansing MI 48824 meier@math.msu.edu

More information

Linear algebra and differential equations (Math 54): Lecture 10

Linear algebra and differential equations (Math 54): Lecture 10 Linear algebra and differential equations (Math 54): Lecture 10 Vivek Shende February 24, 2016 Hello and welcome to class! As you may have observed, your usual professor isn t here today. He ll be back

More information

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10 Lecture 10 4 Vector Spaces 4.1 Basic Definition and Examples Throughout mathematics we come across many types objects which can be added and multiplied by scalars to arrive at similar types of objects.

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix Math 34H EXAM I Do all of the problems below. Point values for each of the problems are adjacent to the problem number. Calculators may be used to check your answer but not to arrive at your answer. That

More information