Elementary linear algebra

Size: px
Start display at page:

Download "Elementary linear algebra"

Transcription

1 Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The fundamental concepts are linear combination, linear dependence, basis, subspace. The material of this chapter can be found in Zhang (Chapter 1) and Halmos (Chapters I-III). Let F be a given field, usually we consider F = R, C Example The set of all rational numbers, Q, is a field. Definition A vector space V over the field F is a set of elements (vectors) with two operations + (addition) and (scalar multiplication) such that (A) (V, +) is an Abelian group, i.e., 1. commutative: x + y = y + x, for all x, y V. 2. associative: x + (y + z) = (x + y) + z, for all x, y, z V. 3. neutral element: there exists in V a unique vector 0 such that x + 0 = x for all x V. 4. inverse: for every x V, there is a unique vector x (called the inverse of x) such that x + ( x) = 0. (B) Scalar multiplication: α x or simply αx, α F, x V. 1. distributive: α(βx) = (αβ)x, for all α F, x V. 2. 1x = x, for all x V. (C) Distributive (scalar multiplication over addition) 1. α(x + y) = αx + αy, for all α F, x, y V. 1

2 2. (α + β)x = (αx + βx), for all α, β F, x V. Example R n over R, C n over C or R, P = the set of all polynomial in the indeterminant t, P n = the set of polynomials in t of degree no greater than n 1, and F = the set of real valued functions. Remark If F is replaced by a ring, V is called a module Group Representation Theory. Definition (linear combination) Let x 1,..., x n V. The vector x = α 1 x 1 + α n x n, where α 1,..., α n F, is called a linear combination of x 1,..., x n. Example The vector x = (1, 3) is a linear combination of x 1 = (1, 0) and x 2 = (1, 2) since (1, 3) = 1 2 (1, 0)+ 3 2 (1, 2) by direct observation or solving a 2 2 system of linear equations. 2. If A is an m n matrix and x = (x 1,..., x n ) T is an n 1 vector, the product Ax is indeed x 1 a 1 + +x n a n, a linear combination of the columns, a 1,..., a n, of A. Definition (linear dependence and independence) A finite set of vectors S = {x 1,..., x n } is linearly dependent if α 1 x α n x n = 0 where α 1,..., α n F are not all zero, i.e., there is a nontrivial zero combination among x 1,..., x n. Otherwise the set is linearly independent, i.e., α 1 x α n x n = 0 implies that α 1,..., α n are all zero. Sometimes we say that x 1,..., x n are linearly independent or dependent instead of the set S. In case S V is infinite, we shall say that the set S is linearly independent if every finite subset of S is such. Otherwise, S is linearly dependent. Remark Every set containing a linearly dependent set is also linearly dependent. Every subset of a linearly independent set is also linear independent. 2. The empty set of vectors in linearly independent. α 1 x α n x n = 0 implies that α 1,..., α n are all zero can be interpreted as if α 1 x α n x n = 0, then there is no index i for which α i 0. Exercise Show that the three vectors x, x 1, x 2 in the last example are linearly dependent but any two of them are linearly independent. Theorem (Halmos p.9) The set of nonzero vectors x 1,..., x n is linearly dependent if and only if some x k, 2 k n is a linear combination of the preceding ones. 2

3 Proof ( ) Let k be the first integer between 2 and n for which x 1,..., x k are linearly dependent (if worst comes worst, k = n will do it). Then α 1 x α n x n = 0, for some α 1,..., α k not all zero. Now we cannot have α k = 0, otherwise we should have a linear dependence relation among x 1,..., x k 1, contrary to the definition of k. Hence x k = α 1 x α k 1 x k 1. α k α k Thus we have the necessity. ( ) Since every set containing a linearly dependent set is also linearly dependent, we have the sufficiency. Corollary The set of nonzero vectors x 1,..., x n is linearly dependent if and only if some x k is a linear combination of the others. Definition Let S V be a subset of vectors. The span of S, denoted by span S, is the set of all linear combinations of finitely many vectors of S. The vector space V is spanned by S if span S = V. Example Let V = R 3, S 1 = {(1, 0, 0), (0, 1, 0)}, and S 2 = {( 1, 2, 0), (0, 1, 0)}. Then span S 1 and span S 2 are the xy-plane and V is spanned by the set S 3 = {(1, 0, 0), (0, 1, 0), (0, 0, 1)}. Definition A basis for a vector space V is a set B V of linearly independent vectors such that B spans V. V is finite dimensional if it has a finite basis. Remark Infinite dimensional vector space normed spaces, Hilbert spaces, Banach spaces. The proof of the existence of a basis for an infinite dimensional vector space involves Zorn s lemma. We will only study finite dimensional vector spaces. From now on, if we do not specify, all vector spaces are finite dimensional. Theorem If B = {x 1,..., x n } is a basis for V, then every x V can be written as x = α 1 x α n x n and α 1,... α n F are uniquely determined by x. Proof Surely x can be expressed as a linear combination of x 1,..., x n since B is a basis. If x = α 1 x α n x n = β 1 x β n x n, then (α 1 β 1 )x (α n β n )x n = 0. By the linear independence of B, we have α i = β i for all i = 1,..., n. The above theorem justifies the following 3

4 Definition If x = α 1 x α n x n, the vector α 1 α 2 x B :=. Fn α n is called the coordinate vector of x with respect to the basis B = {x 1,..., x n }. Theorem (Extension to a basis, Halmos p.11) If {x 1,..., x k } is a linearly independent set of a finite dimensional vector space V, then it can be extended to a basis, i,e., we can find vectors x k+1,..., x n V such that x 1,..., x k, x k+1,..., x n is a basis for V. Proof Since V is finite dimensional, there is a basis, say {y 1,..., y n }. We consider the set S of vectors x 1,..., x k, y 1,..., y n, in this order. Since S is linearly dependent (?), applying Theorem to S, let y i be the first vector such that linear dependence occurs among x 1,..., x k, y 1,..., y i. Then consider the set S x 1,..., x k, y 1,..., y i 1, y i+1,..., y n. The deletion of y i would not reduce the span of the remaining vectors. If S is linearly independent, then we are done. Otherwise we apply Theorem until we reach a basis. Theorem (Halmos p.13) Every basis for a finite dimensional vector space V contains the same number of vectors. Proof Let {x 1,..., x n } and {y 1,..., y m } be two bases for V. Consider the set S of vectors y m, x 1,..., x n, which is clearly linearly dependent. Applying Theorem to have the set S of vectors y m, x 1,..., x i 1, x i+1,..., x n, such that span S = V. Then add y m 1 in front of them and apply the same argument. Continuing in this way, the x s will not be exhausted before the y s, otherwise the remaining y s would have to be linear combinations of the ones already incorporated into S but y s are linearly independent. So n m. By symmetry m n. So n = m. The above theorem justifies the following 4

5 Definition (dimension) The dimension of a finite dimensional vector space V is the number of vector in any basis for V, denoted by dim V. Corollary If dim V = n, then any set of m vectors in V with m > n is linearly dependent. Proof By Theorem Exercise Show that dim R n = n, dim C n = n, dim P n = n by providing a basis for each. Show that P and F are infinite dimensional. Definition A nonempty subset W V of a vector space is called a subspace of V if it is also a vector space under the same operations. Example The subspaces {0} (will simply be denoted by 0) and V are called trivial subspaces of V. 2. Let V = R 3. Then any planes and any lines that pass through the origin are nontrivial subspaces. 3. If S V, then span S is a subspace of V. 4. The solution set of the system of linear equations Ax = 0 is a subspace of R n (C n ), where A is an m n real (complex) matrix. Exercise Show that a nonempty set W V is subspace of V if and only if αu+βv W for all α, β F and u, v W, i.e., W is closed with respect to addition and scalar multiplication. Theorem (Halmos p.17) The intersection of any (possibly infinite) collection of subspaces is a subspace. Proof By the preceding exercise. Exercise Find an example to show that the union of subspaces may not be a subspace. Definition If V 1, V 2 V are subspaces of V, then the sum of V 1 and V 2, denoted by V 1 + V 2, is the set V 1 + V 2 = {v 1 + v 2 : v 1 V 1, v 2 V 2 }. It is a subspace of V containing V 1 V 2. Exercise Show that the sum V 1 + V 2 is the smallest subspace containing V 1 and V 2. Theorem Let V 1 and V 2 be subspaces of V. dim V 1 + dim V 2 = dim(v 1 + V 2 ) + dim(v 1 V 2 ). 5

6 Proof Let B = {x 1,..., x k } be a basis for V 1 V 2. Extend B to a basis for V 1 : B 1 = {x 1,..., x k, x k+1,..., x k+i }, i.e., dim V 1 = k + i and extend B to a basis for V 2 : B 2 = {x 1,..., x k, x k+i+1,..., x k+i+j }, i.e., dim V 2 = k + j, via Theorem We claim that B := B 1 B 2 = {x 1,..., x k+i+j } is a basis for V 1 + V 2. Clearly span B = V 1 + V 2 because span B 1 = V 1 and span B 2 = V 2 and span B 1 + span B 2 span B 2 (?). If α 1 x α k+i+j x k+i+j = 0, then So α 1 x α k+i x k+i = α k+i+1 x l+i+1 α k+i+j x k+i+j V 1 V 2 (?). α 1 x α k+i x k+i = β 1 x β k x k α k+i+1 x k+i+1 α k+i+j x k+i+j = β 1 x β k x k for some scalars β s. From the first equation, we have β m = α m, m = 1,..., k, α m = 0, m = k + 1,..., k + i, and from the second equation, β m = 0, m = 1,..., k, α k+i+m = 0, m = 1,..., j (?). Thus all α s are zero. Definition The sum V 1 + V 2 is called a direct sum if V 1 V 2 = 0 and will be denoted by V 1 V 2. It is called a direct sum of V if V 1 V 2 = V in addition. Exercise Describe explicitly the sum of V 1 = span {(1, 0, 0)} and V 2 = span {(1, 1, 0), (0, 0, 1)}? Is it a direct sum of R 3? Theorem If V 1 and V 2 are subspace of V. Then V = V 1 V 2 if and only if every x V is uniquely written in the form x = x 1 + x 2 where x 1 V 1, x 2 V 2. Proof ( ) Since V = V 1 V 2, each x V can be expressed as x = x 1 + x 2, x 1 V 1, x 2 V 2. Suppose x = y 1 +y 2 where y 1 V 1, y 2 V 2. Then x 1 +x 2 = y 1 +y 2 which implies that x 1 y 1 = y 2 x 2 V 1 V 2. So x 1 = y 1 and x 2 = y 2. ( ) Now we clearly have V = V 1 + V 2. It suffices to show that V 1 V 2 = 0. Let x V 1 V 2. Write x = 0 + x and x = x + 0. By the uniqueness of expression, x = 0 follows immediately. Theorem Suppose V 1 V 2 is a direct sum. If B 1 = {x 1,..., x n } is a basis for V 1 and B 2 = {y 1,..., y m } is a basis for V 2, then B = {x 1,..., x n, y 1,..., y m } is a basis for V 1 V 2. Thus dim(v 1 V 2 ) = dim V 1 + dim V 2. Proof It is clear that span B = V 1 V 2. Suppose α 1 x α n x n + β 1 y β m y m = 0, for some scalars α s and β s. Rearranging yields α 1 x α n x n = β 1 y 1 + β m y m 6

7 which is in V 1 V 2. So α s and β s are all zero (?). Thus B is linearly independent. Problems 1. Show that R n is a vector space over R but not over C. 2. Show that P n, the set of all polynomials (with complex coefficients) of degree less than or equal to n 1 together with the zero polynomial (we include it because the degree of the zero polynomial is not defined), is a vector space. What happens if complex coefficients are replaced by real coefficients? 3. Let V be a vector space. Prove that if x, y V and α F, then (a) 0 + x = x. (b) 0 = 0. (c) α 0 = 0. (d) 0 x = 0. (e) If αx = 0, then either x = 0 or α = 0. (f) x = ( 1)x. (g) y + (x y) = x (here x y = x + ( y)). 4. Under what conditions on the scalar α are the vectors (1 + α, 1 α) and (1 α, 1 + α) in C 2 linearly independent? What is the answer if C 2 is replaced by R Show that the vectors (1, 0) and (0, 1) and (1, 1) are linearly dependent by expressing one as a linear combination of the others. 6. Show that the vector x = (6, 7, 3) is a linear combination of the vectors x 1 = (2, 3, 1), x 2 = (3, 5, 2) and x 3 = (1, 1, 0). 7. Is the set {(x, y) R 2 : x 2y = 0} a subspace of R 3? How about {(x, y) R 2 : x 2y = 1}? 8. Let V be a vector space. Show that the following are bases for V. (a) any maximal linearly independent set B = {x 1,..., x n }, i.e., B is linearly independent and any set properly containing B would be linearly dependent, and (b) any minimal spanning set B = {x 1,..., x n }, i.e., the span of B is V and any proper subset of B would not span V. 7

8 9. (a) What is the dimension of C n over C? What is the dimension of R n over R? Is C n a vector space over R? If so, what is the dimension? (b) Every complex vector space V, i.e., over C, can be turned into a real vector space easily. How to achieve that and what is the dimension of V over R? 10. Show that the space of polynomials P in the variable t (i.e, p(t) := t 2 +3t+2 is an element of P) with complex coefficients is not finite dimensional by giving an infinite linearly independent set. 11. Prove that if W V is a subspace of the vector space V and dim W = dim V, then W = V. 12. Let V be a vector space and L, M, N V subspaces of V. (a) Show that the equation L (M +N) = (L M)+(L N) is not necessarily true by an example. (b) Prove that L (M + (L N)) = (L M) + (L N). 13. Suppose that V is vector space and V 1, V 2 V are subspaces. Show that V 1 V 2 = V if and only if dim V 1 + dim V 2 = dim V. 1.2 Linear maps and matrices Definition Let U and V be vector spaces over the same field F. A linear map T : U V is a map that satisfies T (α 1 x 1 + α 2 x 2 ) = α 1 T (x 1 ) + α 2 T (x 2 ), for all x 1, x 2 U, α 1, α 2 F. When U = V, we call T : V V a linear operator or simply operator. Example The zero map is a linear map. 2. The identity map I : V V is a linear operator. 3. A rotation on R 2 by an angle θ is a linear operator. 4. The differential operator d dt : P P is a linear map defined as d dt f = df dt, the derivative of f(t). Exercise A0 = 0 for any linear map. 2. The set of all linear maps, L(U, V ), from U to V is a vector space. Show that if T : U V is an isomorphism, then the inverse T 1 : V U is linear. 8

9 Definition The set Ker T = {x U : T (x) = 0} is called the kernel of T. The set Im T = {T (x) : x U} is called the range of T. Exercise Show that Ker T and Im T are subspaces of U and V respectively. Theorem Let T : U V be a linear map. The following are equivalent. 1. T is injective. 2. The kernel of T is T maps linearly independent set into linearly independent set, i.e., if {x 1,..., x n } is linearly independent, so is {T (x 1 ),..., T (x n )} Proof (1) (2): Clear. (2) (1): Suppose T (x 1 ) = T (x 2 ). Then T (x 1 x 2 ) = 0 and thus x 1 = x 2 and hence T is injective. (2) (3): Let {x 1,..., x n } be a linearly independent set. Let α 1 T (x 1 ) + + α n T (x n ) = 0. Then T (α 1 x α n x n ) = 0 (?). By (2), α 1 x α n x n = 0 and thus α s are all zero (?). So {T (x 1 ),..., T (x n )} is linearly independent. (3) (2): Suppose T (x) = 0. It implies that T (x) is linearly dependent and thus x is linearly dependent. So x = 0. Theorem Let T : U V be a linear map. Then dim U = dim Ker T + dim Im T. Proof Let {x 1,..., x k } be a basis for Ker T. Extend it to a basis {x 1,..., x k, x k+1, x n } for U. The set {T (x k+1 ),..., T (x n )} is a basis for Im T since each element of Im T is of the form T (x) for some x = α 1 x α n x n. Now T (x) = α 1 T (x 1 ) α n T (x n ) = α k+1 T (x k+1 ) + + α n T (x n ). If α k+1 T (x k+1 ) + + α n T (x n ) = 0, then T (α k+1 x k α n x n ) = 0 and thus α k+1 x k α n x n Ker T. So α s are all zero (?). Definition An isomorphism T : U V is a bijective linear map and the spaces U and V are said to be isomorphic. In this case, dim U = dim V by the above theorem. Indeed U and V are isomorphic if and only if they have the same dimension. For if they have the same dimension, we fix (ordered) bases B 1 = {x 1,..., x n } and B 2 = {y 1,..., y n } for U and V respectively, and define the isomorphism T : U 9

10 V in the obvious way: T (x i ) = y i, i = 1,..., n and extend it by linearity, i.e., T ( n i=1 α i x i ) = n i=1 α i y i. In particular if V = F n and choose a basis B 2 = {e 1,..., e n } for F n. The natural isomorphism T sends n i=1 α i x i to n i=1 α i e i. An m n matrix A in F naturally induces a linear map T : F n F m by T (x) = Ax, x F n. Example Let 1 2 A = Then T : R 2 R 3 is defined as T ( ) x1 x 2 = x 1 + 2x 2 3x 2 2x 1 x 2 To save space we often write T (x 1, x 2 ) T = A(x 1, x 2 ) T = (x 1 + 2x 2, 3x 2, 2x 1 x 2 ) T. Conversely if T : U V is a linear map, and if we fix bases B = {x 1,..., x n } and C = {y 1,..., y m } for U and V, respectively, we have. n T (x i ) = a ji y j, i = 1,..., m. j=1 Definition (matrix representation) The m n matrix A = (a ij ), denoted by MB C (T ), is called the matrix representation of T with respect to the bases B and C, or simply called the matrix of T. If x = n k=1 α k x k, then n n n n n T (x) = α k T (x k ) = a jk α k y j = ( a jk α k )y j. k=1 k=1 j=1 j=1 k=1 Thus (T (x)) C = M C B (T )x B, for all x V, i.e., the coordinate vector of T (x) with respect to the basis C is the product of the the matrix of T with respect to the bases B and C and the coordinate vector of x with respect to B. Moreover MB C (T ) is uniquely determined by the above formula (?). 10

11 Theorem Let S, T : U V be linear maps and let B, C be bases of U and V. Indeed MB C is an isomorphism defined by M B C(aS + bt ) = am B C(S) + bm B C (T ), a, b scalars, from L(U, V ) to F m n. Theorem Let U, V, W be vectors spaces with bases B, C, D. Let S : U V and T : V W be linear maps. Then M D B (T S) = M D C (T )M C B (S). Proof Let A := MC D(T ) and E := M B C (S). Then for all x U, and the desired result follows. (T S(x)) D = (T (S(x)) D = A(S(x)) C = AEx B, Corollary M C B (I)M B C (I) = M B C (I)M C B (I) = I, i.e., M C B (I) 1 = M B C (I). The matrix M(I) C B (the matrix representation of the identity operator I : V V with respect to B and C) is called the matrix of change of bases from B to C, which gives the relation between the coordinate vectors x B and x C of x V : x C = M(I) C Bx B. Exercise Show that if S : V V is the linear map defined by Sx i = y i, i = 1,..., n, then M B B (S) = M(I)C B. The following offers the relationship between M(T ) B 2 B 1 and M(T ) C 2 C 1 if B 1 and C 1 are bases for U and B 2 and C 2 are bases for V. Theorem M C 2 B 2 (T ) = M C 2 C 1 (I)M C 1 B 1 (T )(M B 2 B 1 (I)) 1. Corollary If T : V V is an operator and B and C are bases for V, then M C C (T ) = M C B (I)M B B (T )(M C B (I)) 1. Definition Two n n matrices A and B are similar if there exists a nonsingular matrix P such that B = P AP 1. Exercise Show that similarity is an equivalence relation on F n n. Problems 1. Is the map T : R 2 R defined by T (x 1, x 2 ) T = x x 2 linear? 2. A linear operator P : V V is a projection if P 2 = P. Show that P is a projection if and only if Im P Im (I P ) = V. 11

12 3. Show that if V = V 1 V 2, then P x = x 1, x = x 1 + x 2, x 1 V 1, x 2 V 2, for each x V, defines a projection on V. 4. Given the map T : R 3 R 4 defined by T (x 1, x 2, x 3 ) T = (0, x 1 + x 2, x 3, x 1 x 3 ) T. (a) Show that T is a linear map. (b) Find its matrix representation with respect to the standard bases of R 3 and R 4. (c) Find Ker T and Im T. 5. Show that the following are equivalent. (a) T : V V is bijective. (b) Im T = V. (c) Ker T = Show that AB = I if and only if BA = I, where A and B are n n matrices (a) by showing that if (i) AB = I, then the columns of B are linearly independent by considering the system of equations Bx = 0, (ii) the systems Bc i = e i, i = 1,..., n have solutions, where {e 1,..., e n } is the standard basis of F n, (iii) BA = I. 1.3 Inner product and adjoint Inner product is a new structure given to a vector space that enables us to introduce the some geometric notions like length and angle. Definition An inner product of a vector space V over R or C is a map (, ) : V V R such that for all x, y, z V and scalar α 1. (x, x) 0 and (x, x) = 0 if and only if x = (x + y, z) = (x, z) + (x, z). 3. (αx, y) = α(x, y). 4. (x, y) = (y, x). V is called an inner product space. The norm or length x of a vector x V is x = (x, x). Two vectors x, y V are orthogonal if (x, y) = 0. 12

13 Example R n and C n are inner product spaces with the standard inner products (x, y) = y T x and (x, y) = y x respectively. 2. P has an inner product (f, g) = 1 0 f(t)g(t) dt. Exercise Show that for all x, y V and scalar α 1. (x, αy) = α(x, y). 2. (i) x 0, (ii) αx = α x, (iii) (triangle inequality) x + y x + y. 3. (cosine law) x + y 2 = x 2 + y 2 + 2(x, y). Theorem (Cauchy-Schwarz inequality) For all x, y in an inner product space V, (x, y) x y. Equality holds if and only if x and y are linearly dependent. Proof To avoid triviality we assume x 0. Consider z = y the projection of y orthogonal to x. Now (y, x) (x, x) x, 0 (z, z) = (z, y) = (y, y) (y, x) (x, y), (x, x) and the desired Cauchy-Schwarz inequality follows. If equality holds, it implies (z, z) = 0 and hence y is a scalar multiple of x, i.e., linearly dependent. Exercise Picture the geometry of the proof. Definition Let x, y V be nonzero vectors. Then the cosine of the angle θ (0 θ π) between x and y is cos θ = (x, y) x y. Thus the cosine law becomes x + y 2 = x 2 + y x y cos θ. Theorem (Gram-Schmidt process) Let {x 1,..., x k } be a linearly independent set in V, the there is an orthogonal set {y 1,..., y k } such that span {x 1,..., x i } = span {y 1,..., y i }, i 1,..., k. Thus every inner product space V has an orthonormal basis. 13

14 Proof Let y 1 = x 1. Set In order to have (y i, y j ) = 0 for j i, y i = x i α 1 y 1 α i 1 y i 1. α j = (x i, y j )/(y j, y j ), j = 1,..., i 1. It is clear that the span of x s and the span of y s are identical. Since {x 1,..., x k } is linearly independent, y s are nonzero. Normalization yields orthonormal vectors. Definition Let S V be a subset. The orthogonal complement of S is the set S := {x : (x, y) = 0 for all y S}. Exercise Show that S is a subspace of V. Theorem Let W V be a subspace. Then dim W + dim W = dim V. Proof If W = 0 or V, then it is trivial. Suppose W V is a proper subspace of V. Let {x 1,..., x k } be an orthonormal basis for W. Extend it to a basis {x 1,..., x k, x k+1,..., x n}. Apply Gram-Schmidt process to get an orthonormal basis {x 1,..., x k, x k+1,..., x n }. Then {x k+1,..., x n } is a basis of W (check!). Theorem Let T : V V be a linear operator on the inner product space. There exists a unique T : V V such that (T x, y) = (x, T y), called the adjoint of T. Proof It suffices to show that for any y V, there exists a unique element z V (depends on y and T ) such that (T x, y) = (x, z) for all x V. Let {x 1,... x n } be an orthonormal basis of V. We define n z := (T x i, y)x i. i=1 Suppose (T x, y) = (x, z 1 ) = (x, z 2 ), then (x, z 1 z 2 ) = 0 for all x V and thus z 1 = z 2 by putting x = z 1 z 2. Exercise Prove (T ) = T. 2. Prove that (T ) 1 = (T 1 ) if T is bijective. Definition T is normal if T T = T T. 2. T is self-adjoint if T = T. 14

15 3. T is positive semi-definite if (T x, x) 0 for all x V and positive definite if (T x, x) > 0 for all nonzero x. 4. T is an isometry if T T = I, i.e., (T x, T y) = (x, y) for all x, y V. Exercise Show that positive semi-definite operator are self-adjoint. (Hint: consider (A(x + y), x + y) and (A(x + iy), x + iy) if F = C. Complexify V if F = R) Definition Let A be an n n matrix. 1. A is normal if A A = AA. 2. A is Hermitian if A = A and is real symmetric if A is real and A T = A. 3. A is positive semi-definite if x Ax 0 for all x F n and positive definite if x Ax > 0 for all nonzero x. 4. A is unitary if A A = I and is orthogonal if A is real and A T A = I. Example (matrix representation of adjoint) Let B = {x 1,..., x n } be an orthonormal basis for V. If T : V V is a linear operator. Let A = (a ij ) = MB B (T ), i.e., T (x j ) = n i=1 a ij x i, j = 1,..., n. Since B is orthonormal, we have a ij = (T x j, x i ). Therefore, a ji = (T x i, x j ) = (x i, T x j ) = (T x j, x i ). Thus M B B (T ) = (a ji ), the conjugate transpose of the matrix representation T. Exercise Show that if T is normal, positive semi-definite, positive definite, unitary, respectively, so is the matrix representation of T with respect to an orthonormal basis. i.e., Definition is just the matrix version of Definition in the sense of the previous example. 2. Deduce that positive semi-definite matrices are Hermitian. Exercise Show that a matrix A is unitary if and only if the columns (rows) of A are orthonormal with respect to the standard inner product of F n. Exercise What is the matrix version of Exercise ? Problems 1. Prove that if x and y are in a complex inner product space, then 4(x, y) = x + y 2 x y 2 + i x + iy 2 i x iy (Parallelogram identity) Show that x + y 2 + x y 2 = 2 x y 2. 15

16 3. *(Jordan-von Neumann) Let V be a complex vector space with a norm. Then there exists an inner product (, ) on V such that x = (x, x) if and only if it satisfies the parallelogram law. 4. (a) Let V be a real inner product space. Show that x, y V are orthogonal if and only if x + y 2 = x 2 + y 2. (b) Let V be a real inner product space. If x = y, then x y and x + y are orthogonal. (c) Show that (a) is false if V is a complex vector space. (d) Let V be a complex inner product space. Show that x, y V are orthogonal if and only if αx + βy 2 = αx 2 + βy 2 for all α, β C. 5. Let x 1,..., x n be n unit vectors of a real inner product space V such that x i x j = 1 for all i j. Find the angle between x i and x j and the angle between s x i and s x j where x = (x e n )/(n + 1). Find a geometric interpretation of the vectors for n = 2 and n = Let C n n be the vector space of complex n n matrices. Show that (A, B) = tr AB is an inner product on C n n where tr A = n i=1 a ii is the trace of A. How about R n n and C m n, (m n)? 7. Apply Gram-Schmidt process to the three vectors (1, 1, 1), ( 1, 2, 2) and (1, 4, 0). 8. Let V be an inner product space with inner product (, ). Let, be an inner product of V. Then there exists a unique positive definite operator T such that x, y = (T x, y), for all x, y V. What is the matrix version for C n with standard inner product (, )? 9. Let V be a complex inner product space. Show that if T : V V such that (T x, x) = 0 for all x V, then T = 0. Is it true if V is over R? 10. Given T : V V a linear operator on V. Show that the following are equivalent. (a) T is an isometry. (b) T maps an orthonormal basis into an orthonormal basis. (c) T preserves norm, i.e., T x = x for all x V. 11. Show that the set of n n Hermitian (real symmetric) matrices is a vector space. Is it true for unitary matrices and normal matrices? 12. Show that the set of n n unitary (orthogonal) matrices is a group. 16

17 13. (orthogonal projection) Let V = V 1 V 2 be an orthogonal sum, i.e., V 1 V 2. Show that the map P : V V 1 defined by P x = x 1 if x = x 1 + x 2 where x 1 V 1, x 2 V 2, is an orthogonal projection, i.e., P 2 = P and P = P. Conversely prove that if P is an orthogonal projection, then V = Im P Im (I P ) is an orthogonal direct sum. 14. (reflection) Let r V be a nonzero vector in the real inner product space V. Show that the map s r : V V defined by (x, r) s r (x) = x 2 (r, r) r is a unitary operator. What is s r (r)? and what is the image of x V such that x r. What is the geometric interpretation? 1.4 Some numerical characteristics of matrices Let A be an m n matrix. The image and the kernel of A are the subspaces Im A = {Ax : x F n } of F m and Ker A = {x F n : Ax = 0} of F n respectively. The rank of A is rank A = dim Im A. Theorem Let A and B be matrices of sizes m n and n p, respectively. Then rank AB = rank B dim(im B Ker A). In particular rank A + rank B n rank AB min{rank A, rank B}. A scalar λ is an eigenvalue of an n n A if there exists a nonzero vector x such that Ax = λx, or equivalently (A λi)x = 0 has nontrivial solution. To compute the eigenvalues of A, we solve the characteristic equation det(λi A) = 0. By the fundamental theorem of algebra, an n n complex matrix has n eigenvalues in C. Let A be an m n matrix. The square roots of the eigenvalues of A A are called the singular values of A. Problems Let A be an m n matrix. Show that for any n m matrix B,. dim Im A + dim Ker A = dim Im BA + dim Ker BA 17

18 Chapter 2 Matrix decompositions 2.1 Schur s triangularization theorem Two n n matrices are unitarily similar if B = UAU 1 for some unitary matrix U. Unitarily similarity is an equivalence relation on F n n. Theorem (Schur s triangularization theorem) Let A be an n n complex matrix. Then there exists a unitary matrix U such that UAU 1 is upper triangular. If A is real, then U may be chosen as real orthogonal. Theorem (Spectral theorem for normal matrices) Let A be an n n normal matrix. Then there exists a unitary matrix U such that UAU 1 = diag (λ 1,..., λ n ) where λ s are the eigenvalues of A. If A is real normal, then U may be chosen as real orthogonal. Corollary (Spectral theorem for Hermitian matrices) Let A be an n n Hermitian matrix. Then there exists a unitary matrix U such that UAU 1 = diag (λ 1,..., λ n ) where λ s in R are the eigenvalues of A. If A is real symmetric, then U may be chosen as real orthogonal. Corollary (Spectral theorem for unitary matrices) Let A be an n n unitary matrix. Then there exists a unitary matrix U such that UAU 1 = diag (λ 1,..., λ n ) where λ s of modulus 1 are the eigenvalues of A. If A is real orthogonal, then U may be chosen as real orthogonal. Two m n matrices are unitarily equivalent if B = UAV for some nonsingular matrices U and V. Unitarily equivalence is an equivalence on F m n. Theorem (singular value decomposition SVD) Let A be an m n complex matrix. Then there exist unitary matrices U (m m) and V (n n) such that 18

19 A = USV where S = { [diag (s1,..., s r, 0,..., 0) 0] if m n ( diag (s 1,..., s r, 0,..., 0) 0 ) if m > n. If A is real, then U and V may be chosen as real orthogonal. Corollary Let A be an n n complex matrix. There exist positive semidefinite matrices P 1, P 2 and unitary matrices U 1 and U 2 such that A = U 1 P 1 = P 2 U 2. Two m n matrices are equivalent if B = P AQ for some nonsingular matrices P and Q. Exercise Show that equivalence is an equivalence relation on F m n. Theorem Let A be an m n matrix with ranka = r. nonsingular matrices S and T such that (I ( r 0 m r 0) ) if m n SAT = Ir 0 m r if m > n 0 Then there exist The form is call the Hermite form of A which is completely determined by the rank of A. 19

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

Spectral Theory, with an Introduction to Operator Means. William L. Green

Spectral Theory, with an Introduction to Operator Means. William L. Green Spectral Theory, with an Introduction to Operator Means William L. Green January 30, 2008 Contents Introduction............................... 1 Hilbert Space.............................. 4 Linear Maps

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Chapter 6 Inner product spaces

Chapter 6 Inner product spaces Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y

More information

Mathematics Department Stanford University Math 61CM/DM Inner products

Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS KOLMAN & HILL NOTES BY OTTO MUTZBAUER 11 Systems of Linear Equations 1 Linear Equations and Matrices Numbers in our context are either real numbers or complex

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

Math 113 Final Exam: Solutions

Math 113 Final Exam: Solutions Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India

Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India CHAPTER 9 BY Prof. M. Saha Professor of Mathematics The University of Burdwan West Bengal, India E-mail : mantusaha.bu@gmail.com Introduction and Objectives In the preceding chapters, we discussed normed

More information

Lecture Notes for Inf-Mat 3350/4350, Tom Lyche

Lecture Notes for Inf-Mat 3350/4350, Tom Lyche Lecture Notes for Inf-Mat 3350/4350, 2007 Tom Lyche August 5, 2007 2 Contents Preface vii I A Review of Linear Algebra 1 1 Introduction 3 1.1 Notation............................... 3 2 Vectors 5 2.1 Vector

More information

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same. Introduction Matrix Operations Matrix: An m n matrix A is an m-by-n array of scalars from a field (for example real numbers) of the form a a a n a a a n A a m a m a mn The order (or size) of A is m n (read

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

CHAPTER II HILBERT SPACES

CHAPTER II HILBERT SPACES CHAPTER II HILBERT SPACES 2.1 Geometry of Hilbert Spaces Definition 2.1.1. Let X be a complex linear space. An inner product on X is a function, : X X C which satisfies the following axioms : 1. y, x =

More information

I teach myself... Hilbert spaces

I teach myself... Hilbert spaces I teach myself... Hilbert spaces by F.J.Sayas, for MATH 806 November 4, 2015 This document will be growing with the semester. Every in red is for you to justify. Even if we start with the basic definition

More information

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY MAT 445/1196 - INTRODUCTION TO REPRESENTATION THEORY CHAPTER 1 Representation Theory of Groups - Algebraic Foundations 1.1 Basic definitions, Schur s Lemma 1.2 Tensor products 1.3 Unitary representations

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1.

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1. August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei 1. Vector spaces 1.1. Notations. x S denotes the fact that the element x

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

NOTES on LINEAR ALGEBRA 1

NOTES on LINEAR ALGEBRA 1 School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura

More information

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space. Chapter 1 Preliminaries The purpose of this chapter is to provide some basic background information. Linear Space Hilbert Space Basic Principles 1 2 Preliminaries Linear Space The notion of linear space

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Math 121 Homework 5: Notes on Selected Problems

Math 121 Homework 5: Notes on Selected Problems Math 121 Homework 5: Notes on Selected Problems 12.1.2. Let M be a module over the integral domain R. (a) Assume that M has rank n and that x 1,..., x n is any maximal set of linearly independent elements

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

A NICE PROOF OF FARKAS LEMMA

A NICE PROOF OF FARKAS LEMMA A NICE PROOF OF FARKAS LEMMA DANIEL VICTOR TAUSK Abstract. The goal of this short note is to present a nice proof of Farkas Lemma which states that if C is the convex cone spanned by a finite set and if

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

A Do It Yourself Guide to Linear Algebra

A Do It Yourself Guide to Linear Algebra A Do It Yourself Guide to Linear Algebra Lecture Notes based on REUs, 2001-2010 Instructor: László Babai Notes compiled by Howard Liu 6-30-2010 1 Vector Spaces 1.1 Basics Definition 1.1.1. A vector space

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

CHAPTER VIII HILBERT SPACES

CHAPTER VIII HILBERT SPACES CHAPTER VIII HILBERT SPACES DEFINITION Let X and Y be two complex vector spaces. A map T : X Y is called a conjugate-linear transformation if it is a reallinear transformation from X into Y, and if T (λx)

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

Matrix Theory. A.Holst, V.Ufnarovski

Matrix Theory. A.Holst, V.Ufnarovski Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

MATH 532: Linear Algebra

MATH 532: Linear Algebra MATH 532: Linear Algebra Chapter 5: Norms, Inner Products and Orthogonality Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2015 fasshauer@iit.edu MATH 532 1 Outline

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Part 1a: Inner product, Orthogonality, Vector/Matrix norm Part 1a: Inner product, Orthogonality, Vector/Matrix norm September 19, 2018 Numerical Linear Algebra Part 1a September 19, 2018 1 / 16 1. Inner product on a linear space V over the number field F A map,

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Linear algebra 2. Yoav Zemel. March 1, 2012

Linear algebra 2. Yoav Zemel. March 1, 2012 Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.

More information

Hilbert spaces. 1. Cauchy-Schwarz-Bunyakowsky inequality

Hilbert spaces. 1. Cauchy-Schwarz-Bunyakowsky inequality (October 29, 2016) Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/fun/notes 2016-17/03 hsp.pdf] Hilbert spaces are

More information

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35 Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35 1. Let R be a commutative ring with 1 0. (a) Prove that the nilradical of R is equal to the intersection of the prime

More information

Linear Vector Spaces

Linear Vector Spaces CHAPTER 1 Linear Vector Spaces Definition 1.0.1. A linear vector space over a field F is a triple (V, +, ), where V is a set, + : V V V and : F V V are maps with the properties : (i) ( x, y V ), x + y

More information

Inner Product and Orthogonality

Inner Product and Orthogonality Inner Product and Orthogonality P. Sam Johnson October 3, 2014 P. Sam Johnson (NITK) Inner Product and Orthogonality October 3, 2014 1 / 37 Overview In the Euclidean space R 2 and R 3 there are two concepts,

More information

Part IA. Vectors and Matrices. Year

Part IA. Vectors and Matrices. Year Part IA Vectors and Matrices Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2018 Paper 1, Section I 1C Vectors and Matrices For z, w C define the principal value of z w. State de Moivre s

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition 6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

0.2 Vector spaces. J.A.Beachy 1

0.2 Vector spaces. J.A.Beachy 1 J.A.Beachy 1 0.2 Vector spaces I m going to begin this section at a rather basic level, giving the definitions of a field and of a vector space in much that same detail as you would have met them in a

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Functional Analysis. James Emery. Edit: 8/7/15

Functional Analysis. James Emery. Edit: 8/7/15 Functional Analysis James Emery Edit: 8/7/15 Contents 0.1 Green s functions in Ordinary Differential Equations...... 2 0.2 Integral Equations........................ 2 0.2.1 Fredholm Equations...................

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Some notes on Coxeter groups

Some notes on Coxeter groups Some notes on Coxeter groups Brooks Roberts November 28, 2017 CONTENTS 1 Contents 1 Sources 2 2 Reflections 3 3 The orthogonal group 7 4 Finite subgroups in two dimensions 9 5 Finite subgroups in three

More information

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information