# LINEAR ALGEBRA REVIEW

Save this PDF as:
Size: px
Start display at page: ## Transcription

1 LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication defined coordinatewise. (2) A linear combination of vectors v 1,... v n is an expression λ 1 v λ n v n for some scalars λ i. By convention the empty linear combination has value 0. A linear combination of linear combinations is a linear combination also. (3) If W is a VS over F, a subspace is a nonempty V W such that λv V for all λ F, v V and V forms a VS. V is a subspace if and only if 0 V and V is closed under linear combinations of two elements. A subspace is closed under all linear combinations. (4) If U and V are subspaces of W then U + V and U V are subspaces, where U + V = {u + v : u U, v V }. (5) If W = U +V and U V = {0}, then every element of W has the form u+v for a unique u U and v V. This is because if u 1 + v 1 = u 2 + v 2, then u 1 u 2 = v 2 v 1 U V. In this case U and V are called complementary. 2. Span and dependence (1) Let X V. The span of X is the set of all linear combinations n i=1 λ ix i with λ i F, x i X. Note that X may be infinite, but each individual linear combination is finite; in particular if a vector is in the span of X, it is in the span of a finite subset of X. The span of X is the least subspace containing X. (2) Let v 1,... v n be a sequence of vectors (repetitions are allowed). Then a dependence in the sequence is an equation of the form n i=1 λ iv i = 0, and the dependence is non-trivial if and only if at least one λ i is nonzero. If the sequence contains a zero or has a repetition then there is a non-trivial dependence. (3) A set X of vectors is independent if and only if for all finite sequences from X with no repetitions, there is no non-trivial dependence. is independent, and no independent set contains 0. (4) Given an equation n i=1 λ iv i = 0 with λ j 0, we may deduce that v j = i j λ 1 j λ i v i. This line of thought has several important consequences, among them (a) A set X is independent iff x / span(x \ {x}) for all x X. (b) If y span(x {x}) \ span(x), then x span(x {y}). (c) If X is independent and X {x} is dependent, then x span(x). 1

2 2 JC 3. Basis and dimension (a) A basis of V is an independent subset of V whose span is V. (b) Let B be a finite basis of size n, then all other bases are finite and have the same size. In this case we say V is finite dimensional (FD) and call n the dimension. V has dimension 0 if and only if V = {0}, in this case is the only basis. (c) Related facts: (i) Let dim(v ) = n and let I be an independent set. Then I n, and there is a basis containing I. (ii) Let dim(v ) = n and let X V with X = n + 1. Then X is dependent. (iii) If I is an independent set of size k, then the span of I is a space of dimension k with I as a basis. (5) Let dim(w ) = n and let V be a subspace of W. Then dim(v ) n, and dim(v ) = n iff V = W. 4. Linear maps, isomorphism (1) Let V, W be Vs s over F. A function T : V W is linear iff it preserves + and scalar multiplication, equivalently T (λv + µw) = λt (v) + µt (w). The image of T is {T v : v V } and the nullspace is {v V : T v = 0}. They are subpsaces of W, V respectively. (2) Let T be linear. Then T v = T v iff T (v v ) = 0, that is v v is in the nullspace. So easily T is injective if and only if the nullspace is {0}. (3) Key example: if v 1,... v n are vectors in V then the map (λ 1,... λ n ) i λ iv i is linear from F n to V. (4) The composition of linear maps is linear. If T is a linear bijection then the inverse function T 1 is also linear. Linear bijections are called isomorphisms, and two spaces are isomorphic if and only if there is an isomorphism between them. Being isomorphic is an equivalence relation, because the identity is an isomorphism, composing isomorphisms gives an isomorphism, and the inverse of an isomorphism is an isomorphism. Isomorphic spaces are structurally the same, in particular they have the same dimension, and the image of a basis under an isomorphism is a basis. (5) Let v 1,... v n be a list of vectors without repetitions. (λ 1,... λ n ) i λ iv i is injective if and only if there is no non-trivial dependence, and is surjective if the v i span V. So the linear map (λ 1,... λ n ) i λ iv i is an isomorphism exactly when the v i form a basis. So (a) V (a space over F ) has dimension n if and only if V is isomorphic to F n. (b) Any two spaces over F of dimension n are isomorphic. (6) Let U and V be complementary subspaces of W. Then the map (u, v) u + v is an isomorphism between U V and W, where U V is the VS of ordered pairs with coordinatewise operations. (7) Let W be a FD VS and let U, V be subspaces. Then dim(u + V ) = dim(u) + dim(v ) dim(u V ).

3 LINEAR ALGEBRA REVIEW 3 5. Quotient space, rank, nullity (1) Let V be a subspace of W. Introduce an equivalence relation aeb a b V. The equivalence classes are sets of the form V + a = a + V. We define W/V to be the set of all equivalence classes. We make W/V into a VS by defining (V + a) + (V + b) = V + (a + b) and λ(v + a) = V + λa. It can be checked that these are well-defined and satisfy the VS axioms. (2) Let T : V W be linear and let N be the nullspace of T. Then T v = T v T (v v ) = 0 v v N N + v = N + v. and we get a bijection between the spaces V/N and im(t ) in which N + v maps to T v. It is easy to see this bijection is linear, so the spaces V/N and im(t ) are isomorphic: in particular they have the same dimension. (3) To put it another way: If T : V W is linear and T v = b, then the set of solutions of T w = b is precisely N + v where N is the nullspace. (4) Let W be FD of dimension n and let V be a subspace of dimension m. Let v 1,... v m be a basis of V and extend it to get a basis v 1,... v n for W. A routine check shows that the set of V + v j for m < j n forms a basis for W/V, so dim(w/v ) = n m = dim(w ) dim(v ). (5) Let T : V W be a linear map between FD spaces. The rank of T is the dimension of the image and the nullity is the dimension of the nullspace. By the considerations above dim(v ) = r + n where r is the rank and n is the nullity. (6) If dim(v ) = dim(w ) = K and T : V W is linear, then T is injective if and only if the nullity is zero if and only if the rank is K if and only if T is surjective. 6. Matrix of a transformation, linear equations (1) Let T : V W be linear and let B be a basis for V. Then easily T is determined by T B (the restriction of T to B) and in fact any function from B to W is T B for a unique linear T. (2) Let V, W be FD spaces over F and fix bases v 1,... v m and w 1,... w n. Let T : V W be a linear map. Then the matrix of T is the n m array of elements of F given by T v j = i a ijw i. It has n rows and m columns, and a ij is the entry in row i, column j. Rows correspond to elements of the basis of W, columns to elements of the basis of V ; column j gives the coefficients of the expansion of T v j using the basis w 1,... w n. Subtle point: Formally the matrix depends on the order in which the bases are enumerated. This is not important as long as you stick to a fixed enumeration. (3) If A = (a ij ) is n m and B = (b jk ) is m l, then the matrix product AB is n l and has entries (c ik ) where c ik = m j=1 a ijb jk. The product of matrices corresponds to composition of transformations. (4) A column vector of height k is an k 1 matrix. With the same notation as above, let v V and expand v = j b jv j and T v = i c iw i. If b, c are the column vectors formed from the b s and c s then c = Ab. (5) For a given linear T : V W and w W, solving the equation T v = w amounts to solving a system of n simultaneous equations in n variables.

4 4 JC 7. Spaces of transformations (1) The set of all linear maps from V to W is a vector space if we define (φ + ψ)(v) = φ(v) + ψ(v) and (λφ)(v) = λ(φ(v)). (2) Let V have a basis v 1,... v m and W a basis w 1,... w n. Then the space of linear maps from V to W has a basis of form {ρ ij ) where ρ ij is the unique linear map which takes v j to w i and all other basis elements to zero: explicitly ρ ij ( k λ kv k ) = λ j w i. In particular the dimension of the space of maps is mn. Remark: ρ ij has a matrix with one in row i column j and zero elsewhere. (3) F is a vector space of dimension one over F. The space of all linear maps from V to F is called the dual space V. If v 1,... v m is a basis for V, the dual basis is v1,... vm where vi ( k λ kv k ) = λ i. (4) If T : V W is linear, we induce a map T : W V by T φ = φ T. If the matrix of T with some bases is the n m matrix A, the matrix if T with respect to the dual bases is the transpose; that is the m n matrix with a ji in row i column j. 8. Multilinearity, tensor product (1) Let V 1,... V n and W be VS s over a field k. A function φ : V 1... V n W is multilinear if and only if it is linear in each argument when the other arguments are held constant. The set of such multilinear maps froms a VS. If φ : V 1... V n W is multilinear and ψ : W W is linear then ψ φ is multilinear. (2) Given V 1,... V n we can construct the most general multilinear map with domain V 1... V n. Put precisely, we can construct a space V 1... V n (the tensor product ) and a multilinear map : V 1... V n V 1... V n such that for every space W and every multilinear φ : V 1... V n W there is a unique linear ψ : V 1... V n W such that φ = ψ. We usually write (v 1,... v n ) = v 1... v n, and in this notation the property φ = ψ reads ψ(v 1... v n ) = φ(v 1,... v n ). (3) From now on we assume that each V i is finite dimensional with dimension d i and a basis B i. Then we can construct the tensor product easily: it has dimension d 1... d n with a basis of elements of the form b 1... b n. This reflects the fact that any multilinear map on V 1... V n is determined by its values on B 1... B n. 9. Alternating maps, exterior power, determinants (1) A multilinear map from V n to W is alternating if and only if it is zero on any argument (z 1,... z n ) such that z i = z j for some i < j. If φ is alternating and σ S n, then φ(z σ(1),... z σ(n) ) = ( 1) parity(σ) φ(z 1,... z n ). (2) We can describe multilinear maps using a similar construction to tensor product, called the exterior power V n. This construction gives a space V n and alternating : V n V n such that for any alternating φ : V n W there is a unique linear ψ : V n W with ψ(v 1... v n ) = φ(v 1,..., v n ). (3) We will assume that V is FD with dimension d and fix a basis B, enumerated as b 1,... b d. It is easy to see that if t > d then any alternating map on V t is

5 LINEAR ALGEBRA REVIEW 5 zero (use multilinearity and the fact that any tuple in B t has a repetition). By convention V t = 0 in this case. (4) When t d we can describe V t as follows: it s a space of dimension ( ) d t with a basis of elements of the form b i1 b i2... b it where i 1 < i 2 <... < i t. In particular when t = d, V t has dimension one. (5) Let T : V V be linear. The map (z 1,... z d ) T z 1... T z d is alternating, and using the fact that V d has dimension one there is a unique λ F such that T z 1... T z d = λ(z 1... z d ) for all z i V. λ is called the determinant of T. Routine calculations shows that if T has matrix A with respect to some basis then the determinant of T is the determinant of A. (6) Some easy facts: if I is the identity transformation det(i) = 1. det(st ) = det(s)det(t ). S is invertible iff det(s) 0. det(a) is unchanged if a multiple of one row of A is subtracted from another, and is multiplied by 1 when two rows are swapped. The determinant of an upper triangular matrix is the product of its diagonal elements, so we can efficiently compute determinants by row reduction. The identity tranformation I has a matrix I n with ones down the diagonal and zeroes elsewhere. If A is a matrix with det(a) 0 there is a unique B with AB = BA = I n, B is called the inverse matrix and written A Real inner product spaces (1) Let V be a real VS. An inner product on V is a bilinear function v w from V 2 to R such that v w = w v, v v 0, and v v = 0 if and only if v = 0. A real inner product space is a real space equipped with an inner product. (2) The standard inner product on R n (sometimes called the dot product ) is given by (x 1,... x n ) (y 1,... y n ) = x i y i. i (3) By definition v = v v. This is called the length or norm of v. Since 0 (v + λw) (v + λw) = v 2 + 2λ(v w) + λ 2 w 2, it follows that v w v w (the Cauchy-Schwartz inequality). From this we see easily that v + w v + w. (4) v and w are orthogonal (a fancy word for perpendicular) if v w = 0. Only the zero vector is orthogonal to itself. If v and w are orthogonal then so are λv and µw for any scalars λ, µ. (5) If v 0 and λ = v 1, then λv = 1. (6) A subset of a real IP space is orthonormal if and only if every element has length one, and distinct elements are orthogonal. An ON set is independent because if i λ ix i = 0 where the x i are distinct elements of an ON set, we can form the inner product with x i to conclude that λ i = 0. More generally, we have (Pythagoras theorem) that if the x i are distinct elements of an ON set then λ i x i 2 = λ 2 i. i i

6 6 JC (7) Any FD real IP space has an ON basis. To see this we start with any basis w 1,... w n and generate an ON set v 1,... v n such that the span of v 1,... v j equals the span of w 1,... w j. Given v 1,... v j we first define vj+1 = w j+1 j i=1 w j+1 v i v i, and check that this is nonzero and orthogonal to v 1,... v j. Then we scale it to get a vector v j+1 of length one. Remark: if v 1,... v n is an ON basis then i λv i j µ jv j = i λ iµ i. So by a change of basis we can make any IP into the familiar dot product on R n. (8) If v 1,... v n is an ON basis then the expansion of v takes a very simple form. If v = i λ iv i then taking the inner product with v j we see that v v j = λ j. (9) If V is a FD IP space then every element φ of the dual space V has the form v v w for a unique w. This can be seen easily by letting v 1,... v n be an ON basis, and then writing φ in terms of the dual basis as φ = i λ ivi. It is routine to see that w = i λ iv i works, and in fact is the only possibility. (10) If T : V V is linear, then an adjoint for T is a linear map S : V V such that T v w = v Sw for all v, w. If it exists it is unique. For V FD the adjoint always exists, and if we let A be the matrix of T using an ON basis then the matrix for the adjoint T is the transpose of A. T is self-adjoint if and only if T = T ; in an ON basis this happens if and only if the matrix is symmetric. (11) Let W be a real inner product space and let X W. Then X is the set of v such that v x = 0 for all x X. This is a subspace of W. (12) Let W be a FD real inner product space and let V be a subspace of W. We claim that V and V are complementary. To see this let v 1,... v m be an ON basis of V, and extend to get an ON basis v 1,... v n for W. Then easily w V if and only if v w j = 0 for 1 j m if and only if w is a linear combination of {w m+1,... w n }. So {w m+1,... w n } is a basis for V. 11. Projection on a subspace (1) Let W be a FD subspace of a real IP space V. Then for any v V there is a unique. w W which is closest to v. This can be computed as follows: choose an ON basis w 1,... w m for W, let λ i = v w i, then w = i λ iw i. The main point is that v w W. The map which takes v to w as above is a linear map called orthogonal projection onto W. (2) Assume that V is FD. Then if P is orthogonal projection onto W, we have P 2 = P = P. Conversely if P 2 = P = P for some linear map P, then P is the orthogonal projection onto W where W is the image of P. 12. Complex numbers (1) The field of complex numbers consists of expressions a + bi for real a and b, added and multiplied in the usual way (follow usual laws of arithmetic with i 2 = 1). (2) If z = a + bi, then the conjugate z is given by z = a bi. z + w = z + w and z w = z w. z z = a 2 + b 2, which is zero only when z = 0. (3) Any complex polynomial can be written as a product of linear complex polynomials. In particular any complex polynomial of degree greater than zero has a root.

7 LINEAR ALGEBRA REVIEW Eigenvalues, characteristic equation, diagonalisation (1) Let T : V V be linear. An eigenvector of T is v V such that v 0, and T v = λv for some λ F. λ is called the eigenvalue associated with v. If λ is an eigenvalue then the subspace {w : T w = λw} is the eigenspace; it is not {0}, and its nonzero members are precisely the eignevectors with eigenvalue λ. (2) Let V be FD. Then λ is an eigenvalue of T iff and only if the nullspace of λi T is nonzero, which by the theory of determinants happens exactly when det(λi T ) = 0. The polynomial det(λi T ) is the characteristic polynomial of T, det(λi T ) = 0 is the characteristic equation of T. The characteristic polynomial of T has degree dim(v ) so the equation has at most dim(v ) roots. (3) Let V be FD. A basis of V consists of eigenvectors for T if and only if the matrix of T is diagonal in that basis. (4) Let v 1,... v n be a family of eigenvectors of T, each one belonging to a different eigenvalue of T. Then the v i are independent. To see this take a dependence of minimal size, which (renumbering if necessary) may be assumed to have the form m j=1 µ jv j = 0 for some m with 1 < m n. Multiplying by λ 1, m j=1 λ 1µ j v j = 0. Applying T, m j=1 λ jµ j v j = 0. Subtracting, m j=2 µ j(λ j λ 1 )v j = 0. Contradiction as we chose m to be the minimal size for a dependence. (5) Let V be FD and let T have dim(v ) distinct eigenvalues. Then V has a basis of eigenvectors. Proof: choose one eigenvector for each eigenvalue, they form an independent set. (6) If V is a real inner product space and T is self adjoint map from V to V, then eigenvectors which belong to distinct eigenvalues are orthogonal. In fact if T v = λv and T w = µw we see that λv w = T v w = v T w = µv w. 14. Complex IP spaces, diagonalisation of a self adjoint map (1) Let V be a complex VS. An inner product on V is a function v w from V 2 to C such that (a) (Hermitian property) v w = w v. (b) (Sesquilinear property) (v 1 + v 2 ) w = v 1 w + v 2 w, v (w 1 + w 2 ) = v w 1 + v w 2, (λv) w = λ(v w), v (λw) = λ(v w). (c) (Positivity property) v v R, v v 0, and v v = 0 if and only if v = 0. A complex inner product space is a complex space equipped with an inner product. (2) The standard inner product on C n (sometimes called the dot product ) is given by (x 1,... x n ) (y 1,... y n ) = x i ȳ i. i (3) The following facts have similar proofs to the real case, except you have to be careful since inner product is only linear in the first argument. (a) Any FD complex IP space has an ON basis. (b) If v 1,... v n is an ON basis then v = n i=1 (v v i)v i. (c) Every element of V is v v w for a unique w. (d) An ON set is independent.

8 8 JC (4) It is also easy to see that if V is a FD complex IP space and T : V V is linear, there is a unique linear T (the adjoint) such that T v w = v T w. If T has matrix A = (a ij ) with respect to some ON basis, the matrix of T is the Hermitian conjugate of A, that is the matrix with ā ji in row i column j. T is self-adjoint if and only if T = T, this corresponds to the matrix A being hermitian (equal to its Hermitian conjugate). (5) When V is FD and W is a subspace of V, then the orthogonal projection onto W is a linear map P with P 2 = P = P. As in the real case, the property P 2 = P = P characterises the projection maps. (6) If T is self-adjoint then all eigenvalues of T are real. To see this observe that if T v = λv, v 0 then λv v = T v v = λv T v = λv v. (7) If T is self-adjoint then eigenvectors corresponding to different eigenvalues are orthogonal. Argument similar to real case, uses fact that all eigenvalues are real. (8) If V is a FD complex space with dim(v ) > 0, and S : V V is linear then S has an eigenvalue, because the characteristic polynomial has positive degree and so has at least one root. (9) (complex form of the spectral theorem) If W is FD complex IP space and T is self-adjoint then W has an ON basis consisting of eigenvectors. To see this let λ 1,... λ m be the eigenvalues, and choose for each i an ON basis B i for the eigenspace W i corresponding to λ i. Let B = B i, and let V be the subspace spanned by the ON set B. It is easy to see that v V = T v V, and v V = T v V. Let S : V V be the restriction of T to V : we must have V = {0} because otherwise S would have an eigenvalue v, and this would be a nonzero element of V V. So V = W and we are done. It is easy to see that the spectral theorem expresses T in the form i λ ip i where P i is orthogonal projection onto W i. (10) (real form of the spectral theorem) Let V be a real inner product space and let T : V V be self-adjoint. Then V has an ON basis consisting of eigenvectors of T. Proof: use complex numbers to show that a self-adjoint map on a real inner product space of dimension greater than zero has a (real!) eigenvalue, then argue as above. As in the complex case T is a linear combination of projections to the eiegenspaces with coefficients the eigenvalues. 15. Orthogonal and unitary matrices (1) A real square matrix O is orthogonal if O T = O 1. Note that this is so if and only if the columns of O form an ON set (look at O T O = I) if and only if the rows of O form an ON set (look at OO T = I). In matrix language the real spectral theorem says that if A is symmetric there is an orthogonal O with O T AO = O 1 AO diagonal. (2) A complex square matrix U is unitary if U H = U 1. Note that this is so if and only if the columns of U form an ON set if and only if the rows of U form an ON set. In matrix language the complex spectral theorem says that if A is Hermitian there is a unitary U with U H AU = U 1 AU diagonal.

9 LINEAR ALGEBRA REVIEW Quadratic forms, positive (semi)definite matrices (1) A real quadratic form in n variables can be written in the form x T Ax for a unique symmetric A. The spectral theorem gives a change of variables x = Oy such that in the new variables the form becomes y T (O T AO)y = i λ iy 2 i where the λ i are the eigenvalues of A. (2) In particular, x T Ax 0 for all x if and only if all the λ i are non-negative, and in the case we say A is positive semidefinite. x T Ax 0 for all x and is zero only at zero if and only if all the λ i are positive, and in this case we say A is positive definite.

### Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

### 1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

### Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

### Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

### Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

### Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

### Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

### University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

### homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45 address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test

### MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

### Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

### DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

### Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

### The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

### 235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

### LINEAR ALGEBRA MICHAEL PENKAVA LINEAR ALGEBRA MICHAEL PENKAVA 1. Linear Maps Definition 1.1. If V and W are vector spaces over the same field K, then a map λ : V W is called a linear map if it satisfies the two conditions below: (1)

### A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

### MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

### Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

### Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

### Linear algebra 2. Yoav Zemel. March 1, 2012 Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

### 2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

### Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

### Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

### Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

### Introduction to Linear Algebra, Second Edition, Serge Lange Introduction to Linear Algebra, Second Edition, Serge Lange Chapter I: Vectors R n defined. Addition and scalar multiplication in R n. Two geometric interpretations for a vector: point and displacement.

### (v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

### Lecture 2: Linear operators Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study

### 2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure. Hints for Exercises 1.3. This diagram says that f α = β g. I will prove f injective g injective. You should show g injective f injective. Assume f is injective. Now suppose g(x) = g(y) for some x, y A.

### Linear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS. a + 2b = x a + 3b = y. This solves to a = 3x 2y, b = y x. Thus Linear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS 1. (5.5 points) Let T : R 2 R 4 be a linear mapping satisfying T (1, 1) = ( 1, 0, 2, 3), T (2, 3) = (2, 3, 0, 0). Determine T (x, y) for (x, y) R

### Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

### Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same. Introduction Matrix Operations Matrix: An m n matrix A is an m-by-n array of scalars from a field (for example real numbers) of the form a a a n a a a n A a m a m a mn The order (or size) of A is m n (read

### Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

### 6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

### Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

### Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

### Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

### MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

### Solving a system by back-substitution, checking consistency of a system (no rows of the form MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary

### GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

### Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

### October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

### Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

### Linear Algebra Practice Problems Linear Algebra Practice Problems Page of 7 Linear Algebra Practice Problems These problems cover Chapters 4, 5, 6, and 7 of Elementary Linear Algebra, 6th ed, by Ron Larson and David Falvo (ISBN-3 = 978--68-78376-2,

### Chapter 1 Vector Spaces Chapter 1 Vector Spaces Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 110 Linear Algebra Vector Spaces Definition A vector space V over a field

### Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

### CS 246 Review of Linear Algebra 01/17/19 1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector

### Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

### MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

### Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

### Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

### SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

### ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

### Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

### Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition. Lecture notes - Math 110 Lec 002, Summer 2016 BW The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition. 1 Contents 1 Sets and fields - 6/20 5 1.1 Set notation.................................

### Real symmetric matrices/1. 1 Eigenvalues and eigenvectors Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

### Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

### Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

### Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

### 2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

### BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

### LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

### MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

### Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

### Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

### Math 593: Problem Set 10 Math 593: Problem Set Feng Zhu, edited by Prof Smith Hermitian inner-product spaces (a By conjugate-symmetry and linearity in the first argument, f(v, λw = f(λw, v = λf(w, v = λf(w, v = λf(v, w. (b We

### Last name: First name: Signature: Student number: MAT 2141 The final exam Instructor: K. Zaynullin Last name: First name: Signature: Student number: Do not detach the pages of this examination. You may use the back of the pages as scrap paper for calculations,

### W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V. MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have

### Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

### MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

### Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

### 1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

### Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

### Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

### GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

### Vector Spaces and Linear Transformations Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations

### Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

### MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

### 1 9/5 Matrices, vectors, and their applications 1 9/5 Matrices, vectors, and their applications Algebra: study of objects and operations on them. Linear algebra: object: matrices and vectors. operations: addition, multiplication etc. Algorithms/Geometric

### OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

### PRACTICE PROBLEMS FOR THE FINAL PRACTICE PROBLEMS FOR THE FINAL Here are a slew of practice problems for the final culled from old exams:. Let P be the vector space of polynomials of degree at most. Let B = {, (t ), t + t }. (a) Show

### Vector spaces. EE 387, Notes 8, Handout #12 Vector spaces EE 387, Notes 8, Handout #12 A vector space V of vectors over a field F of scalars is a set with a binary operator + on V and a scalar-vector product satisfying these axioms: 1. (V, +) is

### WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces WOMP 2001: LINEAR ALGEBRA DAN GROSSMAN Reference Roman, S Advanced Linear Algebra, GTM #135 (Not very good) Let k be a field, eg, R, Q, C, F q, K(t), 1 Vector spaces Definition A vector space over k is

### Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

### MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

### Math 113 Practice Final Solutions Math 113 Practice Final Solutions 1 There are 9 problems; attempt all of them. Problem 9 (i) is a regular problem, but 9(ii)-(iii) are bonus problems, and they are not part of your regular score. So do

### MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

### Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

### Linear Algebra II. Ulrike Tillmann. January 4, 2018 Linear Algebra II Ulrike Tillmann January 4, 208 This course is a continuation of Linear Algebra I and will foreshadow much of what will be discussed in more detail in the Linear Algebra course in Part

### Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler. Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we

### A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and Section 5.5. Matrices and Vectors A matrix is a rectangular array of objects arranged in rows and columns. The objects are called the entries. A matrix with m rows and n columns is called an m n matrix.

### Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v, NOTES ON BILINEAR FORMS PARAMESWARAN SANKARAN These notes are intended as a supplement to the talk given by the author at the IMSc Outreach Programme Enriching Collegiate Education-2015. Symmetric bilinear