Linear algebra 2. Yoav Zemel. March 1, 2012

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Linear algebra 2. Yoav Zemel. March 1, 2012"

Transcription

1 Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at Linear transformations in an inner product space Let V be an inner product space and T : V V a linear transformation. We wish to define its conjugate transformation T : V V. The transformation (α, β) T (α), β is the bilinear pattern defined by T. Lemma 3.1 Let α, β, γ, δ in V, and a, b in F, then (1) T (α + γ), β = T (α), β + T (γ), β. (2) T (α), β + δ = T (α), β + T (α), δ. (3) T (aα), β = a T (α), β. (4) T (α), bβ = b T (α), β. Proof. Trivial by linearity of T and the definition of the inner product. Lemma 3.2 Let β, γ in V. (1) If for any α V α, β = α, γ, or for any α V β, α = γ, α then γ = β. (2) Let T and S be linear transformations V V. If for any α, β V T (α), β = S(α), β or for any α α, T (β) = α, S(β) then T = S. (3) If for any α, β V we have T (α), β = 0 or for any α, β V we have α, T (β) = 0 then T = 0. Proof. (1) If for any α α, β γ = 0 then β γ is orthogonal to all vectors, and therefore equals 0. The other case is analogous. 1

2 (2) If T (α) S(α), β = 0 for any β, then T (α) S(α) = 0. This holds for any α, hence T = S. the other case is analogous. (3) We have T (α), β = 0 = S(α), β, where S is the zero transformation. By (2) T = 0, and the other case is analogous. So T determines a bilinear pattern T (α), β. By this lemma, this linear pattern determines T. If T (α), α = 0 for any α, it does not necessarily mean that T = 0, if F = R e.g. when T is rotation by 90 degrees. However, if F = C then it is true. Proof. For any α, β V we have 0 = T (α + β), α + β = T (α), α + T (α), β + T (β), α + T (β), β = T (α), β + T (β), α. Replacing β by iβ and using i = i we get i T (β), α i T (α), β = 0. On the other hand, we also have i T (β), α + i T (α), β = 0. From this it follows that T (β), α = 0 for any β and any α, and thus T = 0. Let β V and consider the linear functional ϕ β : V R defined by ϕ β (α) = α, β. Note that ϕ β = β, α is linear only over R, not over C. Theorem 3.3 For any linear functional ϕ : V F there exists a unique β for which ϕ = ϕ β. Note : this is true only when dimv <, e.g. V =continuous functions on [-1,1] and the functional ϕ(f) = f(0) does not come from any inner product. Proof. Suppose that ϕ β = ϕ γ then for any α, α, β = α, γ and by a previous lemma γ = β. This establishes uniqueness. For the existence, we consider all the linear functionals ϕ β for β V. This is a subspace of the dual space V. As dimv = dimv, we need to show that n = dimv = dim{ϕ β β V } and this will establish V = {ϕ β β V }. Let S : V V be defined by S(β) = ϕ β. S is linear over R, and almost linear over C : S(β 1 +β 2 )(α) = ϕ β1 +β 2 (α) = α, β 1 + β 2 = α, β 1 + α, β 2 = ϕ β1 (α)+ϕ β2 (α). Thus S(β 1 + β 2 ) = S(β 1 ) + S(β 2 ). But S(cβ)(α) = α, cβ = c α, β = cs(β)(α) cs(β)(α), 2

3 so S is not necessarily linear. S is injective because if ϕ β = ϕ γ then β = γ. We know that if dimv = n and T : V W is injective then its image T (V ) is a subspace of W with dimension n. The same proof will work here for S, even though it is not linear but almost linear. It follows that Im(S) = V and the existence proof is complete. Theorem 3.4 conjugate transformation Let V be an inner product space of dimension n <. For any linear transformation T : V V there exists a unique linear transformation T : V V, such that α, β T (α), β = α, T (β). T is the conjugate transformation of T. Proof. (uniqueness) If α, β α, T (β) = α, S (β), then T = S by a previous lemma. (existence) Let β V and consider ϕ β (α) = T α, β. By theorem 3 there exists γ V such that for any α, ϕ β (α) = α, γ. Define T (β) = γ, which is well defined by uniqueness of γ. Then α we have T α, β = α, T (β). Do the same procedure for any β to define T over whole V. All that remains is to show that T is linear. Let α, β and γ in V, then α, T (β + γ) = T α, β + γ = T α, β + T α, γ = α, T (β) + α, T (γ) = α, T (β) + T (γ). This holds for any α, and by lemma 2A, T (β + γ) = T (β) + T (γ). For a scalar s F, α, T (sβ) = T α, sβ = s T (α), β = s α, T (β) = α, st (β). This, again, holds for any α and therefore T (sβ) = st (β). T is linear, and the proof is complete. Theorem 3.5 Let T : V V and let T be its conjugate. Let {e 1,..., e n } be an orthonormal basis of V under which A is the matrix that represents T. Then the matrix that represents T is the matrix A defined by A ij = A ji i, j. A is the conjugate matrix of A. Proof. The j th column of A is the coordinate vector of T (e j ) by the given basis. The i th element of it is A ij. We also know that the i th element equals T (e j ), e i. Thus A ij = T (e j ), e i = e i, T (e j ) = T (e i ), e j = A ji. 3

4 Consider C 2 with the standard inner product, and T (z 1, z 2 ) = (z 1 + iz 2, 2z 2 + (2 3i)z 1 ). Then T (1, 0) = (1, 2 3i) = 1(1, 0)+(2 3i)(0, 1) and T (0, 1) = (i, 2) = i(1, 0)+2(0, 1), so the matrix of T is therefore ( ) 1 i A = 2 3i 2 and A = ( ) i. i 2 This gives T (z 1, z 2 ) = (z 1 + (2 + 3i)z 2, 2z 2 iz 1 ). 3.1 Properties of the conjugate transformation Consider the map : hom(v, V ) hom(v, V ). Lemma 3.6 (1) (T + S) = T + S. (2) (at ) = at. (3) (T S) = S T. (4) (T ) = T. (5) The * transformation is bijective. Proof. Rather immediate : For (1), observe that α, (T + S) (β) = (T + S)α, β = T α + Sα, β = T α, β + Sα, β = = α, T β + α, S β = α, T β + S β. As this holds for any α, (T + S) (β) = T (β) + S (β). As this holds for any β, (T + S) = T + S. (2) Similarly, α, (at ) (β) = (at )(α), β = a T α, β = a α, T β = α, at β. As this holds for any α, (at ) (β) = (at )(β). As this holds for any β, (at ) = at. (3) Here we have α, (T S) (β) = (T S)α, β = T (S(α)), β = S(α), T (β) = α, S (T (β)). This holds for any α, so (T S) (β) = S (T (β)). This holds for any β, so that (T S) = S T. (4) Observe that α, (T ) (β) = T (α), β = β, T (α) = T β, α = α, T β. 4

5 This holds for any α, so (T ) (β) = T (β). This holds for any β, so (T ) = T. (5) If T = S then (T ) = (S ) and by (4) T = S. Thus * is injective. For a given T, T exists, and (T ) = T, so * is also surjective. Since the * operation preserves the relation between matrices and transformations, these properties hold for matrices as well. It is left as an exercise to prove it directly for matrices. Let I be the identity, then I is also the identity matrix, since Iα, β = α, Iβ = α, I β. Definition 3.7 T is conjugate to itself if T = T. When F = R, it is said to be symmetric. When F = C, it is said to be Hermitian. An n n matrix is conjugate to itself if A = A, which happens precisely when A is symmetric if F = R. If F = C then A is said to be Hermitian. If A represents T under an orthonormal basis then A is Hermitian if and only if T is Hermitian. Lemma 3.8 Let T be conjugate to itself and suppose that T α, α = 0 for any α then T = 0. Proof. If F = C then we do not even need T to be conjugate to itself, so it is sufficient to prove the lemma for F = R. One has 0 = T (α + β), α + β = T α, α + T β, β + T α, β + T β, α = = T β, α + T α, β = T α, β + β, T (α) = T α, β + β, T (α) = 2 T α, β, where the last equality is because F = R and the one before that is because T is conjugate to itself. Since T α, β = 0 for any α, β, it follows that T = 0. Theorem 3.9 Let V be a unitary space, then T is conjugate to itself if and only if α V we have T α, α R. This is obviously true only when F = C. Proof. Suppose that T is conjugate to itself. Then for any α we have T α, α = α, T α = α, T α = T α, α, from which it follows that it is indeed a real number. For the converse, reversing the arguments shows that α V α, (T T )(α) = 0, and since F = C it follows that T = T. 5

6 Remark 1 Obviously, T is conjugate to itself if and only if T is conjugate to itself. Definition 3.10 T is unitary if it preserves the inner product, i.e. for any α, β V we have T α, T β = α, β. In a Euclidean space a unitary transformation is called orthogonal. Theorem 3.11 The following are equivalent (1) T is unitary. (2) T preserves the norm i.e. a = T (a) for any a. (3) T T = I. (4) T maps any orthonormal basis to an orthonormal basis. (5) T maps some orthonormal basis to an orthonormal basis. Remark 2 (2) is equivalent to T preserving lengths i.e. T α T β = T (α β) = α β. (3) means that T and T are invertible, and the inverse of one is the other. Clearly this means T T = I as well. Therefore, if T is unitary then T is as well. Proof. If T is unitary, then T α 2 = T α, T α = α, α = α 2, so T preserves the norm. If T preserves the norm, then α, T T α = T α, T α = α, α. Thus 0 = α, T T α α = α, (T T I)α = (T T I)α, α. The proof now follows since T T I is conjugate to itself : (T T I) = T T I. If T T = I and {e 1,..., e n } is an orthonormal basis for V, then e i, e j = δ ij. As T is invertible, {T e 1,..., T e n } has to be a basis. It remains to show that it is orthonormal. T e i, T e j = e i, T T e j = e i, e j = δ ij, so {T e 1,..., T e n } is an orthonormal basis. (4) implies (5), since orthonormal bases exist. Suppose that {e 1,..., e n } and {T e 1,..., e n } are orthonormal bases. Then for α = i a ie i and β = j b je j we have T α, T β = ai T e i, b j T e j = a i T e i, b j T e j = = i a i b j e i, e j = a i b i = α, β. j Definition 3.12 A matrix A, is said to be unitary (orthogonal, if F = R) if A A = I (A t A = I). 6

7 Corollary 3.13 If A represents T with respect to an orthonormal basis, then T is unitary if and only if A is unitary. Proof. If T is unitary then T T = I, and as the basis is orthonormal, A represents T, and thus A A represents I, which gives A A = I. If T is not unitary then T T I and A will not be unitary. ( ) ( ) cos a sin a 1 0 Example : is an orthogonal matrix over R, as well as. sin a cos a 0 1 A product of unitary matrices is also unitary : (T S) T S = S T T S = S IS = S S = I. Proposition 3.14 A is unitary if and only if there exists an inner product space V over F such that A is the transformation matrix of a transformation mapping an orthonoraml basis of V to an orthonormal basis of V. Proof. Let A be unitary, set V = F n with the standard inner product, and define T : V V by T α = Aα. Then T is a unitary linear transformation. Let k = {e 1,..., e n } be the standard basis of V. Thus k 2 = {T e 1,..., T e n } is an orthonormal basis as well. A is the transformation matrix from k to k 2. For the converse, we use the isomorphism that maps a vector to its coordinates vector under the orthonormal basis k = {e 1,..., e n }. Now, suppose that k 2 = {d 1,..., d n } and k are both orthonormal, and A is the transformation matrix from k to k 2. Therefore T as defined above maps k to k 2 ; it is therefore unitary. A represents it under an orthonormal basis, so it is also unitary. Theorem 3.15 Let A M n (F ) and consider its rows as vectors α 1,..., α n F n. Then A is unitary if and only if these rows form an orthonormal basis. Proof. Write A A = (b ij ), then A is unitary if and only if b ij = δ ij. On the other hand, α i, α j = k a ika jk = k a ika kj = b ij. We need α i, α j to equal δ ij and that happens if and only if A is unitary. It follows immediately that A is unitary if and only if its columns are orthonormal basis. We wish to discover when V has an orthonormal basis of eigenvectors of T. Claim 1 If such a basis exists, then T T = T T. Proof. If {e 1,..., e n } is an orthonormal basis of eigenvectors of T then the matrix A which represents T under this basis is diagonal. A is diagonal as well, and represents T. AA represents T T, and A A represents T T, but AA = A A because both are diagonal. Thus T T = T T. 7

8 Definition 3.16 A linear transformation T is said to be normal if T T = T T. Examples : If T is conjugate to itself then it is normal. If it is unitary then it is normal. Lemma 3.17 Let V inner product space over F, T : V V normal. If c is eigenvalue of T with eigenvector u, then c is an eigenvalue of T with the same u. Proof. As T is normal, for any w V we have T w 2 = T w, T w = w, T T w = w, T T w = T w, T w = T w 2. We know that T u = cu and wish to show that T u = cu. Consider T ci. As T is normal, T ci is also normal : (T ci)(t ci) = (T ci)(t ci) = T T ct ct + cci, and (T ci) (T ci) = T T ct ct + cci, which are equal as T and T commute. Thus T ci is normal. It follows that 0 = T u cu = (T ci)u = (T ci) u = (T ci)u = T u cu. This gives T u = cu. Corollary 3.18 If T is normal, then any real eigenvalue of T is an eigenvalue of T. Lemma 3.19 Let T be normal, then any two eigenvectors lying in different eigenspaces are orthogonal. Proof. Suppose that u belong to a and w to b a, then a u, w = au, w = T u, w = u, T w = u, bw = b u, w, in virtue of the previous lemma. As a b this means u, w = 0. Lemma 3.20 Let V be an inner product space over C, with 0 < dimv = n <. Then for any linear transformation T : V V there is an eigenvector u 0. Proof. The (not identically zero) characteristic polynomial splits to linear components and they are all eigenvalues thus there are nonzero eigenvectors. Theorem 3.21 Let V be a unitary space over C and T : V V be normal. Then V admits an orthonormal basis of eigenvectors of T. 8

9 Proof. Let U be the span of all the eigenvectors. This is a subspace of V. Let a 1,..., a k be the eigenvalues of T and V i U the eigenspaces. For any V i we choose an orthonormal basis and define E to be the union of theses bases. Since all the eigenvectors are in E we have U = V i = span(e). Write E = {u 1,..., u l }, where l k, then E spans U. We have to show that E is independent and that l = n. If i = j, then u i, u j = 1 because u i is an elements of an orthonormal basis of some space. If i j and u i, u j are in the same eigenspace V r, then u i, u j = 0 because they are part of an orthonormal basis of V r. If i j and they lie in different eigenspaces, then u i, u j = 0 by a previous lemma. This proves that E is an orthonormal system, thus independent. To see that l = n, write V = U U +, where U + is the orthogonal complement of U. We wish to show that U + = {0}, and we begin by showing that it is T invariant, i.e. T (U + ) U +. Let β U +, then u j, β = 0 for any j. But u j, T (β) = T (u j ), β = xu j, β = x u j, β = 0, where x is the eigenvalue to whose space u j belongs. Thus T (β) U +. Define S : U + U + by S(α) = T (α). If there is α 0, then by a previous lemma S has an eigenvector. It will also be an eigenvector of T, so must lie in U, and so the intersection of U and U + is not {0}, a contradiction. This completes the proof. Corollary 3.22 Let T : V V as before, then V has an orthonormal basis under which T is diagonal. Corollary 3.23 For a normal A M n (C), there is a unitary U M n (C) with U AU = U 1 AU diagonal. In order to find U, one can proceed as follows : (1) calculate the characteristic polynomial det(xi A). (2) find the eigenvalues of A, i.e. the roots of the polynomial. (3) for any eigenvalue find a basis for the eigenspace. (4) for any eigenspace find an orthornormal basis (use Gram Schimdt process). (5) the union of those bases is an orthonormal basis of eigenvectors of V. If the columns of U are (α 1,..., α n ), then the similar matrix is l 1 l 2..., l n 9

10 where l i is the eigenvalue of α i (l 1,..., l n do not have to be distinct). Theorem 3.24 Let V be an inner product space over C or R, T : V V normal. Then (1) T is conjugate to itself if and only if all the eigenvalues of T (in C) are real numbers. (2) T is unitary if and only if all eigenvalues have an absolute value of 1. ( ) 0 1 Example : is unitary (its rows are orthonormal), and has no real 1 0 eigenvalues. But over C is has the eigenvalues i and i, both have a modulus of 1. Proof. Let {e 1,..., e n } be an orthorormal basis of V under which the matrix D of T is diagonal. D is clearly normal. (1) T is conjugate to itself if and only if D = D, which is equivalent to D having only real eigenvalues, and they are the same eigenvalues of T. (2) T is unitary if and only if D is unitary if and only if DD = I if and only if l i l i = 1 i if and only if li = 1 i. Corollary 3.25 Let A be a symmetric matrix over R then A is diagonalizable over R. Proof. A is symmetric thus conjugate to itself thus normal. Therefore all the eigenvalues are real, and the characteristic polynomial decomposes to linear elements. As A is normal it can be diagonalized by a unitary matrix : for any eigenvalue we look at the eigenspace and find an orthonormal basis. Since A and its eigenvalues are real, we can choose the elements of this basis to be real as well. The union of these bases is an orthonormal basis of V. The matrix U whose columns are the vectors of the basis is orthogonal and U 1 AU is diagonal. Corollary 3.26 Let V be a unitary space with dimv = n. Let T : V V, then T is normal if and only if there exist a unitary matrix U and a Hermitian matrix H such that HU = UH = T. Proof. If such H and U exist then HU is normal, because (H = H commutes with U) HU(HU) = HUU H = HH = H 2 = U UHH = U HHU = U H HU = (HU) HU. If T is normal, let {e 1,..., e n } be an orthonormal basis of V under which D, the matrix that represents T, is diagonal. Write l 1 l 2 D =..., l j = r j (cos t j + i sin t j ), r j R +. l n 10

11 Set r 1 H r 2 = cos t 1 + i sin t 1..., U cos t 2 + i sin t 2 = r n.... cos t n + i sin t n Then : H is Hermitian because it is diagonal, and all its eigenvalues are real. U is normal because it is diagonal. It is unitary because all its eigenvalues have a modulus of 1. Clearly H U = U H = D. Now let H, and U be the linear transformations whose matrices under {e 1,..., e n } are H and U respectively. Then HU = UH = T. 4 Bilinear patterns Definition 4.1 Let W and V be vector space over a field F. A bilinear pattern is a function f : V W F with the following properties : f(a + c, b) = f(a, b) + f(c, b), f(a, b + c) = f(a, b) + f(a, c), f(au, v) = f(u, av) = af(u, v). Equivalently, for any v V and w W, f(v, ) : W F and f(, w) : V F are linear functionals. Examples : f(v, w) = 0 for any w, v is a bilinear pattern. If V is a Euclidean space over over R then, : V V F is bilinear. If V is unitary,, is not bilinear because it is not homogeneous unless V = {0}. If V is the dual space of V then f : V V F defined by f(v, w) = w(v) is a bilinear pattern. Fix g W and h V. Then f(w, v) = g(w)h(v) is also a bilinear pattern. Let V = F m, W = F n and A an m n matrix. Then A defines a bilinear pattern on V W by f(v, w) = v t Aw where v t is the transpose of v. This example is in fact the most general, once we fix bases for V and W : Suppose that dimv = m and dimw = n. Let (v 1,..., v m ) and (w 1,..., w n ) be bases. For a bilinear f define the matrix A by A ij = f(v i, w j ). 11

12 Write v = m i=1 x iv i, w = n i=1 y iw i then ( f(v, w) = f x i v i, ) y i w i i j = i,j x i y j f(vi, wj) = = i,j x i y j A ij = (x 1,..., x m )A(y 1,..., y n ) t = v t Aw, where v t = (x 1,..., x m ) and w t = (y 1,..., y n ). Given these two bases, f determines A and A determines f, so the map f A is bijective. If W = V and A = I under the standard basis we get f(u, v) = u t v. Suppose that dimv = m and dimw = n. multiplication by Define addition and scalar (f + g)(a, b) = f(a, b) + g(a, b) and (cf)(a, b) = cf(a, b). Then the collection of bilinear patterns is a vector space, of dimension mn by the isomorphism f A. One can also show it without using matrices : Lemma 4.2 Let V and W be vector space of finite dimension, and {g 1,..., g m }, {h 1,..., h n } bases of V and W respectively. Then g i h j is a bilinear pattern. These mn products give a basis for the vector space of bilinear patterns. Theorem 4.3 Let f be a bilinear pattern on V W and consider its matrix A with respect to the bases {w 1,..., w n } and {v 1,..., v m }. Let P be the transformation matrix from {v 1,..., v m } to {v 1,..., v m} and Q the transformation matrix from {w 1,..., w n } to {w 1,..., w n} then the matrix of f with respect to these new bases is P t AQ. Proof. Let v V, w W, and write their coordinates under {v 1,..., v m} and {w 1,..., w n} as v = (x 1,..., x m ) and w = (y 1,..., y n ). Then the coordinates of v under {v 1,..., v m } are P (x 1,..., x m ), and the coordinates of w under {w 1,..., w n } are Q(y 1,..., y n ), and so f(v, w) = (P (x 1,..., x m )) t AQ(y 1,..., y n ) = (x 1,..., x m ) t (P t AQ)(y 1,..., y n ), so the matrix by the new bases is indeed P t AQ. If V = W, then we call f a linear pattern on V. In such cases, we use the same basis for both vectors. Thus, if the matrix of f under {v 1,..., v n } is A, and under {v 1,..., v n} it is A, then A = P t AP, where P is the transformation matrix from {v 1,..., v n } to {v 1,..., v n}. 12

13 Definition 4.4 Two matrices A and B are congruent if there exists an invertible matrix P such that B = P t AP. This defines an equivalence relation, since A = I t AI, and if B = P t AP then A = (P t ) 1BP 1 = (P 1) t BP 1. If C = Q t BQ then C = Q t BQ = Q t P t AP Q = (P Q) t A(P Q), and P Q is invertible. Lemma 4.5 Two n n matrices are congruent if and only if they represent the same bilinear pattern over V. Proof. Immediate. Theorem 4.6 Let f be a bilinear pattern on V W with dimv = m and dimw = n. Choose bases {v 1,..., v m } and {w 1,..., w n } and 0 r min{m, n}. { 1 i = j r (1) f(v i, w j ) =. This gives something of the sort of a diagonal 0 otherwise matrix. 13

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

Spectral Theorem for Self-adjoint Linear Operators

Spectral Theorem for Self-adjoint Linear Operators Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

The Spectral Theorem for normal linear maps

The Spectral Theorem for normal linear maps MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question

More information

Spectral Theorems in Euclidean and Hermitian Spaces

Spectral Theorems in Euclidean and Hermitian Spaces Chapter 10 Spectral Theorems in Euclidean and Hermitian Spaces 10.1 Normal Linear Maps Let E be a real Euclidean space (or a complex Hermitian space) with inner product u, v 7! hu, vi. In the real Euclidean

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

LINEAR ALGEBRA MICHAEL PENKAVA

LINEAR ALGEBRA MICHAEL PENKAVA LINEAR ALGEBRA MICHAEL PENKAVA 1. Linear Maps Definition 1.1. If V and W are vector spaces over the same field K, then a map λ : V W is called a linear map if it satisfies the two conditions below: (1)

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Spectral Theorems in Euclidean and Hermitian Spaces

Spectral Theorems in Euclidean and Hermitian Spaces Chapter 9 Spectral Theorems in Euclidean and Hermitian Spaces 9.1 Normal Linear Maps Let E be a real Euclidean space (or a complex Hermitian space) with inner product u, v u, v. In the real Euclidean case,

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you

More information

Vector Spaces and Linear Transformations

Vector Spaces and Linear Transformations Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Chapter 6 Inner product spaces

Chapter 6 Inner product spaces Chapter 6 Inner product spaces 6.1 Inner products and norms Definition 1 Let V be a vector space over F. An inner product on V is a function, : V V F such that the following conditions hold. x+z,y = x,y

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure. Hints for Exercises 1.3. This diagram says that f α = β g. I will prove f injective g injective. You should show g injective f injective. Assume f is injective. Now suppose g(x) = g(y) for some x, y A.

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent Lecture 4. G-Modules PCMI Summer 2015 Undergraduate Lectures on Flag Varieties Lecture 4. The categories of G-modules, mostly for finite groups, and a recipe for finding every irreducible G-module of a

More information

Chapter 1 Vector Spaces

Chapter 1 Vector Spaces Chapter 1 Vector Spaces Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 110 Linear Algebra Vector Spaces Definition A vector space V over a field

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Last name: First name: Signature: Student number:

Last name: First name: Signature: Student number: MAT 2141 The final exam Instructor: K. Zaynullin Last name: First name: Signature: Student number: Do not detach the pages of this examination. You may use the back of the pages as scrap paper for calculations,

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

Symmetric and self-adjoint matrices

Symmetric and self-adjoint matrices Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that

More information

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) Lecture 1: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11) The eigenvalue problem, Ax= λ x, occurs in many, many contexts: classical mechanics, quantum mechanics, optics 22 Eigenvectors and

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector

More information

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors /88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

NOTES ON BILINEAR FORMS

NOTES ON BILINEAR FORMS NOTES ON BILINEAR FORMS PARAMESWARAN SANKARAN These notes are intended as a supplement to the talk given by the author at the IMSc Outreach Programme Enriching Collegiate Education-2015. Symmetric bilinear

More information

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V. MA322 Sathaye Final Preparations Spring 2017 The final MA 322 exams will be given as described in the course web site (following the Registrar s listing. You should check and verify that you do not have

More information

Mathematics 1. Part II: Linear Algebra. Exercises and problems

Mathematics 1. Part II: Linear Algebra. Exercises and problems Bachelor Degree in Informatics Engineering Barcelona School of Informatics Mathematics Part II: Linear Algebra Eercises and problems February 5 Departament de Matemàtica Aplicada Universitat Politècnica

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Algebra II. Paulius Drungilas and Jonas Jankauskas

Algebra II. Paulius Drungilas and Jonas Jankauskas Algebra II Paulius Drungilas and Jonas Jankauskas Contents 1. Quadratic forms 3 What is quadratic form? 3 Change of variables. 3 Equivalence of quadratic forms. 4 Canonical form. 4 Normal form. 7 Positive

More information

Lecture 2: Linear operators

Lecture 2: Linear operators Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms De La Fuente notes that, if an n n matrix has n distinct eigenvalues, it can be diagonalized. In this supplement, we will provide

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

Math113: Linear Algebra. Beifang Chen

Math113: Linear Algebra. Beifang Chen Math3: Linear Algebra Beifang Chen Spring 26 Contents Systems of Linear Equations 3 Systems of Linear Equations 3 Linear Systems 3 2 Geometric Interpretation 3 3 Matrices of Linear Systems 4 4 Elementary

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

Categories and Quantum Informatics: Hilbert spaces

Categories and Quantum Informatics: Hilbert spaces Categories and Quantum Informatics: Hilbert spaces Chris Heunen Spring 2018 We introduce our main example category Hilb by recalling in some detail the mathematical formalism that underlies quantum theory:

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Part 1a: Inner product, Orthogonality, Vector/Matrix norm Part 1a: Inner product, Orthogonality, Vector/Matrix norm September 19, 2018 Numerical Linear Algebra Part 1a September 19, 2018 1 / 16 1. Inner product on a linear space V over the number field F A map,

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

Linear System Theory

Linear System Theory Linear System Theory Wonhee Kim Lecture 4 Apr. 4, 2018 1 / 40 Recap Vector space, linear space, linear vector space Subspace Linearly independence and dependence Dimension, Basis, Change of Basis 2 / 40

More information

Lecture 8 : Eigenvalues and Eigenvectors

Lecture 8 : Eigenvalues and Eigenvectors CPS290: Algorithmic Foundations of Data Science February 24, 2017 Lecture 8 : Eigenvalues and Eigenvectors Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Hermitian Matrices It is simpler to begin with

More information

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY MAT 445/1196 - INTRODUCTION TO REPRESENTATION THEORY CHAPTER 1 Representation Theory of Groups - Algebraic Foundations 1.1 Basic definitions, Schur s Lemma 1.2 Tensor products 1.3 Unitary representations

More information

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45 address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

Algebra I Fall 2007

Algebra I Fall 2007 MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary

More information

Definitions for Quizzes

Definitions for Quizzes Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does

More information

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed : NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011 Linear Algebra II May 2011 Time allowed : 2 hours INSTRUCTIONS TO CANDIDATES 1. This examination paper contains

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017 Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.

More information

FORMS ON INNER PRODUCT SPACES

FORMS ON INNER PRODUCT SPACES FORMS ON INNER PRODUCT SPACES MARIA INFUSINO PROSEMINAR ON LINEAR ALGEBRA WS2016/2017 UNIVERSITY OF KONSTANZ Abstract. This note aims to give an introduction on forms on inner product spaces and their

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

Representation Theory

Representation Theory Representation Theory Representations Let G be a group and V a vector space over a field k. A representation of G on V is a group homomorphism ρ : G Aut(V ). The degree (or dimension) of ρ is just dim

More information