(VII.B) Bilinear Forms

Size: px
Start display at page:

Download "(VII.B) Bilinear Forms"

Transcription

1 (VII.B) Bilinear Forms There are two standard generalizations of the dot product on R n to arbitrary vector spaces. The idea is to have a product that takes two vectors to a scalar. Bilinear forms are the more general version, and inner products (a kind of bilinear form) the more specific they are the ones that look like the dot product (= Euclidean inner product ) with respect to some basis. Here we tour the zoo of various classes of bilinear forms, before narrowing our focus to inner products in the next section. that Now let V/R be an n -dimensional vector space: DEFINITION. A bilinear f orm is a function B : V V R such B(a u + b v, w) = ab( u, w) + bb( v, w) B( u, a v + b w) = ab( u, v) + bb( u, w) for all u, v, w V. It is called symmetric if also B( u, w) = B( w, u). Now let us lay down the following law: whenever we are considering a bilinear form B on a vector space V (even if V = R n!), all orthogonality (or orthono rmality) is relative to B : u w means B( u, w) = 0. Sounds fine, huh? EXAMPLE 2. Take a look at the following (non-symmetric) bilinear form B on R 2 : ( B( x, y) = 0 x x 2 ) ( ) ( y y 2 ) = x 2 y. This is absolutely sick, because under this bilinear form we have ( ) ( ) ( ) ( ) 0, but it is not true that 0! So we usually consider symmetric bilinear forms. 0

2 2 (VII.B) BILINEAR FORMS Try instead ( B( x, y) = x x 2 ) ( 0 0 ) ( y y 2 ) = x y x 2 y 2 ; B is symmetric but the notion of norm is not (( what ) we ( are )) used to from the dot product: B(ê 2, ê 2 ) =, while B, = 0 that is, under B there exists a nonzero self-perpendicular vector ( ) ( ). Such pathologies with general bilinear forms are routine; the point of defining inner products in VII.C will be to get rid of them. The Matrix of a Bilinear form. Consider a basis B = { v,..., v n } for V and write u = u i v i, w = w i v i. Let B be any bilinear form on V and set b ij = B( v i, v j ) ; we have ( = B( u, w) = B ( ) u i v i, w j v j = u i b ij w j i,j ) u u n b b n..... b n b nn w. w n =: t [ u] B [B] B [ w] B, defining a matrix for B. We write the basis B as a superscript, because basis change does not work the same way as for transformation matrices: t ([ u] B ) [B] B ([ w] B ) = t (S C B [ u] C ) [B] B (S C B [ w] C ) = t [ u] C ( t S C B [B] B S C B ) [ w] C = [B] C = t S[B] B S. Matrices related in this fashion (where S is invertible) are said to be cogredient. However, we warn the reader that the only bilinear form or inner product on R n for which B(ê i, ê j ) = δ ij (i.e. under which ê is an orthonormal basis) is the dot product! In order to avoid confusion coming from ê we will often work in a more abstract setting than R n.

3 (VII.B) BILINEAR FORMS 3 Symmetric Bilinear forms. Next let B be an arbitrary symmetric bilinear form, so that is a symmetric relation, and the subspace V w = { v V B( v, w) = B( w, v) = 0} makes sense. Now already the matrix of such a form (with respect to any basis) is symmetric, that is, t [B] B = [B] B ; and we now explain how to put it into an especially nice form. LEMMA 3. There exists a basis B such that where the b i = 0. [B] B = diag{b,..., b r, 0,..., 0}, PROOF. by induction on n = dim V (clear for n = ): assume the result for (n ) -dimensional spaces and prove for V. If B( u, w) = 0 for all u, w V, then [B] B = 0 in any basis and we are done. Moreover, if B( v, v) = 0 for all v V then (using symmetry and bilinearity) 2B( u, w) = B( u, w) + B( w, u) = B( u + w, u + w) B( u, u) B( w, w) = 0 for all u, w and we re done once more. So assume otherwise: that there exists v V such that B( v, v ) = b = 0. Clearly B( v, v ) = 0 = (i) span{ v } V v = {0}. Furthermore, (ii) V = span{ v } + V v : any w V can be written w = ( w B( w, v )b v ) + B( w, v )b v where the second term is in span{ v }, and the first is in V v since B(first term, v ) = B( w, v ) B( w, v )b B( v, v ) = 0. Combining (i) and (ii), V = span{ v } V v. Applying the inductive hypothesis to V v establishes the existence of a basis { v 2,..., v n } with B( v i, v j ) = b i δ ij (i, j 2). Combining this with the direct-sum decomposition of V, B = { v, v 2,..., v n }

4 4 (VII.B) BILINEAR FORMS is a basis for V. By definition of V v, B( v, v j ) = B( v j, v ) = 0 for j 2. Therefore B( v i, v j ) = b i δ ij for i, j and we re through. Reorder the basis elements so that [B] B = diag{ b,..., b p >0, b p+,..., b p+q <0, 0,..., 0} and then normalize: take B v = v p v p+ v p+q,..., ;,..., ; v p+q+,..., v n b bp b bp+q, p+ so that [B] B = diag{,..., p,,..., q Suppose for some other basis C = { w,..., w n }, 0,..., 0}. [B] C = diag{ c,..., c p >0 are p, q and p, q necessarily the same?, c p +,..., c p +q <0, 0,..., 0}; Since [B] C and [B] B are cogredient they must have the same rank, p + q = p + q. Now any nonzero u span{ v,..., v p } has B( u, u) > 0, while a nonzero u span{ w p +,..., w n } has B( u, u) 0. Therefore the only intersection of these two spans can be at {0}, which says the sum of their dimensions cannot exceed dim V : p + (n p ) n, which implies p p. An exactly symmetric argument (interchanging the v s and w s) shows that p p. So p = p, and q = q follows immediately. We have proved THEOREM 4 (Sylvester s Theorem). Given a symmetric bilinear form B on V/R, there exists a basis B = { v,..., v n } for V such that B( v i, v j ) = { 0 0 for i = j, and B( v i, v i ) = where the number of + s, s, ± and 0 s is well-defined (another such choice of basis will not change these

5 (VII.B) BILINEAR FORMS 5 numbers). The corresponding statement for the matrix of B gives rise to the following COROLLARY 5. Any symmetric real matrix is cogredient to exactly one matrix of the form diag{+,..., +,,...,, 0,..., 0}. For a given symmetric bilinear form (or symmetric matrix), we call the # of + s and s in the form guaranteed by the Theorem (or Corollary) the index of positivity (= p) and index of negativity (= q) of B (or [B]). Their sum r = p + q is (for obvious reasons) referred to as the rank, and the pair (p, q) (or triple (p, q, n r), or sometimes the difference p q) is referred to as the signature of B (or [B]). Anti-symmetric bilinear forms. Now if B : V V R is antisymmetric (or alternating ), i.e. B( u, v) = B( v, u), for all u, v V, then is still a symmetric relation. In this case the matrix of B with respect to any basis is skew-symmetric, t [B] B = [B] B, and the analogue of Sylvester s Theorem is actually simpler: blocks) PROPOSITION 6. There exists a basis B such that (in terms of matrix [B] B = where r = 2m n is the rank of B. 0 I m 0 I m , REMARK 7. This matrix can be written more succinctly as ( ) [B] B J = 2m 0, 0 0

6 6 (VII.B) BILINEAR FORMS where we define () J 2m := ( 0 I m I m 0 ). PROOF OF PROPOSITION 6. We induce on n: clearly the result holds (B = 0) if n =, since B( v, v) = B( v, v) must be 0. Now assume the result is known for all dimensions less than n. If V (of dimension n) contains any vector v with B( v, w) = 0 w V, take v n := v. Let U V be any (n )-dimensional subspace with V = span{ v} U, and obtain the rest of the basis by applying induction to U. If V does not contain such a vector, then the columns of [B] (with respect to any basis) are independent and det([b]) = 0, with the immediate consequence that n = 2m is even (by Exercise IV.B.4). So choose v 2m V\{0} arbitrarily, and v m V so that B( v 2m, v m ) =. Writing W := span{ v m, v 2m }, and W := { v V B( v, w) = 0 w W}, we claim that V = W W. Indeed, if w W W then w = a v m + b v 2m has 0 = B( w, v m ) = b and 0 = B( w, v 2m ) = a = w = 0. Moreover, given v V we may consider w := B( v, v m ) v 2m B( v, v 2m ) v m W and w := v w, which satisfies B( w, v m ) = B( v, v m ) B( v, v m )B( v 2m, v m ) and + B( v, v 2m )B( v m, v m ) = 0 0 B( w, v 2m ) = B( v, v 2m ) B( v, v m )B( v 2m, v 2m ) + B( v, v 2m )B( v m, v 2m ) = 0. 0 Thus w W, v = w + w, and the claim is proved. Now apply induction to W to get the remaining 2m 2 basis elements v,..., v m, v m+,..., v 2m. Hermitian symmetric forms. Given an n-dimensional complex vector space V/C, a Hermitian-linear (or sesquilinear 2 ) form H : V 2 from the Latin for one and a half, since it s only conjugate-linear in the first entry

7 (VII.B) BILINEAR FORMS 7 V C satisfies H(α u, w) = ᾱh( u, w), H( u, α w) = αh( u w), H( u + u 2, w) = H( u, w) + H( u 2, w), H( u, w + w 2 ) = H( u, w ) + H( u, w 2 ). It is Hermitian symmetric if it also satisfies H( u, w) = H( w, u); we will often call such forms simply Hermitian. The passage to matrices (where as usual B = { v,..., v n }, u = u i v i, w = w j v j, h ij = H( v i, v j ) ) looks like H( u, w) = ū i h ij w j = [ u] B [H]B [ w] B ; the relevant version of cogredience for coordinate transformations is S [H]S. If H is Hermitian symmetric then h ij = h ji ; that is, [H] B = ([H] B ) is a Hermitian symmetric matrix. For such forms/matrices Sylvester holds verbatim. For the purpose of studying inner product spaces in the next two sections, you can think of Hermitian symmetric matrices (A = t Ā ) as being the right complex generalization of the real symmetric matrices (which they include), and similarly for the forms. But there is an important caveat here: Hermitian forms aren t really bilinear over C, due to their conjugate-linearity in the first argument. If we view V as a 2n-dimensional vector space over R, then they are bilinear over R. In fact, if we go that route, then we may view H = B re + B im : R 2n R 2n C = R R as splitting up into two bilinear forms over R. Moreover, since B re ( v, u) + B im ( v, u) = H( v, u) = H( u, v) = B re ( u, v) B im ( u, v) for all u, v R 2n, B re is symmetric and B im is anti-symmetric. Now given a basis B = { v,..., v n } for V as a C-vector space, B := { v,..., v n, v,..., v n } =: { w,..., w 2n } is a basis for

8 8 (VII.B) BILINEAR FORMS V as an R-vector space. The matrix of 3 J := {multiplication by } : V V is clearly [J ] B = J 2n (as defined in ()). The complex anti-linearity of H in the first argument gives B re (J u, v) + B im (J u, v) = H( u, v) = H( u, v) = B im ( u, v) B re ( u, v) hence B re (J u, v) = B im ( u, v) and B im (J u, v) = B re ( u, v). Writing v = J w, this gives B re (J u, J w) = B im ( u, J w) = B im (J w, u) = B re ( w, u) = B re ( u, w) and similarly B im (J u, J w) = B im ( u, w). In this way one deduces that giving a Hermitian form on V is equivalent to giving a (real) symmetric or antisymmetric form which is invariant under the complex structure map J. FInally, if [H] B = diag{+,..., +,,...,, 0..., 0} =: D then one finds that ( ) ( ) [B re ] B D 0 = and [B im ] B 0 D = ; 0 D D 0 that is, the signature of B re is exactly double that of H. Quadratic Forms. Now let s return to unambiguously real vector spaces. form DEFINITION 8. A quadratic form is a function Q : V R of the Q( u) = B( u, u), for some symmetric bilinear form B. 3 Conversely, given a 2n-dimensional real vector space V with a linear transformation J with J 2 = Id (or equivalently, matrix J 2n in some basis), we call J a complex structure on V.

9 (VII.B) BILINEAR FORMS 9 Let B = { v,..., v n }, use this basis to expand a given vector u = x i v i, and set b ij = B( v i, v j ) = entries of [B] B ; then Q( u) = t [ u] B [B] B [ u] B = x i b ij x j. Basically this is a homogeneous quadratic function of several variables, with cross-terms (like x 2 x 3 ). Sylvester effectively says so long to the cross-terms. It gives us a new basis C = { w,..., w n } such that [B] C = diag{ +,..., + p u = y i w i in this new basis we have,,..., q, 0,..., 0} ; expanding (2) Q( u) = t [ u] C [B] C [ u] C = y y2 p (y 2 p y2 p+q). Application to Calculus. Say f : R n R (nonlinear) has a stationary point at 0 = (0,..., 0), f x 0 =... = f x n 0 = 0. Consider its Taylor expansion about 0 : where f (x,..., x n ) = constant + b ij = n x i x j b ij + higher-order terms, i,j= 2 f x i x j 0. These are the entries of the so-called Hessian matrix of f (evaluated at 0 ); sometimes Hess( f ) is just called the matrix of second partials in calculus texts. Now Sylvester s theorem, applied exactly as above, gives us a linear change of coordinates (y,..., y n ) so that the second term in the Taylor series takes exactly the form (2). This is otherwise known as completing squares! If p = n then we have y y2 n and a local minimum; if q = n then y 2... y2 n and a local maximum. The other cases where p + q = n are saddle points, and if p + q < n

10 0 (VII.B) BILINEAR FORMS (Hess( f ) 0 not of maximal rank) then one must look to the higherorder terms. Incidentally in this latter case we say 0 is a degenerate stationary point. Is there any way you can tell what situation you re in without finding the basis guaranteed by Sylvester s theorem? Certainly if det(hess( f ) 0 ) = 0 then p + q = n we have a nondegenerate critical point. Where do we go from there? If n = 2 (notice that this is the situation in your calculus text), then we can use the fact that determinants of cogredient matrices have the same sign: t SAS = B = det A (det S) 2 = det B. If det(hess( f ) 0 ) > 0 then either (p, q) = (2, 0) or (0, 2) (a ++ or situation), which means a local extremum; if det(hess( f ) 0 ) < 0 then we have p = q = ( + ) and a saddle point. More generally (n > 2 ) if Hess( f ) 0 is diagonalizable/r, that could completely substitute for Sylvester (how?). In fact, in the next section we will see that real symmetric matrices can always be diagonalized over R! Nondegenerate bilinear forms. A bilinear (or Hermitian) form B on a vector space V is said to be nondegenerate if and only if, for every vector v V, there exists w V such that B( v, w) = 0. (Equivalently, the matrix of the form has nonzero determinant.) A nondegenerate symmetric form B is called orthogonal. 4 If the signs in Sylvester s theorem are all positive (resp. negative), B is positive (resp. negative) definite; otherwise, B is indefinite of signature (p, q), where p + q = n. A nondegenerate anti-symmetric form B is termed symplectic. In this case dim(v) is necessarily even: n = 2m, and (in some basis B) [B] B = J 2m. A nondegenerate Hermitian form H is called unitary. The same terminology applies as in the orthogonal case. 4 Calling a form orthogonal or unitary is a bit nonstandard, but is more consistent with the standard use of symplectic.

11 (VII.B) BILINEAR FORMS EXAMPLE 9. On R 4 with spacetime coordinates (x, y, z, t), we define an indefinite orthogonal form of signature (3, ) via its quadratic form x 2 + y 2 + z 2 t 2. The resulting Minkowski space is closely associated with the theory of special relativity. Nondegenerate symplectic and indefinite-orthogonal forms also play a central role in the topology of projective manifolds. We conclude this section with a brief (and somewhat sketchy) account of the simplest of the so-called classical groups. This material will not be used in the remainder of the text. Linear algebraic groups. These are those subgroups of GL n (C) or GL n (R), the general linear groups of invertible n n matrices with entries in C or R, that are defined by linear algebraic conditions. We ll only be interested in examples of reductive such groups, which are the ones with the property that whenever a vector subspace W is closed under their action on a vector space V, there is another subspace W also closed under this action, such that V = W W. In general, one can show (Chevalley s Theorem) that the reductive linear algebraic groups are the subgroups of GL n defined by the property of fixing a subalgebra of the tensor algebra 5 a,b V a (V ) b, where V = C n resp. R n. But that is a subject for a different course; here we ll just define a few of these groups, using only determinants and nondegenerate bilinear forms. For B orthogonal resp. symplectic, the corresponding orthogonal resp. symplectic group consists of all g GL n (F) (F = R or C) 5 We have not discussed tensor products of vector spaces, but they re not hard to define: given vector spaces V and W with bases { v i } i= n and { w j} m j=, you simply take V W to be the nm-dimensional vector space with basis { v i w j } i =,..., n j =,..., m. (These symbols obey the distributive property.) These aren t unfamiliar objects either. You could try to prove, for instance, that V W is the vector space of linear transformations from V to W, or that a bilinear form is an element of (V ) 2.

12 2 (VII.B) BILINEAR FORMS satisfying B(g v, g w) = B( v, w) v, w F n. (We write Sp n (F, B) resp. O n (F, B) for these groups.) By Proposition 6, all the symplectic groups are isomorphic (via conjugation by a change-of-basis matrix) to Sp 2m (F) := {g GL 2m (F) t gj 2m g = J 2m } for F = C resp. R. The real orthogonal groups are isomorphic to one of the O(p, q) := {g GL p+q (R) t gi p,q g = I p,q }, where I p,q := diag{i p, I q }. Note that O(p, q) = O(q, p), and O(n, 0) is written O n (R); these are called indefinite resp. definite orthogonal groups. All the complex orthogonal groups are isomorphic to O n (C) := {g GL n (C) t gg = I n }. The unitary groups U n (H) are defined by the property of preserving a Hermitian form: they consist of those g GL n (C) with (3) H(g u, g v) = H( u, v) u, v C n, and are each conjugate to one of the U(p, q) := {g GL p+q (C) g I p,q g = I p,q }. (Write U(n, 0) =: U(n) for the definite unitary group.) But this is a little deceptive, as (3) is not a C-linear condition: the U(p, q) are actually real linear algebraic groups. If (as above) we think of C n as R 2n, and write H = B re + B im, then I claim that U n (H) = O 2n (R, B re ) Sp 2n (R, B im ) as a subgroup of GL 2n (R). The inclusion is clear, since to preserve H, γ GL 2n (R) must preserve its real and imaginary parts. Conversely, we just need to show that preserving B re and B im also implies that γ comes from an element of GL n (C), or equivalently

13 (VII.B) BILINEAR FORMS 3 commutes with J (cf. Exercise 6 below). This follows from B re (gj u, g v) = B re (J u, v) = B im ( u, v) = B im (g u, g v) = B re (J g u, g v) and nondegeneracy of B re. The special linear groups SL n (F) consist of the elements of GL n (F) with determinant. They contain the symplectic groups (see Exercise 5), from which it also follows that SL 2n (R) contains U n (H). On the other hand, SL n (C) does not contain U n (H); intersecting them gives the special unitary groups SU n (H) (and SU(p, q)). The special orthogonal groups (denoted SO n (F, B), SO(p, q), etc.) are given by intersecting the orthogonal and special linear groups. Jordan decomposition. Let G be one of the above linear algebraic groups (over R or C). 6 Since G is a subgroup of GL n (F) for some n, we may think of elements of G as invertible matrices (with additional constraints) acting on R n or C n. We conclude this section with a fundamental result that is closely related to the Jordan normal form. DEFINITION 0. An element g G is semisimple if g is diagonalizable over C unipotent if a power of (g I n ) is zero. THEOREM. Every g G may be written UNIQUELY as a product (4) g = g ss g un of COMMUTING semisimple and unipotent elements OF G. PROOF. Begin with the case of G = GL n (C). A Jordan form matrix J admits such a decomposition, since each block decomposes: σ σ σ = σ σ σ σ σ 6 Even more generally, Theorem will hold for all reductive linear algebraic groups over a perfect field, though that is obviously beyond our scope here.

14 4 (VII.B) BILINEAR FORMS By Theorem VI.E.8, we have g = γj(g)γ = γj ss J un γ = γj ss γ γj un γ =:g ss =:g un, and g ss, g un commute because J ss, J un do. Conversely, given a decomposition (4) and v E σ (g ss ), we have (g σi n ) n v = (g un g ss σi n ) n v = σ n (g un I n ) n v = 0 since g ss and g un commute, and so v E s σ(g). Writing C n = s j= E σ j (g ss ), this shows that E σj (g ss ) E s σ j (g). Since the dimensions of E s σ j (g) cannot sum to more than n, these inclusions are equalities. Therefore, the eigenspaces of g ss, and so g ss itself (and thus g un = gg ss ), are determined uniquely by g. It remains to show that if g belongs to one of the classical groups G GL n (C) above, then g ss and g un are as well. First, if g has real entries, then ker s (σi g) (= E σ (g ss )) and ker s ( σi g) (= E σ (g ss )) are conjugate, from which one deduces that g ss (hence g un ) is real. Next, if det(g) =, then since the determinant of a unipotent matrix is always, det(g ss ) = det(g un ) =. Finally, if g preserves a nondegenerate symmetric or alternating bilinear form B, we claim that g ss, g un do too. (This will finish the proof, as all the above groups are cut out of GL n (C) by some combination of these constraints.) Write V i := E s σ i (g) = E σi (g ss ). Since any vector decomposes into a sum of vectors in these V i, it will suffice to show that (5) B(g v, g w) = B( v, w) i, j, v V i, w V j implies (6) B(g ss v, g ss w) = B( v, w) i, j, v V i, w V j. The latter is equivalent to (7) B(V i, V j ) = 0 if σ i σ j =,

15 EXERCISES 5 since LHS(6) is just σ i σ j B( v, w). Suppose σ i σ j = ; then (σ i I g) is invertible on V j, and g invertible on V i. Given v V i, w V j, write w l := (σ i I g) l w V j and v l := g l v V i ; and note that (g σ i I) k V i = { 0} for some k. Applying (5), B( v, w) = B = σ i ( v, (σ i I g) w ) = σ i B( v, w ) B( v, g w ) B( v, w ) B(g v, w ) = σ i B ((g σ i I) v, w ) ) ((g σ i I) k v k, w k = 0 = = σ k i B establishes (7) and we are done. Exercises () A quadratic form on R 3 is given by Q(x, y, z) = x 2 + 3y 2 + z 2 4xy + 2xz 2yz. (a) Write the matrix of the corresponding symmetric bilinear form B with respect to ê. (Be careful: if 4 is an entry in your matrix, it isn t quite correct.) (b) Find S such that t S[B]êS is diagonal by following the steps in the proof of Sylvester s theorem. (c) What is the signature of B? (2) Consider the vector space P 2 (R) of polynomials of degree 2 with (symmetric) bilinear form B( f, g) := 0 f (t)g(t)dt. (a) Find a basis B of P 2 (R) in which [B] B = I 3. [Hint: work as if B were the dot product and use Gram-Schmidt, starting from the basis {, t, t 2 }.] (b) Let T : P 2 (R) P 2 (R) be the shift operator T f (t) = f (t ). Compute [T] B (where B is the basis found in part (a)). (3) Find a basis for the vector space(!) of all alternating bilinear forms on R n. What is its dimension? [Hint: you could use matrices.] (4) Write out the proof of Sylvester s theorem for Hermitian forms.

16 6 (VII.B) BILINEAR FORMS (5) Show that Sp 2m (F) is in fact a subgroup of SL 2m (F). [Hint: use Jordan to reduce to g ss.] (6) Deduce that the image of GL n (C) GL 2n (R) given by regarding C n as R 2n (via (z,..., z n ) (Re(z ),..., Re(z n ), Im(z ),..., Im(z n ))) consists of those elements commuting with J 2n. (7) (i) Let B be an orthogonal form on F n, and w F n be such that B( w, w) = 2. Show that the (matrix of the) reflection µ( v) := v B( w, v) w belongs to O n (F, B). (ii) Let B be a symplectic form on F 2m, w F 2m nonzero. Show that the (matrix of the) transvection τ( v) := v B( w, v) w belongs to Sp 2m (F, B).

MATH 112 QUADRATIC AND BILINEAR FORMS NOVEMBER 24, Bilinear forms

MATH 112 QUADRATIC AND BILINEAR FORMS NOVEMBER 24, Bilinear forms MATH 112 QUADRATIC AND BILINEAR FORMS NOVEMBER 24,2015 M. J. HOPKINS 1.1. Bilinear forms and matrices. 1. Bilinear forms Definition 1.1. Suppose that F is a field and V is a vector space over F. bilinear

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

SYMMETRIC SUBGROUP ACTIONS ON ISOTROPIC GRASSMANNIANS

SYMMETRIC SUBGROUP ACTIONS ON ISOTROPIC GRASSMANNIANS 1 SYMMETRIC SUBGROUP ACTIONS ON ISOTROPIC GRASSMANNIANS HUAJUN HUANG AND HONGYU HE Abstract. Let G be the group preserving a nondegenerate sesquilinear form B on a vector space V, and H a symmetric subgroup

More information

3.2 Real and complex symmetric bilinear forms

3.2 Real and complex symmetric bilinear forms Then, the adjoint of f is the morphism 3 f + : Q 3 Q 3 ; x 5 0 x. 0 2 3 As a verification, you can check that 3 0 B 5 0 0 = B 3 0 0. 0 0 2 3 3 2 5 0 3.2 Real and complex symmetric bilinear forms Throughout

More information

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces WOMP 2001: LINEAR ALGEBRA DAN GROSSMAN Reference Roman, S Advanced Linear Algebra, GTM #135 (Not very good) Let k be a field, eg, R, Q, C, F q, K(t), 1 Vector spaces Definition A vector space over k is

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY MAT 445/1196 - INTRODUCTION TO REPRESENTATION THEORY CHAPTER 1 Representation Theory of Groups - Algebraic Foundations 1.1 Basic definitions, Schur s Lemma 1.2 Tensor products 1.3 Unitary representations

More information

implies that if we fix a basis v of V and let M and M be the associated invertible symmetric matrices computing, and, then M = (L L)M and the

implies that if we fix a basis v of V and let M and M be the associated invertible symmetric matrices computing, and, then M = (L L)M and the Math 395. Geometric approach to signature For the amusement of the reader who knows a tiny bit about groups (enough to know the meaning of a transitive group action on a set), we now provide an alternative

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j Math 210A. Quadratic spaces over R 1. Algebraic preliminaries Let V be a finite free module over a nonzero commutative ring F. Recall that a quadratic form on V is a map Q : V F such that Q(cv) = c 2 Q(v)

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that

L(C G (x) 0 ) c g (x). Proof. Recall C G (x) = {g G xgx 1 = g} and c g (x) = {X g Ad xx = X}. In general, it is obvious that ALGEBRAIC GROUPS 61 5. Root systems and semisimple Lie algebras 5.1. Characteristic 0 theory. Assume in this subsection that chark = 0. Let me recall a couple of definitions made earlier: G is called reductive

More information

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop Eric Sommers 17 July 2009 2 Contents 1 Background 5 1.1 Linear algebra......................................... 5 1.1.1

More information

Math 113 Final Exam: Solutions

Math 113 Final Exam: Solutions Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

Linear algebra 2. Yoav Zemel. March 1, 2012

Linear algebra 2. Yoav Zemel. March 1, 2012 Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.

More information

Supplementary Notes March 23, The subgroup Ω for orthogonal groups

Supplementary Notes March 23, The subgroup Ω for orthogonal groups The subgroup Ω for orthogonal groups 18.704 Supplementary Notes March 23, 2005 In the case of the linear group, it is shown in the text that P SL(n, F ) (that is, the group SL(n) of determinant one matrices,

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

Representations. 1 Basic definitions

Representations. 1 Basic definitions Representations 1 Basic definitions If V is a k-vector space, we denote by Aut V the group of k-linear isomorphisms F : V V and by End V the k-vector space of k-linear maps F : V V. Thus, if V = k n, then

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

Linear and Bilinear Algebra (2WF04) Jan Draisma

Linear and Bilinear Algebra (2WF04) Jan Draisma Linear and Bilinear Algebra (2WF04) Jan Draisma CHAPTER 10 Symmetric bilinear forms We already know what a bilinear form β on a K-vector space V is: it is a function β : V V K that satisfies β(u + v,

More information

LECTURE 25-26: CARTAN S THEOREM OF MAXIMAL TORI. 1. Maximal Tori

LECTURE 25-26: CARTAN S THEOREM OF MAXIMAL TORI. 1. Maximal Tori LECTURE 25-26: CARTAN S THEOREM OF MAXIMAL TORI 1. Maximal Tori By a torus we mean a compact connected abelian Lie group, so a torus is a Lie group that is isomorphic to T n = R n /Z n. Definition 1.1.

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Algebra I Fall 2007

Algebra I Fall 2007 MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) = Math 395. Quadratic spaces over R 1. Algebraic preliminaries Let V be a vector space over a field F. Recall that a quadratic form on V is a map Q : V F such that Q(cv) = c 2 Q(v) for all v V and c F, and

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

CLASSICAL GROUPS DAVID VOGAN

CLASSICAL GROUPS DAVID VOGAN CLASSICAL GROUPS DAVID VOGAN 1. Orthogonal groups These notes are about classical groups. That term is used in various ways by various people; I ll try to say a little about that as I go along. Basically

More information

Supplementary Notes on Linear Algebra

Supplementary Notes on Linear Algebra Supplementary Notes on Linear Algebra Mariusz Wodzicki May 3, 2015 1 Vector spaces 1.1 Coordinatization of a vector space 1.1.1 Given a basis B = {b 1,..., b n } in a vector space V, any vector v V can

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

UC Berkeley Summer Undergraduate Research Program 2015 July 9 Lecture

UC Berkeley Summer Undergraduate Research Program 2015 July 9 Lecture UC Berkeley Summer Undergraduate Research Program 205 July 9 Lecture We will introduce the basic structure and representation theory of the symplectic group Sp(V ). Basics Fix a nondegenerate, alternating

More information

ne varieties (continued)

ne varieties (continued) Chapter 2 A ne varieties (continued) 2.1 Products For some problems its not very natural to restrict to irreducible varieties. So we broaden the previous story. Given an a ne algebraic set X A n k, we

More information

ALGEBRA 8: Linear algebra: characteristic polynomial

ALGEBRA 8: Linear algebra: characteristic polynomial ALGEBRA 8: Linear algebra: characteristic polynomial Characteristic polynomial Definition 8.1. Consider a linear operator A End V over a vector space V. Consider a vector v V such that A(v) = λv. This

More information

Cartan s Criteria. Math 649, Dan Barbasch. February 26

Cartan s Criteria. Math 649, Dan Barbasch. February 26 Cartan s Criteria Math 649, 2013 Dan Barbasch February 26 Cartan s Criteria REFERENCES: Humphreys, I.2 and I.3. Definition The Cartan-Killing form of a Lie algebra is the bilinear form B(x, y) := Tr(ad

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows up in an interesting way in computer vision.

The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows up in an interesting way in computer vision. 190 CHAPTER 3. REVIEW OF GROUPS AND GROUP ACTIONS 3.3 The Lorentz Groups O(n, 1), SO(n, 1) and SO 0 (n, 1) The Lorentz group provides another interesting example. Moreover, the Lorentz group SO(3, 1) shows

More information

The Cayley-Hamilton Theorem and the Jordan Decomposition

The Cayley-Hamilton Theorem and the Jordan Decomposition LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Algebra Workshops 10 and 11

Algebra Workshops 10 and 11 Algebra Workshops 1 and 11 Suggestion: For Workshop 1 please do questions 2,3 and 14. For the other questions, it s best to wait till the material is covered in lectures. Bilinear and Quadratic Forms on

More information

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

More information

Symmetric and self-adjoint matrices

Symmetric and self-adjoint matrices Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

6 Orthogonal groups. O 2m 1 q. q 2i 1 q 2i. 1 i 1. 1 q 2i 2. O 2m q. q m m 1. 1 q 2i 1 i 1. 1 q 2i. i 1. 2 q 1 q i 1 q i 1. m 1.

6 Orthogonal groups. O 2m 1 q. q 2i 1 q 2i. 1 i 1. 1 q 2i 2. O 2m q. q m m 1. 1 q 2i 1 i 1. 1 q 2i. i 1. 2 q 1 q i 1 q i 1. m 1. 6 Orthogonal groups We now turn to the orthogonal groups. These are more difficult, for two related reasons. First, it is not always true that the group of isometries with determinant 1 is equal to its

More information

Since G is a compact Lie group, we can apply Schur orthogonality to see that G χ π (g) 2 dg =

Since G is a compact Lie group, we can apply Schur orthogonality to see that G χ π (g) 2 dg = Problem 1 Show that if π is an irreducible representation of a compact lie group G then π is also irreducible. Give an example of a G and π such that π = π, and another for which π π. Is this true for

More information

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses

More information

Math 396. An application of Gram-Schmidt to prove connectedness

Math 396. An application of Gram-Schmidt to prove connectedness Math 396. An application of Gram-Schmidt to prove connectedness 1. Motivation and background Let V be an n-dimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V

More information

(VII.E) The Singular Value Decomposition (SVD)

(VII.E) The Singular Value Decomposition (SVD) (VII.E) The Singular Value Decomposition (SVD) In this section we describe a generalization of the Spectral Theorem to non-normal operators, and even to transformations between different vector spaces.

More information

Special Lecture - The Octionions

Special Lecture - The Octionions Special Lecture - The Octionions March 15, 2013 1 R 1.1 Definition Not much needs to be said here. From the God given natural numbers, we algebraically build Z and Q. Then create a topology from the distance

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Notation. For any Lie group G, we set G 0 to be the connected component of the identity.

Notation. For any Lie group G, we set G 0 to be the connected component of the identity. Notation. For any Lie group G, we set G 0 to be the connected component of the identity. Problem 1 Prove that GL(n, R) is homotopic to O(n, R). (Hint: Gram-Schmidt Orthogonalization.) Here is a sequence

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure. Hints for Exercises 1.3. This diagram says that f α = β g. I will prove f injective g injective. You should show g injective f injective. Assume f is injective. Now suppose g(x) = g(y) for some x, y A.

More information

Mathematical Methods wk 2: Linear Operators

Mathematical Methods wk 2: Linear Operators John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

5 Quiver Representations

5 Quiver Representations 5 Quiver Representations 5. Problems Problem 5.. Field embeddings. Recall that k(y,..., y m ) denotes the field of rational functions of y,..., y m over a field k. Let f : k[x,..., x n ] k(y,..., y m )

More information

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for 2017-01-30 Most of mathematics is best learned by doing. Linear algebra is no exception. You have had a previous class in which you learned the basics of linear algebra, and you will have plenty

More information

QUATERNIONS AND ROTATIONS

QUATERNIONS AND ROTATIONS QUATERNIONS AND ROTATIONS SVANTE JANSON 1. Introduction The purpose of this note is to show some well-known relations between quaternions and the Lie groups SO(3) and SO(4) (rotations in R 3 and R 4 )

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

1.4 Solvable Lie algebras

1.4 Solvable Lie algebras 1.4. SOLVABLE LIE ALGEBRAS 17 1.4 Solvable Lie algebras 1.4.1 Derived series and solvable Lie algebras The derived series of a Lie algebra L is given by: L (0) = L, L (1) = [L, L],, L (2) = [L (1), L (1)

More information

Chapter 2: Linear Independence and Bases

Chapter 2: Linear Independence and Bases MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space

More information

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent Lecture 4. G-Modules PCMI Summer 2015 Undergraduate Lectures on Flag Varieties Lecture 4. The categories of G-modules, mostly for finite groups, and a recipe for finding every irreducible G-module of a

More information

IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents

IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS NEEL PATEL Abstract. The goal of this paper is to study the irreducible representations of semisimple Lie algebras. We will begin by considering two

More information

Background on Chevalley Groups Constructed from a Root System

Background on Chevalley Groups Constructed from a Root System Background on Chevalley Groups Constructed from a Root System Paul Tokorcheck Department of Mathematics University of California, Santa Cruz 10 October 2011 Abstract In 1955, Claude Chevalley described

More information

Exercises on chapter 1

Exercises on chapter 1 Exercises on chapter 1 1. Let G be a group and H and K be subgroups. Let HK = {hk h H, k K}. (i) Prove that HK is a subgroup of G if and only if HK = KH. (ii) If either H or K is a normal subgroup of G

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in 806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

More information

Problem set 2. Math 212b February 8, 2001 due Feb. 27

Problem set 2. Math 212b February 8, 2001 due Feb. 27 Problem set 2 Math 212b February 8, 2001 due Feb. 27 Contents 1 The L 2 Euler operator 1 2 Symplectic vector spaces. 2 2.1 Special kinds of subspaces....................... 3 2.2 Normal forms..............................

More information

Clifford Algebras and Spin Groups

Clifford Algebras and Spin Groups Clifford Algebras and Spin Groups Math G4344, Spring 2012 We ll now turn from the general theory to examine a specific class class of groups: the orthogonal groups. Recall that O(n, R) is the group of

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

REPRESENTATION THEORY WEEK 5. B : V V k

REPRESENTATION THEORY WEEK 5. B : V V k REPRESENTATION THEORY WEEK 5 1. Invariant forms Recall that a bilinear form on a vector space V is a map satisfying B : V V k B (cv, dw) = cdb (v, w), B (v 1 + v, w) = B (v 1, w)+b (v, w), B (v, w 1 +

More information

0 A. ... A j GL nj (F q ), 1 j r

0 A. ... A j GL nj (F q ), 1 j r CHAPTER 4 Representations of finite groups of Lie type Let F q be a finite field of order q and characteristic p. Let G be a finite group of Lie type, that is, G is the F q -rational points of a connected

More information

Math 121 Homework 5: Notes on Selected Problems

Math 121 Homework 5: Notes on Selected Problems Math 121 Homework 5: Notes on Selected Problems 12.1.2. Let M be a module over the integral domain R. (a) Assume that M has rank n and that x 1,..., x n is any maximal set of linearly independent elements

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

GROUP THEORY PRIMER. New terms: so(2n), so(2n+1), symplectic algebra sp(2n)

GROUP THEORY PRIMER. New terms: so(2n), so(2n+1), symplectic algebra sp(2n) GROUP THEORY PRIMER New terms: so(2n), so(2n+1), symplectic algebra sp(2n) 1. Some examples of semi-simple Lie algebras In the previous chapter, we developed the idea of understanding semi-simple Lie algebras

More information

REPRESENTATION THEORY OF S n

REPRESENTATION THEORY OF S n REPRESENTATION THEORY OF S n EVAN JENKINS Abstract. These are notes from three lectures given in MATH 26700, Introduction to Representation Theory of Finite Groups, at the University of Chicago in November

More information

REPRESENTATION THEORY WEEK 7

REPRESENTATION THEORY WEEK 7 REPRESENTATION THEORY WEEK 7 1. Characters of L k and S n A character of an irreducible representation of L k is a polynomial function constant on every conjugacy class. Since the set of diagonalizable

More information

Hodge Structures. October 8, A few examples of symmetric spaces

Hodge Structures. October 8, A few examples of symmetric spaces Hodge Structures October 8, 2013 1 A few examples of symmetric spaces The upper half-plane H is the quotient of SL 2 (R) by its maximal compact subgroup SO(2). More generally, Siegel upper-half space H

More information

LIE ALGEBRAS: LECTURE 7 11 May 2010

LIE ALGEBRAS: LECTURE 7 11 May 2010 LIE ALGEBRAS: LECTURE 7 11 May 2010 CRYSTAL HOYT 1. sl n (F) is simple Let F be an algebraically closed field with characteristic zero. Let V be a finite dimensional vector space. Recall, that f End(V

More information

A 2 G 2 A 1 A 1. (3) A double edge pointing from α i to α j if α i, α j are not perpendicular and α i 2 = 2 α j 2

A 2 G 2 A 1 A 1. (3) A double edge pointing from α i to α j if α i, α j are not perpendicular and α i 2 = 2 α j 2 46 MATH 223A NOTES 2011 LIE ALGEBRAS 11. Classification of semisimple Lie algebras I will explain how the Cartan matrix and Dynkin diagrams describe root systems. Then I will go through the classification

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information