= W z1 + W z2 and W z1 z 2

Size: px
Start display at page:

Download "= W z1 + W z2 and W z1 z 2"

Transcription

1 Math 44 Fall 06 homework page Math 44 Fall 06 Darij Grinberg: homework set 8 due: Wed, 4 Dec 06 [Thanks to Hannah Brand for parts of the solutions] Exercise Recall that we defined the multiplication of complex numbers by the rule a, b a, b a a b b, a b + a b a Prove that this multiplication is associative: ie, that z z z 3 z z z 3 for every three complex numbers z, z, z 3 Begin by writing z in the form a, b, etc [5 points] b For any complex number z a, b a + bi, define a real matrix W z by a b W z b a Given two complex numbers z and z, prove that W z z W z W z [5 points] Solution to Exercise We begin by proving part b b Let z and z be two complex numbers Write z in the form z a, b Write z in the form z a, b Thus, z z a, b a, b a a b b, a b + b a by the definition of the product of two complex numbers Hence, the definition of W z z yields a a W z z b b a b + b a a b + b a a a b b a a b b a b + b a a b b a a a b b a b On the other hand, we have z a, b Thus, W z Similarly, b a a b W z Multiplying these two equalities, we obtain b a a b W z W z a b a a + b b a b + b a b a b a b a + a b b b + a a a a b b a b + b a a b b a a a b b Comparing this with, we obtain W z z W z W z This solves Exercise b [Remark: Of course, we also have W z +z W z + W z and W z z W z W z for any two complex numbers z and z These facts, combined, show that the addition,

2 Math 44 Fall 06 homework page subtraction and multiplication of complex numbers are mirrored by the addition, subtraction and multiplication of their corresponding W-matrices where the W- matrix of a complex number z means the -matrix W z In more abstract terms, this says that the map C R sending each complex number z to its W-matrix W z is a ring homomorphism This fact is behind our second solution of part a given below] a First solution of part a: Here is the straightforward approach: We defined the multiplication of complex numbers by the rule a, b a, b a a b b, a b + a b Given three complex numbers z a, b, z a, b, and z 3 a 3, b 3, we can see that z z z 3 z z z 3 by explicitly computing both sides of this equation: and z z z 3 a, b a, b a 3, b 3 a, b a a 3 b b 3, a b 3 + a 3 b a a a 3 b b 3 b a b 3 + a 3 b, a a b 3 + a 3 b + a a 3 b b 3 b a a a 3 a b b 3 a b b 3 a 3 b b, a a b 3 + a a 3 b + a a 3 b b b b 3 z z z 3 a, b a, b a 3, b 3 a a b b, a b + a b a 3, b 3 a a b b a 3 a b + a b b 3, a a b b b 3 + a 3 a b + a b a a a 3 a 3 b b a b b 3 a b b 3, a a b 3 b b b 3 + a a 3 b + a a 3 b The right hand sides of these two equations are equal even though the terms appear in slightly different orders in them Thus, the left hand sides are also equal In other words, z z z 3 z z z 3 This solves part a Second solution of part a: Here is a more elegant proof, using part b Part b says that W z z W z W z for any two complex numbers z and z Renaming z and z as u and v, we can rewrite this as follows: W z W uv W u W v for any two complex numbers u and v Furthermore, any complex number z can be reconstructed from the -matrix Therefore, if u and v are two complex numbers satisfying W u W v, then u v Strictly speaking, this statement also includes the facts that W 0 0 and W I a b Proof Let z be a complex number Write z in the form a, b Then, W z by the b a definition of W z Hence, a and b are the two entries of the first row of the matrix W z Therefore, we can reconstruct a and b from W z Therefore, we can reconstruct z from W z since z a, b Qed

3 Math 44 Fall 06 homework page 3 Now, let z, z and z 3 be three complex numbers Applying to u z and v z z 3, we obtain W z z z 3 W z W z z }{{} 3 W z W z3 by, applied to uz and vz 3 W z W z W z3 3 But applying to u z z and v z 3, we obtain W z z z 3 W z z }{{} W z W z by, applied to uz and vz W z3 W z W z W z3 W z W z W z3 since we know that multiplication of matrices is associative Comparing this with 3, we obtain W z z z 3 W z z z 3 But recall that if u and v are two complex numbers satisfying W u W v, then u v Applying this to u z z z 3 and v z z z 3, we obtain z z z 3 z z z 3 since W z z z 3 W z z z 3 This solves part a again [Remark: This second solution illustrates an idea frequently used in algebra: We want to prove that a structure in our case, the ring C of complex numbers satisfies a certain property in this case, associativity of multiplication Instead of doing this directly as was done in the first solution of part a, we embed the structure in a bigger structure in our case, the bigger structure is the ring R of - matrices, and the embedding is the map sending each z C to the matrix W z which is already known to possess this property after all, we know that matrix multiplication is associative; then, we get the property on the smaller structure for free] Here is the algorithm for diagonalizing a matrix we did in class: Algorithm 0 Let A C n n be an n n-matrix We want to diagonalize A; that is, we want to find an invertible n n-matrix S and a diagonal n n-matrix Λ such that A SΛS We proceed as follows: Step : We compute the polynomial det A xi n where x is the indeterminate This polynomial, or the closely related polynomial det xi n A, is often called the characteristic polynomial of A Step : We find the roots of this polynomial det A xi n Let λ, λ,, λ k be these roots without repetitions eg, multiple roots are not listed multiple times, in whatever order you like For example, if det A xi n x x, then you can set k, λ and λ, or you can set k, λ and λ, but you must not set k 3 Step 3: For each j {,,, k}, we find a basis for Ker A λ j I n Notice that Ker { 0 } A λ j I n because λ j is a root of det A λ j I n, and thus det A λ j I n 0; hence, the basis should consist of at least one vector

4 Math 44 Fall 06 homework page 4 Step 4: Concatenate these bases into one big list s, s,, s m of vectors If m < n, then the matrix A cannot be diagonalized, and the algorithm stops here Otherwise, m n, and we proceed further Step 5: Thus, for each p {,,, m}, the vector s p belongs to a basis of Ker A λ j I n for some j {,,, k} Denote the corresponding λj by µ p so that s p Ker A µ p I n For example, if sp belongs to a basis of Ker A 5I n, then µ p 5 Thus, we have defined m numbers µ, µ,, µ m Step 6: Let S be the n n-matrix whose columns are s, s,, s n Let Λ be the diagonal matrix whose diagonal entries from top-left to bottom-right are µ, µ,, µ n I called these Steps differently in class the first four steps were called Steps to 4, while the last two steps were called Part But the above is less confusing Here is an example that is probably too messy for a midterm, but illustrates some things: Example 0 Let A Algorithm 0: Step : We have n 3 and thus det A xi n det Let us diagonalize A We proceed using 5 x 5 x 4 x 5 x x x x 4 5 x x x 3 + 8x 0x 4 Step : Now we must find the roots of this polynomial det A xi n x 3 + 8x 0x 4 This is a cubic polynomial, so if it has no rational roots, then finding its roots is quite hopeless in theory, there is Cardano s formula, but it is so complicated that it is almost useless Thus, we hope that there is a rational root To find it, we use the rational root theorem, which says that any rational root of a polynomial with integer coefficients must have the form p where p is q an integer dividing the constant term and q is a positive integer dividing the leading coefficient This is more general than what I quoted in class, and more correct than what I quoted in Section 070 In our case, the polynomial x 3 + 8x 0x 4 has leading coefficient and constant term 4 Thus, any rational root must have the form p where p is an integer dividing 4 q and q is a positive integer dividing This leaves 6 possibilities for p namely,

5 Math 44 Fall 06 homework page 5 p {,, 3, 4, 6, 8,, 4,,, 3, 4, 6, 8,, 4} and possibility for q namely, q Trying out all of these possibilities, we find that p 4 and q works Thus, p q 4 4 is a root Hence, we have found one root of our polynomial: namely, x 4 In order to find the others, we divide the polynomial by x 4 using polynomial long division We get x 3 + 8x 0x 4 x 4 x + 4x + 6 It thus remains to find the roots of x + 4x + 6 This is a quadratic, so we know how to do this The roots are + 0 and 0 Thus, altogether, the three roots of det A xi n are 4, + 0 and 0 Let me number them λ 4, λ + 0 and λ 3 0 although you can use any numbering you wish Step 3: Now, we must find a basis of Ker A λ j I n for each j {,, 3} This is a straightforward exercise in Gaussian elimination, and the only complication is that you have to know how to rationalize a denominator because λ and λ 3 involve square roots Let me only show the computation for j : Computing Ker A λ I n : We have Ker A λ I n Ker A + 0 I n Ker This is the set of all solutions to the system 3 0 x + y + 5z 0; x + 0 y + 4 z 0; x + y + 0 z 0 4 So let us solve this system We divide the first equation by 3 0 in order to have a simpler pivot entry This is tantamount to multiplying it by this was obtained by rationalizing the denominator, and it is absolutely useful here: you don t want to carry nested fractions around! It then becomes x y z 0, and the whole system transforms into x y z 0; x + 0 y + 4 z 0; x + y + 0 z 0

6 Math 44 Fall 06 homework page 6 Now, subtracting appropriate multiples of the first row from the other two rows, we eliminate x, resulting in the following system: x y z 0; y z 0; 4 0 y z 0 Next, we divide the second equation by aka, multiply it by , so that it becomes y z 0 Then, subtracting an appropriate multiple of it from the third equation turns the third equation into 0 0 Thus, our system takes the form x y z 0; y z 0; In this form, it can be solved by back-substitution unsurprisingly, there is a free variable, because the kernel is nonzero The solutions have the form Thus, r x 3 3 y 8 z 0 + r 3 3 r Ker A λ I n span Hence, is a basis of Ker A λ I n Of course, you can scale the vector by 3 in order to get rid of the denominators

7 Math 44 Fall 06 homework page 7 Similarly, we can find a basis of Ker A λ I n for example, 0, and a basis of Ker A λ 3 I n for example, Step 4: Now, we concatenate these three bases into one big list s, s,, s m of vectors So this big list is s, s, s 3 0 }{{} a basis of KerA λ I n , 8 0 +, }{{}}{{} a basis of a basis of KerA λ I n KerA λ 3 I n Thus, m 3, so that m n, and thus A can be diagonalized Step 5: Since s belongs to a basis of Ker A λ I n, we have µ λ 4 Similarly, µ λ + 0 and µ 3 λ 3 0 Step 6: Now, S is the n n-matrix whose columns are s, s,, s n In other words, S Furthermore, Λ is the diagonal matrix whose diagonal entries from top-left to bottom-right are µ, µ,, µ n In other words, Λ These are the S and Λ we were seeking With some patience, you could check that A SΛS although it s not necessary to check it Remark 03 a Algorithm 0 relies on some nontrivial theorems for example, Lemma 83 in Olver/Shakiban See 83 of Olver/Shakiban for a complete treatment Chapter 7 of Lankham/Nachtergaele/Schilling comes close, whereas

8 Math 44 Fall 06 homework page 8 Chapter FiveIV of Hefferon is probably overkill b What can we do if A is not diagonalizable? The next best thing is the Jordan normal form or Jordan canonical form; see 86 of Olver/Shakiban c In Step 4 of Algorithm 0, we may sometimes notice that A is not diagonalizable since m < n Is there a way to notice this earlier, thus saving ourself some useless work? Yes For each j {,,, k}, let α j be the multiplicity of the root λ j of the polynomial det A xi n For example, if det A xi n x 6 3 x + and λ 6, then α 3, because the root λ 6 has multiplicity 3 In Step 3, when computing Ker A λ j I n, the dimension dim Ker A λj I n will be either α j or < α j If it is < α j, then the algorithm is doomed to failure ie, you will get m < n in Step 4, and A is not diagonalizable This can save you some work d Algorithm 0 is more of a theoretical result than an actual workable algorithm; the difficulty of finding exact roots of polynomials, and the instability of Gaussian elimination for non-exact matrices, makes it rather useless However, for -matrices it works fine you can solve quadratics, and it also works nicely for various kinds of matrices of nice forms eg, you can diagonalize the n n-matrix for each n; try it Practical algorithms for n n n numerical computation are a completely different story 06 of Olver/Shakiban tells the beginnings of the story namely, how to find eigenvalues, and get something close to diagonalization Similar to Gaussian elimination, it is wrong to expect diagonalization to work with approximate matrices, because S and Λ can jump wildly when A is changed only a little bit; however, certain things can be done that come close to diagonalization e There is a theorem called the spectral theorem saying that if A is a symmetric matrix with real entries, then A is always diagonalizable over the reals ie, we can find S and Λ with real entries, and moreover you can find an S that is orthogonal ie, the columns of S are orthonormal This is a hugely important fact in applications it is related to the SVD, among many other things, but we will not have the time for it in class Let me just mention that finding an orthogonal S requires only a simple fix to Algorithm 0: In Step 3, you have to choose an orthonormal basis of Ker A λ j I n not just some basis Then, in Step 4, the big list s, s,, s m will automatically be an orthonormal basis of R n This is one of the miracles of symmetric matrices See 84 in Olver/Shakiban for a proof and more details Exercise a Diagonalize A 4 0 b Diagonalize A 0 [5 points] [5 points]

9 Math 44 Fall 06 homework page 9 c Diagonalize A [0 points] Solution to Exercise We proceed by using Algorithm 0 You have seen this often enough that a We have x det A xi det x 4 x x 4 5x x The roots of this polynomial ie, the eigenvalues of A are clearly 0 and 5 We number them as λ 0 and λ 5 We now must find bases for Ker A λ I and Ker A λ I We can do this using the standard Gaussian elimination procedure you can also see the result directly if you are sufficiently astute, obtaining the basis for Ker A λ I and the basis for Ker A λ I The big list is therefore s, s, This has size, which is our n; hence, the matrix A can be diagonalized We have s, µ λ 0, s and µ λ Therefore, S and Λ b We can take S and Λ 0 0 One way to solve this is by proceeding exactly as in part a Another is to observe that our matrix A is already diagonal, so we can diagonalize it by simply taking S I and Λ A c We can take S 0 0 and Λ Again, the method is the same as for part a, but this time we have to solve the cubic equation x 3 3x + x 0 This is done as follows: The root x 0 is obvious Leaving this root aside, we can find the other two roots by solving x 3x + 0; this can be done using the standard formula for the roots of a quadratic Exercise 3 Define a sequence g 0, g, g, of integers by g 0 0, g, g n+ 3g n + g n for all n This is similar to the Fibonacci sequence Here is a partial table of values: k g k

10 Math 44 Fall 06 homework page 0 a What are g 9 and g 0? b Define a -matrix A by A c Prove that 3 0 A n gn+ g n g n [ points] Find A and A 3 [ points] g n for all n The proof or at least the easiest proof is by induction over n: In the induction base, you should check that 5 holds for n In the induction step, you assume that 5 holds for n m for a given positive integer m, and then you have to check that 5 also holds for n m + Use the fact that A m+ AA m 3 A 0 m [0 points] d Diagonalize A [0 points] e Use this to obtain an explicit formula for g n The formula will involve square roots and n-th powers of numbers, but no recursion and no matrices [0 points] Solution to Exercise 3 a We have g 9 3g 8 + g and g 0 3g 9 + g b We have A 0 3 and A [This is, of course, a 0 3 particular case of 5] c We mimic the proof of Proposition 48 in the lecture notes: We shall prove 5 by induction over n: Induction base: We have A A g+ g g g we obtain A 3 0 g+ g g g 3 0 Comparing this with 5 since g + g 3, g and g g 0 0, In other words, 5 holds for n This completes the induction base Induction step: Let N be a positive integer I am calling it N rather than m here, in order to stay closer to the proof of Proposition 48 in the lecture notes Assume that 5 holds for n N We must show that 5 also holds for n N + The definition of the sequence g 0, g, g, shows that g N+ 3g N+ + g N and g N+ 3g N + g N We have assumed that 5 holds for n N In other words, A N gn+ g N g N g N

11 Math 44 Fall 06 homework page Now, A N+ }{{} A N gn+ g N g N g N gn+ g N }{{} A 3 gn+ 3 + g N 0 g N+ + g N 0 g N 3 + g N g N + g N 0 g N g N by the definition of a product of two matrices 3gN+ + g N g N+ gn+ g N+ 3g N + g N g N g N+ g N 3 0 since 3g N+ + g N g N+ and 3g N + g N g N+ In other words, 5 holds for n N + This completes the induction step; hence, 5 is proven d We have A SΛS, where S 3 3 and Λ This can be found using Algorithm 0 again e Fix n N Let S and Λ be as in the solution to part d Then, 3 n Λ n 3 n because in order to raise a diagonal matrix to the n-th power, it suffices to raise each diagonal entry to the n-th power Now, from A SΛS, we obtain A n SΛ n S as shown in class since S , Λ n 3 n n n n 3 +

12 Math 44 Fall 06 homework page 3 and S If we multiply out this product, we obtain 3 + explicit formulas for each of the four entries of A n In particular, we obtain the following formula for its, -th entry: A n, 3 But 5 shows that A n, g n Hence, g n A n, 3 3 n n n n 3 + This is the formula we are looking for It is, of course, similar to the Binet formula for the Fibonacci numbers Exercise 4 Let A be an n n-matrix Assume that A can be diagonalized, with A SΛS for an invertible n n-matrix S and a diagonal n n-matrix Λ a Diagonalize A [5 points] b Diagonalize A, if A is invertible You can use the fact that for an invertible A, the diagonal entries of Λ are nonzero, and so Λ is a diagonal matrix again [5 points] c Diagonalize A T the transpose of A [0 points] The answers should be in terms of S and Λ For example, A + I n can be diagonalized as follows: A + I n S Λ + I n S Indeed, S is an invertible matrix, Λ + I n is a diagonal matrix being the sum of the two diagonal matrices Λ and I n, and we have S Λ + I n S } SΛS {{ } + SI }{{} n A S Solution to Exercise 4 a We have A SΛS, and thus S A + SS }{{} I n A + I n A SΛS SΛ S } {{ S} ΛS S }{{} ΛΛ S SΛ S I n Λ The matrix Λ is diagonal Thus, any power of Λ is diagonal as well because in order to raise a diagonal matrix Λ to some power, we merely need to raise its diagonal entries to this power In particular, Λ is diagonal Therefore, the equality A SΛ S provides a diagonalization of A

13 Math 44 Fall 06 homework page 3 b Assume that A is invertible Then, it is not hard to see that all diagonal entries of Λ are nonzero 3 Therefore, the diagonal matrix Λ is invertible, and its inverse Λ is obtained by inverting all diagonal entries of Λ In particular, Λ is a diagonal matrix as well You can use this fact without proof, but it is helpful to know how it is proven Recall that UV V U for any two invertible n n-matrices U and V Applying this to U SΛ and V S, we find SΛS S SΛ SΛ S We have A SΛS, and thus A SΛS SΛ S } {{ } S }{{} Λ S This equality provides a diagonalization of A since the matrix Λ is diagonal c Proposition 38 f in the lecture notes applied to S instead of A shows that the matrix S T is invertible, and its inverse is S T S T Proposition 38 e in the lecture notes shows that any two matrices U and V satisfy UV T V T U T as long as the product UV is well-defined, ie, the number of columns of U equals the number of rows of V Applying this to U SΛ and SΛ T }{{}}{{} S T Λ T S T The matrix Λ its diagonal Hence, Λ T Λ because transposing a diagonal matrix does not change it Now, from A SΛS, we obtain V S, we obtain SΛS T S T A T SΛS T S T Λ T }{{} Λ }{{} S T S T S T Λ T S T S T Λ S T This equality provides a diagonalization of A T since the matrix Λ is diagonal 3 Proof Assume the contrary Then, at least one diagonal entry of Λ is zero But the matrix Λ is diagonal, and thus upper-triangular Hence, the determinant of Λ equals the product of its diagonal entries, and therefore equals 0 since at least one diagonal entry is 0, and therefore the whole product must be 0 In other words, det Λ 0 Now, from A SΛS, we obtain det A det SΛS det S det }{{ Λ } det S 0 Hence, A is not invertible since a square 0 matrix with determinant 0 is not invertible This contradicts the fact that A is invertible This contradiction shows that our assumption was false, qed [This was not the easiest or most elementary proof, but the shortest one]

Math 4242 Fall 2016 (Darij Grinberg): homework set 8 due: Wed, 14 Dec b a. Here is the algorithm for diagonalizing a matrix we did in class:

Math 4242 Fall 2016 (Darij Grinberg): homework set 8 due: Wed, 14 Dec b a. Here is the algorithm for diagonalizing a matrix we did in class: Math 4242 Fall 206 homework page Math 4242 Fall 206 Darij Grinberg: homework set 8 due: Wed, 4 Dec 206 Exercise Recall that we defined the multiplication of complex numbers by the rule a, b a 2, b 2 =

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0 LECTURE LECTURE 2 0. Distinct eigenvalues I haven t gotten around to stating the following important theorem: Theorem: A matrix with n distinct eigenvalues is diagonalizable. Proof (Sketch) Suppose n =

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Practice problems for Exam 3 A =

Practice problems for Exam 3 A = Practice problems for Exam 3. Let A = 2 (a) Determine whether A is diagonalizable. If so, find a matrix S such that S AS is diagonal. If not, explain why not. (b) What are the eigenvalues of A? Is A diagonalizable?

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2018 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Chapter 2 Notes, Linear Algebra 5e Lay

Chapter 2 Notes, Linear Algebra 5e Lay Contents.1 Operations with Matrices..................................1.1 Addition and Subtraction.............................1. Multiplication by a scalar............................ 3.1.3 Multiplication

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

The 4-periodic spiral determinant

The 4-periodic spiral determinant The 4-periodic spiral determinant Darij Grinberg rough draft, October 3, 2018 Contents 001 Acknowledgments 1 1 The determinant 1 2 The proof 4 *** The purpose of this note is to generalize the determinant

More information

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions. Review for Exam. Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions. x + y z = 2 x + 2y + z = 3 x + y + (a 2 5)z = a 2 The augmented matrix for

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

22A-2 SUMMER 2014 LECTURE 5

22A-2 SUMMER 2014 LECTURE 5 A- SUMMER 0 LECTURE 5 NATHANIEL GALLUP Agenda Elimination to the identity matrix Inverse matrices LU factorization Elimination to the identity matrix Previously, we have used elimination to get a system

More information

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points.

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points. 86 Problem Set 8 Solution Due Wednesday, April 9 at 4 pm in -6 Total: 6 points Problem : If A is real-symmetric, it has real eigenvalues What can you say about the eigenvalues if A is real and anti-symmetric

More information

Lecture 2e Row Echelon Form (pages 73-74)

Lecture 2e Row Echelon Form (pages 73-74) Lecture 2e Row Echelon Form (pages 73-74) At the end of Lecture 2a I said that we would develop an algorithm for solving a system of linear equations, and now that we have our matrix notation, we can proceed

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 The test lasts 1 hour and 15 minutes. No documents are allowed. The use of a calculator, cell phone or other equivalent electronic

More information

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education MTH 3 Linear Algebra Study Guide Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education June 3, ii Contents Table of Contents iii Matrix Algebra. Real Life

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

Math 4242 Fall 2016 (Darij Grinberg): homework set 6 due: Mon, 21 Nov 2016 Let me first recall a definition.

Math 4242 Fall 2016 (Darij Grinberg): homework set 6 due: Mon, 21 Nov 2016 Let me first recall a definition. Math 4242 Fall 206 homework page Math 4242 Fall 206 Darij Grinberg: homework set 6 due: Mon, 2 Nov 206 Let me first recall a definition. Definition 0.. Let V and W be two vector spaces. Let v = v, v 2,...,

More information

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS chapter MORE MATRIX ALGEBRA GOALS In Chapter we studied matrix operations and the algebra of sets and logic. We also made note of the strong resemblance of matrix algebra to elementary algebra. The reader

More information

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A = STUDENT S COMPANIONS IN BASIC MATH: THE ELEVENTH Matrix Reloaded by Block Buster Presumably you know the first part of matrix story, including its basic operations (addition and multiplication) and row

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Volume in n Dimensions

Volume in n Dimensions Volume in n Dimensions MA 305 Kurt Bryan Introduction You ve seen that if we have two vectors v and w in two dimensions then the area spanned by these vectors can be computed as v w = v 1 w 2 v 2 w 1 (where

More information

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Math 108B Professor: Padraic Bartlett Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Week 6 UCSB 2014 1 Lies Fun fact: I have deceived 1 you somewhat with these last few lectures! Let me

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

0.1. Linear transformations

0.1. Linear transformations Suggestions for midterm review #3 The repetitoria are usually not complete; I am merely bringing up the points that many people didn t now on the recitations Linear transformations The following mostly

More information

Recitation 8: Graphs and Adjacency Matrices

Recitation 8: Graphs and Adjacency Matrices Math 1b TA: Padraic Bartlett Recitation 8: Graphs and Adjacency Matrices Week 8 Caltech 2011 1 Random Question Suppose you take a large triangle XY Z, and divide it up with straight line segments into

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) D. ARAPURA Gaussian elimination is the go to method for all basic linear classes including this one. We go summarize the main ideas. 1.

More information

A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

More information

Final Exam Practice Problems Answers Math 24 Winter 2012

Final Exam Practice Problems Answers Math 24 Winter 2012 Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the

More information

MAT1302F Mathematical Methods II Lecture 19

MAT1302F Mathematical Methods II Lecture 19 MAT302F Mathematical Methods II Lecture 9 Aaron Christie 2 April 205 Eigenvectors, Eigenvalues, and Diagonalization Now that the basic theory of eigenvalues and eigenvectors is in place most importantly

More information

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p. LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a

More information

Chapter 4 & 5: Vector Spaces & Linear Transformations

Chapter 4 & 5: Vector Spaces & Linear Transformations Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations.

Linear Algebra. The analysis of many models in the social sciences reduces to the study of systems of equations. POLI 7 - Mathematical and Statistical Foundations Prof S Saiegh Fall Lecture Notes - Class 4 October 4, Linear Algebra The analysis of many models in the social sciences reduces to the study of systems

More information

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur Lecture 02 Groups: Subgroups and homomorphism (Refer Slide Time: 00:13) We looked

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

MATH 310, REVIEW SHEET 2

MATH 310, REVIEW SHEET 2 MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,

More information

Gaussian elimination

Gaussian elimination Gaussian elimination October 14, 2013 Contents 1 Introduction 1 2 Some definitions and examples 2 3 Elementary row operations 7 4 Gaussian elimination 11 5 Rank and row reduction 16 6 Some computational

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

CANONICAL FORMS FOR LINEAR TRANSFORMATIONS AND MATRICES. D. Katz

CANONICAL FORMS FOR LINEAR TRANSFORMATIONS AND MATRICES. D. Katz CANONICAL FORMS FOR LINEAR TRANSFORMATIONS AND MATRICES D. Katz The purpose of this note is to present the rational canonical form and Jordan canonical form theorems for my M790 class. Throughout, we fix

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Elementary Linear Algebra

Elementary Linear Algebra Matrices J MUSCAT Elementary Linear Algebra Matrices Definition Dr J Muscat 2002 A matrix is a rectangular array of numbers, arranged in rows and columns a a 2 a 3 a n a 2 a 22 a 23 a 2n A = a m a mn We

More information

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0. Chapter Find all x such that A x : Chapter, so that x x ker(a) { } Find all x such that A x ; note that all x in R satisfy the equation, so that ker(a) R span( e, e ) 5 Find all x such that A x 5 ; x x

More information

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Math 5- Computation Test September 6 th, 6 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Name: Answer Key: Making Math Great Again Be sure to show your work!. (8 points) Consider the following

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Math Matrix Theory - Spring 2012

Math Matrix Theory - Spring 2012 Math 440 - Matrix Theory - Spring 202 HW #2 Solutions Which of the following are true? Why? If not true, give an example to show that If true, give your reasoning (a) Inverse of an elementary matrix is

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

1. Introduction to commutative rings and fields

1. Introduction to commutative rings and fields 1. Introduction to commutative rings and fields Very informally speaking, a commutative ring is a set in which we can add, subtract and multiply elements so that the usual laws hold. A field is a commutative

More information

Math 121 Homework 5: Notes on Selected Problems

Math 121 Homework 5: Notes on Selected Problems Math 121 Homework 5: Notes on Selected Problems 12.1.2. Let M be a module over the integral domain R. (a) Assume that M has rank n and that x 1,..., x n is any maximal set of linearly independent elements

More information

MA106 Linear Algebra lecture notes

MA106 Linear Algebra lecture notes MA106 Linear Algebra lecture notes Lecturers: Diane Maclagan and Damiano Testa 2017-18 Term 2 Contents 1 Introduction 3 2 Matrix review 3 3 Gaussian Elimination 5 3.1 Linear equations and matrices.......................

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

4 Elementary matrices, continued

4 Elementary matrices, continued 4 Elementary matrices, continued We have identified 3 types of row operations and their corresponding elementary matrices. To repeat the recipe: These matrices are constructed by performing the given row

More information

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions Linear Algebra (MATH 4) Spring 2 Final Exam Practice Problem Solutions Instructions: Try the following on your own, then use the book and notes where you need help. Afterwards, check your solutions with

More information

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler. Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we

More information

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and

More information

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

More information

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3 Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

MATHEMATICS 23a/E-23a, Fall 2015 Linear Algebra and Real Analysis I Module #1, Week 4 (Eigenvectors and Eigenvalues)

MATHEMATICS 23a/E-23a, Fall 2015 Linear Algebra and Real Analysis I Module #1, Week 4 (Eigenvectors and Eigenvalues) MATHEMATICS 23a/E-23a, Fall 205 Linear Algebra and Real Analysis I Module #, Week 4 (Eigenvectors and Eigenvalues) Author: Paul Bamberg R scripts by Paul Bamberg Last modified: June 8, 205 by Paul Bamberg

More information

Linear Algebra: Lecture Notes. Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway

Linear Algebra: Lecture Notes. Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway Linear Algebra: Lecture Notes Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway November 6, 23 Contents Systems of Linear Equations 2 Introduction 2 2 Elementary Row

More information

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

More information

Notes on arithmetic. 1. Representation in base B

Notes on arithmetic. 1. Representation in base B Notes on arithmetic The Babylonians that is to say, the people that inhabited what is now southern Iraq for reasons not entirely clear to us, ued base 60 in scientific calculation. This offers us an excuse

More information

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any

More information

REPRESENTATION THEORY OF S n

REPRESENTATION THEORY OF S n REPRESENTATION THEORY OF S n EVAN JENKINS Abstract. These are notes from three lectures given in MATH 26700, Introduction to Representation Theory of Finite Groups, at the University of Chicago in November

More information

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1) Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1) Travis Schedler Thurs, Nov 18, 2010 (version: Wed, Nov 17, 2:15

More information

Lecture 6: Finite Fields

Lecture 6: Finite Fields CCS Discrete Math I Professor: Padraic Bartlett Lecture 6: Finite Fields Week 6 UCSB 2014 It ain t what they call you, it s what you answer to. W. C. Fields 1 Fields In the next two weeks, we re going

More information

(VI.D) Generalized Eigenspaces

(VI.D) Generalized Eigenspaces (VI.D) Generalized Eigenspaces Let T : C n C n be a f ixed linear transformation. For this section and the next, all vector spaces are assumed to be over C ; in particular, we will often write V for C

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

The minimal polynomial

The minimal polynomial The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic

More information

Linear Algebra Practice Final

Linear Algebra Practice Final . Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis

More information

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is

More information

Math 3C Lecture 20. John Douglas Moore

Math 3C Lecture 20. John Douglas Moore Math 3C Lecture 20 John Douglas Moore May 18, 2009 TENTATIVE FORMULA I Midterm I: 20% Midterm II: 20% Homework: 10% Quizzes: 10% Final: 40% TENTATIVE FORMULA II Higher of two midterms: 30% Homework: 10%

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

MATH 235. Final ANSWERS May 5, 2015

MATH 235. Final ANSWERS May 5, 2015 MATH 235 Final ANSWERS May 5, 25. ( points) Fix positive integers m, n and consider the vector space V of all m n matrices with entries in the real numbers R. (a) Find the dimension of V and prove your

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

Linear Least-Squares Data Fitting

Linear Least-Squares Data Fitting CHAPTER 6 Linear Least-Squares Data Fitting 61 Introduction Recall that in chapter 3 we were discussing linear systems of equations, written in shorthand in the form Ax = b In chapter 3, we just considered

More information

Frame Diagonalization of Matrices

Frame Diagonalization of Matrices Frame Diagonalization of Matrices Fumiko Futamura Mathematics and Computer Science Department Southwestern University 00 E University Ave Georgetown, Texas 78626 U.S.A. Phone: + (52) 863-98 Fax: + (52)

More information

Eigenvalues by row operations

Eigenvalues by row operations Eigenvalues by row operations Barton L. Willis Department of Mathematics University of Nebraska at Kearney Kearney, Nebraska 68849 May, 5 Introduction There is a wonderful article, Down with Determinants!,

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information