HW # 2 Solutions. The Ubiquitous Curling and Hockey NBC Television Schedule. March 4, 2010
|
|
- Caitlin King
- 5 years ago
- Views:
Transcription
1 HW # 2 Solutions The Ubiquitous Curling and Hockey NBC Television Schedule March 4, 2010 Hi everyone. NBC here. I just got done airing another 47.2 hours of either curling or hockey. We understand that Corey likes both of those sports, but we both understand that he also likes basically every other olympic sport, and simultaneously refuse to air anything other that curling and hockey. Even on the internet. Enjoy these solutions! Again, Corey is posting those solutions which he grades or finds interesting and nontrivial enough to post. He wouldn t post stuff he specifically covered already in class. Have fun! (a) By definition, W = span{v, T (v), T 2 (v),...}. Let w W. Then for some scalars a i, we have w = a i T i (v), where by convention, we express T 0 (v) = v. Then T (w) = a i T i+1 (v) span{v, T (v), T 2 (v),...}. So it follows that W is T invariant. (b) Let Z be a T invariant subspace, and suppose v Z. We must show that W V, where W is defined above (in part (a)). This can be accomplished by showing that each of the elements of the spanning set {v, T (v), T 2 (v),...} of W is an element of Z. But Z is T invariant, so T (v) Z, and by repeating the argument, one has (inductively) T i (v) Z for all i. Thus {v, T (v), T 2 (v),...} Z, so W Z The proof follows very closely to that of Theorem 5.5, page 261. Proceeding inductively, we see that if v 1 W (that s the sum of the first one elements of the set {v 1,..., v k }, then v 1 W (that s each of the summands from the sum above). Now suppose that the sum of the first k 1 elements from the set {v 1,..., v k } being in W is equivalent to the assertion v i W for i = 1,..., k 1. Now consider v v k W. Clearly, if v i W, then this sum is in W. Conversely, applying T λ k I, where v i E λi for i = 1,..., k, we have (T λ k I)(v v k ) = (λ 1 λ k )v (λ k 1 λ k )v k 1 W, so by the induction hypothesis, each of (λ i λ k )v i W, so that v i W for i = 1,..., k 1. Now, v = v 1 + +v k 1 +v k W means that v k = v v 1 v k 1 W We first show that if W is a T invariant subspace, and W V = E λ, then W = (E λ W ). Let W λ = E λ W. Then first we show that W λ = W. Let w W λ. Then w = w λ, where the sum is taken over all distinct eigenvalues λ, and w λ W λ = E λ W. So, each w λ W, and so w W. Conversely, suppose 1
2 w W V = E λ. So w, then, must be the sum of eigenvectors corresponding to distinct eigenvalues w = λ w λ, w λ E λ. By the above problem , we have each w λ W. So w λ W λ, and w ( W λ. Now we show that η λ η) W W λ = {0}. If η λ w η W λ, then using the T invariant subspace W λ and problem , we have each w η W λ, which could only be the case if w η = 0, since E η E λ = {0} when η λ. So, W = W λ. Now, we can find a basis β λ for W λ for each λ and proceed. The space W λ is T invariant, and T Wλ = λi. So the basis β λ diagonalizes T Wλ, and the union β = λ β λ over all eigenvalues λ of T yields a basis for W = W λ that diagonalizes T W (a) We suppose that T and U are diagonalizable operators, and that UT = T U. We show that there exists one basis β for V for which both [U] β and [T ] β are diagonal. We note that V = λ E λ, where the E λ are the eigenspaces of T, and the direct sum is taken over all eigenvalues λ of T. We also note that if, for each eigenvalue λ of T, β λ can be found which simultaneously diagonalizes both T and U on E λ, then the basis β = β λ is a desired basis for V that diagonalizes both T and U on V, where the union is taken over all eigenvalues λ of T. So we set out to find a basis β λ for E λ diagonalizing both T and U. First we show that U is E λ invariant. After that, we observe that the restriction U λ of U to E λ is diagonalizable since it is the restriction of a diagonalizable operator to an invariant subspace (see # 24). Last, we ll notice that every basis of E λ diagonalizes T on E λ since T λ, the restriction of T to E λ, is a scalar multiple of the identity: T λ = λi. Thus, a desired basis β λ for E λ is one which diagonalizes U λ, since such a basis would necessarily also diagonalize T λ = λi. So showing the U is E λ invariant is our last duty. Suppose v E λ. We must show that Uv E λ. To check this, we have to verify that T (Uv) = λ(uv). But by our commutativity hypothesis, T (Uv) = (T U)v = (UT )v = U(λv) = λ(uv) The proof that this is an inner product is actually independent of A, provided that A satisfies a condition known as being Hermitian, which for the most part means that A = A (here, A is the adjoint of the matrix A). The first part follows from direct observation about the algebra of appropriately sized matrices: x + y, z = (x + y)az = xaz + yaz = x, z + y, z. The next property also follows quite easily The third property follows since A = A: cx, z = (cx)az = c(xaz ) = c x, z. 2
3 x, z = x, z T = xaz T = z T A T x T = zax = z, x. Finally, we do the multiplication, with i = 1, v 1 = a + bi and v 2 = c + di so that v 2 = c di. ( ) ( ) ( ) 1 i v1 v1 v 2 = ( ) ( ) v v i 2 v 1 v 1 + i v i v v 2 = v v i(v 1 v 2 v 1 v 2 ) = a 2 + b 2 + 2c 2 + 2d 2 + 2ad 2bc = d 2 + c 2 + (a + d) 2 + (b c) 2 > 0 when either of v 1 or v 2 is We compute by hand, using a different dummy variable in the second slot. Keep in mind that the hypotheses state that {v 1,..., v n } is an orthogonal set, so that v i, v j = 0 if i j: i a iv i 2 = i a iv i, j a jv j = i,j a iā j v i, v j = i a iā i v i, v i = a i 2 v i Here are some thoughts on this problem, as it s sort of a big deal. Here s a proof: Proof. If every operator in C has only one eigenvalue, then since they re all diagonalizable, each one is represented by a scalar matrix for any choice of basis. (That means I couldn t screw up each of them being diagonal if I tried. This is a big point.) Namely, they re simultaneously diagonalizable. Notice also that if the dimension of the vector space is 1, then the same statement is true: any basis diagonalizes every element in C. We induct on the dimension of V, and the base case has already been verified. Suppose that on any vector space of dimension less than n, that whenever we have a family of commuting diagonalizable operators on such a vector space, that they are simultaneously diagonalizable. Since the result is true for any vector space V of dimension n where every operator has only one eigenvalue, we assume that there is at least one operator T with more than one eigenvalue. Suppose λ is an eigenvalue of T, and that Λ is the set of eigenvalues of T. Note that since Λ > 1 by hypothesis, we have 1 dim E λ < n. Since T is diagonalizable, we have V = η Λ E η, and that the vector space η Λ,η λ E η has a dimension dim( η Λ,η λ E η ) = n dim(e λ ) {1,..., n 1}. 3
4 Namely, if there were a commuting collection of linear operators for each of these spaces, the induction hypothesis would then apply to the vector spaces E λ and W := η Λ,η λ E η. To do this, we must find such a collection C λ for E λ and C W for W. We establish this presently. Suppose U C is an arbitrary element of C. We show that E λ and W are both U invariant. We begin by noting that if η is any eigenvalue of T, then E η is U invariant. This fact follows precisely as in the last part of the argument in number (a). So, U is E λ invariant (η = λ). For every eigenvalue η λ, U : E η E η, and thus U : W W since W is the direct sum of eigenspaces. So E λ and W are U-invariant. If U C, then its restrictions U λ to E λ and U W to W are both still diagonalizable. Furthermore, if we set C λ = {U λ U C} and C W = {U W U C}, then C λ and C W is a set of commuting diagonalizable operators. So, the induction hypothesis applies to both, and we have bases β λ of E λ and β W of W which simultaneously diagonalize the operators in C λ and C W respectively. Since V = E λ W, the collection β = β λ β W is a basis for V. Since U = U λ U W, each of the operators remains simultaneously diagonalized. So, here s the deal. With most induction proofs I feel as though if you gave me n, and enough time, that the proof is instructive enough to actually produce what is being asked for. You know, the idea that you could do one step, put your work aside and focus on what s left, then repeat until you find yourself finished and yearning to see something other than curling or hockey on TV. In this case, it didn t seem as though I could do that until I gave it some thought. So, here s a description of what I might do if given a collection C on a vector space of dimension n. I would first choose any operator not already diagonal (i.e., choose any basis, and see if any operators are not scalar matrices.) If every one of them is, then I m done! Back to watching curling and/or hockey on NBC. If not, then choose any one which is not. This means that it must have more than one eigenvalue. Choose any one of its eigenvalues λ and get to work. Everything we accomplish below (i.e., finding a suitable basis for E λ ) can then be applied to each eigenvalue and eigenspace, and we ll be done. Suppose for just a moment that C contained 3 operators: T, U and Ũ. Number 25 shows that we may find a basis for E λ that diagonalizes U, since E λ is U invariant. But what about Ũ? Like before, E λ is Ũ invariant. Here s where thinking about it too much leads one to not many conclusions, but remembering something simple helps a lot. Rather than formulae and proofs (since I ve already given a valid proof above), let me speak a little less formally about this situation, and you ll see that you could indeed diagonalize all of T, U, and Ũ on E λ. Suppose {v 1,..., v k } is a basis for E λ diagonalizing both T and U. What if one of the v i is not an eigenvector for Ũ? This makes v 1 somewhat of a problem. Then say i = 1 and Ũv 1 = av 1 + bv 2 for 4
5 simplicity. Then UŨv 1 = aλ 1 v 1 + bλ 2 v 2 = λ 1 av 1 + λ 1 bv 2 = ŨUv 1. Thus, λ 1 b = λ 2 b, and since v 1 was not an eigenvector, b 0. So λ 1 = λ 2. Furthermore, if v 2 isn t an eigenvector for Ũ, then Ũv 2 spreads out amongst possibly other parts of the space, but the same argument would give us λ 2 is equal to any of the other corresponding eigenvectors across which Ũv 2 might be, same with the rest of any other problem vectors that associate with one another in this manner. That is, for each problem vector, there is a set of some other problem vectors that all correspond to the same eigenvalue. Say that eigenvalue is η (of U). Then on W = E λ E η (that s λ, the eigenvalue for T, and η, the eigenvalue for U), the restriction of U to W is U W = ηi, a scalar multiple of the identity. So, all I have to do is collect all of the problem vectors associated to one eigenvalue into one pot that s W and proceed. On W, both T and U are multiples of the identity, so try as I might, there s no way that any basis for W could possibly screw up the diagonal look of T W and U W (see (a), and further, it s easy to see that W is both T and U invariant). It s also easy to check in the same manner as before that W is Ũ invariant since all of T, U and Ũ commute. So Ũ W may be diagonalized on W, and so, on W, all of T, U, and Ũ are simultaneously diagonalizable. It s possible that there were other problem vectors that correspond to another eigenvalue of U. Those are in a seperate spot, i.e., none of the problem vectors corresponding to one eigenvalue could also be problem vectors corresponding to any other eigenvalue. So, for each eigenvalue of U, we adjust each of the problem spots as described above, and since E λ = Eη U since U Eλ is diagonalizable, where the direct sum is taken over all eigenvalues η of U, and the spaces Eη U are the eigenspaces of U corresponding to the eigenvalue η, this produces a basis for E λ which simultaneously diagonailzes all of T, U and Ũ. Now choose a different eigenvalue of T and do the same thing. The process eventually stops since dim(v ) <. So what if there is another operator Û? Then we go back to our first problem in W, diagonalize T, U, and Ũ, and do the same thing to Û that we did to Ũ before, but this time with reference to Ũ instead of U. There is no need to pay any attention to T and U as we search for a suitable basis for W, since any basis would diagonalize T and U on W. Again, since dim(v ) <, as we consider more and more operators, the successive intersections of eigenspaces will eventually lead to a bunch of spaces of dimension 1 (which would be easy to find such a basis on each such space anything works), or there would be some subspace (of possibly higher dimension) that is an eigenspace (or subspace of an eigenspace) for all of the operators in C, and all of the operators are multiples of the identity on such a space. (Actually, the situation that we end up with a bunch of 1 dimensional spaces is the same situation!) We patch all the bases together, and move eigenvalue by eigenvalue. Whew! So, I now believe that for any collection of diagonalizable commuting operators, that I could simultaneously diagonalize them. The bonus in this discussion is that this reveals the inductive step that was described above: if you could do it for E λ, then you could do it for the 5
6 whole space. Then, within E λ, you ask yourself the same question and get down to W. Then ask yourself the same question and repeat until your fingers fall off from typing, or the olympics end and you long for both curling and hockey. 6
Linear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationMath 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.
Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,
More informationMATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by
MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar
More informationUniversity of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm
University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions
More informationGeneralized eigenspaces
Generalized eigenspaces November 30, 2012 Contents 1 Introduction 1 2 Polynomials 2 3 Calculating the characteristic polynomial 5 4 Projections 7 5 Generalized eigenvalues 10 6 Eigenpolynomials 15 1 Introduction
More informationMATHEMATICS 217 NOTES
MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable
More informationA linear algebra proof of the fundamental theorem of algebra
A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional
More informationMath 108b: Notes on the Spectral Theorem
Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator
More informationA linear algebra proof of the fundamental theorem of algebra
A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional
More informationLinear Algebra Final Exam Solutions, December 13, 2008
Linear Algebra Final Exam Solutions, December 13, 2008 Write clearly, with complete sentences, explaining your work. You will be graded on clarity, style, and brevity. If you add false statements to a
More informationMAT1302F Mathematical Methods II Lecture 19
MAT302F Mathematical Methods II Lecture 9 Aaron Christie 2 April 205 Eigenvectors, Eigenvalues, and Diagonalization Now that the basic theory of eigenvalues and eigenvectors is in place most importantly
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationFinal A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017
Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.
More informationLECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY
LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector
More informationRemarks on Definitions
Remarks on Definitions 1. Bad Definitions Definitions are the foundation of mathematics. Linear algebra bulges with definitions and one of the biggest challenge for students is to master them. It is difficult.
More informationRecitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples
Math b TA: Padraic Bartlett Recitation 9: Probability Matrices and Real Symmetric Matrices Week 9 Caltech 20 Random Question Show that + + + + +... = ϕ, the golden ratio, which is = + 5. 2 2 Homework comments
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationLINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM
LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator
More informationSpectral Theorem for Self-adjoint Linear Operators
Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND
More informationName: Final Exam MATH 3320
Name: Final Exam MATH 3320 Directions: Make sure to show all necessary work to receive full credit. If you need extra space please use the back of the sheet with appropriate labeling. (1) State the following
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationMITOCW ocw f99-lec30_300k
MITOCW ocw-18.06-f99-lec30_300k OK, this is the lecture on linear transformations. Actually, linear algebra courses used to begin with this lecture, so you could say I'm beginning this course again by
More informationJordan Canonical Form Homework Solutions
Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have
More informationThe eigenvalues are the roots of the characteristic polynomial, det(a λi). We can compute
A. [ 3. Let A = 5 5 ]. Find all (complex) eigenvalues and eigenvectors of The eigenvalues are the roots of the characteristic polynomial, det(a λi). We can compute 3 λ A λi =, 5 5 λ from which det(a λi)
More informationLinear Algebra, Summer 2011, pt. 2
Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................
More informationAnswers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3
Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,
More informationOHSx XM511 Linear Algebra: Solutions to Online True/False Exercises
This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)
More informationMAT 1302B Mathematical Methods II
MAT 1302B Mathematical Methods II Alistair Savage Mathematics and Statistics University of Ottawa Winter 2015 Lecture 19 Alistair Savage (uottawa) MAT 1302B Mathematical Methods II Winter 2015 Lecture
More informationProblems for M 11/2: A =
Math 30 Lesieutre Problem set # November 0 Problems for M /: 4 Let B be the basis given by b b Find the B-matrix for the transformation T : R R given by x Ax where 3 4 A (This just means the matrix for
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationMatrices related to linear transformations
Math 4326 Fall 207 Matrices related to linear transformations We have encountered several ways in which matrices relate to linear transformations. In this note, I summarize the important facts and formulas
More informationDirect Sums and Invariants. Direct Sums
Math 5327 Direct Sums and Invariants Direct Sums Suppose that W and W 2 are both subspaces of a vector space V Recall from Chapter 2 that the sum W W 2 is the set "u v u " W v " W 2 # That is W W 2 is
More informationA PRIMER ON SESQUILINEAR FORMS
A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationFirst we introduce the sets that are going to serve as the generalizations of the scalars.
Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................
More informationMITOCW ocw f99-lec09_300k
MITOCW ocw-18.06-f99-lec09_300k OK, this is linear algebra lecture nine. And this is a key lecture, this is where we get these ideas of linear independence, when a bunch of vectors are independent -- or
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationFootnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases
Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases November 18, 2013 1 Spanning and linear independence I will outline a slightly different approach to the material in Chapter 2 of Axler
More informationOnline Exercises for Linear Algebra XM511
This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2
More informationINTRODUCTION TO LIE ALGEBRAS. LECTURE 2.
INTRODUCTION TO LIE ALGEBRAS. LECTURE 2. 2. More examples. Ideals. Direct products. 2.1. More examples. 2.1.1. Let k = R, L = R 3. Define [x, y] = x y the cross-product. Recall that the latter is defined
More informationMATH 583A REVIEW SESSION #1
MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),
More informationNote: Please use the actual date you accessed this material in your citation.
MIT OpenCourseWare http://ocw.mit.edu 18.06 Linear Algebra, Spring 2005 Please use the following citation format: Gilbert Strang, 18.06 Linear Algebra, Spring 2005. (Massachusetts Institute of Technology:
More informationHomework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.
Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0
More informationTopics in linear algebra
Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over
More informationDifferential Equations
This document was written and copyrighted by Paul Dawkins. Use of this document and its online version is governed by the Terms and Conditions of Use located at. The online version of this document is
More informationEigenvalues and Eigenvectors
/88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationUNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if
UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is
More informationComps Study Guide for Linear Algebra
Comps Study Guide for Linear Algebra Department of Mathematics and Statistics Amherst College September, 207 This study guide was written to help you prepare for the linear algebra portion of the Comprehensive
More informationThe Spectral Theorem for normal linear maps
MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationMath 396. Quotient spaces
Math 396. Quotient spaces. Definition Let F be a field, V a vector space over F and W V a subspace of V. For v, v V, we say that v v mod W if and only if v v W. One can readily verify that with this definition
More informationMatrix Theory, Math6304 Lecture Notes from Sept 11, 2012 taken by Tristan Whalen
Matrix Theory, Math6304 Lecture Notes from Sept 11, 2012 taken by Tristan Whalen 1 Further Review continued Warm-up Let A, B M n and suppose det(a) = 0.Defineamatrixvaluedfunctionasfollows: F (t) =(A +
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More information18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in
806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state
More informationc 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0
LECTURE LECTURE 2 0. Distinct eigenvalues I haven t gotten around to stating the following important theorem: Theorem: A matrix with n distinct eigenvalues is diagonalizable. Proof (Sketch) Suppose n =
More informationMath 24 Spring 2012 Questions (mostly) from the Textbook
Math 24 Spring 2012 Questions (mostly) from the Textbook 1. TRUE OR FALSE? (a) The zero vector space has no basis. (F) (b) Every vector space that is generated by a finite set has a basis. (c) Every vector
More information. = V c = V [x]v (5.1) c 1. c k
Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationNONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction
NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques
More informationLS.2 Homogeneous Linear Systems with Constant Coefficients
LS2 Homogeneous Linear Systems with Constant Coefficients Using matrices to solve linear systems The naive way to solve a linear system of ODE s with constant coefficients is by eliminating variables,
More informationMath 291-2: Lecture Notes Northwestern University, Winter 2016
Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,
More informationLecture 6: Lies, Inner Product Spaces, and Symmetric Matrices
Math 108B Professor: Padraic Bartlett Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Week 6 UCSB 2014 1 Lies Fun fact: I have deceived 1 you somewhat with these last few lectures! Let me
More informationLinear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS. a + 2b = x a + 3b = y. This solves to a = 3x 2y, b = y x. Thus
Linear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS 1. (5.5 points) Let T : R 2 R 4 be a linear mapping satisfying T (1, 1) = ( 1, 0, 2, 3), T (2, 3) = (2, 3, 0, 0). Determine T (x, y) for (x, y) R
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationMath 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.
Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we
More informationLecture Note 12: The Eigenvalue Problem
MATH 5330: Computational Methods of Linear Algebra Lecture Note 12: The Eigenvalue Problem 1 Theoretical Background Xianyi Zeng Department of Mathematical Sciences, UTEP The eigenvalue problem is a classical
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationRoberto s Notes on Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 3. Diagonal matrices
Roberto s Notes on Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 3 Diagonal matrices What you need to know already: Basic definition, properties and operations of matrix. What you
More informationChapter 4 & 5: Vector Spaces & Linear Transformations
Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think
More informationLinear Algebra Review
Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationLinear algebra 2. Yoav Zemel. March 1, 2012
Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationThen x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r
Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you
More informationLinear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space
Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................
More informationDIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix
DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that
More informationMath 308 Spring Midterm Answers May 6, 2013
Math 38 Spring Midterm Answers May 6, 23 Instructions. Part A consists of questions that require a short answer. There is no partial credit and no need to show your work. In Part A you get 2 points per
More informationDesigning Information Devices and Systems II
EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric
More informationj=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.
LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a
More informationGetting Started with Communications Engineering. Rows first, columns second. Remember that. R then C. 1
1 Rows first, columns second. Remember that. R then C. 1 A matrix is a set of real or complex numbers arranged in a rectangular array. They can be any size and shape (provided they are rectangular). A
More informationMATH 310, REVIEW SHEET
MATH 310, REVIEW SHEET These notes are a summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive, so please
More informationTHE MINIMAL POLYNOMIAL AND SOME APPLICATIONS
THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise
More informationRepresentations. 1 Basic definitions
Representations 1 Basic definitions If V is a k-vector space, we denote by Aut V the group of k-linear isomorphisms F : V V and by End V the k-vector space of k-linear maps F : V V. Thus, if V = k n, then
More informationSTEP Support Programme. STEP 2 Matrices Topic Notes
STEP Support Programme STEP 2 Matrices Topic Notes Definitions............................................. 2 Manipulating Matrices...................................... 3 Transformations.........................................
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationMath 308 Midterm Answers and Comments July 18, Part A. Short answer questions
Math 308 Midterm Answers and Comments July 18, 2011 Part A. Short answer questions (1) Compute the determinant of the matrix a 3 3 1 1 2. 1 a 3 The determinant is 2a 2 12. Comments: Everyone seemed to
More informationEigenvalues and Eigenvectors A =
Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector
More informationLecture 2: Linear operators
Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study
More informationWOMP 2001: LINEAR ALGEBRA. 1. Vector spaces
WOMP 2001: LINEAR ALGEBRA DAN GROSSMAN Reference Roman, S Advanced Linear Algebra, GTM #135 (Not very good) Let k be a field, eg, R, Q, C, F q, K(t), 1 Vector spaces Definition A vector space over k is
More information