(here, we allow β j = 0). Generally, vectors are represented as column vectors, not row vectors. β 1. β n. crd V (x) = R n

Similar documents
Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Vector Spaces and Linear Transformations

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

235 Final exam review questions

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Definition 3 A Hamel basis (often just called a basis) of a vector space X is a linearly independent set of vectors in X that spans X.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Eigenvalues, Eigenvectors, and Diagonalization

Lecture 9. Econ August 20

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

MATH 221, Spring Homework 10 Solutions

Dimension. Eigenvalue and eigenvector

Math Matrix Algebra

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Linear algebra II Tutorial solutions #1 A = x 1

4. Linear transformations as a vector space 17

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 583A REVIEW SESSION #1

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Online Exercises for Linear Algebra XM511

Calculating determinants for larger matrices

Generalized Eigenvectors and Jordan Form

Definition (T -invariant subspace) Example. Example

Econ Lecture 8. Outline. 1. Bases 2. Linear Transformations 3. Isomorphisms

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Matrices and Linear Algebra

Chapter 5 Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Lec 2: Mathematical Economics

1 Invariant subspaces

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Symmetric and anti symmetric matrices

Math Final December 2006 C. Robinson

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Solutions Problem Set 8 Math 240, Fall

Linear Algebra Review

Eigenvalue and Eigenvector Homework

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Linear algebra and applications to graphs Part 1

Study Guide for Linear Algebra Exam 2

MATH Spring 2011 Sample problems for Test 2: Solutions

Topics in linear algebra

MATH 115A: SAMPLE FINAL SOLUTIONS

Announcements Monday, November 13

Announcements Monday, October 29

Solving a system by back-substitution, checking consistency of a system (no rows of the form

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

and let s calculate the image of some vectors under the transformation T.

MATH 1553-C MIDTERM EXAMINATION 3

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Chapter 5. Eigenvalues and Eigenvectors

Introduction to Matrix Algebra

Name: Final Exam MATH 3320

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

LINEAR ALGEBRA REVIEW

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Math 240 Calculus III

2 Eigenvectors and Eigenvalues in abstract spaces.

Linear Algebra Lecture Notes-II

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

LINEAR ALGEBRA SUMMARY SHEET.

Announcements Wednesday, November 7

Announcements Wednesday, November 01

Linear Algebra - Part II

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Solutions to Final Exam

Lecture 3 Eigenvalues and Eigenvectors

Midterm 2 Solutions, MATH 54, Linear Algebra and Differential Equations, Fall 2014

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Matrix Vector Products

Announcements Wednesday, November 7

Econ Slides from Lecture 7

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Announcements Wednesday, November 01

MATHEMATICS 217 NOTES

UNIT 6: The singular value decomposition.

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Eigenvalues and Eigenvectors

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math Linear Algebra Final Exam Review Sheet

Eigenspaces and Diagonalizable Transformations

Section 5.5. Complex Eigenvalues

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Diagonalization of Matrix

MAT224 Practice Exercises - Week 7.5 Fall Comments

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Transcription:

Economics 24 Lecture 9 Thursday, August 6, 29 Revised 8/6/9, revisions indicated by ** and Sticky Notes Section 3.3 Supplement, Quotient Vector Spaces (not in de la Fuente): Definition 1 Given a vector space X and a vector subspace W of X, define an equivalence relation by x y x y W Form a new vector space X/W : the set of vectors is {[x] :x X} where [x] denotes the equivalence class of x with respect to. Note that the vectors are sets; this is a little weird at first, but.... Define [x]+[y] =[x + y] α[x] =[αx] You should check on your own that is an equivalence relation and that vector addition and scalar multiplication are well-defined, i.e. [x] =[x ], [y] =[y ] [x + y] =[x + y ] [x] =[x ],α F [αx] =[αx ] Theorem 2 If dim X<, then dim(x/w )=dimx dim W Theorem 3 Let T L(X, Y ). Then Im T is isomorphic to X/ ker T. 1

Proof: If dim X<, thendimx/ ker T =dimx dim ker T (from the previous theorem) = Rank T (from Theorem 11 in yesterday s lecture) = dim Im T,soX/ ker T is isomorphic to Im T. We shall prove that it is true in general, and that the isomorphism is natural. Define T ([x]) = T (x) We need to check that this is well-defined. [x] =[x ] x x x x ker T T (x x )= T (x) =T (x ) so T is well-defined. Clearly, T : X/ ker T Im T.Itiseasyto check that T is linear, so T L(X/ ker T,Im T ). T ([x]) = T ([y]) T (x) =T (y) T (x y) = x y ker T x y [x] =[y] so T is one-to-one. y Im T x X T (x) =y T ([x]) = y so T is onto, hence T is an isomorphism. Example: Consider T L(R 3, R 2 ) defined by T (x, y, z) =(x, y) 2

Then ker T = {(x, y, z) R 3 : x = y =} is the z-axis. Given (x, y, z), the equivalence class [(x, y, z)] is just the line through (x, y, ) parallel to the z-axis. T ([(x, y, z)]) = T (x, y, z) =(x, y). Back to de la Fuente: Every real vector space X with dimension n is isomorphic to R n. What s the isomorphism? Definition 4 Fix any Hamel basis V = {v 1,...,v n } of X. Any x X has a unique representation x = n j=1 β jv j (here, we allow β j = ). Generally, vectors are represented as column vectors, not row vectors. crd V (x) = β 1. β n R n crd V (x) is the vector of coordinates of x with respect to the basis V. crd V (v 1 )= 1. crd V (v 2 )= 1. crd V (v n )=. 1 crd V is an isomorphism from X to R n. Matrix Representation of a Linear Transformation Definition 5 Suppose T L(X, Y ), dim X = n, dimy = m. Fix bases V = {v 1,...,v n } of X W = {w 1,...,w m } of Y 3

T (v j ) Y,so Define T (v j )= m Mtx W,V (T )= i=1 α ijw i α 11 α 1n... α m1 α mn Notice that the columns are the coordinates (expressed with respect to W )oft(v 1 ),...,T(v n ). Observe so α 11 α 1n.. α m1 α mn 1. = α 11. α m1 Mtx W,V (T ) crd V (v j )=crd W (T (v j )) x X Mtx W,V (T ) crd V (x) =crd W (T (x)) Multiplying a vector by a matrix does two things: Computes the action of T Accounts for the change in basis Example: X = Y = R 2, V = {(1, ), (, 1)}, W = {(1, 1), ( 1, 1)}, T = id, T (x) =x, Mtx W,V (T ) 1 1 Mtx W,V (T ) is the matrix which changes basis from V to W.How do we compute it? v 1 =(1, ) = α 11 (1, 1) + α 21 ( 1, 1) 4

α 11 α 21 =1 α 11 + α 21 = 2α 11 =1,α 11 = 1 2 α 21 = 1 2 v 2 =(, 1) = α 12 (1, 1) + α 22 ( 1, 1) α 12 α 22 = α 12 + α 22 =1 2α 12 =1,α 12 = 1 2 α 22 = 1 2 Mtx W,V (id) = 1/2 1/2 1/2 1/2 Theorem 6 (3.5 ) Let X and Y be vector spaces over the same field F, dim X = n, dim Y = m, bases V = {v 1,...,v n } for X W = {w 1,...,w m } for Y Then Mtx W,V L(L(X, Y ),F m n ) Mtx W,V is an isomorphism from L(X, Y ) to F m n, the vector space of m n matrices over F. Theorem 7 ((From Handout)) Let X, Y, Z be finite-dimensional vector spaces with bases U, V, W respectively, S L(X, Y ), T L(Y,Z). Then Mtx W,V (T ) Mtx V,U (S) =Mtx W,U (T S) 5

i.e. matrix multiplication corresponds via the isomorphism to composition of linear transformations. Note that Mtx W,V is a function from L(X, Y ) to the space of m n matrices, while Mtx W,V (T ) is an m n matrix. Proof: See handout. The theorem can be summarized by the following Commutative Diagram: S T X Y Z crd U crd V crd W R n R m R r Mtx V,U (S) Mtx W,V (T ) We say the diagram commutes because you get the same answer any way you go around the diagram (in directions allowed by the arrows). The crd arrows go in both directions because crd is an isomorphism. Section 3.5, Change of Basis, Similarity Let X be a finite-dimensional vector space with basis V.IfT L(X, X) it is customary to use the same basis in the domain and range: Definition 8 Mtx V (T ) denotes Mtx V,V (T ) Question: If W is another basis for X, howaremtx V (T )and Mtx W (T ) related? Mtx V,W (id) Mtx W (T ) Mtx W,V (id) 6

= Mtx V,W (id) Mtx W,V (T id) = Mtx V,V (id T id) = Mtx V (T ) Mtx V,W (id) Mtx W,V (id) = Mtx V,V (id) 1 1 = 1 Mtx V (T )=P 1 Mtx W (T )P where P = Mtx W,V (id) is a change of basis matrix. On the other hand, if P is invertible, then P is a change of basis matrix (see handout). Definition 9 Square matrices A, B are similar if for some invertible matrix P. A = P 1 BP Theorem 1 Suppose that X is finite-dimensional. If T L(X, X) and U, W are any two bases of X, then Mtx W (T ) and Mtx U (T ) are similar. Conversely, given similar matrices A, B with A = P 1 BP and any basis U, there is a basis W and T L(X, X) such that B = Mtx U (T ) A = Mtx W (T ) 7

P = Mtx U,W (id) P 1 = Mtx W,U (id) Proof: See Handout on Diagonalization and Quadratic Forms. Section 3.6: Eigenvalues and Eigenvectors De la Fuente defines eigenvalues and eigenvectors of a matrix. Here, we define eigenvalues and eigenvectors of a linear transformation and show that λ is an eigenvalue of T if and only if λ is an eigenvalue for some matrix representation of T if and only if λ is an eigenvalue for every matrix representation of T. Definition 11 Let X be a vector space and T L(X, X). We say that λ is an eigenvalue of T and v isaneigenvector corresponding to λ if T (v) =λv. Theorem 12 (Theorem 4 in Handout) Let X be a finitedimensional vector space, and U any basis. Then λ is an eigenvalue of T if and only if λ is an eigenvalue of Mtx U (T ). v is an eigenvector of T corresponding to λ if and only if crd U (v) is an eigenvector of Mtx U (T ) corrresponding to λ. Proof: By the Commutative Diagram Theorem, T (v) =λv crd U (T (v)) = crd U (λv) Mtx U (T )(crd U (v)) = λ(crd U (v)) Computing eigenvalues and eigenvectors: Suppose dim X = n; leti be the n n identity matrix. Given T L(X, X), fix a basis U and let A = Mtx U (T ) 8

Find the eigenvalues of T by computing the eigenvalues of A: Av = λv (A λi)v = (A λi) is not invertible det(a λi) = We have the following facts: If A R n n, f(λ) =det(a λi) is an n th degree polynomial in λ with real coefficients; it is called the characteristic polynomial of A. f has n roots in C, counting multiplicity: f(λ) =(λ c 1 )(λ c 2 ) (λ c n ) where c 1,...,c n C are the eigenvalues; the c j s are not necessarily distinct. **Notice that f(λ) = if and only if λ {c 1,...,c n }, so the roots are the solutions of the equation f(λ) =. the roots which are not real come in conjugate pairs: f(a + bi) = f(a bi) = if λ = c j R, there is a corresponding eigenvector in R n. if λ = c j R, the corresponding eigenvectors are in C n \ R n. Diagonalization Definition 13 Suppose X is finite-dimensional with basis U. Given a linear transformation T L(X, X), let A = Mtx U (T ) 9

We say that A can be diagonalized if there is a basis W for X such that Mtx W (T ) is diagonal, i.e. Mtx W (T )= λ 1 λ 2 λ n Notice that the eigenvectors of Mtx W (T ) are exactly the standard basisvectorsofr n. But w j is an eigenvector of T for λ j if and only if crd W (w j ) is an eigenvector of Mtx W (T ), and crd W (w j )is the j th standard basis vector of R n,sow = {w 1,...,w n } where w j is an eigvenvector corresponding to λ j. Then the action of T is clear: it stretches each basis element w i by the factor λ i. Theorem 14 (6.7 ) Let X be an n-dimensional vector space, T L(X, X), U any basis of X, anda = Mtx U (T ). Then the following are equivalent: A can be diagonalized there is a basis W for X consisting of eigenvectors of T there is a basis V for R n consisting of eigenvectors of A Proof: Use de la Fuente s Theorem 6.7 and Theorem 4 from the Handout. Theorem 15 (6.8 ) Let X be a vector space, T L(X, X). If λ 1,...,λ m are distinct eigenvalues of T with corresponding eigenvectors v 1,...,v m, then {v 1,...,v m } is linearly independent. 1

If dim X = n and T has n distinct eigenvalues, then X has a basis consisting of eigenvectors of T ; consequently, if U is any basis of X, thenmtx U (T ) is diagonalizable. Proof: This is an adaptation of the proof of Theorem 6.8 in de la Fuente. 11