Econ Slides from Lecture 7

Similar documents
Recall : Eigenvalues and Eigenvectors

Numerical Linear Algebra Homework Assignment - Week 2

Lecture 15, 16: Diagonalization

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Study Guide for Linear Algebra Exam 2

Math Matrix Algebra

Diagonalization of Matrix

Announcements Monday, November 06

CS 246 Review of Linear Algebra 01/17/19

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Further Mathematical Methods (Linear Algebra) 2002

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 221, Spring Homework 10 Solutions

Chapter 5 Eigenvalues and Eigenvectors

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Chapter 2 Notes, Linear Algebra 5e Lay

JUST THE MATHS SLIDES NUMBER 9.6. MATRICES 6 (Eigenvalues and eigenvectors) A.J.Hobson

Symmetric and anti symmetric matrices

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Linear Algebra - Part II

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Math Linear Algebra Final Exam Review Sheet

Econ Slides from Lecture 8

Eigenvalue and Eigenvector Homework

Knowledge Discovery and Data Mining 1 (VO) ( )

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

Eigenvalues and Eigenvectors

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

City Suburbs. : population distribution after m years

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

4. Determinants.

Linear Algebra Practice Problems

Eigenvalues and Eigenvectors

3 Matrix Algebra. 3.1 Operations on matrices

MTH 464: Computational Linear Algebra

Conceptual Questions for Review

CHAPTER 3. Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 583A REVIEW SESSION #1

Linear Algebra- Final Exam Review

Math 3191 Applied Linear Algebra

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Eigenvalues, Eigenvectors, and Diagonalization

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

1 Last time: least-squares problems

MA 265 FINAL EXAM Fall 2012

Eigen Values and Eigen Vectors - GATE Study Material in PDF

Eigenvalues and Eigenvectors: An Introduction

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Eigenvalues and Eigenvectors 7.2 Diagonalization

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Dimension. Eigenvalue and eigenvector

Chap 3. Linear Algebra

Linear Algebra Practice Problems

Diagonalization. Hung-yi Lee

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

Properties of Linear Transformations from R n to R m

Lecture 3 Eigenvalues and Eigenvectors

Math 205, Summer I, Week 4b:

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

Math 315: Linear Algebra Solutions to Assignment 7

Eigenvalues and Eigenvectors A =

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Eigenvalues and Eigenvectors

Introduction to Matrix Algebra

Recall the convention that, for us, all vectors are column vectors.

Lecture 7. Econ August 18

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Summer Session Practice Final Exam

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Math Matrix Algebra

MAT 1302B Mathematical Methods II

Online Exercises for Linear Algebra XM511

Chapter 6. Eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

Chapter 3 Transformations

235 Final exam review questions

Chapter 3. Determinants and Eigenvalues

Linear Algebra Primer

UNIT 6: The singular value decomposition.

Introduction to Matrices

Eigenvalues and Eigenvectors

Announcements Wednesday, November 01

Transcription:

Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010

Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for some scalars λ 1,..., λ k. R n has the property that sums and scalar multiples of elements of R n remain in the set. Hence if we are given some elements of the set, all linear combinations will also be in the set. Some subsets are special because they contain no redundancies: Definition A collection of vectors {x 1,..., x k } is linearly independent if k λ ix i = 0 if and only if λ i = 0 for all i.

Econ 205 Sobel Definition The span of a collection of vectors {x 1,..., x k }, S(X ), is the set of all linear combinations of {x 1,..., x k }. S(X ) is the smallest vector space containing all of the vectors in X. Theorem If X = {x 1,..., x k } is a linearly independent collection of vectors and z S(X ), then there are unique λ 1,..., λ k such that z = k λ ix i.

Proof. Existence follows from the definition of span. Suppose that there are two linear combinations that of the elements of X that yield z so that k z = λ i x i and z = k λ ix i. Subtract the equations to obtain: 0 = k ( ) λ i λ i xi. By linear independence, λ i = λ i for all i, the desired result.

Definition The dimension of a vector space is N, where N is the smallest number of vectors needed to span the space. R n has dimension n. Definition A basis for a vector span V is any collection of linearly independent vectors than span V.

In either case, we have a contradiction. Theorem If X = {x 1,..., x k } is a set of linearly independent vectors that does not span V, then there exists v V such that X {v} is linearly independent. Proof. Take v V such that v 0 and v S(X ). X {v} is a linearly independent set. To see this, suppose that there exists λ i i = 0,..., k such that at least one λ i 0 and λ 0 v + k λ i x i. (1) If λ 0 = 0, then X are not linearly independent. If λ 0 0, then equation (1) can be rewritten v = k λ i λ 0 x i.

Econ 205 Sobel Definition The standard basis for R n consists of the set of N vectors e i, i = 1,..., N, where e i is the vector with component 1 in the ith position and zero in all other positions. 1. Standard basis is a linearly independent set that spans R n. 2. Elements of the standard basis are mutually orthogonal. When this happens, we say that the basis is orthogonal. 3. Each basis element has unit length. When this also happens, we say that the basis is orthonormal.

Orthonormal Bases 1. We know an orthonormal basis for R n. 2. It is always possible to find an orthonormal basis for a vector space. 3. If {v 1,..., v k } is an orthonormal basis for V then for all x V, x = k v i ( x v i) (To prove this, multiple both sides by v i.)

Basis Properties It is not hard (but a bit tedious) to prove that all bases have the same number of elements. (This follows from the observation that any system of n homogeneous equations and m > n unknowns has a non-trivial solution, which in turn follows from row-reduction arguments.)

Eigenvectors and Eigenvalues Definition An eigenvalue of the square matrix A is a number λ with the property A λi is singular. If λ is an eigenvalue of A, then any x 0 such that (A λi)x = 0 is called an eigenvector of A associated with the eigenvalue λ. Eigenvalues are those values for which the equation has a non-zero solution. Ax = λx Eigenvalues solve the equation det(a λi) = 0.

Characteristic Polynomial If A is an n n matrix, then this characteristic equation is a polynomial equation of degree n. By the Fundamental Theorem of Algebra, it will have n (not necessarily distinct and not necessarily real) roots. That is, the characteristic polynomial can be written P(λ) = (λ r 1 ) m1 (λ r k ) m k, where r 1, r 2..., r k are the distinct roots (r i r j when i j) and m i are positive integers summing to n. We call m i the multiplicity of root r i. Eigenvalues and their corresponding eigenvectors are important because they enable one to relate complicated matrices to simple ones.

Diagonalization Theorem If A is an n n matrix that has n distinct eigen-values or is symmetric, then there exists an invertible n n matrix P and a diagonal matrix D such that A = PDP 1. Moveover, the diagonal entries of D are the eigenvalues of A and the columns of P are the corresponding eigenvectors. 1. Certain square matrices are similar to a diagonal matrix. 2. The diagonal matrix has eigenvalues down the diagonal. 3. Similarity relationship: A = PDP 1

Sketch of Proof Suppose that λ is an eigenvalue of A and x is an eigenvector. This means that Ax = λx. If P is a matrix with column j equal to an eigenvector associated with λ i, it follows that AP = PD. The theorem would follow if we could guarantee that P is invertible. When A is symmetric, one can prove that A has only real eigenvalues and that one can find n linearly independent eigenvectors even if the eigenvalues are not distinct. This result is elementary (but uses some basic facts about complex numbers).

The eigenvectors of distinct eigenvalues are distinct. To see this, suppose that λ 1,..., λ k are distinct eigenvalues and x 1,..., x k are associated eigenvectors. In order to reach a contradiction, suppose that the vectors are linearly dependent.

Without loss of generality, we may assume that {x 1,..., x k 1 } are linearly independent, but that x k can be written as a linear combination of the first k 1 vectors. This means that there exists α i i = 1,..., k 1 not all zero such that: k 1 α i x i = x k. (2) Multiply both sides of equation (2) by A and use the eigenvalue property to obtain: k 1 α i λ i x i = λ k x k. (3) Multiply equation (2) by λ k and subtract it from equation (3) to obtain: k 1 α i (λ i λ k )x i = 0. (4) Since the eigenvalues are distinct, equation (4) gives a non-trivial linear combination of the first k 1 x i that is equal to 0, which contradicts linear independence.