Math Camp Notes: Linear Algebra II

Similar documents
Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains

Diagonalization of Matrix

Linear Algebra Solutions 1

Eigenvalue and Eigenvector Homework

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Lecture 12: Diagonalization

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math 3191 Applied Linear Algebra

18.600: Lecture 32 Markov Chains

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Econ Slides from Lecture 7

MATH 221, Spring Homework 10 Solutions

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Eigenvalues and Eigenvectors

18.440: Lecture 33 Markov Chains

Definition (T -invariant subspace) Example. Example

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

City Suburbs. : population distribution after m years

Eigenvalues and Eigenvectors

Jordan Canonical Form Homework Solutions

Chap 3. Linear Algebra

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Recall : Eigenvalues and Eigenvectors

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Calculating determinants for larger matrices

Math 315: Linear Algebra Solutions to Assignment 7

Chapter 5 Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Study Guide for Linear Algebra Exam 2

and let s calculate the image of some vectors under the transformation T.

MATH 310, REVIEW SHEET 2

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Linear Algebra Practice Problems

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.

Math 205, Summer I, Week 4b:

Properties of Linear Transformations from R n to R m

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Math 2J Lecture 16-11/02/12

Math 308 Practice Final Exam Page and vector y =

Eigenvalues, Eigenvectors, and Diagonalization

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Eigenvalues and Eigenvectors

Math Linear Algebra Final Exam Review Sheet

Problem Set 9 Due: In class Tuesday, Nov. 27 Late papers will be accepted until 12:00 on Thursday (at the beginning of class).

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Eigenvalues and Eigenvectors 7.2 Diagonalization

Chapter 3. Determinants and Eigenvalues

18.175: Lecture 30 Markov chains

Eigenvalues, Eigenvectors, and Diagonalization

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Lecture 3 Eigenvalues and Eigenvectors

Math 108b: Notes on the Spectral Theorem

Lecture 11: Eigenvalues and Eigenvectors

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

MAT 1302B Mathematical Methods II

Introduction to Matrix Algebra

Math Homework 8 (selected problems)

235 Final exam review questions

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Diagonalization. Hung-yi Lee

3 (Maths) Linear Algebra

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

MATH 583A REVIEW SESSION #1

Dimension. Eigenvalue and eigenvector

Additional Homework Problems

Eigenvalues and Eigenvectors

Final Exam Practice Problems Answers Math 24 Winter 2012

MAT1302F Mathematical Methods II Lecture 19

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Eigenvalues and Eigenvectors

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

M.A.P. Matrix Algebra Procedures. by Mary Donovan, Adrienne Copeland, & Patrick Curry

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Name: Final Exam MATH 3320

Eigenvalues and Eigenvectors: An Introduction

Jordan Canonical Form

CS 246 Review of Linear Algebra 01/17/19

Study Notes on Matrices & Determinants for GATE 2017

Announcements Wednesday, November 01

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math Matrix Algebra

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Application. Stochastic Matrices and PageRank

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Transcription:

Math Camp Notes: Linear Algebra II Eigenvalues Let A be a square matrix. An eigenvalue is a number λ which when subtracted from the diagonal elements of the matrix A creates a singular matrix. In other words, and eigenvalue is a number λ such that (A λi) x =, x Notice also that this implies that an eigenvalue is a number λ such that Ax = λx. Find the eigenvalues of the matrix A = 4 7 8 Solution: Assume that λ is an eigenvalue of A. Remember that if the matrix resulting when we subtract λi from A is singular, then its determinant must be zero. Then λ solves the equation λ λ 4 7 8 λ = λ λ 4 7 8 λ =. λ 3 + 8λ 2 7λ + 4 = (λ 4)(λ 2 4λ + ) = So one eigenvalue is λ = 4. To solve the quadratic, we use the quadratic formula: ( 4)2 4 λ = 4 ± = 2 ± 3 2 Therefore, the eigenvalues are λ = {2 3, 2 + 3, 4}. Eigenvectors Given an eigenvalue λ, and eigenvector is a non-zero vector x such that (A λi) x =. Find the eigenvectors of the matrix A = 2 2. 3 We must rst nd the eigenvalues of the matrix. The determinant of (A λi) is Therefore, we must nd λ (2 λ) (3 λ) + 2 (2 λ) = λ ( 6 5λ + λ 2) + 4 2λ = 8λ + 5λ 2 λ 3 + 4 λ 3 5λ 2 + 8λ 4 = (λ ) (λ 2) (λ 2) =. We can see that λ = and λ = 2, where λ = 2 is a repeated root. Since each eigenvector corresponds to an eigenvalue, let us consider the eigenvalue λ =. The matrix A λi is then 2 A λi =. 2

This implies that the system of equations can be written x = 2x 2 x = x 2 x 3 x = 2x 2 when A λi =. Combining the second and third equation, we have 2x 2 = x 2 x 3 x 2 = x 3 Therefore, we have x = 2x 2 Which implies x x 2 x 3 x 2 = x 2 x 3 = x 2 2 Therefore, we can let x 2 be anything we want (say x 2 = s R). As we increase and decrease s, we trace out a line in 3-space, which line is the eigenvector of λ =. Each point along that vector has det (A λi) =. As a check, we can see whether (A λi) x =. We then have 2 2 2 x 2. 2 + 2 2 + + 2 + + 2. We now know one eigenvector is ( 2 ). We can nd the others in a similar manner. Let λ = 2. Then 2 2 (A λi) =, which implies Therefore, we can write the system as x x 2 x 3 x = x 3, x = x 3, x = x 3.??? Notice that we cannot write x 2 as a function of either x or x 2! Therefore, there is no single eigenvector which corresponds to λ = 2. However, notice that since the system is independant of x 2 when λ = 2, we can let x 2 be anything we want. Say x = s and x 2 = t. Then we can write the solution to the system as x x 2 x 3 s + x. t. 2

Therefore, the corresponding eigenvectors are and. To check that this is the case, notice 2 2 (A λi) x = 2 + + 2 + + and (A λi) x = 2 2 + + + + + +. Generally, if the matrix is an n n matrix, we have n eigenvalues and n eigenvectors. This is because there are n roots to an nth order polynomial. Also, if an eigenvalue is repeated k times, then there are k corresponding eigenvectors to the repeated root. Also, if a matrix A is only invertible i λ = is not an eigenvalue of A. Diagonalization A square matrix A is called diagonalizable if there exists an invertible matrix P such that P AP is a diagonal matrix. We state without proof the following theorem: Let A be n n, and assume A has n linearly independent eigenvectors. Dene the matrices λ... λ 2... Λ =......, P = ( ) p p 2... p 3,... λ n where p i is the ith linearly independant eigenvector. Then Λ = P AP. Consider again the matrix A = 2 2 3 and remember that the corresponding eigenvalues were λ = {, 2, 2} and 2, Therefore, the matrix P can be expressed as P = 2 Notice that Then we can see that P AP = = P = 2 2 4 2 2 2 2, 2 2 3 2. 2 2 2,. 3

Markov Chains Assume k states of the world, which we label, 2,..., k. Let the probability that we move to state j from state i p ij, and call it the transition probability. Then the transition probability matrix of the Markov chain is the matrix p... p k P =...... p k... p kk Notice that the rows are the current state, and the columns are the states into which we can move. Also notice that the rows of the transition probability matrix must add up to one (otherwise the system would have positive probability of moving into an undenied state). Say that there are three states of the world: rainy, overcast, and sunny. Let state be rainy, 2 be overcast, and 3 be sunny. Consider the transition probability matrix.8.. P =.3.2.5..2.6.2 For example, if it is rainy, the probability it remains rainy is.8. Also, if it is sunny, the probability that it becomes overcast is.6. The transition probability matrix tells us the likelihood of the state of the system next period. However, what if we want to know the likelihood of the state in two periods? Call this probability transition matrix P 2. It can be shown that P P = P 2. Similarly, it can be shown that P n = P n. If it is rainy today, what is the probability that is will be cloudy the day after tomorrow?.8...8...69.6.5 P 2 =.3.2.5.3.2.5.4.37.23.2.6.2.2.6.2.38.26.36 Therefore, there is a.6 chance that if it is rainy today, then it will be cloudy the day after tomorrow. Implicitly we have been assuming that we know the vector of probabilities today. For example, we say assume it is cloudy. But what if we don't know what the probability will be today? We can dene an initial state of probabilities as x = (p, p 2,..., p n ). Then the probability that it will be rainy tomorrow is x P = x. If there is a 3% chance of rain today, and a % chance of sun, what is the probability that is will be cloudy the day after tomorrow? x 2 = x P = x P 2 = (.3.6. ).8...8...3.2.5.3.2.5.2.6.2.2.6.2 = (.3.6. ).69.6.5.4.37.23 (.485.296.29 ).38.26.36 Therefore, there is a.296 chance that it will be cloudy the day after tomorrow. The steady-state vector q of a regular transition matrix P is the unique probability vector that satises the equation qp = q. Notice that this corresponds to λ = as an eigenvector of P. It can be shown 4

that xp n q for all x. Therefore, if we wait long enough in the system, the probabilities of certain events occurring will converge to q. What is the probability it will be cloudy as n? In order to nd the long-run probabilities, we must realize that Notice that if λ =, then we have qp = q P q = q. qp = λq P q = λq. Therefore, q is the eigenvector of P which corresponds to the eigenvalue of λ =. Therefore, it suces to nd this eigenvalue..2.3.2 (P λi)q =..8.6 q =..5.8.2.3.2..8.6..5.8 34 3 4 3 3, 4 q = 34 3 q 3 q 2 = 4 3 q 3 q 3 = q 3. Therefore, q = ( 34 3, ) q 3, where q 3 can be whatever we want. Since we must have that the sum of the elements of q be equal to, we choose q 3 accordingly. 34 3 + 4 3 + 3 3 = 6 3 q 3 = 3 ( 34 6 q = 6, 4 6, 3 ). 6 Homework. Find the eigenvalues and the corresponding eigenvectors for the following matrices: ( ) 3 (a) 8 ( ) 9 (b) 4 2 ( ) 3 (c) 4 ( ) 2 7 (d) 2 ( ) (e) ( ) (f) 5

2. Determine whether the following matrices are diagonalizable ( ) 2 (a) 2 ( ) 2 3 (b) (c) 3 2 2 (d) 3 4 3 2 (e) 2 3 2 3 3. Diagonalize the following matrices, if possible ( ) 4 2 (a) 2 7 ( ) (b) 6 (c) 2 2 (d) 3 3 6