Here are proofs for some of the results about diagonalization that were presented without proof in class.

Size: px
Start display at page:

Download "Here are proofs for some of the results about diagonalization that were presented without proof in class."

Transcription

1 Suppose E is an 8 8 matrix. In what follows, terms like eigenvectors, eigenvalues, and eigenspaces all refer to the matrix E. Here are proofs for some of the results about diagonalization that were presented without proof in class. The first theorem relates the dimension of an eigenspace to the multiplicity of its eigenvalue. Theorem 1 If - 3 is an eigenvalue for the 8 8matrix E, and I 3 is the corresponding eigenspace, then dim ÐI Ñ Ÿ Ðthe multiplicity of the eigenvalue - ) 3 3 Proof The proof is a bit complicated to write down in general. But all the ideas are exactly the same as those in the proof given here for a special case: Suppose E is a matrix with an eigenvalue of - œ (. The eigenspace I for - œ ( is a subspace of, so its dimension might be "ß #ß or ( why is dimension! not possible? ) Suppose it happens that dim Iœ#Þ So, in this special case, we need to see why it is true that dim ÐIÑ œ # Ÿ (the multiplicity of the eigenvalue - œ (Ñ To prove this statement about the multiplicity, we need to show that the characteristic polynomial 5 GÐ-Ñ has a factor Ð- (Ñ, where 5 #Þ ( Similar arguments apply if the eigenspace has dimension 1 or 3; and - œ( has no special role at all to play in the proof; substitute any value - œ -! if you like. I used - œ ( just to make the argument appear less intimidating.) Strategy: We will do this by showing that EœTGT " for some upper triangular matrix GÞ Then E and G have the same characteristic polynomial and, because G is so simple, we will see right away that the characteristic polynomial has a factor Ð- (Ñ 5, where 5 #. Let Ö@ " be a basis for the #-dimensional eigenspace I. Since Ö@ " is a linearly independent set, we can extend it to get a basis œö@ ß@ ß A for. Let TF œ Ò@ A] Þ This is the matrix that changes -coordinates into standard coordinates: What does E do to each basis vector? " T ÒBÓ œ Band ÒBÓ œ T BÞ " # E@ " œ (@" and E@ # œ (@# come from the eigenspace for - œ(. EA œ D We don't know what D is but, Write D œ. "@". #@#. A whatever it is, we can write D as linear combination of the basis ß@ ßA " #

2 In terms of -coordinates these equations say: (!. " " Ó œ! # Ó œ ( ÒEAÓ œ.#!!. (!." Let GœÒÒE@ " Ó ÒE@ # Ó ÒEAÓÓœ! (.#!!. We claim that EœTGT ", thereby proving Eand Gare similar. ( How do we know T is invertible?) To check this, it's equivalent to check that ET œ T G: ET œeò@ A ] œòe@ " E@ # EA ] (@# DÓ T G œ T ÒÒE@ Ó ÒE@ Ó ÒEA Ó Ó œ Ò T ÒE@ Ó T ÒE@ Ó T ÒEAÓ Ó " # " # (*) œòe@ " E@ # EA ] (@# DÓ Å Ðsince multiplication by T c onverts -coordinates to standard coordinates) Therefore ET œ T G. Since E and G are similar, they have the same characteristic polynomial, namely (-!. GÐ-Ñ œ det "! (-. # # œ Ð( -Ñ Ð. -Ñ!!. - Therefore the eigenvalue - œ( has multiplicity #. ( Note: we can't say multiplicity œ# because perhaps. œ (that, we don't know. If you prefer, the calculation (*) could be written without referring to coordinates: (!. T G œ Ò@ A] "! (. #!!. (!. " œ ÒÒ@ A ]! Ò@ A ] ( Ò@ A].# Ó!!. (@. A Ó œ (@ DÓ " # " " # # " #

3 Corollary 1.1 The sum of the dimensions of the eigenspaces of E is Ÿ 8Þ Proof By the theorem, (sum of the dimensions of the eigenspaces) Ÿ (sum of the multiplicities of the eigenvalues) But the multiplicity of an eigenvalue - is the largest exponent 5 such that Ð-- Ñ is a factor of the characteristic polynomial. Therefore the sum of the exponents (multiplicities) for all the eigenvalues must be Ÿ8ß because the characteristic polynomial has degree 8. ñ The theorem give us some information about linear independence of eigenvectors. (But it's the second corollary, below, that we really want.) Theorem 2 Suppose X œ Ö@ " is a set of linearly independent eigenvectors whose eigenvalues are -", -#,..., -:. ÐSome of these - 3 's may be equal that is, some of the eigenvectors might from the same eigenspace.) Let Y œ Ö? " ß ÞÞÞß? 5 be another set of linearly independent eigenvectors all with the same eigenvalue -where - Á -" ß Þ.., - Á -:. ( In other words, the? 3 's are all from the same eigenspace I-, and none of 3 's is from I - ). Then X Y œ ß ß ÞÞÞß? is linearly independent. " : " 5 Stated more informally: If X is a linearly independent set of eigenvectors chosen from one or more different eigenspaces, and Y is a linearly independent set of eigenvectors chosen from a fresh eigenspace, then uniting the two sets gives a larger set that's still linearly independent. Proof Suppose there is a dependency relation - ÞÞÞ -:@:."?" ÞÞÞ. 5? 5 œ! (*) ÐWe want to show that all the weights - and. must be!. Ñ 3 i Let."?" ÞÞÞ. 5? 5 œ A so that - ÞÞÞ - œ A (**) If we multiply both sides of (**) by -, we get - " ÞÞÞ -:-@: œ - A (***) If we multiply both sides of (**) on the left by - E@ ÞÞÞ- E@ œ EA " " : : E, we ß ÞÞÞ@ : are eigenvectors with eigenvalues -",... ß -:. Also, A is an eigenvector with eigenvalue - (Why? Look at how is A defined: what are?" ß ÞÞÞ,? 5? ) Therefore the preceding equation becomes - ÞÞÞ - œ -A " " " : : : (****)

4 If we subtract the two boxed equations (***) and (****), we get - Ð- - Ñ@ ÞÞÞ - Ð- - Ñ@ œ! " " " : : : Since Ö@ ß is linearly independent, all the weights on the left must be!: " : -" Ð- -" Ñ œ! ß ÞÞÞ ß -: Ð- -: Ñ œ! But none of the factors Ð- -3Ñ is! (because we are assuming that - is different from all the other eigenvalues - ß ÞÞÞß - ). So it must be that all - œ ÞÞÞ œ - œ!þ " : " : Therefore our first equation (*) now says:.? ÞÞÞ.? œ! " " 5 5 But Ö? ß ÞÞÞß? is also linearly independent, so. œ ÞÞÞ œ. œ!. " 5 " 5 Therefore all the coefficients in the original equation (*) must be!in other words, the set ß ß ÞÞÞß? is linearly independent. ñ " : " 5 Corollary 2.1 Suppose ", #,..., : are bases for several different eigenspaces I- ß I ß ÞÞÞß I " -# -:. Then... is still linearly independent. " : Stated more informally: if we choose bases for several eigenspaces and unite the bases into a single collection of vectors, the new collection is a linearly independent set (in 8 Ñ Proof In the theorem above, let X œ " and Y œ # Þ These are linearly independent sets of eigenvectors, and all the eigenvectors in Y have the same eigenvalue Ð-# Ñwhich is different from the eigenvalue Ð-" Ñ of any vector in X. Applying the theorem to X and Y, we get that X Y œ " # is linearly independent. Now, repeat the argument. Let X œ " # and Y œ Þ Xand Y are linearly independent sets of eigenvectors, and all the eigenvectors in Y have the same eigenvalue Ð- Ñ which is different from the eigenvalue Ð-" or -# Ñof any vector in X. So, according to the theorem, X Y œ " # is linearly independent. Now, repeat the argument. Let X œ " # and Y œ % and apply the theorem to conclude that X Y œ " # % is linearly independent. Continue using the theorem in this way, over and over, until you reach the conclusion that X Y œ â is linearly independent. " # % : ñ Notice: Corollary 2.1 shows us again (in another way) that the sum of the dimensions of the eigenspaces must be Ÿ8: Choose bases ",..., : for all of the eigenspaces. By Corollary 2.1, "... : is a linearly independent set in 8 ß so it contains Ÿ 8 vectors. Therefore (sum of dimensions of eigenspaces) œ (number of vectors in "... number of vectors in Ñœ(number of vectors in... ) Ÿ8Þ : " :

5 Corollary 2.2 If the sum of the dimensions of all eigenspaces œ8, then Eis diagonalizable. Proof Pick bases ", #,..., : for all of the different eigenspaces I-" ß I-# ß ÞÞÞß I-:. If the sum of the dimensions if the eigenspaces œ8, then œ"... : is a linearly independent set (by 8 8 Corollary 2.1) of 8eigenvectors in Þ So has an eigenvector basis,, and therefore E is diagonalizable. ñ Notice that the converse of Corollary 2.2 is also true: Theorem 3 If E is diagonalizable, then the sum of the dimensions of the eigenspaces is 8. Proof It is always true that the sum of the eigenspace dimensions is Ÿ8Þ If E is diagonalizable, there is an eigenvector basis œ Ö@ " ß 8 for. These ß 8 come from the different eigenspaces, and the dimension of an eigenspace is the number of vectors from that it contains ß 8are linearly independent Ñ. Therefore (sum of dimensions of eigenspaces) 8Þ So, if E is diagonalizable, the sum of the dimensions of the eigenspaces œ 8Þ ñ 8

6 Some Conditions Equivalent to Diagonalizability of an 8 8Matrix E "Þ E is diagonalizable Ì Ë 2. There is a basis of eigenvectors of E for 8 Ì Ë 3. E has 8linearly independent eigenvectors Ì Ë 4. The sum of the dimensions of the eigenspaces is 8 We indicated in the early notes Introduction to Diagonalization how 1) and 2) are equivalent 2) and 3) are obviously equivalent because a basis of eigenvectors is 8 is a set of 8 linearly independent eigenvectors and a set of 8 linearly independent eigenvectors (in 8 Ñ is automatically a basis The proofs given above show way 3) and 4) are equivalent.

Introduction to Diagonalization

Introduction to Diagonalization Introduction to Diagonalization For a square matrix E, a process called diagonalization can sometimes give us more insight into how the transformation B ÈE B works. The insight has a strong geometric flavor,

More information

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :)

: œ Ö: =? À =ß> real numbers. œ the previous plane with each point translated by : Ðfor example,! is translated to :) â SpanÖ?ß@ œ Ö =? > @ À =ß> real numbers : SpanÖ?ß@ œ Ö: =? > @ À =ß> real numbers œ the previous plane with each point translated by : Ðfor example, is translated to :) á In general: Adding a vector :

More information

Example: A Markov Process

Example: A Markov Process Example: A Markov Process Divide the greater metro region into three parts: city such as St. Louis), suburbs to include such areas as Clayton, University City, Richmond Heights, Maplewood, Kirkwood,...)

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

The Spotted Owls 5 " 5. = 5 " œ!þ")45 Ð'!% leave nest, 30% of those succeed) + 5 " œ!þ("= 5!Þ*%+ 5 ( adults live about 20 yrs)

The Spotted Owls 5  5. = 5  œ!þ)45 Ð'!% leave nest, 30% of those succeed) + 5  œ!þ(= 5!Þ*%+ 5 ( adults live about 20 yrs) The Spotted Owls Be sure to read the example at the introduction to Chapter in the textbook. It gives more general background information for this example. The example leads to a dynamical system which

More information

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns

Æ Å not every column in E is a pivot column (so EB œ! has at least one free variable) Æ Å E has linearly dependent columns ßÞÞÞß @ is linearly independent if B" @" B# @# ÞÞÞ B: @: œ! has only the trivial solution EB œ! has only the trivial solution Ðwhere E œ Ò@ @ ÞÞÞ@ ÓÑ every column in E is a pivot column E has linearly

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

More information

Inner Product Spaces

Inner Product Spaces Inner Product Spaces In 8 X, we defined an inner product? @? @?@ ÞÞÞ? 8@ 8. Another notation sometimes used is? @? ß@. The inner product in 8 has several important properties ( see Theorem, p. 33) that

More information

solve EB œ,, we can row reduce the augmented matrix

solve EB œ,, we can row reduce the augmented matrix PY Decomposition: Factoring E œ P Y Motivation To solve EB œ,, we can row reduce the augmented matrix Ò + + ÞÞÞ+ ÞÞÞ + l, Ó µ ÞÞÞ µ Ò- - ÞÞÞ- ÞÞÞ-. Ó " # 3 8 " # 3 8 When we get to Ò-" -# ÞÞÞ -3 ÞÞÞ-8Ó

More information

Notes on the Unique Extension Theorem. Recall that we were interested in defining a general measure of a size of a set on Ò!ß "Ó.

Notes on the Unique Extension Theorem. Recall that we were interested in defining a general measure of a size of a set on Ò!ß Ó. 1. More on measures: Notes on the Unique Extension Theorem Recall that we were interested in defining a general measure of a size of a set on Ò!ß "Ó. Defined this measure T. Defined TÐ ( +ß, ) Ñ œ, +Þ

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful

More information

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero. Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called

More information

Lecture 12: Diagonalization

Lecture 12: Diagonalization Lecture : Diagonalization A square matrix D is called diagonal if all but diagonal entries are zero: a a D a n 5 n n. () Diagonal matrices are the simplest matrices that are basically equivalent to vectors

More information

Inner Product Spaces. Another notation sometimes used is X.?

Inner Product Spaces. Another notation sometimes used is X.? Inner Product Spaces X In 8 we have an inner product? @? @?@ ÞÞÞ? 8@ 8Þ Another notation sometimes used is X? ß@? @? @? @ ÞÞÞ? @ 8 8 The inner product in?@ ß in 8 has several essential properties ( see

More information

Relations. Relations occur all the time in mathematics. For example, there are many relations between # and % À

Relations. Relations occur all the time in mathematics. For example, there are many relations between # and % À Relations Relations occur all the time in mathematics. For example, there are many relations between and % À Ÿ% % Á% l% Ÿ,, Á ß and l are examples of relations which might or might not hold between two

More information

8œ! This theorem is justified by repeating the process developed for a Taylor polynomial an infinite number of times.

8œ! This theorem is justified by repeating the process developed for a Taylor polynomial an infinite number of times. Taylor and Maclaurin Series We can use the same process we used to find a Taylor or Maclaurin polynomial to find a power series for a particular function as long as the function has infinitely many derivatives.

More information

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise:

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise: Math 227-4 Week notes We will not necessarily finish the material from a given day's notes on that day We may also add or subtract some material as the week progresses, but these notes represent an in-depth

More information

! " # $! % & '! , ) ( + - (. ) ( ) * + / 0 1 2 3 0 / 4 5 / 6 0 ; 8 7 < = 7 > 8 7 8 9 : Œ Š ž P P h ˆ Š ˆ Œ ˆ Š ˆ Ž Ž Ý Ü Ý Ü Ý Ž Ý ê ç è ± ¹ ¼ ¹ ä ± ¹ w ç ¹ è ¼ è Œ ¹ ± ¹ è ¹ è ä ç w ¹ ã ¼ ¹ ä ¹ ¼ ¹ ±

More information

and let s calculate the image of some vectors under the transformation T.

and let s calculate the image of some vectors under the transformation T. Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

More information

B œ c " " ã B œ c 8 8. such that substituting these values for the B 3 's will make all the equations true

B œ c   ã B œ c 8 8. such that substituting these values for the B 3 's will make all the equations true System of Linear Equations variables Ð unknowns Ñ B" ß B# ß ÞÞÞ ß B8 Æ Æ Æ + B + B ÞÞÞ + B œ, "" " "# # "8 8 " + B + B ÞÞÞ + B œ, #" " ## # #8 8 # ã + B + B ÞÞÞ + B œ, 3" " 3# # 38 8 3 ã + 7" B" + 7# B#

More information

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a). .(5pts) Let B = 5 5. Compute det(b). (a) (b) (c) 6 (d) (e) 6.(5pts) Determine which statement is not always true for n n matrices A and B. (a) If two rows of A are interchanged to produce B, then det(b)

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues. Similar Matrices and Diagonalization Page 1 Theorem If A and B are n n matrices, which are similar, then they have the same characteristic equation and hence the same eigenvalues. Proof Let A and B be

More information

solve EB œ,, we can row reduce the augmented matrix

solve EB œ,, we can row reduce the augmented matrix Motivation To PY Decomposition: Factoring E œ P Y solve EB œ,, we can row reduce the augmented matrix Ò + + ÞÞÞ+ ÞÞÞ + l, Ó µ ÞÞÞ µ Ò- - ÞÞÞ- ÞÞÞ-. Ó " # 3 8 " # 3 8 we get to Ò-" -# ÞÞÞ -3 ÞÞÞ-8Ó in an

More information

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith (c)2015 UM Math Dept licensed under a Creative Commons By-NC-SA 4.0 International License. Definition: Let V T V be a linear transformation.

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

Eigenvalues and Eigenvectors. Review: Invertibility. Eigenvalues and Eigenvectors. The Finite Dimensional Case. January 18, 2018

Eigenvalues and Eigenvectors. Review: Invertibility. Eigenvalues and Eigenvectors. The Finite Dimensional Case. January 18, 2018 January 18, 2018 Contents 1 2 3 4 Review 1 We looked at general determinant functions proved that they are all multiples of a special one, called det f (A) = f (I n ) det A. Review 1 We looked at general

More information

Dimension. Eigenvalue and eigenvector

Dimension. Eigenvalue and eigenvector Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,

More information

An Example file... log.txt

An Example file... log.txt # ' ' Start of fie & %$ " 1 - : 5? ;., B - ( * * B - ( * * F I / 0. )- +, * ( ) 8 8 7 /. 6 )- +, 5 5 3 2( 7 7 +, 6 6 9( 3 5( ) 7-0 +, => - +< ( ) )- +, 7 / +, 5 9 (. 6 )- 0 * D>. C )- +, (A :, C 0 )- +,

More information

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015 The test lasts 1 hour and 15 minutes. No documents are allowed. The use of a calculator, cell phone or other equivalent electronic

More information

Diagonalization of Matrix

Diagonalization of Matrix of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that

More information

Proofs Involving Quantifiers. Proof Let B be an arbitrary member Proof Somehow show that there is a value

Proofs Involving Quantifiers. Proof Let B be an arbitrary member Proof Somehow show that there is a value Proofs Involving Quantifiers For a given universe Y : Theorem ÐaBÑ T ÐBÑ Theorem ÐbBÑ T ÐBÑ Proof Let B be an arbitrary member Proof Somehow show that there is a value of Y. Call it B œ +, say Þ ÐYou can

More information

Cayley-Hamilton Theorem

Cayley-Hamilton Theorem Cayley-Hamilton Theorem Massoud Malek In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n Let A be an n n matrix Although det (λ I n A

More information

General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator

General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator General Neoclassical Closure Theory: Diagonalizing the Drift Kinetic Operator E. D. Held eheld@cc.usu.edu Utah State University General Neoclassical Closure Theory:Diagonalizing the Drift Kinetic Operator

More information

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

MATH 1553 PRACTICE MIDTERM 3 (VERSION A) MATH 1553 PRACTICE MIDTERM 3 (VERSION A) Name Section 1 2 3 4 5 Total Please read all instructions carefully before beginning. Each problem is worth 10 points. The maximum score on this exam is 50 points.

More information

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

More information

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0 535 The eigenvalues are 3,, 3 (ie, the diagonal entries of D) with corresponding eigenvalues,, 538 The matrix is upper triangular so the eigenvalues are simply the diagonal entries, namely 3, 3 The corresponding

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Topic 1: Matrix diagonalization

Topic 1: Matrix diagonalization Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it

More information

City Suburbs. : population distribution after m years

City Suburbs. : population distribution after m years Section 5.3 Diagonalization of Matrices Definition Example: stochastic matrix To City Suburbs From City Suburbs.85.03 = A.15.97 City.15.85 Suburbs.97.03 probability matrix of a sample person s residence

More information

Gene expression experiments. Infinite Dimensional Vector Spaces. 1. Motivation: Statistical machine learning and reproducing kernel Hilbert Spaces

Gene expression experiments. Infinite Dimensional Vector Spaces. 1. Motivation: Statistical machine learning and reproducing kernel Hilbert Spaces MA 751 Part 3 Gene expression experiments Infinite Dimensional Vector Spaces 1. Motivation: Statistical machine learning and reproducing kernel Hilbert Spaces Gene expression experiments Question: Gene

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Jordan Normal Form and Singular Decomposition

Jordan Normal Form and Singular Decomposition University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ

More information

Recall : Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors Recall : Eigenvalues and Eigenvectors Let A be an n n matrix. If a nonzero vector x in R n satisfies Ax λx for a scalar λ, then : The scalar λ is called an eigenvalue of A. The vector x is called an eigenvector

More information

Eigenvalues and Eigenvectors 7.2 Diagonalization

Eigenvalues and Eigenvectors 7.2 Diagonalization Eigenvalues and Eigenvectors 7.2 Diagonalization November 8 Goals Suppose A is square matrix of order n. Provide necessary and sufficient condition when there is an invertible matrix P such that P 1 AP

More information

Definition Suppose M is a collection (set) of sets. M is called inductive if

Definition Suppose M is a collection (set) of sets. M is called inductive if Definition Suppose M is a collection (set) of sets. M is called inductive if a) g M, and b) if B Mß then B MÞ Then we ask: are there any inductive sets? Informally, it certainly looks like there are. For

More information

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity: Diagonalization We have seen that diagonal and triangular matrices are much easier to work with than are most matrices For example, determinants and eigenvalues are easy to compute, and multiplication

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Jordan Canonical Form Homework Solutions

Jordan Canonical Form Homework Solutions Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version 5/5/2 2 version

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

236 Chapter 4 Applications of Derivatives

236 Chapter 4 Applications of Derivatives 26 Chapter Applications of Derivatives Î$ &Î$ Î$ 5 Î$ 0 "Î$ 5( 2) $È 26. (a) g() œ ( 5) œ 5 Ê g () œ œ Ê critical points at œ 2 and œ 0 Ê g œ ± )(, increasing on ( _ß 2) and (!ß _), decreasing on ( 2 ß!)!

More information

Practice Final Exam Solutions

Practice Final Exam Solutions MAT 242 CLASS 90205 FALL 206 Practice Final Exam Solutions The final exam will be cumulative However, the following problems are only from the material covered since the second exam For the material prior

More information

Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

Announcements Monday, November 06

Announcements Monday, November 06 Announcements Monday, November 06 This week s quiz: covers Sections 5 and 52 Midterm 3, on November 7th (next Friday) Exam covers: Sections 3,32,5,52,53 and 55 Section 53 Diagonalization Motivation: Difference

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

Math 121 Practice Final Solutions

Math 121 Practice Final Solutions Math Practice Final Solutions December 9, 04 Email me at odorney@college.harvard.edu with any typos.. True or False. (a) If B is a 6 6 matrix with characteristic polynomial λ (λ ) (λ + ), then rank(b)

More information

The Cayley-Hamilton Theorem and the Jordan Decomposition

The Cayley-Hamilton Theorem and the Jordan Decomposition LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

More information

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2. Diagonalization Let L be a linear operator on a finite-dimensional vector space V. Then the following conditions are equivalent:

More information

The Jordan Normal Form and its Applications

The Jordan Normal Form and its Applications The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens

More information

The Leontief Open Economy Production Model

The Leontief Open Economy Production Model The Leontief Open Economy Production Model We will look at the idea for Leontief's model of an open economy a term which we will explain below. It starts by looking very similar to an example of a closed

More information

MATH 310, REVIEW SHEET 2

MATH 310, REVIEW SHEET 2 MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

12.4 The Diagonalization Process

12.4 The Diagonalization Process Chapter - More Matrix Algebra.4 The Diagonalization Process We now have the background to understand the main ideas behind the diagonalization process. Definition: Eigenvalue, Eigenvector. Let A be an

More information

The Leontief Open Economy Production Model

The Leontief Open Economy Production Model The Leontief Open Economy Production Model We will look at the idea for Leontief's model of an open economy a term which we will explain below. It starts by looking very similar to an example of a closed

More information

Math 1553 Worksheet 5.3, 5.5

Math 1553 Worksheet 5.3, 5.5 Math Worksheet, Answer yes / no / maybe In each case, A is a matrix whose entries are real a) If A is a matrix with characteristic polynomial λ(λ ), then the - eigenspace is -dimensional b) If A is an

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

Chapter 5 Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n

More information

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

1. In this problem, if the statement is always true, circle T; otherwise, circle F. Math 1553, Extra Practice for Midterm 3 (sections 45-65) Solutions 1 In this problem, if the statement is always true, circle T; otherwise, circle F a) T F If A is a square matrix and the homogeneous equation

More information

0 2 0, it is diagonal, hence diagonalizable)

0 2 0, it is diagonal, hence diagonalizable) MATH 54 TRUE/FALSE QUESTIONS FOR MIDTERM 2 SOLUTIONS PEYAM RYAN TABRIZIAN 1. (a) TRUE If A is diagonalizable, then A 3 is diagonalizable. (A = P DP 1, so A 3 = P D 3 P = P D P 1, where P = P and D = D

More information

Framework for functional tree simulation applied to 'golden delicious' apple trees

Framework for functional tree simulation applied to 'golden delicious' apple trees Purdue University Purdue e-pubs Open Access Theses Theses and Dissertations Spring 2015 Framework for functional tree simulation applied to 'golden delicious' apple trees Marek Fiser Purdue University

More information

1. Classify each number. Choose all correct answers. b. È # : (i) natural number (ii) integer (iii) rational number (iv) real number

1. Classify each number. Choose all correct answers. b. È # : (i) natural number (ii) integer (iii) rational number (iv) real number Review for Placement Test To ypass Math 1301 College Algebra Department of Computer and Mathematical Sciences University of Houston-Downtown Revised: Fall 2009 PLEASE READ THE FOLLOWING CAREFULLY: 1. The

More information

Econ Slides from Lecture 7

Econ Slides from Lecture 7 Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors 5 Eigenvalues and Eigenvectors 5.2 THE CHARACTERISTIC EQUATION DETERMINANATS n n Let A be an matrix, let U be any echelon form obtained from A by row replacements and row interchanges (without scaling),

More information

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Name: TA Name and section: NO CALCULATORS, SHOW ALL WORK, NO OTHER PAPERS ON DESK. There is very little actual work to be done on this exam if

More information

2 b 3 b 4. c c 2 c 3 c 4

2 b 3 b 4. c c 2 c 3 c 4 OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a

More information

Matrices related to linear transformations

Matrices related to linear transformations Math 4326 Fall 207 Matrices related to linear transformations We have encountered several ways in which matrices relate to linear transformations. In this note, I summarize the important facts and formulas

More information

Name: Final Exam MATH 3320

Name: Final Exam MATH 3320 Name: Final Exam MATH 3320 Directions: Make sure to show all necessary work to receive full credit. If you need extra space please use the back of the sheet with appropriate labeling. (1) State the following

More information

Homework 3 Solutions Math 309, Fall 2015

Homework 3 Solutions Math 309, Fall 2015 Homework 3 Solutions Math 39, Fall 25 782 One easily checks that the only eigenvalue of the coefficient matrix is λ To find the associated eigenvector, we have 4 2 v v 8 4 (up to scalar multiplication)

More information

ETIKA V PROFESII PSYCHOLÓGA

ETIKA V PROFESII PSYCHOLÓGA P r a ž s k á v y s o k á š k o l a p s y c h o s o c i á l n í c h s t u d i í ETIKA V PROFESII PSYCHOLÓGA N a t á l i a S l o b o d n í k o v á v e d ú c i p r á c e : P h D r. M a r t i n S t r o u

More information

T i t l e o f t h e w o r k : L a M a r e a Y o k o h a m a. A r t i s t : M a r i a n o P e n s o t t i ( P l a y w r i g h t, D i r e c t o r )

T i t l e o f t h e w o r k : L a M a r e a Y o k o h a m a. A r t i s t : M a r i a n o P e n s o t t i ( P l a y w r i g h t, D i r e c t o r ) v e r. E N G O u t l i n e T i t l e o f t h e w o r k : L a M a r e a Y o k o h a m a A r t i s t : M a r i a n o P e n s o t t i ( P l a y w r i g h t, D i r e c t o r ) C o n t e n t s : T h i s w o

More information

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

MATH 1553 PRACTICE MIDTERM 3 (VERSION B) MATH 1553 PRACTICE MIDTERM 3 (VERSION B) Name Section 1 2 3 4 5 Total Please read all instructions carefully before beginning. Each problem is worth 10 points. The maximum score on this exam is 50 points.

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Diagonalization. Hung-yi Lee

Diagonalization. Hung-yi Lee Diagonalization Hung-yi Lee Review If Av = λv (v is a vector, λ is a scalar) v is an eigenvector of A excluding zero vector λ is an eigenvalue of A that corresponds to v Eigenvectors corresponding to λ

More information

B-9= B.Bà B 68 B.Bß B/.Bß B=38 B.B >+8 ab b.bß ' 68aB b.bà

B-9= B.Bà B 68 B.Bß B/.Bß B=38 B.B >+8 ab b.bß ' 68aB b.bà 8.1 Integration y Parts.@ a.? a Consider. a? a @ a œ? a @ a Þ....@ a..? a We can write this as? a œ a? a @ a@ a Þ... If we integrate oth sides, we otain.@ a a.?. œ a? a @ a..? a @ a. or...? a.@ œ? a @

More information