Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10

Similar documents
Designing Information Devices and Systems I Spring 2016 Official Lecture Notes Note 21

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Definition (T -invariant subspace) Example. Example

4. Linear transformations as a vector space 17

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

MAT 1302B Mathematical Methods II

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 6

Diagonalization of Matrix

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Recall : Eigenvalues and Eigenvectors

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note 6

Recitation 8: Graphs and Adjacency Matrices

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Lecture 15, 16: Diagonalization

and let s calculate the image of some vectors under the transformation T.

Announcements Monday, November 06

Math 205, Summer I, Week 4b:

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Eigenvalues, Eigenvectors, and Diagonalization

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.

CS 246 Review of Linear Algebra 01/17/19

Eigenvalues and Eigenvectors 7.2 Diagonalization

MAT1302F Mathematical Methods II Lecture 19

MATH 1553-C MIDTERM EXAMINATION 3

Math 3191 Applied Linear Algebra

Jordan Canonical Form Homework Solutions

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points.

Eigenvalue and Eigenvector Homework

Calculating determinants for larger matrices

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Designing Information Devices and Systems I Discussion 4B

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Problem # Max points possible Actual score Total 120

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

= main diagonal, in the order in which their corresponding eigenvectors appear as columns of E.

MATH 221, Spring Homework 10 Solutions

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

18.06 Quiz 2 April 7, 2010 Professor Strang

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Linear Algebra Practice Problems

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Properties of Linear Transformations from R n to R m

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Mathematical Methods for Engineers 1 (AMS10/10A)

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Eigenvalues and Eigenvectors

Lecture 12: Diagonalization

Solution Set 7, Fall '12

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 3

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Chapter 4 & 5: Vector Spaces & Linear Transformations

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Math 240 Calculus III

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Designing Information Devices and Systems I Fall 2018 Homework 5

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math Matrix Algebra

Econ Slides from Lecture 7

Family Feud Review. Linear Algebra. October 22, 2013

Linear algebra and differential equations (Math 54): Lecture 13

MIT Final Exam Solutions, Spring 2017

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Announcements Wednesday, November 01

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction: Op-amps in Negative Feedback

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Math Spring 2011 Final Exam

Matrices and Deformation

Eigenvalues, Eigenvectors, and an Intro to PCA

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

PY 351 Modern Physics Short assignment 4, Nov. 9, 2018, to be returned in class on Nov. 15.

Math 215 HW #11 Solutions

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

2 Eigenvectors and Eigenvalues in abstract spaces.

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

City Suburbs. : population distribution after m years

Math 308 Final, Autumn 2017

Eigenspaces and Diagonalizable Transformations

Math 215 HW #9 Solutions

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Lecture 11: Diagonalization

Chapter 5. Eigenvalues and Eigenvectors

Practice problems for Exam 3 A =

Transcription:

EECS 6A Designing Information Devices and Systems I Spring 209 Lecture Notes Note 0 0. Change of Basis for Vectors Previously, we have seen that matrices can be interpreted as linear transformations between vector spaces. In particular, an m n matrix A can be viewed as a function A : U V mapping a vector u from vector space U R n to a vector A u in vector space V R m. In this note, we explore a different interpretation of square, invertible matrices: as a change of basis. Let s first start with an example. Consider the vector u = u u 2 form, implicitly we are representing it in the standard basis for R 2, e = = 4 3. When we write a vector in this 0 and e 0 2 =. This means that we can write u = 4 e + 3 e 2. Geometrically, e and e 2 defines a grid in R 2, and u is represented by the coordinates in the grid, as shown in the figure below: What if we want to represent u as a linear combination of another set of basis vectors, say a = and 0 a 2 =? This means that we need to find scalars u a and u a2 such that u = u a a + u a2 a 2. We can write EECS 6A, Spring 209, Note 0

this equation in matrix form: a a 2 0 u a u a2 u = u 2 u a 4 = u a2 3 0 Thus we can find u a and u a2 by solving a system of linear equations. Since the inverse of is 0, we get u a = 4 and u a2 =. Then we can write u as 4 a + a 2. Geometrically, a and a 2 defines a skewed grid from which the new coordinates are computed. From this example, we can see that the same vector u can be represented in multiple ways! In the standard basis e, e 2, the coordinates for u are 4,3. In the skewed basis a, a 2, the coordinates for u are 4,. It s the same vector geometrically, but with different coordinates. In general, suppose we are given a vector u R n in the standard basis and want to change to a different basis with linearly independent basis vectors a,, a n. If we denote the vector in the new basis as u a =., we solve the following equation A u a = u, where A is the matrix basis is given by: u a = A u u a u an a a n. Therefore the change of EECS 6A, Spring 209, Note 0 2

If we already have a vector u a in the basis a,, a n, how do we change it back to a vector u in the standard basis? We can reverse the change of basis transformation, thus u = A u a. Pictorially, the relationship between any two bases and the standard basis is given by: Basis a,, a n A Standard Basis B B Basis b,, b n A For example, consider a vector in the basis of a,, a n : u = u a a + + u an a n = A u a Suppose we would like to represent it as a linear combination of different basis vectors b,, b n. In other words, we d like to write it as: u = u b b + + u bn b n = B u b Setting these equal gives the following relationship: B u b = u b b + + u bn b n = u = u a a + + u an a n = A u a B u b = A u a u b = B A u a Thus the change of basis transformation from basis a,, a n to basis b,, b n is given by B A. 0.2 Change of Basis for Linear Transformations Now that we know how to change the basis of vectors, let s shift our attention to linear transformations. We will answer these questions in this section: how do we change the basis of linear transformations and what does this mean? First let s review linear transformations. Suppose we have a linear transformation T represented by a n n matrix that transforms u R n to v R n : v = T u Although T is represented by a matrix, we can think of it as a geometric transformation on vectors. Implicit in this representation is the choice of a coordinate system. Unless stated otherwise, we always assume that vectors lie in the coordinate system defined by the standard basis vectors e, e 2,..., e n. But what if our vectors represented in a different basis? Suppose we have basis vectors a,, a n R n, and the vectors u, v above are represented in this basis: u =u a a + + u an a n v =v a a + + v an a n Can we also represent the transformation T above, such that v a = T a u a? Let s start by defining the matrix A: A = a a n EECS 6A, Spring 209, Note 0 3

Since our vectors u and v are linear combinations of the basis vectors we can write them as u = A u a and v = A v a. This is exactly the formula defined in the previous section to change a vector back to the standard basis! Now, starting with our transformation T in the standard basis, we can plug in these relationships: T u = v TA u a = A v a A TA u a = v a By pattern matching, we see that if we set T a = A TA, we get our desired relationship, T a u a = v a. The correspondences stated above are all represented in the following diagram: u T v A A A A u a T a v a For example, there are two ways to get from u a to v a. First is the transformation T a. Second, we can trace out the longer path, applying transformations A, T and A in order. This is represented in matrix form as T a = A TA (Recall that matrices act from right to left: the closest matrix to the vector will be applied first.) By the same logic, we can go in the reverse direction: T = AT a A. Let s look at whats happing at each step. First, A transforms a vector in the standard basis into the A basis. Then, T a then acts on it, returning a new vector which also in the A basis. Finally, multiplying by A returns a weighted sum of the columns of A, in other words, transforming the vector back into the original standard basis. 0.3 A Diagonalizing Basis We just saw that transformations are different in different bases. Is there a special basis under which a transformation attains a particularly nice form? Making our transformation into a diagonal matrix is very useful since diagonal matrices are easily invertible, they can be raised to high powers without a lot of computation, and multiplication between diagonal matrices is commutative. Suppose that we choose our basis vectors a,, a n to be the eigenvectors of the transformation matrix T, which have associated eigenvalues λ,,λ n. What does T u look like now? Recall that u can be written in the new basis: u a a + + u an a n. T u = T (u a a + + u an a n ) = u a T a + + u an T a n EECS 6A, Spring 209, Note 0 4

By the definition of eigenvectors and eigenvalues: = u a λ a + + u an λ n a n = a a n... λ n u a u an. = AD u a = ADA u, where D is the diagonal matrix of eigenvalues and A is a matrix with the corresponding eigenvectors as its columns. Thus we have shown that in an eigenvector basis, T = ADA. In particular, T a, the counterpart of T in the eigenvector basis, is a diagonal matrix. How is transforming T into a diagonal matrix useful? Say we want apply our transformation T many times. We might be interested in calculating T k where k could be quite large. Although you could do this calculation with repeated matrix multiplication, it would quite tedious and even with a computer is slow for large matrices. However, we just showed that we can write T = ADA where D is diagonal. Consider raising this to the power of k: T k =(ADA ) k We can expand this out k times and then regroup the terms: T k =(ADA )(ADA )...(ADA )(ADA ) =AD(A A)D(A...A)D(A A)DA =AD(I)D(I...)D(I)DA =ADD...DDA ) =AD k A To raise a diagonal matrix to a power, one can raise each element to that power (try proving this to yourself by writing out the multiplication element by element). This is a much simpler calculation that repeatedly doing the full matrix multiplication! In addition, this formulation better allows us to write equations describing how the transformation changes with k having equations will help us better understand and design systems. 0.4 Diagonalization Under what circumstances is a matrix diagonalizable? A n n matrix T is diagonalizable if it has n linearly independent eigenvectors. If it has n linearly independent eigenvectors a,, a n with eigenvalues λ, λ n, then we can write: T = ADA Where A = a a n and D is a diagonal matrix of eigenvalues. Note that the eigenvalues must be arranged to match the arrangement of eignevectors in A. For example, the eigenvalue associated with a EECS 6A, Spring 209, Note 0 5

should be in the first column on D, the eigenvalue associated with a 2 should be in the second column on D, and so on. 0 Example 0. (A matrix that is not diagonalizable): Let T =. To find the eigenvalues, we solve the equation: det(t λi) = 0 ( λ) 2 = 0 λ = The eigenvector corresponding to λ = is a = 0. Since this matrix only has eigenvector but exists in R 2 2, it is not diagonalizable. Example 0.2 (Diagonalization of a 3 3 matrix): 3 2 4 T = 2 0 2 4 2 3 To find the eigenvalues, we solve the equation: det(t λi) = 0 λ 3 6λ 2 5λ 8 = 0 (λ 8)(λ + ) 2 = 0 λ =,8 If λ =, we need to find a such that: (T ( )I) a = 0 = 4 2 4 2 2 a = 0 = 4 2 4 2 2 0 0 0 a = 0 0 0 0 Thus the dimension of the nullspace of T ( )I is 2, and we can find two linearly independent vectors in this basis: a = 0 a 2 = 2 0 If λ = 8, we need to find a such that: (T (8)I) a = 0 = 5 2 4 2 8 2 a = 0 = 4 2 5 0 0 2 a = 0 0 0 0 EECS 6A, Spring 209, Note 0 6

Thus the dimension of the nullspace of T (8)I is, and we find the vector in the nullspace: 2 a 3 = 2 Now we define: 2 0 0 A = 0 2 D = 0 0 A = 4 2 5 4 9 0 2 0 0 8 2 2 Then T can be diagonalized as T = ADA. Additional Resources For more on diagonalization, read Strang pages 304-309. In Schuam s, read pages 299-30 and try Problems 9.9 to 9.2, and 9.45 to 9.5. 0.5 A Proof with Diagonalization Not only can diagonalization help us raise matrices to powers, it can also help us understand some underlying properties of matrices. In this example, we use diagonalization to understand when matrices are commutative. Show that if two n n diagonalizable matrices A and B have the same eigenvectors, then their matrix multiplication is commutative (i.e. AB = BA) Let v,..., v n be the eigenvectors of A and B, and P = v,..., v n be a matrix constructed from the eigenvectors of A and B. Also let λ a,...,λ an be the eigenvalues of A, and λ b,...,λ bn be the eigenvalues of B. Using diagonalization, we can decompose A and B such that A = PD A P and B = PD B P, where D A is the diagonal matrix of A, and D B is the diagonal matrix of B: Now we can write a D A =... λ an D B = AB = (PD A P )(PD B P ) = PD A (P P)D B P = PD A (I)D B P = PD A D B P b... Because D A and D B are both diagonal matrices, D A D B = D B D A. To see why this is the case, let s look at it element by element: λ bn EECS 6A, Spring 209, Note 0 7

a D A D B =... a λ b =... b λ a =... b =... λ an λ bn b... λ bn () (2) λ an λ bn λ bn λ an a... (3) λ an = D BD A (4) Thus: PD A D B P = PD B D A P = PD B ID A P = (PD B P )(PD A P ) = BA We ve now shown that A and B are commutative if they have the same eigenvectors. We can use this to design matrices that commute! 0.6 Practice Problems These practice problems are also available in an interactive form on the course website (http://inst.eecs.berkeley.edu/~ee6a/sp9/hw-practice/). 0. If v = 2 is in the standard basis, what is v in the basis of a = 0, a 2 =, a 3 =? 3 0 0 0 0 2. True or False: A 3 3 matrix with the following eigenvectors is diagonalizable:, 2,. 2 4 0 k 0 3. Which of the following matrices is equivalent to? 0.5 0.5 EECS 6A, Spring 209, Note 0 8

0 (a) 0 0.5 k 0 (b) 0.5 k 0.5 k 0.5 (c) k 0 0.5 k 0 (d) + 0.5 k 0.5 k EECS 6A, Spring 209, Note 0 9