Discrete Math Class 19 ( )

Similar documents
Linear algebra and applications to graphs Part 1

Math 24 Spring 2012 Sample Homework Solutions Week 8

REU 2007 Apprentice Class Lecture 8

MA 265 FINAL EXAM Fall 2012

Linear Algebra Massoud Malek

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Math 315: Linear Algebra Solutions to Assignment 7

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Lecture 1: Review of linear algebra

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 21b. Review for Final Exam

Review of Some Concepts from Linear Algebra: Part 2

5 Compact linear operators

235 Final exam review questions

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Review problems for MA 54, Fall 2004.

Applied Linear Algebra in Geoscience Using MATLAB

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

1 9/5 Matrices, vectors, and their applications

Definitions for Quizzes

Diagonalization of Matrix

Spectral radius, symmetric and positive matrices

Notes on Linear Algebra and Matrix Theory

1. Select the unique answer (choice) for each problem. Write only the answer.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Exercise Sheet 1.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

MATH 54 - FINAL EXAM STUDY GUIDE

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

NORMS ON SPACE OF MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

1. General Vector Spaces

Markov Chains, Random Walks on Graphs, and the Laplacian

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states.

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Math 307 Learning Goals

Math 307 Learning Goals. March 23, 2010

Math 108b: Notes on the Spectral Theorem

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

1 Last time: least-squares problems

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Functional Analysis Review

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Problem Set 9 Due: In class Tuesday, Nov. 27 Late papers will be accepted until 12:00 on Thursday (at the beginning of class).

Exercise Set 7.2. Skills

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

Inner products and Norms. Inner product of 2 vectors. Inner product of 2 vectors x and y in R n : x 1 y 1 + x 2 y x n y n in R n

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Chap 3. Linear Algebra

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Markov Chains and Stochastic Sampling

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Linear Algebra: Matrix Eigenvalue Problems

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

LINEAR ALGEBRA QUESTION BANK

Diagonalizing Matrices

Basic Elements of Linear Algebra

SUMMARY OF MATH 1600

I. Multiple Choice Questions (Answer any eight)

LinGloss. A glossary of linear algebra

UCSD ECE269 Handout #18 Prof. Young-Han Kim Monday, March 19, Final Examination (Total: 130 points)

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Lecture Summaries for Linear Algebra M51A

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Elementary linear algebra

MAT Linear Algebra Collection of sample exams

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Basic Calculus Review

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Spectral Theorem for Self-adjoint Linear Operators

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Lecture 10 - Eigenvalues problem

Computational math: Assignment 1

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Columbus State Community College Mathematics Department Public Syllabus

CS 246 Review of Linear Algebra 01/17/19

Review of Linear Algebra

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

LINEAR ALGEBRA REVIEW

Math 113 Final Exam: Solutions

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Perron Frobenius Theory

Math 302 Outcome Statements Winter 2013

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Math Linear Algebra II. 1. Inner Products and Norms

TBP MATH33A Review Sheet. November 24, 2018

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Lecture 1 and 2: Random Spanning Trees

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Transcription:

Discrete Math 37110 - Class 19 (2016-12-01) Instructor: László Babai Notes taken by Jacob Burroughs Partially revised by instructor Review: Tuesday, December 6, 3:30-5:20 pm; Ry-276 Final: Thursday, December 8, 10:30-12:30; Ry-251 DO 19.1. IMPORTANT. Study the relevant chapters of LN (about Markov Chains) and the Linear Algebra online text. 19.1 Similar matrices Definition 19.2. Let A, B M n (F) where F is a field (i.e. R, C) A, B are similar if C M n (F), C 1 such that B = C 1 AC. Notation A B. DO 19.3. Similarity is an equivalence relation on M n (F) DO 19.4. If A B then trace(a) = trace(b). (Hint trace(cd) = trace(dc)) DO 19.5. If A B then det(a) = det(b). (Hint det(cd) = det(c) det(d)) DO 19.6. If A B then f A = f B (Their characteristic polynomials are equal) 19.2 Matrix of a linear map Definition 19.7. A linear transformation is a function f : V V where f(a + b) = f(a) + f(b) and f(λa) = λf(a). Equivalently, f( α i a i ) = α i f(a i ) Definition 19.8. A linear map is a function f : V W with the same attributes. DO 19.9. If v 1,..., v n are a basis of V, w 1,..., w n are arbitrary vectors in W, there exists a unique linear map such that ( i)(f(v i ) = w i ) Definition 19.10. Coordinates: Let e = (e 1,..., e n ) be a basis of V. Then every v V can be uniquely written as v = α i e i. THe α i are the coordinates of v wrt (with respect to) the basis e.arranged in a column vector, we write [v] e = (alpha 1,..., α n ) T (transpose, to make it a column vector). 19.3 Change of basis Definition 19.11 (Change of basis matrix). Take two bases: e = (e 1,..., e n ) (the old basis) and e = (e 1,..., e n). The change of basis matrix is S = [[e 1] e,..., [e n] e ]. (The i-th column lists the coordinates of e i wrt e.) 1

DO 19.12 (Change of coordinates under change of basis). [v] new = S 1 [v] old DO 19.13. [e] e = [e ] 1 e Definition 19.14 (Matrix of a linear map). Let ϕ : V W be a linear map. Let e = (e 1,..., e n ) be a basis of V and f = (f 1,..., f k ) be a basis of W. The matrix of ϕ wrt this pair of bases is [ϕ] e,f = [[ϕ(e 1 ) f,..., ϕ(e n ) f ] So this is a k n matrix. DO 19.15 (Change of matrix under change of bases). Let us have a linear map ϕ : V W. Let e, f be old bases of V and W respectively, and e, f be new bases. Then define S, T as the change of basis matrices. Let A = [ϕ] e,f and A = [ϕ] e,f Then A = T 1 AS Corollary 19.16. If ϕ : V V, then [ϕ] new = S 1 [ϕ] old S Corollary 19.17. A B if and only if ϕ : V V and bases e, e such that A = [ϕ] e and B = [ϕ] e Corollary 19.18. A linear transformation has a characteristic polynomial. (because similar matrices have the same characteristic polynomial) λ 1... 0 Definition 19.19. A is diagonalizable if A a diagonal matrix = D =..... 0... λ n Then f A (t) = f D (t) = (t λ i ) [ ] 1 1 Example 19.20. A = is not diagonalizable (since it is not similar to I). 0 1 DO 19.21. A matrix is diagonalizable if and only if it has an eigenbasis Hint: A = [ϕ] e and make e the eigenbasis. Make S = [e ] e Then S 1 AS is a diagonal matrix D. You need to show that AS = SD Corollary 19.22. If f A has n distinct roots, then A is diagonalizable. Caveat: I is diagonalizable, yet it has multiple eigenvalues. DO 19.23. A B = rk A = rk B DO 19.24. ( A)( S)(if S nonsingular then rk(as) = rk A) DO 19.25. rk(ab) rk(a) and rk(ab) rk(b) 2

19.4 Eigensubspaces. Geometric and algebraic multiplicity of eigenvalues Definition 19.26. Algebraic multiplicity of eigenvalue λ is its multiplicity in the characteristic polynomial, i.e., it is the largest m such that (t λ) m f A. Definition 19.27. Geometric multiplicity: The maximum number of linearly independent eigenvectors to λ: U λ = {x Ax = λx} F n This is the eigensubspace to λ. DO 19.28. λ is an eigenvalue if and only if dim U λ 1 The geometric multiplicity of λ is the dimension of U λ DO 19.29. dim U λ = n rk(λi A) Hint: Rank nullity DO 19.30. The geometric multiplicity of λ is less than or equal to the algebraic multiplicity of λ. DO 19.31. Over C: λ C algebraic multiplicity of λ = n. DO 19.32. Over C: A is diagonalize if and only if λ C geometric multiplicity of λ = n λ, the algebraic and geometric multiplicities are equal. DO 19.33. * Over C, every matrix is similar to a triangular matrix (This is a hint for the above exercise) 19.5 Norm, orthogonality in R n Over R: The standard dot product in R n : x y = x T y = x i y i x and y are orthogonal if x y = 0 We define the norm x = x x = x 2 i DO 19.34. Cauchy-Schwarz inequality: a b a b DO 19.35. Triangle inequality: a + b a + b DO 19.36. Show Cauchy-Schwarz is equivalent to the triangle inequality. DO 19.37. If v 1,..., v k R n are orthogonal and non-zero, they are linearly independent. Definition 19.38. The operator norm of A R k n : A = sup Ax x 3

DO 19.39. Show this supremum is a maximum. DO 19.40. If A = (α ij ) then A α ij Theorem 19.41 (Spectral theorem). If A M n (R) and A = A T (A is a symmetric real matrix)then A has an orthonormal{ eigenbasis, i.e., ( b 1,... b n R n 1 i = j )(b i b j = δ ij = 0 i j, b i = 1, and Ab i = λ i b i ) Any orthonormal system of eigenvectors can be extended to an orthonormal eigenbasis. DO 19.42. If A = A T (A is symmetric) then A = λ max = max i Λ i. Hint: Spectral theorem 19.6 Elements of spectral graph theory DO 19.43. A connected undirected graph is aperiodic if and only if it is not bipartite. { 1 i j Definition 19.44. The adjacency matrix of a graph: A g = (a ij ) where a ij = 0 i j DO 19.45. The graph G is regular of degree r if and only if 1 is an eigenvector to eigenvalue r. DO 19.46. For an r-regular graph G, let λ 1,..., λ n be the eigenvalues of the adjacency matrix. (Note: these are real because the adjacency matrix is symmetric.) λ, λ r DO 19.47. Let G be a regular graph of degree r with eigenvalues r = λ 1 λ 2 λ n. Prove: λ 2 = r if and only if G is disconnected. (a) Show that this follows from Perron Frobenius. (b) Prove this without using Perron Frobenius. DO 19.48. Let G be a connected regular graph of degree r. Then r is an eigenvalue if and only if G is bipartite. 19.7 Rate of convergence of random walk on a graph: spectral estimate Definition 19.49 (SLEM). Let A be a symmetric real matrix with eigenvalues λ 1 λ 2 λ n. Then SLEM(A) = max i λ i = max λ 2, λ n (second largest eigenvalue modulus) Notation. J is the all-ones matrix. Theorem 19.50 (Convergence rate of naive random walk on regular graph). Let G be a connected regular non-bipartite graph. Let A be the adjacency matrix of A. So T = (1/r)A is the transition matrix of the naive random walk on G. Then lim T t = 1 J and T t 1 J λt n n where λ is SLEM(T ) = (1/r)SLEM(A) (so 0 < λ < 1). 4

Proof. We shall show that the maximum absolute value of the eigenvalues of T t 1 n J is λt. The all-ones vector 1 is an eigenvector of T to eigenvalue 1. Let e 1 be its normalized value: e 1 = (1/ n)1. Let e 1, e 2,..., e n be an orthonormal eigenbasis of T. We claim that e 1, e 2,..., e n is also an orthonormal eigenbasis of T t (1/n)J. For i 2 we have e i e 1 and therefore e i 1. Therefore, for i 2 we have Je i = 0 and therefore (T t (1/n)J)e i = T t e i = λ t ie i. Also, T t e 1 = e 1 (because T is stochastic and therefore T t is stochastic), and the same holds for (1/n)J, so (1/n)Je 1 = e 1. Therefore (T t (1/n)J)e 1 = 0. So e 1, e 2,..., e n form an orthonormal eigenbasis of T t (1/n)J with eigenvalues (in this order) 0, λ t 2,..., λ t n. The maximum absolute value among these is therefore λ t, as claimed. 5