Recitation 8: Graphs and Adjacency Matrices

Similar documents
Recitation 9: Probability Matrices and Real Symmetric Matrices. 3 Probability Matrices: Definitions and Examples

and let s calculate the image of some vectors under the transformation T.

Review of Linear Algebra

Definition (T -invariant subspace) Example. Example

k is a product of elementary matrices.

Lecture 2: Eigenvalues and their Uses

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Lecture 4: Applications of Orthogonality: QR Decompositions

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

Dot Products, Transposes, and Orthogonal Projections

Designing Information Devices and Systems I Spring 2016 Official Lecture Notes Note 21

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices

Math 215 HW #9 Solutions

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Matrices related to linear transformations

Math 314H Solutions to Homework # 3

Eigenvalues and Eigenvectors

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.

Math 205, Summer I, Week 4b:

CS 246 Review of Linear Algebra 01/17/19

MAT 1302B Mathematical Methods II

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Solution Set 7, Fall '12

Recitation 5: Elementary Matrices

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Chapter 3 Transformations

Lecture 19: The Determinant

Final Exam Practice Problems Answers Math 24 Winter 2012

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

MATH 310, REVIEW SHEET 2

1 Counting spanning trees: A determinantal formula

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Chapter 4 & 5: Vector Spaces & Linear Transformations

Math 110, Spring 2015: Midterm Solutions

Math Final December 2006 C. Robinson

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Eigenvalues, Eigenvectors, and Diagonalization

Elementary maths for GMT

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Eigenvectors and Hermitian Operators

Lecture 2: Change of Basis

Conceptual Questions for Review

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Math 54 Homework 3 Solutions 9/

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 10

Linear Algebra, Summer 2011, pt. 2

Practice problems for Exam 3 A =

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

I = i 0,

Knowledge Discovery and Data Mining 1 (VO) ( )

18.06 Quiz 2 April 7, 2010 Professor Strang

MATH 583A REVIEW SESSION #1

An Introduction to Spectral Graph Theory

Designing Information Devices and Systems I Discussion 4B

2 b 3 b 4. c c 2 c 3 c 4

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

UMA Putnam Talk LINEAR ALGEBRA TRICKS FOR THE PUTNAM

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math Linear Algebra Final Exam Review Sheet

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

MATH 310, REVIEW SHEET

Lecture 4: Constructing the Integers, Rationals and Reals

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

LINEAR ALGEBRA REVIEW

Math 416, Spring 2010 The algebra of determinants March 16, 2010 THE ALGEBRA OF DETERMINANTS. 1. Determinants

Math Linear Algebra II. 1. Inner Products and Norms

Math 24 Spring 2012 Questions (mostly) from the Textbook

Properties of Linear Transformations from R n to R m

Study Guide for Linear Algebra Exam 2

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 3191 Applied Linear Algebra

Introduction to Basic Proof Techniques Mathew A. Johnson

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Matrices. Chapter Definitions and Notations

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math 1060 Linear Algebra Homework Exercises 1 1. Find the complete solutions (if any!) to each of the following systems of simultaneous equations:

EXAM 2 REVIEW DAVID SEAL

Math 314/814 Topics for first exam

MATH 1553-C MIDTERM EXAMINATION 3

Homework Set #8 Solutions

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Chapter 2 Notes, Linear Algebra 5e Lay

Math 360 Linear Algebra Fall Class Notes. a a a a a a. a a a

Matrices and Linear Algebra

Math 1553 Worksheet 5.3, 5.5

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

1 Matrices and Systems of Linear Equations. a 1n a 2n

NAME MATH 304 Examination 2 Page 1

Section 4.5. Matrix Inverses

Graph fundamentals. Matrices associated with a graph

Transcription:

Math 1b TA: Padraic Bartlett Recitation 8: Graphs and Adjacency Matrices Week 8 Caltech 2011 1 Random Question Suppose you take a large triangle XY Z, and divide it up with straight line segments into a bunch of smaller triangles. Now, suppose you color the vertices of this triangle as follows: Vertex X is colored red, Vertex Y is colored blue, Vertex Z is colored green, All of the vertices on the line XY are either red or blue, All of the vertices on the line Y Z are either blue or green, and finally All of the vertices on the line ZX are either green or red. Show that you must have colored one of the small triangles with all three colors: i.e. that there is a small triangle with one red vertex, one green vertex, and one blue vertex. 2 Homework comments Section average: 90%. This was about 5% above the class average; nice work! People did pretty well, all in all! There were two reoccuring errors that were somewhat concerning: Several people wrote that (A T A) T = AA T. This is not true! I ve said it like fifteen times by now, but: for square matrices, (AB) T = B T A T. So, specifically, (A T A) T = (A) T (A T ) T = A T A; i.e. it is its own transpose! It s an easy mistake to make, but do be careful. 1

As we said at the end of last ( recitation: ) not all matrices are diagonalizable! For 1 1 example, the matrix A = is not diagonalizable; in other words, there 0 1 is no invertible matrix E and diagonal matrix D such that A = EDE. (This is because the matrix A has only one eigenvector, up to scalar multiples.) 3 Graphs and Matrices: Definitions and Examples Definition. A directed graph G = (V, E) consists of the following: A set V, which we call the set of vertices for G, and A set E V 2, made of ordered pairs of vertices, which we call the set of edges for G. Example. In the map below, take the set of the six labeled countries {A, B, C, D, E, F } as our vertices, and connect two of these vertices with an edge if and only if their corresponding countries share a border of any positive length. This then defines a graph, which we draw to the right of our map: Definition. Given a graph G = (V, E) on the vertex set V = {1, 2,... n}, we can define the adjacency matrix for G as the following n n matrix: { } a A G := a ij = 1 if the edge (i, j) is in E; ij a ij = 0 otherwise. Example. If we let G be the map-graph we drew above in our earlier example, then A G is the following matrix: 0 1 1 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 Remark. Conversely, given a n n matrix A, we can create a graph G A associated to said matrix by letting 2

V = {1,... n}, and E = {(i, j) : a ij 0}. It bears mentioning that this process effectively undoes our adjacency-matrix operation above: i.e. if we start with a graph G, take its adjacency matrix A G, and then turn this matrix back into a graph G AG, we have the same graph. Similarly, if we start with a matrix made entirely out of zeroes and ones, turn it into a graph, and then back into matrix, we will have changed nothing. As the definitions above illustrate, we have ways of converting graphs into matrices, and vice-versa. A good question to ask now is the following: why do we care? In other words, what do we gain by being able to turn graphs into matrices, or vice-versa? We answer this question with our next section, which shows us how the adjacency matrix for a graph can shed a lot of light on the graph itself: 4 Graphs and Matrices: A Theorem and its Use Theorem 1 Suppose that G = (V, E) is a graph with V = {1,... n}, A G is its corresponding n n adjacency matrix, and i, j are a pair of vertices in G. Then there is a path 1 of length m from i to j if and only if the (i, j)-th entry of (A G ) m is nonzero. Proof. To prove this claim, we first make the following inductive observation: Claim 2 The (i, j)-th entry of (A G ) m is the following sum: a i,c1 a c1,c 2... a cm,j Proof. We prove this claim by induction on m. For m = 1, this is trivially clear: a ij = a ij. So: we assume that this holds for m, and seek to prove our claim for m + 1: i.e we seek to show that the (i, j-th entry of (A G ) m+1 is c 1,c 2,...c m=1 a i,c1 a c1,c 2... a cm,j. So: to do this, write (A G ) m+1 = A G (A G ) m. Then we can write the (i, j)-th entry of (A G ) m+1 as the i-th row of A G times the j-th column of (A G ) m. We know that the i-th row 1 For a graph G, a path from i to j of length m is a sequence of vertices (i, c 1, c 2... c m) and a corresponding sequence of edges ((i, c 1), (c 1, c 2),... (c m, j)). You can think of this as describing a way to walk from i to j while using precisely m edges. Note that your path is allowed to cross over itself; we are not asking that any of these edges or vertices are distinct, merely that they exist. 3

of A G is just (a i,1,... a i,n ), and by our inductive hypothesis we know that the j-th column of (A G ) m is n c 1,c 2,...c a m=1 1,c 1... a cm,j n c 1,c 2,...c a m=1 2,c 1... a cm,j. n c 1,c 2,...c m=1 a n,c 1... a cm,j So, if we take the product of this row and column, we have that the (i, j)-th entry of (A G ) m+1 is n c 1,c 2,...c a m=1 1,c 1... a cm,j n c 1,c 2,...c a m=1 2,c 1... a cm,j (a i,1,... a i,n ) = a i,1 = = k,. n c 1,c 2,...c a m=1 n,c 1... a cm,j a 1,c1... a cm,j +... + a i,n a i,1 a 1,c1... a cm,j +... + a i,k a k,c1... a cm,j. a n,c1... a cm,j a i,n a n,c1... a cm,j If we relabel the constants k, c 1,... c m appropriately, this is precisely the sum we wanted; therefore, we ve proven our inductive claim. So: we ve just proven that the (i, j)-th entry of (A G ) m is a i,c1 a c1,c 2... a cm,j. What does this mean? Well, suppose that there was a path of length m from i to j. Then there must be vertices c 1... c m such that the edges (i, c 1 ),... (c m, j) all exist; in other words, all of the values a i,c1,... a cm,j must be 1. So their product is also 1; therefore, at least one of the elements in the sum above is nonzero! Because all of the a kl s are nonnegative, this means that the entire sum above is positive! Thus, we ve just shown that if there *is* a path from i to j of length m, then the (i, j)-th entry is nonzero. Conversely, suppose that the (i, j)-th entry is nonzero. The only way this can happen is if one of the terms in the sum above is nonzero: i.e. if there are vertices c 1,... c m such that a i,c1 a c1,c 2... a cm,j 0. But the only way for this to happen is if all of their individual values are 1; i.e. if all of the edges (i, c 1 ),... (c m, j) exist! As this constitutes a path of length m from i to j, we ve just proven the other direction of our claim: that if the (i, j)-th entry of (A G ) m is nonzero, then there is a path of length m. So we re done! 4

If you don t care about the proof, the following question should be more interesting: Question 3 Is there a path of length 243 from 1 to 2 in the following graph? Solution. Call the graph above G, and notice that A G has the following form: 0 0 1 1 A G = 0 0 1 1 1 1 0 0 1 1 0 0 By our theorem above, we know that a path of length 243 from 1 to 2 can exist iff the (1,2)-entry of (A G ) 243 is nonzero. Thus, we ve reduced our path-finding question to one of raising matrices to large powers something we showed last week is trivial for diagonalizable matrices! So, let s see if this matrix is diagonalizable. To do this, we first find its characteristic polynomial: p AG (λ) = det(λi A G ) = det = λ det λ 0 0 λ λ 0 0 λ λ λ 0 0 λ = λ(λ 3 2λ) (λ 2 ) (λ 2 ) = λ 2 (λ 2)(λ + 2). det 0 λ 0 λ + det 0 λ λ 0 Therefore, our matrix has eigenvalues 0, 2, 2. To find their multiplicity, we investigate the appropriate nullspaces, which we find by just being clever and noticing certain combinations 5

of the columns which go to zero: E 0 = nullspace E 2 = nullspace E 2 = nullspace 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 0 2 0 1 1 0 2 1 1 1 1 2 0 1 1 0 2 2 0 1 1 0 2 1 1 1 1 2 0 1 1 0 2 = span{(1,, 0, 0), (0, 0, 1, )} = span{(1, 1, 1, 1)} = span{(,, 1, 1)} Thus, we have four linearly eigenvectors! This allows us to write A G as the following product of matrices: A G = (A G ) 243 = 2 243 0 0 0 0 ( 2) 243 0 0 So, you *could* just multiply this all out and see what you get; however, 2 243 is really really *really* big, and can kind of crash a lot of programs! Thus, what you might want to try instead is noticing the following trick: the diagonal matrix in the middle? That s just times the constant 2242 : therefore, because scalar multiplication 6

commutes with matrices, we can write (A G ) 243 = = 2 242 = 2 242 A G. 2 243 0 0 0 0 ( 2) 243 0 0 Therefore, we know that the (1,2)-entry of (A G ) 243 is nonzero iff the (1,2)-entry in A G itself is nonzero! By inspection, it is 0; therefore, we know that there can be no path of length 243 from 1 to 2. It bears noting that there was nothing special about 243 in the above proof, other than the fact that it was odd, and thus allowed us to factor 2 242 2 243 0 0 0 0 ( 2) 243 0 0 into. Had our number been even, of course, this would not have worked: raising the (-2) in our matrix to an even power would have made it positive, and we would have had no way to do the factoring trick we just did. (This is because there are, in fact, even-length paths of any positive length from 1 to 2! Looking at the graph for a while should convince you of this fact, if you don t want to do the matrix multiplication...) 7