Properties and Recent Applications in Spectral Graph Theory

Size: px
Start display at page:

Download "Properties and Recent Applications in Spectral Graph Theory"

Transcription

1 Virginia Commonwealth University VCU Scholars Compass Theses and Dissertations Graduate School 8 Properties and Recent Applications in Spectral Graph Theory Michelle L. Rittenhouse Virginia Commonwealth University Follow this and additional works at: Part of the Physical Sciences and Mathematics Commons The Author Downloaded from This Thesis is rought to you for free and open access y the Graduate School at VCU Scholars Compass. It has een accepted for inclusion in Theses and Dissertations y an authorized administrator of VCU Scholars Compass. For more information, please contact licompass@vcu.edu.

2 Properties and Recent Applications in Spectral Graph Theory By Michelle L. Rittenhouse Bachelor of Science, University of Pittsurgh Johnstown, PA 989 Director: Dr. Ghidewon Aay Asmerom Associate Professor, Mathematics and Applied Mathematics Virginia Commonwealth University Richmond, Virginia April, 8

3 To: David, Mom, Dad and Shannon. Thank you for all of you support and encouragement. I couldn t have done it without you. To: Nikki and Will. Through determination and perseverance, you can achieve anything.

4 Tale of Contents Introduction i Ch : Graph Theory Section.: Introduction to Graph Theory Section.: The Adjacency Matrix Section.: The Characteristic Polynomial and Eigenvalues Chapter : Spectral Graph Theory Section.: Introduction Section.: The Spectrum of a Graph Section.: Bipartite Graphs Section.4: Walks and the Spectrum Chapter : The Laplacian Section.: Introduction Section.: Spanning Trees Section.: An Alternative Approach Chapter 4: Applications Section 4.: Chemical Applications Section 4.: Protein Structures Section 4.: Graph Coloring Section 4.4: Miscellaneous Applications References

5 Astract There are numerous applications of mathematics, specifically spectral graph theory, within the sciences and many other fields. This paper is an exploration of recent applications of spectral graph theory, including the fields of chemistry, iology, and graph coloring. Topics such as the isomers of alkanes, the importance of eigenvalues in protein structures, and the aid that the spectra of a graph provides when coloring a graph are covered, as well as others. The key definitions and properties of graph theory are introduced. Important aspects of graphs, such as the walks and the adjacency matrix are explored. In addition, ipartite graphs are discussed along with properties that apply strictly to ipartite graphs. The main focus is on the characteristic polynomial and the eigenvalues that it produces, ecause most of the applications involve specific eigenvalues. For example, if isomers are organized according to their eigenvalues, a pattern comes to light. There is a parallel etween the size of the eigenvalue (in comparison to the other eigenvalues) and the maximum degree of the graph. The maximum degree of the graph tells us the most caron atoms attached to any given caron atom within the structure. The Laplacian matrix and many of its properties are discussed at length, including the classical Matrix Tree Theorem and Cayley s Tree Theorem. Also, an alternative approach to defining the Laplacian is explored and compared to the traditional Laplacian. i

6 CHAPTER : GRAPH THEORY SECTION.: INTRODUCTION TO GRAPH THEORY A graph G is a finite nonempty set of points called vertices, together with a set of unordered pairs of distinct vertices called edges. The set of edges may e empty. The degree of a vertex v, deg(v), is the numer of edges incident on v. A graph is regular if all vertices have equal degrees. A graph is considered a complete graph if each pair of vertices is joined y an edge. In the example graph elow, the set of vertices is V(G) = {a,, c, d} while the set of edges is E(G) = {a, ac, c, d, cd}. The graph is not complete ecause vertices a and d are not joined y an edge. deg(a) = = deg(d), deg() = = deg(c). Two vertices are adjacent if there is an edge that connects them. In Figure -, vertices a and c are adjacent, while vertices a and d are not. a Graph G Figure - c d A wide variety of real world applications can e modeled using vertices and edges of a graph. Examples include electrical nodes and the wires that connect them, the stops and rails of a suway system and communication systems etween cities. The cardinality of the vertex set V(G) is called the order of G and is denoted y p(g), or simply p when the context makes it clear. The cardinality of the edge set E(G) is called the size of G, and is usually denoted y q(g) or q. A graph G can e denoted as a G(p, q) graph.

7 A v i v j walk in G is a finite sequence of adjacent vertices that egins at vertex v i and ends at vertex v j. In Figure -, a walk from a to d would e acd. Another walk from a to d is acd. If each pair of vertices in a graph is joined y a walk, the graph is said to e connected. The distance etween any two vertices in a graph is the numer of edges traced on the shortest walk etween the two vertices. The distance from a to d is. The maximum distance etween all of the pairs of vertices of a graph is the diameter of the graph. In Figure -, the distance etween any two vertices is either or, making the diameter of G, or diam(g) =. The vertices and edges may have certain attriutes, such as color or weight. Also, the edges may have direction. When the edges are given direction we have a digraph, or directed graph, as shown elow in Figure -. Digraphs can e used to model road maps. The vertices represent landmarks, while the edges represent one-way or two-way streets. home ank work Figure - vcu gym An incidence matrix associated with a digraph G is a q p matrix whose rows represent the edges and columns represent the vertices. If an edge k starts at vertex i and ends at vertex j, then row k of the incidence matrix will have + in its (k, i) element, and it its (k, j) element. All other elements are.

8 a g a c d e h D = f f I = Figure - c e d g h Example: A directed graph and its incidence matrix are shown aove in Figure -. When two vertices are joined y more than one edge, like the example in Figure -4, it ecomes a multi-graph. a Figure -4 c d When a pair of vertices is not distinct, then there is a self-loop. A graph that admits multiple edges and loops is called a pseudograph. In the pseudograph elow, edge cc is joining a pair of non-distinct vertices; therefore, there is a self-loop at vertex c. a c d Figure -5 Graphs G and H are isomorphic, written G H, if there is a vertex ijection

9 f: V(G) V(H) such that for all u, v V(G), u and v adjacent in G f(u) and f(v) are adjacent in H. Likewise, u and v not adjacent in G f(u) and f(v) are not adjacent in H. a n G = H = m Figure -6 c d q p Graphs G and H are isomorphic, ecause a vertex ijection etween them is: f(a) = m, f() = n, f(c) = g, and f(d) = p. 4

10 SECTION.: ADJACENCY MATRICES There is a great deal of importance and application to representing a graph in matrix form. One of the key ways to do this is through the adjacency matrix, A. The rows and columns of an adjacency matrix represent the vertices, and the elements tell whether or not there is an edge etween any two vertices. Given any element, a ij, a ij = if a i and a j are connected otherwise Example: A graph G with its adjacency matrix A. G a c d A = a c d a c d Figure -7 Notice that the diagonal of an adjacency matrix of a graph contains only zeros ecause there are no self-loops. Rememer that our graphs have no multiple edges or loops. This causes the trace of the adjacency matrix, written tr(a), the sum of its main diagonal, to e zero. Also, when A represents a graph, it is square, symmetric, and all of the elements are non-negative. In other words, a ij = a ji. Property -: The numer of walks of length l from v i to v j in G is the element in position (i, j) of the matrix A l [Bi, p9]. Proof: If l =, then the numer of walks from v i to v j, i j, is, making element 5

11 (i, j) =. The numer of walks from any vertex to itself of length zero is, making element (i, i) =, giving us the identity matrix. A = I, therefore it checks if l =. If l =, then A = A, the adjacency matrix. Let Property - e true for l = L. The set of walks of length L+ from vertex v i to vertex v j is in ijective correspondence with the set of walks of length L from v i to v h adjacent to v j. In other words, if there is a walk of length L from v i to v h, and v h is adjacent to v j, then there is a walk of length L+ from v i to v j. The numer of such walks is ( A { vh, v j } E( G) L ) ih n L = ( A ) a = (A h= ih hj L+ ) ij Therefore, the numer of walks of length L+ joining v i to v j is (A L+ ) ij. By induction, the property holds. Example: Look again at graph G and its adjacency matrix A: G a c d A = a c d a c d Figure -7 The numer of walks from vertex to vertex d of length can e found y squaring matrix A. A =, therefore if we look at element (, d) = element (d, ) =. There is one walk from to d of length. That walk is cd. 6

12 4 Likewise, 4 A =. Element (a, c) = element (c, a) = 4. Therefore, there are walks of length from vertex a to vertex c. These walks are: acac, acdc, acc, aac. Property - has two immediate corollaries. We list them as Property - and Property -. Property -: The trace of A is twice the numer of edges in the graph. Proof: Let v i e an aritrary vertex of G. For every edge v i v j of G, y Property -, we get a count of towards the (i, j) position of A. That is, the (i, j) position of A is equal to the degree of v i, ecause every walk v i v i that involves only one v i v j edge will have a length of. This means: tr( A ) = v V ( G) i deg( v ) = q. Recall, q equals the cardinality of the i edge set of G. This can e seen in our example in Figure -7. A =. The trace of A is 8 and the numer of edges is 4. Property -: The trace of A is six times the numer of triangles in the graph. Proof: Let v i e an aritrary vertex of G. For every -cycle v i v j of G, again y Property -, we get a count of towards the (i, j) position of A. That is, the (i, j) position of A is equal to the numer of -cycles that start and end at v i. However, every -cycle v i v j v k v i is counted six times ecause the vertices of the -cycle can e ordered in 6 ways. This means that tr(a ) = 6(# of triangles) in the graph. 7

13 Again, let s test it with the graph from Figure -7. A =. The trace is 6 and the numer of triangles in the figure is, illustrating Properties - and -. 8

14 SECTION.: THE CHARACTERISTIC POLYNOMIAL AND EIGENVALUES The characteristic polynomial, Г, of a graph of order n is the determinant det(λi A), where I is the n n identity matrix. Example : given adjacency matrix A a c a = c a From graph G c Figure -8 Г = λ det( λi A) = det λ λ λ = det λ λ = λ λ The general form of any characteristic polynomial is λ n + c λ n + c λ n c n Equation Property -4: The coefficients of the characteristic polynomial that coincide with matrix A of a graph G have the following characteristics: ) c = ) c is the numer of edges of G ) c is twice the numer of triangles in G Proof: For each i {,,, n}, the numer ( ) i c i is the sum of those principal minors of A which have i rows and columns. So we can argue as follows: ) Since the diagonal elements of A are all zero, c =. 9

15 ) A principal minor with two rows and columns, and which has a non-zero entry, must e of the form. There is one such minor for each pair of adjacent vertices of the characteristic polynomial, and each has a value of. Hence, ( ) c = E(G). ) There are essentially three possiilities for non-trivial principal minors with three rows and columns:,,. Of these, the only non-zero one is the last one (whose value is two). This principal minor corresponds to three mutually adjacent vertices in Г, and so we have the required description of c. [Bi, p8-9] Example: This can e seen from the characteristic polynomial of A aove. n = 4 ) there is no λ, making c = ) c =. The numer of edges of G is 4. ) c =. The numer of triangles in G is. The characteristic polynomial is enormously important to spectral graph theory, ecause it is an algeraic construction that contains graphical information. It will e explored more in Chapter. The roots of a characteristic polynomial are called the eigenvalues. Setting the characteristic polynomial λ λ from Example equal to zero and solving gives us the eigenvalues {,, }. Eigenvalues are at the heart of understanding the properties and structure of a graph. We will also study them more in chapter. Property -5: The sum of the eigenvalues of a matrix equals its trace. Example: Let us consider a adjacency matrix

16 A = a c d. Det(A - λi) = λ (a + d)λ + (ad c). Applying the quadratic equation and setting = c, we get λ = ( a + d) + ( a d) + 4 and λ = ( a + d) ( a d) + 4. Adding a + d the eigenvalues gives us λ + λ = = a + d = the sum of the diagonal, or tr(a). Similar computations can e applied to any adjacency matrix, regardless of the dimensions. The algeraic multiplicity of an eigenvalue is the numer of times that the value occurs as a root of the characteristic polynomial. In example, the eigenvalue - has a multiplicity of and the eigenvalue of has a multiplicity of. The eigenvectors that correspond to the eigenvalues are. The geometric multiplicity is the dimension of the eigenspace, or the suspace spanned y all of the eigenvectors. Property -6: If a matrix is real symmetric, then each eigenvalue of the graph relating to that matrix is real. Proof: The proof follows from the Spectral Theorem from Linear Algera: Let A e a real, symmetric matrix. Then there exists an orthogonal matrix Q such that A = Q Λ Q - = Q Λ Q T, where Λ is a real diagonal matrix. The eigenvalues of A appear on the diagonal of Λ, while the columns of Q are the corresponding orthonormal eigenvectors. The entries of an adjacency matrix are real and the adjacency matrix is symmetric. Therefore, all of the eigenvalues of an adjacency matrix are real. [Ol, p49]

17 Property -7: The geometric and algeraic multiplicities of each eigenvalue of a real symmetric matrix are equal. Property -8: The eigenvectors that correspond to the distinct eigenvalues are orthogonal (mutually perpendicular). Note: When u and v are two orthogonal eigenvectors of A associated with two distinct eigenvalues λ and μ in that order, then the unit vectors u u v and v are orthonormal eigenvectors associated with λ and μ respectively. Property -9: If a graph is connected, the largest eigenvalue has multiplicity of. Let s check these properties with Example from aove: A a c a = c a From graph G c Figure -8 The eigenvalues are {,, }. All are real, demonstrating Property -6. The algeraic multiplicity of is two, and the algeraic multiplicity of is one. The eigenvector that corresponds to the eigenvalue - is the linear comination of the vectors,. We can easily show that the two vectors are independent. This means the dimension of the suspace they span is. Thus the geometric multiplicity is also.

18 The eigenvector that corresponds to the eigenvalue is. It spans a suspace of dimension one, giving geometric multiplicity of one. Thus, Property -7 holds. To see if Property -8 maintains, we need to calculate whether or not the vectors corresponding to distinct eigenvalues are orthogonal. Let v = and v =, v =. <v, v > = -() + () + () = and <v, v > = -() + () + () =. Therefore, the vectors corresponding to the distinct eigenvalues are orthogonal. Using the Gram-Schmidt process we can get the orthonormal asis = u, = u spanning the same suspace spanned y v and v. If we let = = v v u then we see that we have an orthonormal set. Property -9 applies ecause the largest eigenvalue is, and it has a multiplicity of one.

19 CHAPTER : SPECTRAL GRAPH THEORY SECTION.: INTRODUCTION Spectral graph theory is a study of the relationship etween the topological properties of a graph with the spectral (algeraic) properties of the matrices associated with the graph. The most common matrix that is studied within spectral graph theory is the adjacency matrix. L. Collatz and U. Sinogowitz first egan the exploration of this topic in 957 with their paper, Spektren Endlicher Grafen [Sp]. Originally, spectral graph theory analyzed the adjacency matrix of a graph, especially its eigenvalues. For the last 5 years, many developments have een leaning more toward the geometric aspects, such as random walks and rapidly moving Markov chains [Chu, p]. A central goal of graph theory is to deduce the main properties and the structure of a graph from its invariants. The eigenvalues are strongly connected to almost all key invariants of a graph. They hold a wealth of information aout graphs. This is what spectral graph theory concentrates on. Rememer, our definition of a graph from Chapter does not admit self-loops or multi-edges. 4

20 SECTION.: THE SPECTRUM OF A GRAPH The spectrum of a graph G is the set of eigenvalues of G, together with their algeraic multiplicities, or the numer of times that they occur. Property -: A graph with n vertices has n eigenvalues. Proof: This is a direct result of the Fundamental Theorem of Algera. The characteristic polynomial of G with n vertices is a polynomial of degree n, and the Fundamental Theorem of Algera states that every polynomial of degree n has exactly n roots, counting multiplicity, in the field of complex numers. If a graph has k distinct eigenvalues λ > λ > K > λ k with multiplicities m(λ ), m(λ ),... m(λ k ), then the spectrum of G is written Spec(G) = λ m( λ ) λ m( λ ) λn m( λn ), where k λi = n [Bi, p8] i= Example: G = a c d M = a c d a c d Figure - The characteristic polynomial is 4λ 4 4λ with eigenvalues {,,, }. Our graph has three distinct eigenvalues: -,, and, hence the spectrum of the graph G is given y Spec(G) =. 5

21 One consistent question in graph theory is: when is a graph characterized y its spectrum? [Do, p568] Properties that cannot e determined spectrally can e determined y comparing two nonisomorphic graphs with the same spectrum. Below are two graphs with the same spectrum. G = H = Figure - It is clear that the two graphs are structurally different, with different adjacency matrices, and yet they have the same spectrum, which is. We know that graphs with the same spectrum have the same numer of triangles. The example aove shows that oth graphs have no triangles, ut it also shows that graphs with the same spectrum do not necessarily have the same numer of rectangles. While the first has no rectangle, the second one has one rectangle. Our example also shows that graphs with the same spectrum do not necessarily have the same degree sequence (the sequence formed y arranging the vertex degrees in non-increasing order). The degree sequence of graph G is <4,,,, > while that of H is <,,,, >. Finally, our example also shows that connectivity cannot e determined y the spectrum. A graph is connected if for every pair of vertices u and v, there is a walk from u to v. Here the graph G is connected while graph H is not. There is a correlation etween the degree of the vertices of a graph and the largest eigenvalue, λ. When we have a k-regular graph (all vertices have degree k), then λ = k. 6

22 When we have a complete graph (all vertices are adjacent to one another) with n vertices, λ = n. If G is a connected graph, then λ is less than or equal to the largest degree of the graph. Also, λ increases with graphs that contain vertices of higher degree. In addition, the degrees of the vertices adjacent to the vertex with the largest degree affect value of λ. Example: Look at graphs G and H elow. H Figure - G Both have 5 vertices. The degree of each vertex in G is 4; in other words, G is a complete graph, so λ (G) = 4. The second graph, H, is a path; the degrees of the vertices of H are and, much smaller than that of G: λ (H) =.7. Example: Compare graphs G and H elow, and their respective λ s. In each graph, the vertex with the highest degree has degree 4, ut the adjacent vertices in G have degree, while in H they have degree. This illustrates how the degrees of the vertices adjacent to the vertex with the largest degree affect the value of λ. G λ = H λ = 7

23 SECTION.: BIPARTITE GRAPHS A ipartite graph G is one whose vertex-set V can e partitioned into two susets U and W such that each edge of G has one endpoint in U and one in W. The pair U, W is called the ipartition of G, and U and W are called the ipartition susets. Bipartite graphs are also characterized as follows: Property -: A graph is ipartite if and only if it contains no odd cycles. Proof: Let v v v k v e a cycle in a ipartite graph G; let s also assume, without loss of generality, that v U. Then, y definition of a ipartite graph, v є W, v є U, v 4 є W,... v k W, and k must e even. Therefore, the walk from v to v k is odd, and the cycle v to v is even. Let s assume G is connected and without odd cycles. The case of G disconnected would follow from this special case. Consider a fixed vertex v V(G). Let V i e the set of all vertices that are at distance i from v, that is V i = { u V(G) d(u, v ) = i}, i =,,, t}, where t is the length of the longest path in G. Clearly t is finite since G is connected, and V, V,, V t provides a partition of the vertices of G. Let s oserve that no two vertices in V are adjacent, otherwise there will e a -cycle in G. Similarly, no two vertices in V are adjacent ecause we do not have a -cycle or a 5-cycle. In fact every edge of G is of the form uv, where u V i and v V i+ for some i =,, t. Let U = V V V 4 V j, the union of all the even suscripted sets and W = V V V 5 V r+, the union of all the odd suscripted sets. This shows that G is ipartite. 8

24 Example: Graph G from Figure - elow is a ipartite graph. The vertex susets are U = {a, d}, W = {, c}. Its characteristic polynomial is λ 4-4λ and its eigenvalues are {-,,, }. G = a c d M = a c d a c d Figure - Example: Take also a look at the ipartite graph P 7, the path on 7 vertices. It has the spectrum: +,,,,,, + The eigenvalues of ipartite graphs have special properties. Figure -4 Property -: If G is ipartite graph and λ is an eigenvalue, then λ is also an eigenvalue. Proof: Let G e a ipartite graph. Let U = {u, u,... u n } and W = {w, w,... w n }, where U and W are the partite sets of V(G). Then all edges are of the form u i w j where u i U and w j W. Also, there are no edges that go from u i to u j or from w i to w j for any i, j. This makes the adjacency matrix of G A = B T B, where B is an n m matrix. B x x Because λ is an eigenvalue, we know Av = λv. So T = λ. Using simple B y y 9

25 matrix multiplication, we get By = λx. Multiplying oth sides y negative one gives us B( y) = λx. The second equation that results from the matrix multiplication is B T x = λy, and λy = ( λ)( y). Therefore, B T x = ( λ)( y). This gives us the matrix equation B T B x x = λ y y giving us the eigenvalue λ. Corollary --: The spectrum of a ipartite graph is symmetric around. [Do, p558]. This logically follows from the theorem. The eigenvalues of a ipartite graph occur in pairs of additive inverses. Recall from chapter one the general form of a characteristic polynomial is λ n + c λ n + c λ n c n Property -4: If G is a ipartite graph then c k- = for n. Proof: Since G is ipartite, y Property -, there are no odd cycles in G. Thus, c n- = for n. Let s check this with another ipartite graph. G = c a d e Figure -5 Its adjacency matrix is with characteristic polynomial λ 5 4λ + λ and eigenvalues {,,,, }. For every eigenvalue λ, λ is also an eigenvalue, as

26 stated in Theorem -, and the eigenvalues are symmetric around zero. Looking at the characteristic polynomial, c and c are oth zero as stated in Property -4.

27 SECTION.4: WALKS AND THE SPECTRUM Recall that a v i v j walk in G is a finite sequence of adjacent vertices that egins at vertex v i and ends at vertex v j. Also recall that A k i,j represents the numer of walks of length k from vertex v i to vertex v j. This means, given vertices v i and v j with dist(v i, v j ) = t in G, a graph with adjacency matrix A, we have k A i, j =, for k < t, and k A i, j. A minimal polynomial of a graph G is the monic polynomial q(x) of smallest degree, such that q(g) =. For example, if f(x) = x (x 5) (x + 4) 4, the minimal polynomial for f(x) = x(x 5)(x + 4). Property -5: The degree of the minimal polynomial is larger that the diameter. Proof: Since A is symmetric, the minimum polynomial is given y q(g:λ) = (λ - λ k ), k m, where the λ k s are all distinct. If we have m distinct eigenvalues, then the degree of the minimum polynomial is m. Suppose m d, where d is the diameter of G and suppose i and j are two vertices of G whose distance is equal to m, then the (i, j) entry of A m is positive while the (i, j) entry of A k is for all k < m. This means q(g:a), which is a contradiction, thus m > d. Example: Using the graph from Figure -, we can see that the diameter of graph G is. a G = c d

28 We found the characteristic polynomial is 4λ 4-4λ = 4λ (λ ). The minimal polynomial 4λ(λ ) has degree, which is larger than the diameter of. Property -6: If a graph G has diameter d and has m distinct eigenvalues, then m > d +. Proof: This follows from Property -5. Example: Let s look at this with Example from Chapter, elow. We have already estalished that the diameter of this graph is, ecause the distance etween any two vertices is either one or two. a Graph G Figure -6 c d The adjacency matrix for G is A =, giving us the characteristic polynomial λ 4-5λ - 4λ with the 4 eigenvalues {-.56, -,,.56}. There are 4 distinct eigenvalues, giving us m = 4. The diameter is. Therefore, m > d +. A complete graph is one in which every pair of vertices is joined y an edge. A complete graph with n vertices is denoted y K n. Example: Two isomorphic K 4 graphs are G = H= Figure -7

29 Property -7: The complete graph is the only connected graph with exactly two distinct eigenvalues. Proof: If a graph has exactly distinct eigenvalues, then its diameter has to e or. Rememer that two distinct eigenvalues means we have a minimal polynomial of degree. Also, the degree must e greater than the diameter. If the diameter is, then the graph is K. Otherwise, the graph is K p for p >. Furthermore, the degree of the characteristic polynomial of a complete graph is p (giving us p eigenvalues). Using matrix algera one can show that the characteristic polynomial for K p is ( ) p λ+ ( λ p ) + ; then is a root of the characteristic polynomial with multiplicity p and the other root, which occurs once, is p [Sc, Wednesday AM]. In the example elow, the matrix of the complete graph yields a characteristic polynomial of degree four. Note that occurs as a root times, and = 4 occurs once. Example: The adjacency matrix from Graph G in figure -5 is A =. G is a complete graph, and its characteristic polynomial is λ 4 6λ 8λ with eigenvalues {-, } where - is a triple root and is a single root. The graph has 4 vertices, 4 roots, and exactly two distinct roots. Property -8: The complete graph K p is determined y its spectrum. 4

30 Proof: This comes from Properties - and -7. If there are exactly two eigenvalues in the spectrum, then we know that the graph is a complete graph, and we know how many vertices the graph has ased on the multiplicities of the eigenvalues. Example: Given Spec G =, we know that the graph is complete with three vertices, giving us the graph G in Figure -8. G = Figure -8 5

31 CHAPTER : THE LAPLACIAN SECTION.: INTRODUCTION The Laplacian is an alternative to the adjacency matrix for descriing the adjacent vertices of a graph. The Laplacian, L, of a graph is the square matrix that corresponds to the vertices of a graph [Do, p57]. The main diagonal of the matrix represents the degree of the vertex while the other entries are as follows: A ij = - if v i and v j are adjacent otherwise The Laplacian can also e derived from D A, where D is the diagonal matrix whose entries represent the degrees of the vertices, and A is the adjacency matrix. Example: a a c d G = c d a L = c d Figure - The Laplacian of a connected graph has eigenvalues λ λ... λ n. The algeraic connectivity is defined to e λ, the second smallest eigenvalue. The name is a result of its connection to the vertex connectivity and the edge connectivity of a graph. It is the most important information contained within the spectrum of a graph [Mo, p4] and will e discussed more in Chapter 4. 6

32 The oldest result aout the Laplacian concerns the numer of spanning trees of a graph [Do, p57]. The Matrix Tree Theorem is one of the most significant applications of the Laplacian, and is usually contriuted to Kirchhoff. It is discussed in the next section. A positive semidefinite matrix is one that is Hermitian, and whose eigenvalues are all non-negative. A Hermitian matrix is one which equals its conjugate transpose. This is usually written: A H = T A = A. Example: + 4i B = 4i, then B H = B. This means B is Hermitian. Here B is Hermitian ut it is not symmetric. On the other hand, + 4i C = + 4i, is symmetric ut not Hermitian. On the other hand, a real matrix is symmetric if and only if it is Hermitian. This is ecause the every real numer is its own conjugate. Here A is the real symmetric matrix A =. If we look at each element, a ij = a ji ; its eigenvalues are {, }. The characteristic function is the function for which every suset N of X, has a value of at points of N, and at points of X N. In other words, it takes the value of for numers in the set, and for numers not in the set. Property -: The smallest eigenvalue of L is. Proof: This is a direct result of the Laplacian matrix eing a positive semidefinite matrix. It will have n real Laplace eigenvalues: = λ λ... λ n. 7

33 Property -: The multiplicity of as an eigenvalue of L is the numer of connected components in the graph. Proof: Let H e a connected component of graph G. Denote y f H R V the characteristic function of V(H). Then, L(G) f H =. Since the characteristic functions of different connected components are linearly independent, the multiplicity of the eigenvalue is at least the numer of connected components of G. Conversely, let f R V e a function from the kernel of L(G). Then f, Lf =, implying that f is constant on each connected component of G. Therefore f is a linear comination of the characteristic functions of connected components of G. [Mo, p6] Property -: The algeraic connectivity is positive if and only if the graph is connected. Proof: If λ > and λ =, then G must e connected, ecause, as mentioned aove, the eigenvalues of the Laplacian are = λ λ... λ n. This is a direct result of Property -. If G is connected, then zero is the smallest eigenvalue. λ must e greater than zero, and therefore positive. When we have a k-regular graph, a graph whose vertices all have degree k, there is a linear relationship etween the eigenvalues of the Laplacian and the eigenvalues of the adjacency matrix A. If θ θ... θ n are the eigenvalues of L (Laplacian eigenvalues of G) and λ, λ,... λ n are the eigenvalues of A (the adjacency matrix of G), then θ i = k λ i. This is a result of the graph eing a regular graph, giving us the relationship L = ki A etween the Laplacian and the adjacency matrix. No such relationship exists etween the eigenvalues of the Laplacian and adjacency matrix of a non-regular graph. 8

34 SECTION.: SPANING TREES In order to discuss spanning trees, we must first cover a few definitions. A tree is a connected graph that has no cycles. Example: Figure - A tree Not a tree A sugraph H of a graph G is a graph whose vertex and edge sets are suset of V(G) and E(G) in that order. A few sugraphs of the tree aove are: Figure - A sugraph H is said to span a graph G if V(H) = V(G). A spanning tree of a graph is a spanning sugraph that is a tree. Given graph G elow, graph H is a spanning tree of G. 9

35 Figure -4 Graph G Graph H Before we go into the next few properties, we need to understand a cofactor of a matrix, which egins with the minor of a matrix. A minor M ij of a matrix B is determined y removing row i and column j from B, and then calculating the determinant of the resulting matrix. The cofactor of a matrix is ( ) i + j M ij. Example: B = Minor M comes from deleting row and column, and then calculating the determinant of the resulting matrix, M = Cofactor C = ( ) + M = Also, given an n n matrix B, det(b) = a j C j + a j C j a nj C nj = a i C i + a i C i a in C in = the sum of the cofactors multiplied y the entries that generated them. Property -4, The Matrix Tree Theorem: Given a graph G, its adjacency matrix A, and its degree matrix C, the numer of nonidentical spanning trees of G is equal to the value of any cofactor of the matrix C A.

36 Proof: Let D = C A. The sum of the elements of any row i or any column i of A is the degree of vertex v i. Therefore, the sum of any row i or column i of matrix D is zero. It is a result of matrix theory that all cofactors of D have the same value. Let G e a disconnected matrix of order p. Let G n e a sugraph of G whose vertex set is {v, v,..., v n }. Let D e the (p ) (p ) matrix that results from deleting row p and column p from matrix D. Since the sum of the first n rows of D is the zero vector with p entries, the rows of D are linearly dependent, implying that det(d ) =. Hence, one cofactor of D has value zero. Zero is also the numer of spanning trees of G, since G is disconnected. Now let G e a connected (p, q) graph where q p. Let B denote the incidence matrix of G and in each column of B, replace one of the two nonzero entries y. Denote the resulting matrix y M = [m ij ]. We now show that the product of M and its transpose M T is D. The (i, j) entry of MM T is q k = m ik m jk, which has the value deg v i if i = j, the value if v i v j є E(G), and otherwise. Therefore, MM T = D. Consider a spanning sugraph H of G containing p edges. Let M e the (p ) (p ) sumatrix of M determined y the columns associated with the edges of H and y all rows of M with one exception, say row k. We now need to determine the determinant of M. If H is not connected, then H has a component, H*, not containing v k. The sum of the row vectors of M corresponding to the vertices of H* is the zero vector with p entries. Hence, det M =. Now assume H is connected. H is then a spanning tree of G. Let h v k e an end-vertex of H, and e the edge incident with it. Also, let h v k e an end-vertex of the tree

37 H h and e the edge of H e incident with e. We continue this process until finally '' only v k remains. A matrix M = [ m ] can now e otained y a permutation of the rows ij '' and columns of M such that m = if and only if u i and e i are incident. From the ij manner in which M was defined, any vertex u i is incident only with edges e j, where j i. '' This, however, implies that M is lower triangular, and since m = for all i, we conclude that det(m ) =. However, the permutation of rows and columns of a matrix affects only the sign of its determinant, implying that det(m ) = det(m ) =. Since every cofactor of D has the same value, we evaluate only the i th principal cofactor. That is, the determinant of the matrix otained y deleting from D row i so that the aforementioned cofactor equals det(m i M T i ), which implies that this numer is the sum of the products of the corresponding major determinants of M i and M T i. However, corresponding major determinants have the same value and their product is if the defining columns correspond to a spanning tree of G and is otherwise. [Cha, p66] A key result to the Matrix Tree Theorem is Cayley s Tree Formula. Corollary -4-, Cayley s Tree Formula: The numer of different trees on n laeled vertices is n n. Proof: Start with the matrix (D A) of a complete graph; we get an n n matrix M ij n, i = j with m ij =, otherwise. Calculating the determinant of a cofactor of M gives us n n. From the Matrix Tree Theorem this numer is the numer of spanning trees.

38 SECTION.: AN ALTERNATIVE APPROACH There is an alternative way of defining eigenvalues. We can define them in their normalized form. One advantage to this definition is that it is consistent with eigenvalues in spectral geometry and in stochastic processes [Chu, p]. It also allows results which were only known for regular graphs to e generalized to all graphs. We will use NL to represent the Laplacian calculated using this definition. In a graph where d v represents the degree of vertex v, the Laplacian would e defined to e the matrix NL(u, v) if u = v and d v if u and v and adjacent d u d v otherwise Example: Graph G a c d 6 NL = Figure -5 When graph G is a k-regular graph, it is easy to see that NL = I k A. The degree of each vertex is k, so d u d v = k. Every entry of A that has a will ecome k A, and when we sutract that from I, we get k for all vertices that are adjacent. The non-adjacent

39 elements will e for oth I and A. The diagonal entries of I are, and the diagonals of A (and therefore A) are, giving us when we sutract I A. k k Example: Graph G a c d NL = = I A Figure -6 Let S e a matrix whose rows represent the vertices of a graph G and whose columns represent the edges of G. Each column corresponding to an edge e = uv has an entry of in the row corresponding to vertex u and an entry of d u dv in the row corresponding to vertex v. All other entries in the column will e zero. Property -5: NL = S S T. Proof: This is a direct result of their definitions. When we multiply an element u of S y an element v of S T, and there exists an edge uv, then we are multiplying d u d v = d u d v = corresponding element in NL. If there is no edge uv, then the entries in S, S T, and NL are zero. Example: a G = Figure -7 c d 4

40 5 NL = S = d c a S S T = = = NL Property -6: is an eigenvalue of NL. Proof: Let g denote an aritrary function which assigns to each vertex v of G a real value g(v). We can view g as a column vector. Then = g g g LT T g g g Lg g,,,, / / where T = the diagonal matrix where the entries are the degrees of the vertices, and L is the regular Laplacian. g = T / f and = u~v the sum over all unordered pairs {u, v} for which u and v are adjacent. x x g x f ) ( ) ( denotes the standard inner product in n R. Then = g g g LT T g g g Lg g,,,, / / = f T f T Lf f / /,, = v v v u d v f v f u f ~ ) ( )) ( ) ( ( [Chu, p 4], which = when f(u) = f(v). This also shows us how all eigenvalues are nonnegative. Therefore the spectrum of NL is the set of eigenvalues = λ λ λ n-. a ac c cd

41 When we use this normalized definition of the Laplacian for a graph with n vertices and no isolated vertices, a few properties unfold. Property -7: λ Proof: i i i i n λ = tr(nl), the trace of NL. If the graph is connected, then tr(nl) = n, ecause each element of the diagonal is, and we have an n n matrix. If the graph is disconnected, then one or more elements of the diagonal will e zero, causing the sum of the diagonals to e less than n, and giving us a tr(nl) < n. Property -8: For n, λ n n and λ n- n n Proof: Assume that λ > n ; since all the remaining n - eigenvalue in the spectrum n are at least as large as λ their sum λ n n. Similarly assume that λ n- < i λ > n. This contradicts Property.7; thus i n ; since this eigenvalue is the largest in the n spectrum, and λ is estalished to e, the sum of the n eigenvalues i λ < n. This again i contradicts Property.7 and therefore λ n- n. n Let s look at Figure -8, a very simple example, and apply these properties. G = a c NL = Figure -8 6

42 The characteristic polynomial is λ - λ + λ, yielding eigenvalues {,, }. Property -6 holds, since is an eigenvalue. The sum of the eigenvalues is, so property -7 holds, and λ =, which demonstrates Property -8. 7

43 CHAPTER 4: APPLICATIONS SECTION 4.: CHEMICAL APPLICATIONS Recall that the eigenvalues of L(G), the Laplacian matrix of graph G are λ λ... λ n where λ n = and λ n > if and only if G is connected. Also recall that a tree is a connected acyclic graph. A chemical tree is a tree where no vertex has a degree higher than 4. Chemical trees are molecular graphs representing constitutional isomers of alkanes. If there are n vertices, each chemical tree represents a particular isomer of C n H n+. The first four are methane, ethane, propane, and utane. After that, the alkanes are named ased on Greek numers. For example, C 5 H is pentane. Compounds whose carons are all linked in a row, like the two elow, are called straight-chain alkanes. For example, if n =, we have the graph in Figure 4-, which represents methane. = Caron = Hydrogen Figure 4- Methane Figure 4- shows us utane, which is C 4 H. Figure 4- Butane 8

44 Compounds that have the same formula, ut different structures, are called isomers. When C 4 H is restructured as in Figure 4-, we have isoutane, or - Methylpropane. Butane and -Methylpropane are isomers. Figure 4- Isoutane or -Methylpropane Compounds with four carons have isomers, while those with five carons have isomers. The growth, however, is not linear. The chart elow compares the numer of carons with the numer of isomers. Formula Numer of Isomers C 6 H 4 5 C 7 H 6 9 C 8 H 8 8 C 9 H 5 C H 75 C 5 H 4,47 C H 4 66,9 [Mc, p76] When a caron has four carons onded to it, we have a quarternary caron. An example is elow in Figure 4-4, which is called a,-dimetylpropane. It is isomeric to Pentane. 9

45 Figure 4-4 For simplicities sake, we will just draw the caron atoms from this point on, with the understanding that there are enough hydrogen atoms attached to each caron to give that caron atom a degree of 4. Study was done on the eigenvalues of molecular graphs, and in particular, λ, the largest eigenvalue of a graph. When the isomeric alkanes are ordered according to their λ values, regularity is oserved. [Gu, p 48] Let Δ denote the maximum degree of a graph. The chemical trees that pertain to the 8 isomeric octanes C 8 H 8 follow a pattern with respect to their largest eigenvalue, λ. The isomer with the smallest λ (.8478) value is the straight-chain octane in Figure 4-5, that has Δ =. Figure 4-5 The next isomers have various extensions of ranching, ut none possess a quaternary caron atom. All of them have Δ =, and their λ s are greater than that of the straight-chain graph in Figure 4-5, where Δ =, and less than the following seven, who have Δ = 4. They are shown elow in Figure

46 Figure 4-6 The th through the 8 th octanes contain a quaternary caron atom, they all have Δ = 4, and they have the largest λ. The largest one has λ = and is the last tree shown elow in Figure 4-7. Figure 4-7 This same regularity occurs with isomeric alkanes with n caron atoms, discussed aove. The normal alkane with Δ = has the smallest λ. All alkanes with Δ = have λ greater than the alkanes with Δ =, and smaller than any isomer with Δ = 4. We can therefore draw the conclusion that Δ, which tells us whether or not there is a quaternary 4

47 caron atom, is the main molecular structure descriptor affecting the value λ, the largest Laplacian eigenvalue of an alkane. It has een discovered that λ can e ounded y Δ + < λ < Δ + + Δ Also, y using a linear comination of the lower and upper ounds, λ can e estimated y λ Δ + + γ Δ, where γ depends on oth n and Δ. For alkanes, it has een discovered through numerical testing that γ.. [Gu p 4] It is possile to estalish the alkane isomers with Δ = or Δ = 4 that have the minimal λ. Give P n, elow, T n min is the tree that estalishes the minimal λ for Δ =, and Q n min is the tree that estalishes the minimal λ for Δ = 4. P n = n... T min n =... n Q min n n =... Figure 4-8 max The structure trees that represent the maximal λ are more complex. The T n and Q max n coincide with the chemical trees that have the same Δ and n, having maximal λ and minimal W, where W represents the Wiener topological index of alkanes, and n conforms to the formula W = n i= λ complex, and will not e covered here. i. The exact characterizations of these trees are 4

48 SECTION 4.: PROTEIN STRUCTURES The three dimensional structure of proteins is the key to understanding their function and evolution [Vi, p]. There are an enormous numer of proteins in nature, ut a limited numer of three-dimensional structures that represent them. A protein is formed y the sequential joining of amino acids, end-to-end, to form a long chain-like molecule, or polymer. These polymers are referred to as polypeptides. There are four major protein classes, shown elow in Figure 4-9. [Vi, p] The cylinders represent helices (plural for helix) and the arrows represent strands. A helix is a spiral molecule formed from enzene rings. Two of the current challenges are identifying the fold adopted y the polypeptide chain, and identifying similarities in protein structures. Figure 4-9 4

49 Analysis of stale folded three-dimensional structures provides insights into protein structures for amino acid sequences and drug design studies. The geometry of a protein structure is the composition of the protein ackone and side-chains. Protein structures can have the same gross shape, ut have different geometries. Graphs have helped represent the topology of protein structures, no matter how complex. The main prolem is to define the vertices and edges. Below are a few simple examples of proteins with their graphs elow them. Β-Hairpin Greek key βαβ Figure 4- Properties of graphs and their graph spectral give information aout protein structure, depending on how the vertices and edges are defined. The asic unit of a protein is its amino acid residue. To study cluster identification, the amino acids represent the vertices and the three-dimensional connectivity etween them is represented y the edges. To study fold and pattern identification and the folding rules of proteins, α-helixes and β-strands are used for vertices and spatially closed structures are used for edges. To identify proteins with similar folds, the ackones are the vertices and the spatial neighors within a certain radius are the edges. 44

50 Mathematical graphs are used to represent β structures, which is much more advantageous that drawing three-dimensional figures. The vertices represent the single β-strands and the two edge sets represent the sequential and hydrogen ond connections. Connected graphs are used to represent α-helical structures. The vertices represent secondary structures and the edges represent contacts etween helices. The main reason for setting the structures up this way was to gain information aout the folding process of protein structures and understand etter the organization and patterns within these structures. It is also used to compare protein structures. The protein connectivity is determined y identifying the main chain atoms, which is found due to the closeness within a prescried distance. This comes from identifying clusters. Two protein graphs can e compared to see if they have common features, and thus provide insight into structural overlaps of proteins. One way is a tree searching algorithm, which is a series of matrix permutations. Through it, sugraph isomorphisms are detected to determine the largest sugraph that is common in a pair of graphs. This can highlight areas of structural overlap & therefore show structural and functional commonalities not found through other methods. Unfortunately, this method requires a very high numer of computations, ut a few heuristic methods have een discovered to reduce the time and cost of the computations. Structural iologists are continuously finding promising applications of graph theory with optimism that this field of mathematics will continually contriute sustantially to the understanding of protein structure, folding staility, function and dynamics. Certainly, there is more to come. 45

51 SECTION 4.: GRAPH COLORING One of the classic prolems in graph theory is vertex-coloring, which is the assignment of colors to the vertices of a graph in such a way that no two adjacent vertices have the same color. The oject is to use as few colors as possile. The proper coloring of a graph also forms a natural partition of the vertex set of a graph. The chromatic numer, χ(g), is the least numer of colors required for such a partition. A graph G is l-critical if χ (G) = l and for all induced sugraphs Λ G we have χ (Λ) < l. The spectrum of a graph gives us insight into the chromatic numer. Before we can understand the property that provides an upper-ound for χ(g), we must first estalish Property 4-. Property 4-: Given a graph G with χ (G) = l, there exists a sugraph of G, Λ G, such that χ (Λ) = l, and every vertex of Λ has degree l in Λ. Proof: The set of all induced sugraphs of G is non-empty and contains some graph whose chromatic numer is l (including G itself). The set also contains some sugraphs whose chromatic numer is not l. An example would e a graph with one vertex. Let Λ e a sugraph such that χ (Λ) = l, and χ (G) is minimal with respect to the numer of vertices. Then Λ is l-critical. If v is any vertex of Λ, then V ( Λ) \ v is an induced sugraph of Λ and has a vertex-coloring with l colors. If the degree of v in Λ were less than l, then we could extend this vertex coloring to Λ, contradicting the fact that χ (Λ) = l. Thus the degree of v is at least l. Property 4-: For any graph G, χ (G) + λ, where λ is the largest eigenvalue of the adjacency matrix of G. 46

52 Proof: From Property 4-, there is an induced sugraph Λ of G such that χ (G) = χ (Λ) and d min (Λ) χ (G), where d min (Λ) is the least degree of the vertices of Λ. Thus we have χ (G) + d min (Λ) + λ (Λ) + λ (G). [Bi, p55] Of course, the asolute largest value of χ (G) is n, the numer of vertices. If λ < n, then we will have a smaller maximum than n. Property 4-: The lower ound of the chromatic numer is χ (G) + λ λ min. Proof: The vertex set V(G) can e partitioned into χ (G) coloring classes. Thus, the adjacency matrix A of G can e partitioned into χ sumatrices. This was seen in Chapter. In this case, the diagonal sumatrices A ii ( i v) consists entirely of zeros, and so λ (A ii ) =.. It is known that λ (A) + (t )λ min (A) t i= λ max ( A ii ). Therefore, we have λ(a) + (χ(g) )λ min (A). But, if G has at least one edge, then λ min (A) = λ (G) <. [Bi, p57] A classical application of vertex coloring is the coloring of a map. To make the process as inexpensive as possile, we want to use a few colors as possile. In the graph, the countries are represented y vertices with edges drawn etween those countries that are adjacent. Determining the chromatic numer of the graph gives us the fewest colors necessary to color the map. Another application to graph coloring is a sorting prolem, such as sorting fish in a pet store. Some fish can e in the same tank together, while other fish cannot. Say we have fish types A, B, C, D, E and F. They can e put into tanks according to the chart elow 47

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman Kernels of Directed Graph Laplacians J. S. Caughman and J.J.P. Veerman Department of Mathematics and Statistics Portland State University PO Box 751, Portland, OR 97207. caughman@pdx.edu, veerman@pdx.edu

More information

Topics in Graph Theory

Topics in Graph Theory Topics in Graph Theory September 4, 2018 1 Preliminaries A graph is a system G = (V, E) consisting of a set V of vertices and a set E (disjoint from V ) of edges, together with an incidence function End

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Ma/CS 6b Class 20: Spectral Graph Theory

Ma/CS 6b Class 20: Spectral Graph Theory Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x

More information

Ma/CS 6b Class 20: Spectral Graph Theory

Ma/CS 6b Class 20: Spectral Graph Theory Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Recall: Parity of a Permutation S n the set of permutations of 1,2,, n. A permutation σ S n is even if it can be written as a composition of an

More information

The Matrix-Tree Theorem

The Matrix-Tree Theorem The Matrix-Tree Theorem Christopher Eur March 22, 2015 Abstract: We give a brief introduction to graph theory in light of linear algebra. Our results culminates in the proof of Matrix-Tree Theorem. 1 Preliminaries

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

1.10 Matrix Representation of Graphs

1.10 Matrix Representation of Graphs 42 Basic Concepts of Graphs 1.10 Matrix Representation of Graphs Definitions: In this section, we introduce two kinds of matrix representations of a graph, that is, the adjacency matrix and incidence matrix

More information

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics Spectral Graph Theory and You: and Centrality Metrics Jonathan Gootenberg March 11, 2013 1 / 19 Outline of Topics 1 Motivation Basics of Spectral Graph Theory Understanding the characteristic polynomial

More information

An Introduction to Spectral Graph Theory

An Introduction to Spectral Graph Theory An Introduction to Spectral Graph Theory Mackenzie Wheeler Supervisor: Dr. Gary MacGillivray University of Victoria WheelerM@uvic.ca Outline Outline 1. How many walks are there from vertices v i to v j

More information

Chapter 2 Spectra of Finite Graphs

Chapter 2 Spectra of Finite Graphs Chapter 2 Spectra of Finite Graphs 2.1 Characteristic Polynomials Let G = (V, E) be a finite graph on n = V vertices. Numbering the vertices, we write down its adjacency matrix in an explicit form of n

More information

This section is an introduction to the basic themes of the course.

This section is an introduction to the basic themes of the course. Chapter 1 Matrices and Graphs 1.1 The Adjacency Matrix This section is an introduction to the basic themes of the course. Definition 1.1.1. A simple undirected graph G = (V, E) consists of a non-empty

More information

Upper Bounds for Stern s Diatomic Sequence and Related Sequences

Upper Bounds for Stern s Diatomic Sequence and Related Sequences Upper Bounds for Stern s Diatomic Sequence and Related Sequences Colin Defant Department of Mathematics University of Florida, U.S.A. cdefant@ufl.edu Sumitted: Jun 18, 01; Accepted: Oct, 016; Pulished:

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

A Brief Outline of Math 355

A Brief Outline of Math 355 A Brief Outline of Math 355 Lecture 1 The geometry of linear equations; elimination with matrices A system of m linear equations with n unknowns can be thought of geometrically as m hyperplanes intersecting

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Quivers. Virginia, Lonardo, Tiago and Eloy UNICAMP. 28 July 2006

Quivers. Virginia, Lonardo, Tiago and Eloy UNICAMP. 28 July 2006 Quivers Virginia, Lonardo, Tiago and Eloy UNICAMP 28 July 2006 1 Introduction This is our project Throughout this paper, k will indicate an algeraically closed field of characteristic 0 2 Quivers 21 asic

More information

Representation theory of SU(2), density operators, purification Michael Walter, University of Amsterdam

Representation theory of SU(2), density operators, purification Michael Walter, University of Amsterdam Symmetry and Quantum Information Feruary 6, 018 Representation theory of S(), density operators, purification Lecture 7 Michael Walter, niversity of Amsterdam Last week, we learned the asic concepts of

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Expansion formula using properties of dot product (analogous to FOIL in algebra): u v 2 u v u v u u 2u v v v u 2 2u v v 2

Expansion formula using properties of dot product (analogous to FOIL in algebra): u v 2 u v u v u u 2u v v v u 2 2u v v 2 Least squares: Mathematical theory Below we provide the "vector space" formulation, and solution, of the least squares prolem. While not strictly necessary until we ring in the machinery of matrix algera,

More information

Laplacian Integral Graphs with Maximum Degree 3

Laplacian Integral Graphs with Maximum Degree 3 Laplacian Integral Graphs with Maximum Degree Steve Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A kirkland@math.uregina.ca Submitted: Nov 5,

More information

G1110 & 852G1 Numerical Linear Algebra

G1110 & 852G1 Numerical Linear Algebra The University of Sussex Department of Mathematics G & 85G Numerical Linear Algebra Lecture Notes Autumn Term Kerstin Hesse (w aw S w a w w (w aw H(wa = (w aw + w Figure : Geometric explanation of the

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Notes on the Matrix-Tree theorem and Cayley s tree enumerator

Notes on the Matrix-Tree theorem and Cayley s tree enumerator Notes on the Matrix-Tree theorem and Cayley s tree enumerator 1 Cayley s tree enumerator Recall that the degree of a vertex in a tree (or in any graph) is the number of edges emanating from it We will

More information

4. Determinants.

4. Determinants. 4. Determinants 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 2 2 determinant 4.1. Determinants; Cofactor Expansion Determinants of 2 2 and 3 3 Matrices 3 3 determinant 4.1.

More information

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12 24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

More information

Lecture 6 January 15, 2014

Lecture 6 January 15, 2014 Advanced Graph Algorithms Jan-Apr 2014 Lecture 6 January 15, 2014 Lecturer: Saket Sourah Scrie: Prafullkumar P Tale 1 Overview In the last lecture we defined simple tree decomposition and stated that for

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Spectral radius, symmetric and positive matrices

Spectral radius, symmetric and positive matrices Spectral radius, symmetric and positive matrices Zdeněk Dvořák April 28, 2016 1 Spectral radius Definition 1. The spectral radius of a square matrix A is ρ(a) = max{ λ : λ is an eigenvalue of A}. For an

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Basic graph theory 18.S995 - L26.

Basic graph theory 18.S995 - L26. Basic graph theory 18.S995 - L26 dunkel@math.mit.edu http://java.dzone.com/articles/algorithm-week-graphs-and no cycles Isomorphic graphs f(a) = 1 f(b) = 6 f(c) = 8 f(d) = 3 f(g) = 5 f(h) = 2 f(i) = 4

More information

1 Counting spanning trees: A determinantal formula

1 Counting spanning trees: A determinantal formula Math 374 Matrix Tree Theorem Counting spanning trees: A determinantal formula Recall that a spanning tree of a graph G is a subgraph T so that T is a tree and V (G) = V (T ) Question How many distinct

More information

Eigenvectors Via Graph Theory

Eigenvectors Via Graph Theory Eigenvectors Via Graph Theory Jennifer Harris Advisor: Dr. David Garth October 3, 2009 Introduction There is no problem in all mathematics that cannot be solved by direct counting. -Ernst Mach The goal

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

1Number ONLINE PAGE PROOFS. systems: real and complex. 1.1 Kick off with CAS

1Number ONLINE PAGE PROOFS. systems: real and complex. 1.1 Kick off with CAS 1Numer systems: real and complex 1.1 Kick off with CAS 1. Review of set notation 1.3 Properties of surds 1. The set of complex numers 1.5 Multiplication and division of complex numers 1.6 Representing

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Math113: Linear Algebra. Beifang Chen

Math113: Linear Algebra. Beifang Chen Math3: Linear Algebra Beifang Chen Spring 26 Contents Systems of Linear Equations 3 Systems of Linear Equations 3 Linear Systems 3 2 Geometric Interpretation 3 3 Matrices of Linear Systems 4 4 Elementary

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Graph G = (V, E). V ={vertices}, E={edges}. V={a,b,c,d,e,f,g,h,k} E={(a,b),(a,g),( a,h),(a,k),(b,c),(b,k),...,(h,k)}

Graph G = (V, E). V ={vertices}, E={edges}. V={a,b,c,d,e,f,g,h,k} E={(a,b),(a,g),( a,h),(a,k),(b,c),(b,k),...,(h,k)} Graph Theory Graph G = (V, E). V ={vertices}, E={edges}. a b c h k d g f e V={a,b,c,d,e,f,g,h,k} E={(a,b),(a,g),( a,h),(a,k),(b,c),(b,k),...,(h,k)} E =16. Digraph D = (V, A). V ={vertices}, E={edges}.

More information

8. Diagonalization.

8. Diagonalization. 8. Diagonalization 8.1. Matrix Representations of Linear Transformations Matrix of A Linear Operator with Respect to A Basis We know that every linear transformation T: R n R m has an associated standard

More information

Graph fundamentals. Matrices associated with a graph

Graph fundamentals. Matrices associated with a graph Graph fundamentals Matrices associated with a graph Drawing a picture of a graph is one way to represent it. Another type of representation is via a matrix. Let G be a graph with V (G) ={v 1,v,...,v n

More information

Algebraic Methods in Combinatorics

Algebraic Methods in Combinatorics Algebraic Methods in Combinatorics Po-Shen Loh 27 June 2008 1 Warm-up 1. (A result of Bourbaki on finite geometries, from Răzvan) Let X be a finite set, and let F be a family of distinct proper subsets

More information

On Hadamard Diagonalizable Graphs

On Hadamard Diagonalizable Graphs On Hadamard Diagonalizable Graphs S. Barik, S. Fallat and S. Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A2 Abstract Of interest here is a characterization

More information

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d

More information

GEOMETRY FINAL CLAY SHONKWILER

GEOMETRY FINAL CLAY SHONKWILER GEOMETRY FINAL CLAY SHONKWILER 1 a: If M is non-orientale and p M, is M {p} orientale? Answer: No. Suppose M {p} is orientale, and let U α, x α e an atlas that gives an orientation on M {p}. Now, let V,

More information

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is

More information

Strongly Regular Graphs, part 1

Strongly Regular Graphs, part 1 Spectral Graph Theory Lecture 23 Strongly Regular Graphs, part 1 Daniel A. Spielman November 18, 2009 23.1 Introduction In this and the next lecture, I will discuss strongly regular graphs. Strongly regular

More information

Boolean Inner-Product Spaces and Boolean Matrices

Boolean Inner-Product Spaces and Boolean Matrices Boolean Inner-Product Spaces and Boolean Matrices Stan Gudder Department of Mathematics, University of Denver, Denver CO 80208 Frédéric Latrémolière Department of Mathematics, University of Denver, Denver

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

Using Laplacian Eigenvalues and Eigenvectors in the Analysis of Frequency Assignment Problems

Using Laplacian Eigenvalues and Eigenvectors in the Analysis of Frequency Assignment Problems Using Laplacian Eigenvalues and Eigenvectors in the Analysis of Frequency Assignment Problems Jan van den Heuvel and Snežana Pejić Department of Mathematics London School of Economics Houghton Street,

More information

Recitation 8: Graphs and Adjacency Matrices

Recitation 8: Graphs and Adjacency Matrices Math 1b TA: Padraic Bartlett Recitation 8: Graphs and Adjacency Matrices Week 8 Caltech 2011 1 Random Question Suppose you take a large triangle XY Z, and divide it up with straight line segments into

More information

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version /6/29 2 version

More information

1.3.3 Basis sets and Gram-Schmidt Orthogonalization

1.3.3 Basis sets and Gram-Schmidt Orthogonalization .. Basis sets and Gram-Schmidt Orthogonalization Before we address the question of existence and uniqueness, we must estalish one more tool for working with ectors asis sets. Let R, with (..-) : We can

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Linear Algebra II. Ulrike Tillmann. January 4, 2018

Linear Algebra II. Ulrike Tillmann. January 4, 2018 Linear Algebra II Ulrike Tillmann January 4, 208 This course is a continuation of Linear Algebra I and will foreshadow much of what will be discussed in more detail in the Linear Algebra course in Part

More information

THE NORMALIZED LAPLACIAN MATRIX AND

THE NORMALIZED LAPLACIAN MATRIX AND THE NORMALIZED LAPLACIAN MATRIX AND GENERAL RANDIĆ INDEX OF GRAPHS A Thesis Submitted to the Faculty of Graduate Studies and Research In Partial Fulfillment of the Requirements for the Degree of Doctor

More information

Differential Geometry of Surfaces

Differential Geometry of Surfaces Differential Geometry of urfaces Jordan mith and Carlo équin C Diision, UC Berkeley Introduction These are notes on differential geometry of surfaces ased on reading Greiner et al. n. d.. Differential

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Sample Solutions from the Student Solution Manual

Sample Solutions from the Student Solution Manual 1 Sample Solutions from the Student Solution Manual 1213 If all the entries are, then the matrix is certainly not invertile; if you multiply the matrix y anything, you get the matrix, not the identity

More information

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8. Linear Algebra M1 - FIB Contents: 5 Matrices, systems of linear equations and determinants 6 Vector space 7 Linear maps 8 Diagonalization Anna de Mier Montserrat Maureso Dept Matemàtica Aplicada II Translation:

More information

The spectra of super line multigraphs

The spectra of super line multigraphs The spectra of super line multigraphs Jay Bagga Department of Computer Science Ball State University Muncie, IN jbagga@bsuedu Robert B Ellis Department of Applied Mathematics Illinois Institute of Technology

More information

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank David Glickenstein November 3, 4 Representing graphs as matrices It will sometimes be useful to represent graphs

More information

Energy of Graphs. Sivaram K. Narayan Central Michigan University. Presented at CMU on October 10, 2015

Energy of Graphs. Sivaram K. Narayan Central Michigan University. Presented at CMU on October 10, 2015 Energy of Graphs Sivaram K. Narayan Central Michigan University Presented at CMU on October 10, 2015 1 / 32 Graphs We will consider simple graphs (no loops, no multiple edges). Let V = {v 1, v 2,..., v

More information

ORIE 6334 Spectral Graph Theory September 8, Lecture 6. In order to do the first proof, we need to use the following fact.

ORIE 6334 Spectral Graph Theory September 8, Lecture 6. In order to do the first proof, we need to use the following fact. ORIE 6334 Spectral Graph Theory September 8, 2016 Lecture 6 Lecturer: David P. Williamson Scribe: Faisal Alkaabneh 1 The Matrix-Tree Theorem In this lecture, we continue to see the usefulness of the graph

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

IN this paper we study a discrete optimization problem. Constrained Shortest Link-Disjoint Paths Selection: A Network Programming Based Approach

IN this paper we study a discrete optimization problem. Constrained Shortest Link-Disjoint Paths Selection: A Network Programming Based Approach Constrained Shortest Link-Disjoint Paths Selection: A Network Programming Based Approach Ying Xiao, Student Memer, IEEE, Krishnaiyan Thulasiraman, Fellow, IEEE, and Guoliang Xue, Senior Memer, IEEE Astract

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Algebraic Methods in Combinatorics

Algebraic Methods in Combinatorics Algebraic Methods in Combinatorics Po-Shen Loh June 2009 1 Linear independence These problems both appeared in a course of Benny Sudakov at Princeton, but the links to Olympiad problems are due to Yufei

More information

k is a product of elementary matrices.

k is a product of elementary matrices. Mathematics, Spring Lecture (Wilson) Final Eam May, ANSWERS Problem (5 points) (a) There are three kinds of elementary row operations and associated elementary matrices. Describe what each kind of operation

More information

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C Topic 1 Quiz 1 text A reduced row-echelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correct-choice may have 0, 1, 2, or 3 choice may have 0,

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

THE BALANCED DECOMPOSITION NUMBER AND VERTEX CONNECTIVITY

THE BALANCED DECOMPOSITION NUMBER AND VERTEX CONNECTIVITY THE BALANCED DECOMPOSITION NUMBER AND VERTEX CONNECTIVITY SHINYA FUJITA AND HENRY LIU Astract The alanced decomposition numer f(g) of a graph G was introduced y Fujita and Nakamigawa [Discr Appl Math,

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW JC Stuff you should know for the exam. 1. Basics on vector spaces (1) F n is the set of all n-tuples (a 1,... a n ) with a i F. It forms a VS with the operations of + and scalar multiplication

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

The least eigenvalue of the signless Laplacian of non-bipartite unicyclic graphs with k pendant vertices

The least eigenvalue of the signless Laplacian of non-bipartite unicyclic graphs with k pendant vertices Electronic Journal of Linear Algebra Volume 26 Volume 26 (2013) Article 22 2013 The least eigenvalue of the signless Laplacian of non-bipartite unicyclic graphs with k pendant vertices Ruifang Liu rfliu@zzu.edu.cn

More information

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses

More information

Eigenvalues and Eigenvectors A =

Eigenvalues and Eigenvectors A = Eigenvalues and Eigenvectors Definition 0 Let A R n n be an n n real matrix A number λ R is a real eigenvalue of A if there exists a nonzero vector v R n such that A v = λ v The vector v is called an eigenvector

More information

Math 315: Linear Algebra Solutions to Assignment 7

Math 315: Linear Algebra Solutions to Assignment 7 Math 5: Linear Algebra s to Assignment 7 # Find the eigenvalues of the following matrices. (a.) 4 0 0 0 (b.) 0 0 9 5 4. (a.) The characteristic polynomial det(λi A) = (λ )(λ )(λ ), so the eigenvalues are

More information

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018 MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Laplacians of Graphs, Spectra and Laplacian polynomials

Laplacians of Graphs, Spectra and Laplacian polynomials Laplacians of Graphs, Spectra and Laplacian polynomials Lector: Alexander Mednykh Sobolev Institute of Mathematics Novosibirsk State University Winter School in Harmonic Functions on Graphs and Combinatorial

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information