Contents Lecture 4. Greedy graph algorithms Dijkstra s algorithm Prim s algorithm Kruskal s algorithm Union-find data structure with path compression

Similar documents
Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Shortest Path Algorithms

Single Source Shortest Paths

Minimum spanning tree

Algorithm Design and Analysis

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Dijkstra s Single Source Shortest Path Algorithm. Andreas Klappenecker

Design and Analysis of Algorithms

CS 410/584, Algorithm Design & Analysis, Lecture Notes 4

Single Source Shortest Paths

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk

CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk. 10/8/12 CMPS 2200 Intro.

CS 4407 Algorithms Lecture: Shortest Path Algorithms

1 Some loose ends from last time

Introduction to Algorithms

Breadth First Search, Dijkstra s Algorithm for Shortest Paths

Introduction to Algorithms

ICS 252 Introduction to Computer Design

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

Algorithm Design Strategies V

Assignment 5: Solutions

CS781 Lecture 3 January 27, 2011

25. Minimum Spanning Trees

Directed Graphs (Digraphs) and Graphs

25. Minimum Spanning Trees

Algorithms: Lecture 12. Chalmers University of Technology

Introduction to Algorithms

ABHELSINKI UNIVERSITY OF TECHNOLOGY

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

University of New Mexico Department of Computer Science. Final Examination. CS 362 Data Structures and Algorithms Spring, 2007

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

IS 709/809: Computational Methods in IS Research Fall Exam Review

Algorithm Design and Analysis

Preliminaries. Graphs. E : set of edges (arcs) (Undirected) Graph : (i, j) = (j, i) (edges) V = {1, 2, 3, 4, 5}, E = {(1, 3), (3, 2), (2, 4)}

CS 161: Design and Analysis of Algorithms

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes

Bounds on the Traveling Salesman Problem

Greedy Algorithms and Data Compression. Curs 2018

Approximation Algorithms for Asymmetric TSP by Decomposing Directed Regular Multigraphs

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Introduction to Algorithms

Combinatorial Optimization

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS 580: Algorithm Design and Analysis

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Matroids and Greedy Algorithms Date: 10/31/16

Breadth-First Search of Graphs

Data Structures in Java

Greedy Algorithms My T. UF

All-Pairs Shortest Paths

General Methods for Algorithm Design

8 Priority Queues. 8 Priority Queues. Prim s Minimum Spanning Tree Algorithm. Dijkstra s Shortest Path Algorithm

Slides credited from Hsueh-I Lu & Hsu-Chun Hsiao

Geometric Steiner Trees

NP-Completeness. Until now we have been designing algorithms for specific problems

Minmax Tree Cover in the Euclidean Space

K-center Hardness and Max-Coverage (Greedy)

CSE 332. Data Abstractions

Algorithm Design and Analysis

Minimum Spanning Trees

Matching Residents to Hospitals

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi

Practice Final Solutions. 1. Consider the following algorithm. Assume that n 1. line code 1 alg(n) { 2 j = 0 3 if (n = 0) { 4 return j

NP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch]

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

AC64/AT64/ AC115/AT115 DESIGN & ANALYSIS OF ALGORITHMS

Lecture 11. Single-Source Shortest Paths All-Pairs Shortest Paths

1. Problem I calculated these out by hand. It was tedious, but kind of fun, in a a computer should really be doing this sort of way.

NP-completeness. Chapter 34. Sergey Bereg

Notes on MapReduce Algorithms

Lecture 4: NP and computational intractability

ACO Comprehensive Exam October 14 and 15, 2013

Trees. A tree is a graph which is. (a) Connected and. (b) has no cycles (acyclic).

CS/COE

Algorithms Booklet. 1 Graph Searching. 1.1 Breadth First Search

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Union-Find Structures

CS 580: Algorithm Design and Analysis

Lecture 4. 1 Estimating the number of connected components and minimum spanning tree

Data Structures and and Algorithm Xiaoqing Zheng

Assignment 4. CSci 3110: Introduction to Algorithms. Sample Solutions

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn

The minimum G c cut problem

2 Notation and Preliminaries

Greedy Algorithms and Data Compression. Curs 2017

2 GRAPH AND NETWORK OPTIMIZATION. E. Amaldi Introduction to Operations Research Politecnico Milano 1

Shortest paths: label setting

Small distortion and volume preserving embedding for Planar and Euclidian metrics Satish Rao

Query Processing in Spatial Network Databases

Sublinear Algorithms Lecture 3. Sofya Raskhodnikova Penn State University

Disjoint-Set Forests

Partial cubes: structures, characterizations, and constructions

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Algorithms Re-Exam TIN093/DIT600

Theory of Computation Chapter 1: Introduction

Minmax Tree Cover in the Euclidean Space

Graph Theory. Thomas Bloom. February 6, 2015

More on NP and Reductions

Transcription:

Contents Lecture 4 Greedy graph algorithms Dijkstra s algorithm Prim s algorithm Kruskal s algorithm Union-find data structure with path compression Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 1 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m 5 n What is the shortest path from a to n? To every other node? How can we find these paths efficiently? For navigation, the edge weights are positive distances (obviously) For some other graphs the weights can be a positive or negative cost The problem is easier with positive weights Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 2 / 40

Dijkstra s algorithm Given a directed graph G(V, E), a weight function w : E R, and a node s V, Dijkstra s algorithm computes the shortest paths from s to every other node The sum of all edge weights on a path should be minimized A weight can e.g. mean metric distance, cost, or travelling time For this algorithm, we assume the weights are nonnegative numbers Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 3 / 40

Dijkstra s algorithm overview input w(e) weight of edge e = (u, v). We also write w(u, v) output d(v) shortest path distance from s to v for v V output pred(v) predecessor of v in shortest path from s to v V A set Q of nodes for which we have not yet found the shortest path A set S of nodes for which we have already found the shortest path procedure dijkstra (G, s) d(s) 0 Q V {s} S {s} while Q select v which minimizes d(u) + w(e) where u S, v / S, e = (u, v) d(v) d(u) + w(e) pred(v) u remove v from Q add v to S Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 4 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 Only b has a predecessor in S d(b) 4 pred(b) a S {a, b} l m 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 5 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(b) + w(b, d) = 4 + 2 = 6 d(b) + w(b, h) = 4 + 21 = 25 d minimizes d(u) + w(u, v) d(d) 6 pred(d) b S {a, b, d} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 6 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(b) + w(b, h) = 4 + 21 = 25 d(d) + w(d, c) = 6 + 8 = 14 d(d) + w(d, g) = 6 + 13 = 19 c minimizes d(u) + w(u, v) d(c) 14 pred(c) d S {a, b, c, d} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(b) + w(b, h) = 4 + 21 = 25 d(d) + w(d, g) = 6 + 13 = 19 d(c) + w(c, e) = 14 + 3 = 1 e minimizes d(u) + w(u, v) d(e) 1 pred(e) c S {a, b, c, d, e} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 8 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(b) + w(b, h) = 4 + 21 = 25 d(d) + w(d, g) = 6 + 13 = 19 d(e) + w(e, f ) = 1 + 9 = 26 g minimizes d(u) + w(u, v) d(g) 19 pred(g) d S {a, b, c, d, e, g} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 9 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m 5 d(b) + w(b, h) = 4 + 21 = 25 d(e) + w(e, f ) = 1 + 9 = 26 d(g) + w(g, h) = 19 + = 26 d(g) + w(g, j) = 19 + 3 = 22 j minimizes d(u) + w(u, v) d(j) 22 pred(j) g S {a, b, c, d, e, g, j} n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 10 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m 5 n d(b) + w(b, h) = 4 + 21 = 25 d(e) + w(e, f ) = 1 + 9 = 26 d(g) + w(g, h) = 19 + = 26 d(j) + w(j, m) = 22 + 3 = 25 h and m minimize d(u) + w(u, v) We can take any of them d(h) 25 pred(h) b S {a, b, c, d, e, g, h, j} Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 11 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(e) + w(e, f ) = 1 + 9 = 26 d(j) + w(j, m) = 22 + 3 = 25 d(h) + w(h, k) = 25 + 6 = 2 m minimizes d(u) + w(u, v) d(m) 25 pred(m) j S {a, b, c, d, e, g, h, j, m} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 12 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(e) + w(e, f ) = 1 + 9 = 26 d(h) + w(h, k) = 25 + 6 = 2 d(m) + w(m, n) = 25 + 5 = 30 f minimizes d(u) + w(u, v) d(f ) 26 pred(f ) e S {a, b, c, d, e, f, g, h, j, m} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 13 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(h) + w(h, k) = 25 + 6 = 2 d(m) + w(m, n) = 25 + 5 = 30 d(f ) + w(f, i) = 26 + 6 = 32 k minimizes d(u) + w(u, v) d(k) 2 pred(k) h S {a h, j, k, m} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 14 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(m) + w(m, n) = 25 + 5 = 30 d(f ) + w(f, i) = 26 + 6 = 32 n minimizes d(u) + w(u, v) d(n) 30 pred(k) h S {a k, m, n} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 15 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(f ) + w(f, i) = 26 + 6 = 32 Only i possible d(i) 32 pred(i) f S {a k, m, n} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 16 / 40

Shortest paths a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m d(i) + w(i, l) = 32 + 1 = 33 Only l possible d(l) 33 pred(l) i S {a n} 5 n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 1 / 40

Observations about Dijkstra s algorithm We only add an edge when it really is to the node which is closest to the start vertex. To print the shortest path from s to any node v, simply print v and follow the pred(v) attributes. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 18 / 40

Dijkstra s algorithm Theorem For each node v S, d(v) is the length of the shortest path from s to v. Proof. We use induction with base case S = 1 which is true since S = {s} and d(s) = 0. Inductive hypothesis: Assume theorem is true for S 1. Let v be the next node added to S, and pred(v) = u. d(v) = d(u) + w(e) where e = (u, v). Assume in contradiction there exists a shorter path from s to v containing the edge (x, y) with x S and y / S, followed by the subpath from y to v. Since the path via y to v is shorter than the path from u to v, d(y) < d(v) but it is not since v is chosen and not y. A contradiction which means no shorter path to v exists. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 19 / 40

Recall procedure dijkstra (G, s) d(s) 0 Q V {s} S {s} while Q select v which minimizes d(u) + w(e) where u S, v / S, e = (u, v) d(v) d(u) + w(e) pred(v) u remove v from Q add v to S We use a heap priority queue for Q with d(v) as keys. For v s we initially set d(v) and then decrease it Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 20 / 40

Running time of Dijkstra s algorithm Assume n nodes and m edges Constructing Q: O(n) using heapify (but O(n log n) using n inserts) Heapify is called init_heap in C and pseudo-code in the book O(n) iterations of the while loop Each selected node must check each neighbor not in S and possibly reduce its key O(m log n) operations for reducing keys With all nodes reachable from s, we have m n 1 Therefore (m log n) running time Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 21 / 40

Constructing a heap from an array in linear time The down-operation can be thought of as merging one element at position k with two heaps at positions 2k and 2k + 1, and creating a new valid heap at position k. Initially we have n unordered elements, and of these each leaf is a valid one-element heap. But each down-operation is O(log n) and we perform n/2 of these this still gives a running time of O(n log n). Since most of these down-operations merge very small heaps, we will next see that this results in a linear running time. To do this, we start with a geometric series. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 22 / 40

A geometric series For x < 1 we have: x k 1 = k=0 1 x d dx x k = d 1 dx k=0 1 x kx k 1 x = (1 x) 2. k=0 and differentiating both sides then gives: Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 23 / 40

Heapify Let h be the height of a heap, so the down-operation then is O(h). Consider n unordered elements in an array. There are at most n 2 height h. leaf elements, and at most n 2 h+1 elements at Therefore the running time of constructing a heap from n unordered elements is: log n h=0 n log n 2 h+1 O(h) = O(n = O(n h=0 h=0 h 2 h) h( 1 2 )h ) 1 2 = O(n (1 1 2 )2) = O(2n) = O(n). Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 24 / 40

Heapify procedure init_heap (h, b, s) begin copy each element in b to the array of h size(h) n for (k n/2 ; k 1; k k 1) do down(h,k) return h end Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 25 / 40

Change priority procedure change_priority (h, x) begin k position(x) up(h,k) down(h,k) end See heap.c Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 26 / 40

The Minimum Spanning Tree Problem a 4 b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m 5 n Assume the nodes are cities and a country wants to build an electrical network The edge weights are the costs of connecting two cities We want to find a subset of the edges so that all cities are connected, and which minimizes the cost This problem was suggested to the Czech mathematician Otakar Borůvka during World War I for Mähren. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 2 / 40

The minimum spanning tree problem In 1926 Borůvka published the first paper on finding the minimum spanning tree. It is an abbreviation of minimum-weight spanning tree It has been regarded as the cradle of combinatorial optimization. Borůvka s algorithm has been rediscovered several times: Choquet 1938, by Florek, Lukasiewicz, Steinhaus, and Zubrzycki 1951 and by Sollin 1965. We will study two classic algorithms for this problem: Prim s algorithm, and Kruskal s algorithm One of the currently fastest MST algorithm by Chazelle 2000 is based on Borůvka s algorithm. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 28 / 40

MST Consider a connected undirected graph G(V, E) If T E and (V, T ) is a tree, it is called a spanning tree of G(V, E) Given edge costs c(e), a (V, T ) is a minimum spanning tree, or MST of G such that the sum of the edge costs is minimized. Prim s algorithm is similar to Dijkstra s and grows one MST Kruskal s algorithm instead creates a forest which eventually becomes one MST Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 29 / 40

Minimum spanning tree: Prim s algorithm 4 a b 2 8 c d 21 13 3 5 e f g h 9 2 6 3 6 i j k 1 3 6 l m 5 n A root node s must first be selected. Any will do. How can we know which edge to add next? Is it possible to do it with a greedy algorithm? Compare with the Traveling Salesman Problem! (JS/Section 6.6) TSP searches a path from one node which visits all nodes and returns. TSP asks if there is such a tour of cost at most x? Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 30 / 40

Safe edges We will next learn a rule which Prim s and Kruskal s algorithm rely on It determines when it is safe to add a certain edge (u, v) A partition (S, V S) of the nodes V is called a cut An edge (u, v) crosses the cut if u S and v V S Let A E and A be a subset of the edges in some minimum spanning tree of G A does not necessarily create a connected graph A is applicable to both Prim s and Kruskal s algorithms and represents the edges selected so far An edge (u, v) is safe if A {(u, v)} is also a subset of the edges in some MST. So how can we determine if an edge is safe? Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 31 / 40

Safe edges Lemma Assume A is a subset of the edges in some minimum spanning tree of G, (S, V S) is any cut of V, and no edge in A crosses (S, V S). Then every edge (u, v) with minimum weight, u S, and v V S is safe. Proof. Assume T E is a minimum spanning tree of G. We have either (u, v) T (in which case we are done) or (u, v) / T. Without loss of generality we can assume u S and v V S There is a path p in T which connects u and v Therefore T {(u, v)} creates a cycle with p There is an edge (x, y) T which also crosses (S, V S) and by assumption (x, y) / A Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 32 / 40

Safe edges Proof. Since T is a minimum spanning tree, it has only one path from u to v. Removing (x, y) from T partitions V and adding (u, v) creates a new spanning tree U U = (T {(x, y)}) {(u, v)} Since (u, v) has minimum weight, w(u) w(t ), and since T is a minimum spanning tree, w(u) = w(t ) Since A (u, v) U, (u, v) is safe for A Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 33 / 40

Prim s algorithm overview input w(e) weight of edge e = (u, v). We also write w(u, v) a root node r V output minimum spanning tree T procedure prim (G, r) T Q V {r} while Q select a v which minimizes w(e) where u / Q, v Q, e = (u, v) remove v from Q add (u, v) to T return T We use a heap priority queue for Q with d(v), the distance to any node in V Q, as keys. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 34 / 40

Running time of Prim s algorithm Prim s algorithm has the same running time as for Dijkstra s algorithm Assume n nodes and m edges Constructing Q: O(n) using heapify (but O(n log n) using n inserts) O(n) iterations of the while loop Each selected node must check each neighbor not in S and possibly reduce its key O(m log n) operations for reducing keys With all nodes reachable from s, we have m n 1 Therefore (m log n) running time Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 35 / 40

Kruskal s algorithm overview input w(e) weight of edge e = (u, v). We also write w(u, v) output minimum spanning tree T procedure kruskal (G) T B E while B select an edge e with minimal weight if T {e} does not create a cycle then add e to T remove e from B return T How can we detect cycles? Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 36 / 40

The union-find data structure Consider a set, such as with n nodes of a graph A union-find data structure lets us: Create an initial partitioning {p 0, p 1,..., p n 1 } with n sets consisting of one element each Merge two sets p i and p j Check which set an elements belongs to The merge operation is called union The check set operation is called find We can use this as follows: A set represents a connected subgraph and initially consists of one node When we add an edge (u, v) to the minimum spanning tree, we need to Find the set p u with u Find the set p v with v Ignore (u, v) if find(u) = find(v) Note that the two subgraphs are connected using union otherwise Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 3 / 40

Union-find data structure How should the sets p i be named? It is only essential that two different sets have different names It is suitable to let the node v be the initial name of p v Thus no extra data type is needed. We simply add an attribute to the node Then after a union operation with u and v we set one of the nodes as the name of the merged set. Assume we use u as the name. Then v needs a way to find u For this the node attribute parent(v) = u Code for find: if parent(v) == null then v else parent(v) Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 38 / 40

Efficiency of Union-Find Refer to Section 3.5 of JS. Using both path compression and union-by-size (or union-by-rank), the time complexity of m find and n union operations is: Θ(mα(m, n)) Θ(n + mα(m, n)) m n m < n α(m, n)) 4 for all practical values of m and n Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 39 / 40

Running time of Kruskal s algorithm Assume n nodes and m edges and m > n Sorting the edges: O(m log m) Adding an edge (v, w) would create a cycle if find(v) = find(w) There are m edges so we do at most 2m find operations A tree has n 1 edges so we do n 1 union operations From previous slide the complexity of these union-find operations is Θ(mα(m, n)) We can conclude that sorting the edges is more costly than the union-find operations so the running time of Kruskal s algorithm is O(m log m) We have m n 2 Therefore O(m log m) = O(m log n 2 ) = O(m2 log n) = O(m log n) I.e. the same as for Prim s algorithm. Jonas Skeppstedt (jonasskeppstedt.net) Lecture 4 2018 40 / 40