Part I. Efficient Algorithms and Data Structures. Part I. Organizational Matters. WS 2011/12 Organizational Matters. Harald Räcke. Winter Term 2011/12

Size: px
Start display at page:

Download "Part I. Efficient Algorithms and Data Structures. Part I. Organizational Matters. WS 2011/12 Organizational Matters. Harald Räcke. Winter Term 2011/12"

Transcription

1 Part I WS / Organizational Matters Efficient Algorithms and Data Structures Harald Räcke Fakultät für Informatik TU München Winter Term / c Ernst Mayr, Harald Räcke c Ernst Mayr, Harald Räcke Modul: IN3 Part I Organizational Matters Name: Efficient Algorithms and Data Structures Effiziente Algorithmen und Datenstrukturen ECTS: 8 Credit points Lectures: 4 SWS Mon :5 3:45 (Room.3.9A) Thu :5 :45 (Room.4., HS) Webpage: Required knowledge: IN, IN3 Introduction to Informatics / EinfÃijhrung in die Informatik / IN7 Fundamentals of Algorithms and Data Structures Grundlagen: Algorithmen und Datenstrukturen (GAD) IN Basic Theoretic Informatics EinfÃijhrung in die Theoretische Informatik (THEO) IN5 Discrete Structures Diskrete Strukturen (DS) IN8 Discrete Probability Theory Diskrete Wahrscheinlichkeitstheorie (DWT) c Ernst Mayr, Harald Räcke 3 c Ernst Mayr, Harald Räcke 4

2 The Lecturer Harald RÃd cke Room: Office hours: (per appointment) Tutorials Tutor: Chintan Shah Room: Office hours: Wed :3 :3 Room:.8.38 Time: Tue 4:4 5:45 c Ernst Mayr, Harald Räcke 5 c Ernst Mayr, Harald Räcke 6 Assessment Assessment In order to pass the module you need to. pass an exam, and. obtain at least 4% of the points in the assignment sheets. Exam: Date will be announced shortly. There are no resources allowed, apart from a hand-written piece of paper (A4). Answers should be given in English, but German is also accepted. Assignment Sheets: An assignment sheet is usually made available on Wednesday on the module webpage. Solutions have to be handed in in the following week before the lecture on Thursday. You can hand in your solutions by putting them in the right folder in front of room Solutions have to be given in English. Solutions will be discussed in the subsequent tutorial on Tuesday. We will probably have assignment sheets. The first one will be out on Wednesday, 6 October. c Ernst Mayr, Harald Räcke 7 c Ernst Mayr, Harald Räcke 8

3 Contents Foundations Machine models Efficiency measures Asymptotic notation Recursion Higher Data Structures Search trees Hashing Priority queues Union/Find data structures Cuts/Flows Matchings Literatur I Alfred V. Aho, John E. Hopcroft, Jeffrey D. Ullman: The design and analysis of computer algorithms, Addison-Wesley Publishing Company: Reading (MA), 974 Thomas H. Cormen, Charles E. Leiserson, Ron L. Rivest, Clifford Stein: Introduction to algorithms, McGraw-Hill, 99 Michael T. Goodrich, Roberto Tamassia: Algorithm design: Foundations, analysis, and internet examples, John Wiley & Sons, Contents c Ernst Mayr, Harald Räcke 9 Literatur c Ernst Mayr, Harald Räcke Literatur II Volker Heun: Grundlegende Algorithmen: EinfÃijhrung in den Entwurf und die Analyse effizienter Algorithmen,. Auflage, Vieweg, 3 Jon Kleinberg, Eva Tardos: Algorithm Design, Addison-Wesley, 5 Donald E. Knuth: The art of computer programming. Vol. : Fundamental Algorithms, 3. Auflage, Addison-Wesley Publishing Company: Reading (MA), 997 Literatur III Donald E. Knuth: The art of computer programming. Vol. 3: Sorting and Searching, 3. Auflage, Addison-Wesley Publishing Company: Reading (MA), 997 Christos H. Papadimitriou, Kenneth Steiglitz: Combinatorial Optimization: Algorithms and Complexity, Prentice Hall, 98 Uwe SchÃČÂűning: Algorithmik, Spektrum Akademischer Verlag, Literatur c Ernst Mayr, Harald Räcke Literatur c Ernst Mayr, Harald Räcke

4 Literatur IV Steven S. Skiena: The Algorithm Design Manual, Springer, 998 Part II Foundations Literatur c Ernst Mayr, Harald Räcke 3 c Ernst Mayr, Harald Räcke 4 3 Goals Gain knowledge about efficient algorithms for important problems, i.e., learn how to solve certain types of problems efficiently. Learn how to analyze and judge the efficiency of algorithms. Learn how to design efficient algorithms. 4 Modelling Issues What do you measure? Memory requirement Running time Number of comparisons Number of multiplications Number of hard-disc accesses Program size Power consumption... 3 Goals c Ernst Mayr, Harald Räcke 5 4 Modelling Issues c Ernst Mayr, Harald Räcke 6

5 4 Modelling Issues 4 Modelling Issues How do you measure? Implementing and testing on representative inputs How do you choose your inputs? May be very time-consuming. Very reliable results if done correctly. Results only hold for a specific machine and for a specific set of inputs. Theoretical analysis in a specific model of computation. Gives asymptotic bounds like this algorithm always runs in time O(n ). Typically focuses on the worst case. Can give lower bounds like any comparison-based sorting algorithm needs at least Ω(n log n) comparisons in the worst case. 4 Modelling Issues c Ernst Mayr, Harald Räcke 7 Input length The theoretical bounds are usually given by a function f : N N that maps the input length to the running time (or storage space, comparisons, multiplications, program size etc.). The input length may e.g. be the size of the input (number of bits) the number of arguments Example Suppose n numbers from the interval {,..., N} have to be sorted. In this case we usually say that the input length is n instead of e.g. n log N, which would be the number of bits required to encode the input. 4 Modelling Issues c Ernst Mayr, Harald Räcke 8 Model of Computation How to measure performance. Calculate running time and storage space etc. on a simplified, idealized model of computation, e.g. Random Access Machine (RAM), Turing Machine (TM),.... Calculate number of certain basic operations: comparisons, multiplications, harddisc accesses,... Version. is often easier, but focusing on one type of operation makes it more difficult to obtain meaningful results. Turing Machine Very simple model of computation. Only the current memory location can be altered. Very good model for discussing computabiliy, or polynomial vs. exponential time. Some simple problems like recognizing whether input is of the form xx, where x is a string, have quadratic lower bound. Not a good model for developing efficient algorithms control unit state state holds program and can act as constant size memory 4 Modelling Issues c Ernst Mayr, Harald Räcke 9 4 Modelling Issues c Ernst Mayr, Harald Räcke

6 Random Access Machine (RAM) Input tape and output tape (sequences of zeros and ones; unbounded length). Memory unit: infinite but countable number of registers R[], R[], R[],.... Registers hold integers. Indirect addressing. Note that in the picture on the right the tapes are one-directional, and that a READ- or WRITE-operation always advances its tape. input tape output tape control unit memory R[] R[] R[] R[3] R[4] R[5]. Random Access Machine (RAM) Operations input operations (input tape R[i]) READ i output operations (R[i] output tape) WRITE i register-register transfers R[j] := R[i] R[j] := 4 indirect addressing R[j] := R[R[i]] loads the content of the register number R[i] into register number j 4 Modelling Issues c Ernst Mayr, Harald Räcke 4 Modelling Issues c Ernst Mayr, Harald Räcke Random Access Machine (RAM) Operations branching (including loops) based on comparisons jump x jumps to position x in the program; sets instruction counter to x; reads the next operation to perform from register R[x] jumpz x R[i] jump to x if R[i] = if not the instruction counter is increased by ; jumpi i jump to R[i] (indirect jump); arithmetic instructions: +,,, / R[i] := R[j] + R[k]; R[i] := -R[k]; The jump-directives are very close to the jump-instructions contained in the assembler language of real machines. Model of Computation uniform cost model Every operation takes time. logarithmic cost model The cost depends on the content of memory cells: The time for a step is equal to the largest operand involved; The storage space of a register is equal to the length (in bits) of the largest value ever stored in it. Bounded word RAM model: cost is uniform but the largest value stored in a register may not exceed w, where usually w = log n. The latter model is quite realistic as the word-size of a standard computer that handles a problem of size n must be at least log n as otherwise the computer could either not store the problem instance or not address all its memory. 4 Modelling Issues c Ernst Mayr, Harald Räcke 3 4 Modelling Issues c Ernst Mayr, Harald Räcke 4

7 4 Modelling Issues Example Algorithm RepeatedSquaring(n) : r ; : for i = n do 3: r r 4: return r running time: uniform model: n steps logarithmic model: n = n+ = Θ( n ) space requirement: uniform model: O() logarithmic model: O( n ) There are different types of complexity bounds: best-case complexity: C bc (n) := min{c(x) x = n} Usually easy to analyze, but not very meaningful. worst-case complexity: C wc (n) := max{c(x) x = n} Usually moderately easy to analyze; sometimes too pessimistic. average case complexity: C avg (n) := I n x =n more general: probability measure µ C avg (n) := C(x) x I n µ(x) C(x) cost of instance C(x) x x input length of instance x I n set of instances of length n 4 Modelling Issues c Ernst Mayr, Harald Räcke 5 4 Modelling Issues c Ernst Mayr, Harald Räcke 6 There are different types of complexity bounds: amortized complexity: The average cost of data structure operations over a worst case sequence of operations. randomized complexity: The algorithm may use random bits. Expected running time (over all possible choices of random bits) for a fixed input x. Then take the worst-case over all x with x = n. 5 Asymptotic Notation We are usually not interested in exact running times, but only in an asymtotic classification of the running time, that ignores constant factors and constant additive offsets. We are usually interested in the running times for large values of n. Then constant additive terms do not play an important role. An exact analysis (e.g. exactly counting the number of operations in a RAM) may be hard, but wouldn t lead to more precise results as the computational model is already quite a distance from reality. A linear speed-up (i.e., by a constant factor) is always possible by e.g. implementing the algorithm on a faster machine. Running time should be expressed by simple functions. 4 Modelling Issues c Ernst Mayr, Harald Räcke 6 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 7

8 Asymptotic Notation Formal Definition Let f denote functions from N to R +. O(f ) = {g c > n N n n : [g(n) c f (n)]} (set of functions that asymptotically grow not faster than f ) Ω(f ) = {g c > n N n n : [g(n) c f (n)]} (set of functions that asymptotically grow not slower than f ) Θ(f ) = Ω(f ) O(f ) (functions that asymptotically have the same growth as f ) o(f ) = {g c > n N n n : [g(n) c f (n)]} (set of functions that asymptotically grow slower than f ) ω(f ) = {g c > n N n n : [g(n) c f (n)]} (set of functions that asymptotically grow faster than f ) Asymptotic Notation There is an equivalent definition using limes notation (assuming that the respective limes exists). f and g are functions from N to R +. g O(f ): g Ω(f ): g Θ(f ): g o(f ): g ω(f ): lim n g(n) f (n) < < lim n g(n) f (n) < lim n g(n) f (n) < g(n) lim n f (n) = g(n) lim n f (n) = Note that for the version of the Landau notation defined here, we assume that f and g are positive functions. There also exist versions for arbitrary functions, and for the case that the limes is not infinity. 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 8 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 9 Asymptotic Notation Abuse of notation. People write f = O(g), when they mean f O(g). This is not an equality (how could a function be equal to a set of functions).. People write f (n) = O(g(n)), when they mean f O(g), with f : N R +, n f (n), and g : N R +, n g(n). 3. People write e.g. h(n) = f (n) + o(g(n)) when they mean that there exists a function z : N R +, n z(n), z o(g) such that h(n) f (n) + z(n). Asymptotic Notation Abuse of notation 4. People write O(f (n)) = O(g(n)), when they mean O(f (n)) O(g(n)). Again this is not an equality.. In this context f (n) does not mean the function f evaluated at n, but instead it is a shorthand for the function itself (leaving out domain and codomain and only giving the rule of correspondence of the function). 3. This is particularly useful if you do not want to ignore constant factors. For example the median of n elements can be determined using 3 n + o(n) comparisons. 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 3 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 3

9 Asymptotic Notation Asymptotic Notation Lemma 3 Let f, g be functions with the property n > n n : f (n) > (the same for g). Then c f (n) Θ(f (n)) for any constant c O(f (n)) + O(g(n)) = O(f (n) + g(n)) O(f (n)) O(g(n)) = O(f (n) g(n)) O(f (n)) + O(g(n)) = O(max{f (n), g(n)}) The expressions also hold for Ω. Note that this means that f (n) + g(n) Θ(max{f (n), g(n)}). Comments Do not use asymptotic notation within induction proofs. For any constants a, b we have log a n = Θ(log b n). Therefore, we will usually ignore the base of a logarithm within asymptotic notation. In general log n = log n, i.e., we use as the default base for the logarithm. 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 3 5 Asymptotic Notation c Ernst Mayr, Harald Räcke 3 6 Recurrences Recurrences Algorithm mergesort(list L) : s size(l) : if s return L 3: L L[ s ] 4: L L[ s n] 5: mergesort(l ) 6: mergesort(l ) 7: L merge(l, L ) 8: return L How do we bring the expression for the number of comparisons ( running time) into a closed form? For this we need to solve the recurrence. This algorithm requires ( n ) T (n) T + O(n) comparisons when n > and comparisons when n. 6 Recurrences c Ernst Mayr, Harald Räcke 33 6 Recurrences c Ernst Mayr, Harald Räcke 34

10 Methods for Solving Recurrences 6. Guessing+Induction. Guessing+Induction Guess the right solution and prove that it is correct via induction. It needs experience to make the right guess.. Master Theorem For a lot of recurrences that appear in the analysis of algorithms this theorem can be used to obtain tight asymptotic bounds. It does not provide exact solutions. 3. Characteristic Polynomial Linear homogenous recurrences can be solved via this method. First we need to get rid of the O-notation in our recurrence: T ( n ) + cn n T (n) otherwise Assume that instead we had T ( n) + cn n T (n) otherwise One way of solving such a recurrence is to guess a solution, and check that it is correct by plugging it in. 6 Recurrences c Ernst Mayr, Harald Räcke Guessing+Induction c Ernst Mayr, Harald Räcke Guessing+Induction Suppose we guess T (n) dn log n for a constant d. Then if we choose d c. ( n ) T (n) T + cn ( n log n ) + cn = dn(log n ) + cn = dn log n + (c d)n = dn log n Formally one would make an induction proof, where the above is the induction step. The base case is usually trivial. 6. Guessing+Induction Guess: T (n) dn log n. Proof. (by induction) T ( n) + cn n 6 T (n) b otw. base case ( n < 6): true if we choose d b. induction step... n n: Suppose statem. is true for n {,..., n }, and n 6. We prove it for n: ( n ) T (n) T + cn ( n log n ) + cn = dn(log n ) + cn = dn log n + (c d)n = dn log n Note that this proves the statement for n N, as the statement is wrong for n =. The base case is usually omitted, as it is the same for different recurrences. 6. Guessing+Induction c Ernst Mayr, Harald Räcke 37 Hence, statement is true if we choose d c.

11 6. Guessing+Induction Why did we change the recurrence by getting rid of the ceiling? If we do not do this we instead consider the following recurrence: T ( n ) + cn n 6 T (n) b otherwise 6. Guessing+Induction We also make a guess of T (n) dn log n and get n n + n n log 9 6 n = log n + (log 9 4) ( n ) T (n) T ( n d + cn log n ) + cn ( d(n/ + ) log(n/ + ) ) + cn ( 9 ) dn log 6 n + d log n + cn = dn log n + (log 9 4)dn + d log n + cn Note that we can do this as for constant-sized inputs the running time is always some constant (b in the above case). log n n 4 = dn log n + (log 9 3.5)dn + cn dn log n.33dn + cn dn log n for a suitable choice of d. 6. Guessing+Induction c Ernst Mayr, Harald Räcke Guessing+Induction c Ernst Mayr, Harald Räcke 4 6. Master Theorem Lemma 4 Let a, b and ɛ > denote constants. Consider the recurrence ( n ) T (n) = at + f (n). b Case. If f (n) = O(n log b(a) ɛ ) then T (n) = Θ(n log b a ). Case. If f (n) = Θ(n log b(a) log k n) then T (n) = Θ(n log b a log k+ n). 6. Master Theorem We prove the Master Theorem for the case that n is of the form b l, and we assume that the non-recursive case occurs for problem size and incurs cost. Case 3. If f (n) = Ω(n logb(a)+ɛ ) and for sufficiently large n af ( n b ) cf (n) for some constant c < then T (n) = Θ(f (n)). Note that the cases do not cover all possibilities. 6. Master Theorem c Ernst Mayr, Harald Räcke 4 6. Master Theorem c Ernst Mayr, Harald Räcke 4

12 The Recursion Tree 6. Master Theorem The running time of a recursive algorithm can be visualized by a recursion tree: n b a nx n b a a n b a f (n) af ( n b ) This gives log b n ( ) T (n) = n log b n a + a i f b i i=. n n n n n n n n n b b b b b b b b b a a a a a a a a a a f ( n b ) a log bn = n log ba 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke 44 Case. Now suppose that f (n) cn log b a ɛ. T (n) n log b a = b i(log b a ɛ) = b ɛi (b log b a ) i = b ɛi a i Hence, k i= qi = qk+ q ( ) c T (n) b ɛ + log b n i= ( ) n a i f b i log b n ( ) n logb c a i a ɛ b i i= = cn log b a ɛ log b n i= ( b ɛ ) i = cn log b a ɛ (b ɛ log b n )/(b ɛ ) = cn log b a ɛ (n ɛ )/(b ɛ ) c = b ɛ nlog b a (n ɛ )/(n ɛ ) n log b(a) T (n) = O(n log b a ). Case. Now suppose that f (n) cn log b a. Hence, T (n) n log b a = log b n i= ( ) n a i f b i log b n ( ) n logb c a i a b i i= log b n = cn log b a i= = cn log b a log b n T (n) = O(n log b a log b n) T (n) = O(n log b a log n). 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke 46

13 Case. Now suppose that f (n) cn log b a. Hence, T (n) n log b a = log b n i= ( ) n a i f b i log b n ( ) n logb c a i a b i i= log b n = cn log b a i= = cn log b a log b n T (n) = Ω(n log b a log b n) T (n) = Ω(n log b a log n). Case. Now suppose that f (n) cn log b a (log b (n)) k. T (n) n log b a = n = b l l = log b n log b n i= ( ) n a i f b i log b n ( ) n logb c a i a b i i= = cn log b a l i= ( ( b l )) k log b l = cn log b a (l i) k = cn log b a i= l i= i k c k nlog b a l k+ b i k lk+ ( ( )) n k log b b i T (n) = O(n log b a log k+ n). 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke 48 Case 3. Now suppose that f (n) dn log b a+ɛ, and that for sufficiently large n: af (n/b) cf (n), for c <. From this we get a i f (n/b i ) c i f (n), where we assume that n/b i n is still sufficiently large. Hence, T (n) n log b a = q < : n i= qi = qn+ q q log b n i= log b n = i= ( ) n a i f b i c i f (n) + O(n log b a ) c f (n) + O(nlog b a ) T (n) O(f (n)) T (n) = Θ(f (n)). Example: Multiplying Two Integers Suppose we want to multiply two n-bit Integers, but our registers can only perform operations on integers of constant size. For this we first need to be able to add two integers A and B: A B This gives that two n-bit integers can be added in time O(n). 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke 5

14 Example: Multiplying Two Integers Suppose that we want to multiply an n-bit integer A and an m-bit integer B (m n). Time requirement: Computing intermediate results: O(nm). This is also nown as the school method for multiplying integers. Note that the intermediate numbers that are generated can have at most m + n n bits. Adding m numbers of length n: O((m + n)m) = O(nm). Example: Multiplying Two Integers A recursive approach: Suppose that integers A and B are of length n = k, for some k. b n. B.. b n. B. b. n... b.... A. a. n... B A A a Then it holds that Hence, a n a n A = A n + A and B = B n + B A B = A B n + (A B + A B ) n + A B 6. Master Theorem c Ernst Mayr, Harald Räcke 5 6. Master Theorem c Ernst Mayr, Harald Räcke 5 Example: Multiplying Two Integers Example: Multiplying Two Integers Algorithm 3 mult(a, B) : if A = B = then : return a b 3: split A into A and A 4: split B into B and B 5: Z mult(a, B ) 6: Z mult(a, B ) + mult(a, B ) 7: Z mult(a, B ) 8: return Z n + Z n + Z We get the following recurrence: ( n ) T (n) = 4T + O(n). O() O() O(n) O(n) T ( n ) T ( n ) + O(n) T ( n ) O(n) Master Theorem: Recurrence: T [n] = at ( n b ) + f (n). Case : f (n) = O(n log b a ɛ ) T (n) = Θ(n log b a ) Case : f (n) = Θ(n log b a log k n) T (n) = Θ(n log b a log k+ n) Case 3: f (n) = Ω(n log b a+ɛ ) T (n) = Θ(f (n)) In our case a = 4, b =, and f (n) = Θ(n). Hence, we are in Case, since n = O(n ɛ ) = O(n log b a ɛ ). We get a running time of O(n ) for our algorithm. Not better then the school method. 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke 54

15 Example: Multiplying Two Integers We can use the following identity to compute Z : Hence, Z = A B + A B = Z = Z {}}{{}} { = (A + A ) (B + B ) A B A B Algorithm 4 mult(a, B) : if A = B = then : return a b 3: split A into A and A 4: split B into B and B 5: Z mult(a, B ) 6: Z mult(a, B ) 7: Z mult(a + A, B + B ) Z Z 8: return Z n + Z n + Z O() O() O(n) O(n) T ( n ) T ( n ) + O(n) T ( n ) O(n) Example: Multiplying Two Integers We get the following recurrence: ( n ) T (n) = 3T + O(n). Master Theorem: Recurrence: T [n] = at ( n b ) + f (n). Case : f (n) = O(n log b a ɛ ) T (n) = Θ(n log b a ) Case : f (n) = Θ(n log b a log k n) T (n) = Θ(n log b a log k+ n) Case 3: f (n) = Ω(n log b a+ɛ ) T (n) = Θ(f (n)) Again we are in Case. We get a running time of Θ(n log 3 ) Θ(n.59 ). A huge improvement over the school method. 6. Master Theorem c Ernst Mayr, Harald Räcke Master Theorem c Ernst Mayr, Harald Räcke The Characteristic Polynomial Consider the recurrence relation: c T (n) + c T (n ) + c T (n ) + + c k T (n k) = f (n) This is the general form of a linear recurrence relation of order k with constant coefficients (c, c k ). T (n) only depends on the k preceding values. This means the recurrence relation is of order k. The recurrence is linear as there are no products of T [n] s. If f (n) = then the recurrence relation becomes a linear, homogenous recurrence relation of order k. 6.3 The Characteristic Polynomial Observations: The solution T [], T [], T [],... is completely determined by a set of boundary conditions that specify values for T [],..., T [k ]. In fact, any k consecutive values completely determine the solution. k non-concecutive values might not be an appropriate set of boundary conditions (depends on the problem). Approach: First determine all solutions that satisfy recurrence relation. Then pick the right one by analyzing boundary conditions. First consider the homogenous case. 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 58

16 The Homogenous Case The Homogenous Case The solution space S = {T = T [], T [], T [],... } T fulfills recurrence relation is a vector space. This means that if T, T S, then also αt + βt S, for arbitrary constants α, β. How do we find a non-trivial solution? We guess that the solution is of the form λ n, λ, and see what happens. In order for this guess to fulfill the recurrence we need for all n k. c λ n + c λ n + c λ n + + c k λ n k = Dividing by λ n k gives that all these constraints are identical to c λ k + c λ k + c λ k + + c k = }{{} characteristic polynomial P[λ] This means that if λ i is a root (Nullstelle) of P[λ] then T [n] = λ n i is a solution to the recurrence relation. Let λ,..., λ k be the k (complex) roots of P[λ]. Then, because of the vector space property α λ n + α λ n + + α kλ n k is a solution for arbitrary values α i. 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 6 The Homogenous Case Lemma 5 Assume that the characteristic polynomial has k distinct roots λ,..., λ k. Then all solutions to the recurrence relation are of the form α λ n + α λ n + + α kλ n k. Proof. There is one solution for every possible choice of boundary conditions for T [],..., T [k]. The Homogenous Case Proof (cont.). Suppose I am given boundary conditions T [i] and I want to see whether I can choose the α is such that these conditions are met: α λ + α λ + + α k λ k = T [] α λ + α λ + + α k λ k = T []. α λ k + α λ k + + α k λ k k = T [k] We show that the above set of solutions contains one solution for every choice of boundary conditions. 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 6

17 The Homogenous Case Proof (cont.). Suppose I am given boundary conditions T [i] and I want to see whether I can choose the α is such that these conditions are met: λ λ λ k λ λ λ k. λ k λ k λ k k α α.. α k T [] T [] =. T [k] We show that the column vectors are linearly independent. Then the above equation has a solution. The Homogenous Case Proof (cont.). This we show by induction: base case (k = ): A vector (λ i ), λ i is linearly independent. induction step (k k + ): assume for contradiction that there exist α i s with α λ. λ k λ k + + α k λ k. λ k k λ k k = and not all α i =. Then all α i! 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 64 The Homogeneous Case v := λ v k := λ λ λ k α + + α k.. = λ k λ k k λ v = λ k v k = λ k λ k k The Homogeneous Case This gives that k i= ( λ i λ k )α i v i =. This is a contradiction as the v i s are linearly independent because of induction hypothesis. This means that Hence, k α i v i = and i= k λ i α i v i = i= k i= α i v i + α k v k = and k λ k i= λ i α i v i = α k v k 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 66

18 The Homogeneous Case What happens if the roots are not all distinct? Suppose we have a root λ i with multiplicity (Vielfachheit) at least. Then not only is λ n i a solution to the recurrence but also nλn i. To see this consider the polynomial P(λ)λ n k = c λ n + c λ n + c λ n + + c k λ n k Since λ i is a root we can write this as Q(λ)(λ λ i ). Calculating the derivative gives a polynomial that still has root λ i. This means Hence, c nλ n i + c (n )λ n i + + c k (n k)λ n k i = The Homogeneous Case Suppose λ i has multiplicity j. We know that c nλ n i + c (n )λ n i + + c k (n k)λ n k i = (after taking the derivative; multiplying with λ; plugging in λ i ) Doing this again gives c n λ n i + c (n ) λ n i We can continue j times. Hence, n l λ n i is a solution for l,..., j. + + c k (n k) λ n k i = c nλ n i + c (n )λ n i + + c k (n k)λ n k i = }{{}}{{}}{{} T [n] T [n ] T [n k] 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 68 The Homogeneous Case Lemma 6 Let P[λ] denote the characteristic polynomial to the recurrence c T [n] + c T [n ] + + c k T [n k] = Let λ i, i =,..., m be the (complex) roots of P[λ] with multiplicities l i. Then the general solution to the recurrence is given by T [n] = m l i i= j= α ij (n j λ n i ). The full proof is omitted. We have only shown that any choice of α ij s is a solution to the recurrence. Example: Fibonacci Sequence T [] = T [] = T [n] = T [n ] + T [n ] for n The characteristic polynomial is Finding the roots, gives λ λ λ / = ± 4 + = ( ± ) The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 7

19 Example: Fibonacci Sequence Hence, the solution is of the form ( ) n ( α + β ) n Example: Fibonacci Sequence Hence, the solution is [( ) n ( ) n ] 5 T [] = gives α + β =. T [] = gives ( ) ( ) α + β = α β = The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 7 The Inhomogeneous Case The Inhomogeneous Case Consider the recurrence relation: c T (n) + c T (n ) + c T (n ) + + c k T (n k) = f (n) with f (n). While we have a fairly general technique for solving homogeneous, linear recurrence relations the inhomogeneous case is different. The general solution of the recurrence relation is T (n) = T h (n) + T p (n), where T h is any solution to the homogeneous equation, and T p is one particular solution to the inhomogeneous equation. There is no general method to find a particular solution. 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 74

20 The Inhomogeneous Case Example: Then, T [n] = T [n ] + T [] = T [n ] = T [n ] + (n ) Subtracting the first from the second equation gives, or T [n] T [n ] = T [n ] T [n ] (n ) T [n] = T [n ] T [n ] (n ) I get a completely determined recurrence if I add T [] = and T [] =. The Inhomogeneous Case Example: Characteristic polynomial: Then the solution is of the form T [] = gives α =. λ } {{ λ + } = (λ ) T [n] = α n + βn n = α + βn T [] = gives + β = β =. 6.3 The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 76 The Inhomogeneous Case If f (n) is a polynomial of degree r this method can be applied r + times to obtain a homogeneous equation: Shift: T [n] = T [n ] T [n ] + n T [n] = T [n ] + n Shift: T [n ] = T [n ] + (n ) = T [n ] + n n + Difference: T [n] T [n ] = T [n ] T [n ] + n T [n] = T [n ] T [n ] + n T [n ] = T [n ] T [n 3] + (n ) = T [n ] T [n 3] + n 3 Difference: T [n] T [n ] =T [n ] T [n ] + n T [n ] + T [n 3] n + 3 T [n] = 3T [n ] 3T [n ] + T [n 3] + and so on The Characteristic Polynomial c Ernst Mayr, Harald Räcke The Characteristic Polynomial c Ernst Mayr, Harald Räcke 78

21 6.4 Generating Functions 6.4 Generating Functions Definition 7 (Generating Function) Let (a n ) n be a sequence. The corresponding generating function (Erzeugendenfunktion) is F(z) := a n z n ; n= exponential generating function (exponentielle Erzeugendenfunktion) is F(z) = n a n n! zn. Example 8. The generating function of the sequence (,,,...) is F(z) =.. The generating function of the sequence (,,,...) is F(z) = z. 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke Generating Functions 6.4 Generating Functions There are two different views: A generating function is a formal power series (formale Potenzreihe). Then the generating function is an algebraic object. Let f = n= a n z n and g = n= b n z n. Equality: f and g are equal if a n = b n for all n. Addition: f + g := n= (a n + b n )z n. The arithmetic view: We view a power series as a function f : C C. Then, it is important to think about convergence/convergence radius etc. Multiplication: f g := n= c n z n with c = n p= a p b n p. There are no convergence issues here. 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 8

22 6.4 Generating Functions 6.4 Generating Functions Suppose we are given the generating function What does n= z n = z mean in the algebraic view? It means that the power series z and the power series n= z n are invers, i.e., This is well-defined. ( ) ( z n= z n) =. n= We can compute the derivative: z n = z. nz n = n nz n }{{} n= (n+)z n ( z) Hence, the generating function of the sequence a n = n + is /( z). 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke Generating Functions We can repeat this Derivative: (n + )z n = ( z). n= n(n + )z n = n }{{} n= (n+)(n+)z n ( z) 3 Hence, the generating function of the sequence a n = (n + )(n + ) is ( z). 6.4 Generating Functions Computing the k-th derivative of z n. n(n )... (n k + )z n k = n k n k! = ( z) k+. Hence: ( ) n + k z n = k ( z) k+. n The generating function of the sequence a n = (n + k)... (n + )z n ( ) n+k k is ( z) k Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 86

23 6.4 Generating Functions 6.4 Generating Functions nz n = + )z n n (n n n = ( z) z z = ( z) z n We know Hence, n n y n = y a n z n = az The generating function of the sequence a n = n is z ( z). The generating function of the sequence f n = a n is az. 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke Generating Functions Suppose we have again the recurrence a n = a n + for n and a =. A(z) = a n z n n = a + (a n + )z n n = + z a n z n + n n z n 6.4 Generating Functions Solving for A(z) gives a n z n = A(z) = ( z) n = (n + )z n n Hence, a n = n +. = z a n z n + z n n n = za(z) + n = za(z) + z z n 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 9

24 Some Generating Functions n-th sequence element n + generating function z ( z) ( ) n+k n ( z) k+ n z ( z) a n az n z( + z) ( z) 3 n! z( + z) ( z) 3 Some Generating Functions n-th sequence element cf n f n + g n n i= f ig n i generating function cf F + G F G f n k (n k); otw. z k F n i= f i nf n c n f n F(z) z z df(z) dz F(cz) 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 9 Solving Recursions with Generating Functions. Set A(z) = n a n z n.. Transform the right hand side so that boundary condition and recurrence relation can be plugged in. 3. Do further transformations so that the infinite sums on the right hand side can be replaced by A(z). 4. Solving for A(z) gives an equation of the form A(z) = f (z), where hopefully f (z) is a simple function. 5. Write f (z) as a formal power series. Techniques: partial fraction decomposition (Partialbruchzerlegung) lookup in tables 6. The coefficients of the resulting power series are the a n. Example: a n = a n, a =. Set up generating function: A(z) = a n z n n. Transform right hand side so that recurrence can be plugged in: A(z) = a + a n z n n. Plug in: A(z) = + (a n )z n n 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 94

25 Example: a n = a n, a = 3. Transform right hand side so that infinite sums can be replaced by A(z) or by simple function. A(z) = + (a n )z n n Example: a n = a n, a = 5. Rewrite f (n) as a power series: a n z n = A(z) = z = n z n n n = + z a n z n n = + z a n z n n = + z A(z) 4. Solve for A(z). A(z) = z 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 96 Example: a n = 3a n + n, a =. Set up generating function: A(z) = a n z n n Example: a n = 3a n + n, a =./3. Transform right hand side: A(z) = a n z n n = a + a n z n n = + (3a n + n)z n n = + 3z a n z n + nz n n n = + 3z a n z n + nz n n n z = + 3zA(z) + ( z) 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke 98

26 Example: a n = 3a n + n, a = 4. Solve for A(z): gives A(z) = A(z) = + 3zA(z) + z ( z) ( z) + z ( 3z)( z) = z z + ( 3z)( z) Example: a n = 3a n + n, a = 5. Write f (z) as a formal power series: We use partial fraction decomposition: z z + ( 3z)( z)! = A 3z + B z + C ( z) This leads to the following conditions: A + B + C = A + 4B + 3C = A + 3B = which gives A = 7 4 B = 4 C = 6.4 Generating Functions c Ernst Mayr, Harald Räcke Generating Functions c Ernst Mayr, Harald Räcke Example: a n = 3a n + n, a = 5. Write f (z) as a formal power series: A(z) = 7 4 3z 4 z ( z) = n z n 4 z n (n + )z n = n n n ( 7 4 3n 4 (n + ) ) z n 6. This means a n = 7 4 3n n 3 4. n 6.5 Transformation of the Recurrence Example 9 f = f = f n = f n f n for n. Define g n := log f n. Then g n = g n + g n for n g = log =, g = (fãčåšr log = log ) g n = F n (n-th Fibonacci number) f n = F n 6.4 Generating Functions c Ernst Mayr, Harald Räcke 6.5 Transformation of the Recurrence c Ernst Mayr, Harald Räcke

27 6.5 Transformation of the Recurrence 6.5 Transformation of the Recurrence Example Define f = f n = 3f n + n; for n = k ; Example Then: g = g k = 3g k + k, k g k := f k. We get, g k = 3 k+ k+, hence f n = 3 3 k k = 3( log 3 ) k k = 3( k ) log 3 k = 3n log 3 n. 6.5 Transformation of the Recurrence c Ernst Mayr, Harald Räcke Transformation of the Recurrence c Ernst Mayr, Harald Räcke 3 Part III Data Structures Abstract Data Type An abstract data type (ADT) is defined by an interface of operations or methods that can be performed and that have a defined behavior The data types in this lecture all operate on objects that are represented by a [key, value] pair. The key comes from a totally ordered set, and we assume that there is an efficient comparison function. The value can be anything; it usually carries satellite information important for the application that uses the ADT. c Ernst Mayr, Harald Räcke 4 c Ernst Mayr, Harald Räcke 5

28 Dynamic Set Operations S. search(k): Returns pointer to object x from S with key[x] = k or null. S. insert(x): Inserts object x into set S. key[x] must not currently exist in the data-structure. S. delete(x): Given pointer to object x from S, delete x from the set. S. minimum(): Return pointer to object with smallest key-value in S. S. maximum(): Return pointer to object with largest key-value in S. S. successor(x): Return pointer to the next larger element in S or null if S is maximum. S. predecessor(x): Return pointer to the next smaller element in S or null if S is minimum. Dynamic Set Operations S. union(s ): Sets S := S S. The set S is destroyed. S. merge(s ): Sets S := S S. Requires S S =. S. split(k, S ): S := {x S key[x] k}, S := {x S key[x] > k}. S. concatenate(s ): S := S S. Requires S. maximum() S. minimum(). S. decrease-key(x, k): Replace key[x] by k key[x]. c Ernst Mayr, Harald Räcke 6 c Ernst Mayr, Harald Räcke 7 Examples of ADTs 7 Dictionary Stack: S.push(x): Insert an element. S.pop(): Return the element from S that was inserted most recently; delete it from S. S.empty(): Tell if S contains any object. Queue: S.enqueue(x): Insert an element. S.dequeue(): Return the element that is longest in the structure; delete it from S. Dictionary: S.insert(x): Insert an element x. S.delete(x): Delete the element pointed to by x. S.search(k): Return a pointer to an element e with key[e] = k in S if it exists; otherwise return null. S.empty(): Tell if S contains any object. Priority-Queue: S.insert(x): Insert an element. S.delete-min(): Return the element with lowest key-value; delete it from S. 7 Dictionary c Ernst Mayr, Harald Räcke 9

29 7. Binary Search Trees An (internal) binary search tree stores the elements in a binary tree. Each tree-node corresponds to an element. All elements in the left sub-tree of a node v have a smaller key-value than key[v] and elements in the right sub-tree have a larger-key value. We assume that all key-values are different. (External Search Trees store objects only at leaf-vertices) Examples: 7. Binary Search Trees We consider the following operations on binary search trees. Note that this is a super-set of the dictionary-operations. T. insert(x) T. delete(x) T. search(k) T. successor(x) 6 T. predecessor(x) 7 T. minimum() T. maximum() Binary Search Trees c Ernst Mayr, Harald Räcke 7. Binary Search Trees c Ernst Mayr, Harald Räcke Binary Search Trees: Searching TreeSearch(root, 7) 5 Binary Search Trees: Searching TreeSearch(root, 8) Algorithm 5 TreeSearch(x, k) : if x = null or k = key[x] return x : if k < key[x] return TreeSearch(left[x], k) 3: else return TreeSearch(right[x], k) Algorithm 5 TreeSearch(x, k) : if x = null or k = key[x] return x : if k < key[x] return TreeSearch(left[x], k) 3: else return TreeSearch(right[x], k) 7. Binary Search Trees c Ernst Mayr, Harald Räcke 7. Binary Search Trees c Ernst Mayr, Harald Räcke 3

30 Binary Search Trees: Minimum Binary Search Trees: Successor succ is min in right sub-tree Algorithm 6 TreeMin(x) : if x = null or left[x] = null return x : if k < key[x] return TreeSearch(left[x], k) 3: else return TreeMin(left[x]) 4 7 Algorithm 7 TreeSucc(x) : if right[x] null return TreeMin(right[x]) : y parent[x] 3: while y null and x = right[y] do 4: x y; y parent[x] 5: return y; 7. Binary Search Trees c Ernst Mayr, Harald Räcke 4 7. Binary Search Trees c Ernst Mayr, Harald Räcke 5 Binary Search Trees: Successor 5 y Binary Search Trees: Insert Insert element not in the tree. TreeInsert(root, ) 5 x succ is lowest ancestor going left to reach me Algorithm 7 TreeSucc(x) : if right[x] null return TreeMin(right[x]) : y parent[x] 3: while y null and x = right[y] do 4: x y; y parent[x] 5: return y; 7. Binary Search Trees c Ernst Mayr, Harald Räcke Search for z. At some point the search stops at a null-pointer. This is the place to insert z. Algorithm 8 TreeInsert(x, 8 z) 47 : if x = null then root[t ] x; return; : if key[x] > key[z] then 3: if left[x] = null then left[x] z; 4: else TreeInsert(left[x], z); 5: else 6: if right[x] = null then right[x] z; 7: else TreeInsert(right[x], z); 8: return 55

31 Binary Search Trees: Delete 5 Binary Search Trees: Delete Case : Element does not have any children Simply go to the parent and set the corresponding pointer to null. Case : Element has exactly one child Splice the element out of the tree by connecting its parent to its successor. Binary Search Trees: Delete Binary Search Trees: Delete 5 Algorithm 9 TreeDelete(z) Case 3: Element has two children 3 4 Find the successor of the element : if left[z] = null or right[z] = null : then y z else y TreeSucc(z); 3: if left[y] null 4: then x left[y] else x right[y]; 5: if x null then parent[x] parent[y]; 6: if parent[y] = null then 7: root[t ] x 8: else 9: if y = left[parent[x]] then : left[parent[y]] x : else : right[parent[y]] x 3: if y z then copy y-data to z select y to splice out x is child of y (or null) parent[x] is correct fix pointer to x fix pointer to x Splice successor out of the tree Replace content of element by content of successor 7. Binary Search Trees c Ernst Mayr, Harald Räcke 8

32 Balanced Binary Search Trees All operations on a binary search tree can be performed in time O(h), where h denotes the height of the tree. However the height of the tree may become as large as Θ(n). Balanced Binary Search Trees With each insert- and delete-operation perform local adjustments to guarantee a height of O(log n). AVL-trees, Red-black trees, Scapegoat trees, -3 trees, B-trees, AA trees, Treaps similar: SPLAY trees. 7. Red Black Trees Definition A red black tree is a balanced binary search tree in which each internal node has two children. Each internal node has a colour, such that. The root is black.. All leaf nodes are black. 3. For each node, all paths to descendant leaves contain the same number of black nodes. 4. If a node is red then both its children are black. The null-pointers in a binary search tree are replaced by pointers to special null-vertices, that do not carry any object-data 7. Binary Search Trees c Ernst Mayr, Harald Räcke 9 7. Red Black Trees c Ernst Mayr, Harald Räcke Red Black Trees: Example Red Black Trees Lemma A red-black tree with n internal nodes has height at most O(log n). Definition 3 The black height bh(v) of a node v in a red black tree is the number of black nodes on a path from v to a leaf vertex (not counting v). 7 We first show: Lemma 4 A sub-tree of black height bh(v) in a red black tree contains at least bh(v) internal vertices. 7. Red Black Trees c Ernst Mayr, Harald Räcke 7. Red Black Trees c Ernst Mayr, Harald Räcke

33 7. Red Black Trees Proof of Lemma 4. Induction on the height of v. base case (height(v) = ) If height(v) (maximum distance btw. v and a node in the sub-tree rooted at v) is then v is a leaf. The black height of v is. The sub-tree rooted at v contains = bh(v) inner vertices. 7. Red Black Trees Proof (cont.) induction step Supose v is a node with height(v) >. v has two children with strictly smaller height. These children (c, c ) either have bh(c i ) = bh(v) or bh(c i ) = bh(v). By induction hypothesis both sub-trees contain at least bh(v) internal vertices. Then T v contains at least ( bh(v) ) + bh(v) vertices. 7. Red Black Trees c Ernst Mayr, Harald Räcke 3 7. Red Black Trees c Ernst Mayr, Harald Räcke 4 7. Red Black Trees 7. Red Black Trees Proof of Lemma. Let h denote the height of the red-black tree, and let p denote a path from the root to the furthest leaf. At least half of the node on p must be black, since a red node must be followed by a black node. We need to adapt the insert and delete operations so that the red black properties are maintained. Hence, the black height of the root is at least h/. The tree contains at least h/ internal vertices. Hence, h/ n. Hence, h log n + = O(log n). 7. Red Black Trees c Ernst Mayr, Harald Räcke 5 7. Red Black Trees c Ernst Mayr, Harald Räcke 6

34 Rotations Red Black Trees: Insert RB-Insert(root, 8) 5 The properties will be maintained through rotations: 3 3 x z z LeftRotate(x) x A RightRotate(z) C B C A B Insert: 7 8 z first make a normal insert into a binary search tree then fix red-black properties 7. Red Black Trees c Ernst Mayr, Harald Räcke 7 7. Red Black Trees c Ernst Mayr, Harald Räcke 8 Red Black Trees: Insert Invariant of the fix-up algorithm: z is a red node the black-height property is fulfilled at every node the only violation of red-black properties occurs at z and parent[z] either both of them are red (most important case) or the parent does not exist (violation since root must be black) If z has a parent but no grand-parent we could simply color the parent/root black; however this case never happens. Red Black Trees: Insert Algorithm InsertFix(z) : while parent[z] null and col[parent[z]] = red do : if parent[z] = left[gp[z]] then 3: uncle right[grandparent[z]] 4: if col[uncle] = red then 5: col[p[z]] black; col[u] black; 6: col[gp[z]] red; z grandparent[z]; 7: else 8: if z = right[parent[z]] then 9: z p[z]; LeftRotate(z); z in left subtree of grandparent : col[p[z]] black; col[gp[z]] red; Case : uncle red Case : uncle black a: z right child b: z left child : RightRotate(gp[z]); : else same as then-clause but right and left exchanged 3: col(root[t ]) black; 7. Red Black Trees c Ernst Mayr, Harald Räcke 9 7. Red Black Trees c Ernst Mayr, Harald Räcke 3

35 Case : Red Uncle Case b: Black uncle and z is left child 3 z A B 3 6 C D E uncle. rotate around grandparent. re-colour to ensure that black height property holds 3. you have a red black tree A 3 z B 6 C 3 3 z 3 D E. recolour. move z to grand-parent 3. invariant is fulfilled for new z z 6 uncle 4. you made progress A B C D E A B C D E 7. Red Black Trees c Ernst Mayr, Harald Räcke 3 7. Red Black Trees c Ernst Mayr, Harald Räcke 3 Case a: Black uncle and z is right child Red Black Trees: Insert. rotate around parent 3. move z downwards 3. you have case b. 3 z 6 Running time: Only Case may repeat; but only h/ many steps, where h is A B C D E the height of the tree. Case a Case b red-black tree Case b red-black tree z uncle Performing step one O(log n) times and every other step at most once, we get a red-black tree. Hence O(log n) re-colourings and at most rotations. A B C D E 7. Red Black Trees c Ernst Mayr, Harald Räcke Red Black Trees c Ernst Mayr, Harald Räcke 34

36 Red Black Trees: Delete Red Black Trees: Delete 5 First do a standard delete If the spliced out node x was red everyhting is fine If it was black there may be the following problems. Parent and child of x were red; two adjacent red vertices If you delete the root, the root may now be red Every path from an ancestor of x to a descendant leaf of x changes the number of black nodes. Black height property might be violated. 7 Case 3: Element has two children do normal delete 4 when replacing content by content of successor, don t 7. Red Black Trees c Ernst Mayr, Harald Räcke 35 change color of node Red Black Trees: Delete 5 Red Black Trees: Delete Invariant of the fix-up algorihtm the node z is black if we assign a fake black unit to the edge from z to its parent then the black-height property is fulfilled 7 z 4 Goal: make rotations in such a way that you at some point can remove the fake black unit from the edge. Delete: deleting black node messes up black-height property if z is red, we can simply color it black and everything is fine the problem is if z is black (e.g. a dummy-leaf); we call a fix-up procedure to fix the problem. 7. Red Black Trees c Ernst Mayr, Harald Räcke 38

37 Case : Sibling of z is red Case : Sibling is black with two black children z a b c sibling z a b c sibling Here b is either black or red. If it is red we are in a special case that directly leads to a red-black tree. d e d e A B C D E F A B C D E F. left-rotate around parent of z. recolor nodes b and c 3. the new sibling is black (and parent of z is red) 4. Case (special), or Case 3, or Case 4 z a b A B C D d c E e F. re-color node c. move fake black unit upwards 3. move z upwards 4. we made progress 5. if b is red we color it black and are done A a B z b c d e C D E F Case 3: Sibling black with one black child to the right Case 4: Sibling is black with red right child. do a right-rotation at sibling. recolor c and d 3. new sibling is black with red right child (Case 4) z a A B C b d D c E e F A z a B b c d sibling C D E F e Here b and d are either red or black but have possibly different colors. We recolor c by giving it the color of b. z a b c sibling Again the blue color of b indicates that it can either be black or red.. left-rotate around b. recolor nodes b, c, and e 3. remove the fake black unit b c e A B d e C D E F 4. you have a valid red black tree z a A B C D d E F

Part II. Foundations. Ernst Mayr, Harald Räcke 16

Part II. Foundations. Ernst Mayr, Harald Räcke 16 Part II Foundations Ernst Mayr, Harald Räcke 16 Vocabularies a b a times b a multiplied by b a into b a b a divided by b a by b a over b a b (a: numerator (Zähler), b: denominator (Nenner)) a raised to

More information

Part I. Efficient Algorithms and Data Structures. Organizational Matters. Part I. Organizational Matters WS 2013/14. Harald Räcke. Winter Term 2013/14

Part I. Efficient Algorithms and Data Structures. Organizational Matters. Part I. Organizational Matters WS 2013/14. Harald Räcke. Winter Term 2013/14 WS 3/4 Efficient Algorithms and Data Structures Harald Räcke Part I Organizational Matters Fakultät für Informatik TU München http://www4.in.tum.de/lehre/3ws/ea/ Winter Term 3/4 Ernst Mayr, Harald Räcke

More information

The null-pointers in a binary search tree are replaced by pointers to special null-vertices, that do not carry any object-data

The null-pointers in a binary search tree are replaced by pointers to special null-vertices, that do not carry any object-data Definition 1 A red black tree is a balanced binary search tree in which each internal node has two children. Each internal node has a color, such that 1. The root is black. 2. All leaf nodes are black.

More information

7 Dictionary. EADS c Ernst Mayr, Harald Räcke 109

7 Dictionary. EADS c Ernst Mayr, Harald Räcke 109 7 Dictionary Dictionary: S.insert(x): Insert an element x. S.delete(x): Delete the element pointed to by x. S.search(k): Return a pointer to an element e with key[e] = k in S if it exists; otherwise return

More information

7 Dictionary. Harald Räcke 125

7 Dictionary. Harald Räcke 125 7 Dictionary Dictionary: S. insert(x): Insert an element x. S. delete(x): Delete the element pointed to by x. S. search(k): Return a pointer to an element e with key[e] = k in S if it exists; otherwise

More information

7 Dictionary. Ernst Mayr, Harald Räcke 126

7 Dictionary. Ernst Mayr, Harald Räcke 126 7 Dictionary Dictionary: S. insert(x): Insert an element x. S. delete(x): Delete the element pointed to by x. S. search(k): Return a pointer to an element e with key[e] = k in S if it exists; otherwise

More information

Part III. Data Structures. Abstract Data Type. Dynamic Set Operations. Dynamic Set Operations

Part III. Data Structures. Abstract Data Type. Dynamic Set Operations. Dynamic Set Operations Abstract Data Type Part III Data Structures An abstract data type (ADT) is defined by an interface of operations or methods that can be performed and that have a defined behavior. The data types in this

More information

7 Dictionary. 7.1 Binary Search Trees. Binary Search Trees: Searching. 7.1 Binary Search Trees

7 Dictionary. 7.1 Binary Search Trees. Binary Search Trees: Searching. 7.1 Binary Search Trees 7 Dictionary Dictionary: S.insert(): Insert an element. S.delete(): Delete the element pointed to by. 7.1 Binary Search Trees An (internal) binary search tree stores the elements in a binary tree. Each

More information

Each internal node v with d(v) children stores d 1 keys. k i 1 < key in i-th sub-tree k i, where we use k 0 = and k d =.

Each internal node v with d(v) children stores d 1 keys. k i 1 < key in i-th sub-tree k i, where we use k 0 = and k d =. 7.5 (a, b)-trees 7.5 (a, b)-trees Definition For b a an (a, b)-tree is a search tree with the following properties. all leaves have the same distance to the root. every internal non-root vertex v has at

More information

8 Priority Queues. 8 Priority Queues. Prim s Minimum Spanning Tree Algorithm. Dijkstra s Shortest Path Algorithm

8 Priority Queues. 8 Priority Queues. Prim s Minimum Spanning Tree Algorithm. Dijkstra s Shortest Path Algorithm 8 Priority Queues 8 Priority Queues A Priority Queue S is a dynamic set data structure that supports the following operations: S. build(x 1,..., x n ): Creates a data-structure that contains just the elements

More information

Growth of Functions (CLRS 2.3,3)

Growth of Functions (CLRS 2.3,3) Growth of Functions (CLRS 2.3,3) 1 Review Last time we discussed running time of algorithms and introduced the RAM model of computation. Best-case running time: the shortest running time for any input

More information

Data Structures and and Algorithm Xiaoqing Zheng

Data Structures and and Algorithm Xiaoqing Zheng Data Structures and Algorithm Xiaoqing Zheng zhengxq@fudan.edu.cn Trees (max heap) 1 16 2 3 14 10 4 5 6 7 8 7 9 3 8 9 10 2 4 1 PARENT(i) return i /2 LEFT(i) return 2i RIGHT(i) return 2i +1 16 14 10 8 7

More information

Algorithm efficiency analysis

Algorithm efficiency analysis Algorithm efficiency analysis Mădălina Răschip, Cristian Gaţu Faculty of Computer Science Alexandru Ioan Cuza University of Iaşi, Romania DS 2017/2018 Content Algorithm efficiency analysis Recursive function

More information

9 Union Find. 9 Union Find. List Implementation. 9 Union Find. Union Find Data Structure P: Maintains a partition of disjoint sets over elements.

9 Union Find. 9 Union Find. List Implementation. 9 Union Find. Union Find Data Structure P: Maintains a partition of disjoint sets over elements. Union Find Data Structure P: Maintains a partition of disjoint sets over elements. P. makeset(x): Given an element x, adds x to the data-structure and creates a singleton set that contains only this element.

More information

So far we have implemented the search for a key by carefully choosing split-elements.

So far we have implemented the search for a key by carefully choosing split-elements. 7.7 Hashing Dictionary: S. insert(x): Insert an element x. S. delete(x): Delete the element pointed to by x. S. search(k): Return a pointer to an element e with key[e] = k in S if it exists; otherwise

More information

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College Why analysis? We want to predict how the algorithm will behave (e.g. running time) on arbitrary inputs, and how it will

More information

Algorithms Design & Analysis. Analysis of Algorithm

Algorithms Design & Analysis. Analysis of Algorithm Algorithms Design & Analysis Analysis of Algorithm Review Internship Stable Matching Algorithm 2 Outline Time complexity Computation model Asymptotic notions Recurrence Master theorem 3 The problem of

More information

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions

More information

1 Substitution method

1 Substitution method Recurrence Relations we have discussed asymptotic analysis of algorithms and various properties associated with asymptotic notation. As many algorithms are recursive in nature, it is natural to analyze

More information

Divide and Conquer. Arash Rafiey. 27 October, 2016

Divide and Conquer. Arash Rafiey. 27 October, 2016 27 October, 2016 Divide the problem into a number of subproblems Divide the problem into a number of subproblems Conquer the subproblems by solving them recursively or if they are small, there must be

More information

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Lecture 1: Asymptotics, Recurrences, Elementary Sorting Lecture 1: Asymptotics, Recurrences, Elementary Sorting Instructor: Outline 1 Introduction to Asymptotic Analysis Rate of growth of functions Comparing and bounding functions: O, Θ, Ω Specifying running

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2 Asymptotic Analysis Slides by Carl Kingsford Jan. 27, 2014 AD Chapter 2 Independent Set Definition (Independent Set). Given a graph G = (V, E) an independent set is a set S V if no two nodes in S are joined

More information

Data Structures and Algorithms " Search Trees!!

Data Structures and Algorithms  Search Trees!! Data Structures and Algorithms " Search Trees!! Outline" Binary Search Trees! AVL Trees! (2,4) Trees! 2 Binary Search Trees! "! < 6 2 > 1 4 = 8 9 Ordered Dictionaries" Keys are assumed to come from a total

More information

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn Priority Queue / Heap Stores (key,data) pairs (like dictionary) But, different set of operations: Initialize-Heap: creates new empty heap

More information

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0 Asymptotic Notation Asymptotic notation deals with the behaviour of a function in the limit, that is, for sufficiently large values of its parameter. Often, when analysing the run time of an algorithm,

More information

Review Of Topics. Review: Induction

Review Of Topics. Review: Induction Review Of Topics Asymptotic notation Solving recurrences Sorting algorithms Insertion sort Merge sort Heap sort Quick sort Counting sort Radix sort Medians/order statistics Randomized algorithm Worst-case

More information

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Xi Chen Columbia University We continue with two more asymptotic notation: o( ) and ω( ). Let f (n) and g(n) are functions that map

More information

Computational Complexity

Computational Complexity Computational Complexity S. V. N. Vishwanathan, Pinar Yanardag January 8, 016 1 Computational Complexity: What, Why, and How? Intuitively an algorithm is a well defined computational procedure that takes

More information

Module 1: Analyzing the Efficiency of Algorithms

Module 1: Analyzing the Efficiency of Algorithms Module 1: Analyzing the Efficiency of Algorithms Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu What is an Algorithm?

More information

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2 1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science,

More information

Advanced Data Structures

Advanced Data Structures Simon Gog gog@kit.edu - Simon Gog: KIT The Research University in the Helmholtz Association www.kit.edu Predecessor data structures We want to support the following operations on a set of integers from

More information

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 MA008 p.1/37 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/37 Exercise 1 (from LN 2) Asymptotic Notation When constants appear in exponents

More information

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem CS 577 Introduction to Algorithms: Jin-Yi Cai University of Wisconsin Madison In the last class, we described InsertionSort and showed that its worst-case running time is Θ(n 2 ). Check Figure 2.2 for

More information

CS Data Structures and Algorithm Analysis

CS Data Structures and Algorithm Analysis CS 483 - Data Structures and Algorithm Analysis Lecture VII: Chapter 6, part 2 R. Paul Wiegand George Mason University, Department of Computer Science March 22, 2006 Outline 1 Balanced Trees 2 Heaps &

More information

Fundamental Algorithms

Fundamental Algorithms Fundamental Algorithms Chapter 2: Sorting Harald Räcke Winter 2015/16 Chapter 2: Sorting, Winter 2015/16 1 Part I Simple Sorts Chapter 2: Sorting, Winter 2015/16 2 The Sorting Problem Definition Sorting

More information

CS 240 Data Structures and Data Management. Module 4: Dictionaries

CS 240 Data Structures and Data Management. Module 4: Dictionaries CS 24 Data Structures and Data Management Module 4: Dictionaries A. Biniaz A. Jamshidpey É. Schost Based on lecture notes by many previous cs24 instructors David R. Cheriton School of Computer Science,

More information

CS483 Design and Analysis of Algorithms

CS483 Design and Analysis of Algorithms CS483 Design and Analysis of Algorithms Chapter 2 Divide and Conquer Algorithms Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: Room 5326, Engineering Building, Thursday 4:30pm -

More information

COMP 382: Reasoning about algorithms

COMP 382: Reasoning about algorithms Fall 2014 Unit 4: Basics of complexity analysis Correctness and efficiency So far, we have talked about correctness and termination of algorithms What about efficiency? Running time of an algorithm For

More information

data structures and algorithms lecture 2

data structures and algorithms lecture 2 data structures and algorithms 2018 09 06 lecture 2 recall: insertion sort Algorithm insertionsort(a, n): for j := 2 to n do key := A[j] i := j 1 while i 1 and A[i] > key do A[i + 1] := A[i] i := i 1 A[i

More information

Ch01. Analysis of Algorithms

Ch01. Analysis of Algorithms Ch01. Analysis of Algorithms Input Algorithm Output Acknowledgement: Parts of slides in this presentation come from the materials accompanying the textbook Algorithm Design and Applications, by M. T. Goodrich

More information

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

5 + 9(10) + 3(100) + 0(1000) + 2(10000) = Chapter 5 Analyzing Algorithms So far we have been proving statements about databases, mathematics and arithmetic, or sequences of numbers. Though these types of statements are common in computer science,

More information

Algorithms and Their Complexity

Algorithms and Their Complexity CSCE 222 Discrete Structures for Computing David Kebo Houngninou Algorithms and Their Complexity Chapter 3 Algorithm An algorithm is a finite sequence of steps that solves a problem. Computational complexity

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 10, 2018 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. 1 /

More information

Fundamental Algorithms

Fundamental Algorithms Fundamental Algorithms Chapter 1: Introduction Michael Bader Winter 2011/12 Chapter 1: Introduction, Winter 2011/12 1 Part I Overview Chapter 1: Introduction, Winter 2011/12 2 Organizational Stuff 2 SWS

More information

Asymptotic Algorithm Analysis & Sorting

Asymptotic Algorithm Analysis & Sorting Asymptotic Algorithm Analysis & Sorting (Version of 5th March 2010) (Based on original slides by John Hamer and Yves Deville) We can analyse an algorithm without needing to run it, and in so doing we can

More information

CSC236 Week 3. Larry Zhang

CSC236 Week 3. Larry Zhang CSC236 Week 3 Larry Zhang 1 Announcements Problem Set 1 due this Friday Make sure to read Submission Instructions on the course web page. Search for Teammates on Piazza Educational memes: http://www.cs.toronto.edu/~ylzhang/csc236/memes.html

More information

CS Data Structures and Algorithm Analysis

CS Data Structures and Algorithm Analysis CS 483 - Data Structures and Algorithm Analysis Lecture II: Chapter 2 R. Paul Wiegand George Mason University, Department of Computer Science February 1, 2006 Outline 1 Analysis Framework 2 Asymptotic

More information

Advanced Data Structures

Advanced Data Structures Simon Gog gog@kit.edu - Simon Gog: KIT University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association www.kit.edu Predecessor data structures We want to support

More information

Algorithms Exam TIN093 /DIT602

Algorithms Exam TIN093 /DIT602 Algorithms Exam TIN093 /DIT602 Course: Algorithms Course code: TIN 093, TIN 092 (CTH), DIT 602 (GU) Date, time: 21st October 2017, 14:00 18:00 Building: SBM Responsible teacher: Peter Damaschke, Tel. 5405

More information

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts) Introduction to Algorithms October 13, 2010 Massachusetts Institute of Technology 6.006 Fall 2010 Professors Konstantinos Daskalakis and Patrick Jaillet Quiz 1 Solutions Quiz 1 Solutions Problem 1. We

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 14, 2019 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. COMP

More information

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms Taking Stock IE170: Algorithms in Systems Engineering: Lecture 3 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University January 19, 2007 Last Time Lots of funky math Playing

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 4: Divide and Conquer (I) Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Divide and Conquer ( DQ ) First paradigm or framework DQ(S)

More information

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 CS17 Integrated Introduction to Computer Science Klein Contents Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 1 Tree definitions 1 2 Analysis of mergesort using a binary tree 1 3 Analysis of

More information

Automaten und Formale Sprachen Automata and Formal Languages

Automaten und Formale Sprachen Automata and Formal Languages WS 2014/15 Automaten und Formale Sprachen Automata and Formal Languages Ernst W. Mayr Fakultät für Informatik TU München http://www14.in.tum.de/lehre/2014ws/afs/ Wintersemester 2014/15 AFS Chapter 0 Organizational

More information

10 van Emde Boas Trees

10 van Emde Boas Trees Dynamic Set Data Structure S: S. insert(x) S. delete(x) S. search(x) S. min() S. max() S. succ(x) S. pred(x) Ernst Mayr, Harald Räcke 389 For this chapter we ignore the problem of storing satellite data:

More information

CS483 Design and Analysis of Algorithms

CS483 Design and Analysis of Algorithms CS483 Design and Analysis of Algorithms Lecture 1 Introduction and Prologue Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: Room 5326, Engineering Building, Thursday 4:30pm - 6:30pm

More information

Ch 01. Analysis of Algorithms

Ch 01. Analysis of Algorithms Ch 01. Analysis of Algorithms Input Algorithm Output Acknowledgement: Parts of slides in this presentation come from the materials accompanying the textbook Algorithm Design and Applications, by M. T.

More information

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23 CIS 11 Data Structures and Algorithms with Java Spring 018 Big-Oh Notation Monday, January /Tuesday, January 3 Learning Goals Review Big-Oh and learn big/small omega/theta notations Discuss running time

More information

CSCI 3110 Assignment 6 Solutions

CSCI 3110 Assignment 6 Solutions CSCI 3110 Assignment 6 Solutions December 5, 2012 2.4 (4 pts) Suppose you are choosing between the following three algorithms: 1. Algorithm A solves problems by dividing them into five subproblems of half

More information

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms. Solving recurrences Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms Example: Merge-Sort(A, p, r) 1: if p < r then 2: q (p + r)/2 3: Merge-Sort(A,

More information

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms Algorithms and Theory of Computation Lecture 2: Big-O Notation Graph Algorithms Xiaohui Bei MAS 714 August 14, 2018 Nanyang Technological University MAS 714 August 14, 2018 1 / 20 O, Ω, and Θ Nanyang Technological

More information

Computational Complexity - Pseudocode and Recursions

Computational Complexity - Pseudocode and Recursions Computational Complexity - Pseudocode and Recursions Nicholas Mainardi 1 Dipartimento di Elettronica e Informazione Politecnico di Milano nicholas.mainardi@polimi.it June 6, 2018 1 Partly Based on Alessandro

More information

Theory of Computation

Theory of Computation Theory of Computation Dr. Sarmad Abbasi Dr. Sarmad Abbasi () Theory of Computation 1 / 38 Lecture 21: Overview Big-Oh notation. Little-o notation. Time Complexity Classes Non-deterministic TMs The Class

More information

CS483 Design and Analysis of Algorithms

CS483 Design and Analysis of Algorithms CS483 Design and Analysis of Algorithms Lecture 6-8 Divide and Conquer Algorithms Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: STII, Room 443, Friday 4:00pm - 6:00pm or by appointments

More information

Fundamental Algorithms

Fundamental Algorithms Fundamental Algorithms Chapter 1: Introduction Jan Křetínský Winter 2016/17 Chapter 1: Introduction, Winter 2016/17 1 Part I Overview Chapter 1: Introduction, Winter 2016/17 2 Organization Extent: 2 SWS

More information

Fundamental Algorithms

Fundamental Algorithms Chapter 2: Sorting, Winter 2018/19 1 Fundamental Algorithms Chapter 2: Sorting Jan Křetínský Winter 2018/19 Chapter 2: Sorting, Winter 2018/19 2 Part I Simple Sorts Chapter 2: Sorting, Winter 2018/19 3

More information

Asymptotic Analysis and Recurrences

Asymptotic Analysis and Recurrences Appendix A Asymptotic Analysis and Recurrences A.1 Overview We discuss the notion of asymptotic analysis and introduce O, Ω, Θ, and o notation. We then turn to the topic of recurrences, discussing several

More information

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence A. V. Gerbessiotis CS 610-102 Spring 2014 PS 1 Jan 27, 2014 No points For the remainder of the course Give an algorithm means: describe an algorithm, show that it works as claimed, analyze its worst-case

More information

Search Trees. Chapter 10. CSE 2011 Prof. J. Elder Last Updated: :52 AM

Search Trees. Chapter 10. CSE 2011 Prof. J. Elder Last Updated: :52 AM Search Trees Chapter 1 < 6 2 > 1 4 = 8 9-1 - Outline Ø Binary Search Trees Ø AVL Trees Ø Splay Trees - 2 - Outline Ø Binary Search Trees Ø AVL Trees Ø Splay Trees - 3 - Binary Search Trees Ø A binary search

More information

7.3 AVL-Trees. Definition 15. Lemma 16. AVL-trees are binary search trees that fulfill the following balance condition.

7.3 AVL-Trees. Definition 15. Lemma 16. AVL-trees are binary search trees that fulfill the following balance condition. Definition 15 AVL-trees are binary search trees that fulfill the following balance condition. F eery node height(left sub-tree()) height(right sub-tree()) 1. Lemma 16 An AVL-tree of height h contains at

More information

Data Structures and Algorithms CMPSC 465

Data Structures and Algorithms CMPSC 465 Data Structures and Algorithms CMPSC 465 LECTURE 9 Solving recurrences Substitution method Adam Smith S. Raskhodnikova and A. Smith; based on slides by E. Demaine and C. Leiserson Review question Draw

More information

Divide-and-Conquer Algorithms Part Two

Divide-and-Conquer Algorithms Part Two Divide-and-Conquer Algorithms Part Two Recap from Last Time Divide-and-Conquer Algorithms A divide-and-conquer algorithm is one that works as follows: (Divide) Split the input apart into multiple smaller

More information

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 CS 61B More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 Here is a review of some formulas that you will find useful when doing asymptotic analysis. ˆ N i=1 i = 1 + 2 + 3 + 4 + + N = N(N+1)

More information

16. Binary Search Trees. [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap ]

16. Binary Search Trees. [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap ] 423 16. Binary Search Trees [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap. 12.1-12.3] Dictionary implementation 424 Hashing: implementation of dictionaries with expected very fast access times. Disadvantages

More information

Data Structures and Algorithms Chapter 2

Data Structures and Algorithms Chapter 2 1 Data Structures and Algorithms Chapter 2 Werner Nutt 2 Acknowledgments The course follows the book Introduction to Algorithms, by Cormen, Leiserson, Rivest and Stein, MIT Press [CLRST]. Many examples

More information

Big , and Definition Definition

Big , and Definition Definition Big O, Ω, and Θ Big-O gives us only a one-way comparison; if f is O(g) then g eventually is bigger than f from that point on, but in fact f could be very small in comparison. Example; 3n is O(2 2n ). We

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 9 Divide and Conquer Merge sort Counting Inversions Binary Search Exponentiation Solving Recurrences Recursion Tree Method Master Theorem Sofya Raskhodnikova S. Raskhodnikova;

More information

Fall 2017 November 10, Written Homework 5

Fall 2017 November 10, Written Homework 5 CS1800 Discrete Structures Profs. Aslam, Gold, & Pavlu Fall 2017 November 10, 2017 Assigned: Mon Nov 13 2017 Due: Wed Nov 29 2017 Instructions: Written Homework 5 The assignment has to be uploaded to blackboard

More information

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017 CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis Riley Porter Announcements Op4onal Java Review Sec4on: PAA A102 Tuesday, January 10 th, 3:30-4:30pm. Any materials covered will be posted

More information

Recursion. Computational complexity

Recursion. Computational complexity List. Babes-Bolyai University arthur@cs.ubbcluj.ro Overview List 1 2 List Second Laboratory Test List Will take place week 12, during the laboratory You will receive one problem statement, from what was

More information

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, Divide and Conquer CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Merging sorted lists: WHAT Given two sorted lists a 1 a 2 a 3 a k b 1 b 2 b 3 b

More information

16. Binary Search Trees. [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap ]

16. Binary Search Trees. [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap ] 418 16. Binary Search Trees [Ottman/Widmayer, Kap. 5.1, Cormen et al, Kap. 12.1-12.3] 419 Dictionary implementation Hashing: implementation of dictionaries with expected very fast access times. Disadvantages

More information

Analysis of Algorithms - Using Asymptotic Bounds -

Analysis of Algorithms - Using Asymptotic Bounds - Analysis of Algorithms - Using Asymptotic Bounds - Andreas Ermedahl MRTC (Mälardalens Real-Time Research Center) andreas.ermedahl@mdh.se Autumn 004 Rehersal: Asymptotic bounds Gives running time bounds

More information

Lecture 1 - Preliminaries

Lecture 1 - Preliminaries Advanced Algorithms Floriano Zini Free University of Bozen-Bolzano Faculty of Computer Science Academic Year 2013-2014 Lecture 1 - Preliminaries 1 Typography vs algorithms Johann Gutenberg (c. 1398 February

More information

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes Computational Complexity Lecture 02 - Basic Complexity Analysis Tom Kelsey & Susmit Sarkar School of Computer Science University of St Andrews http://www.cs.st-andrews.ac.uk/~tom/ twk@st-andrews.ac.uk

More information

Data Structures and Algorithms Running time and growth functions January 18, 2018

Data Structures and Algorithms Running time and growth functions January 18, 2018 Data Structures and Algorithms Running time and growth functions January 18, 2018 Measuring Running Time of Algorithms One way to measure the running time of an algorithm is to implement it and then study

More information

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019 CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2019 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis

More information

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review: Mathematical Background Mathematical Background CSE 373 Data Structures Today, we will review: Logs and eponents Series Recursion Motivation for Algorithm Analysis 5 January 007 CSE 373 - Math Background

More information

4.4 Recurrences and Big-O

4.4 Recurrences and Big-O 4.4. RECURRENCES AND BIG-O 91 4.4 Recurrences and Big-O In this section, we will extend our knowledge of recurrences, showing how to solve recurrences with big-o s in them, recurrences in which n b k,

More information

CS 161 Summer 2009 Homework #2 Sample Solutions

CS 161 Summer 2009 Homework #2 Sample Solutions CS 161 Summer 2009 Homework #2 Sample Solutions Regrade Policy: If you believe an error has been made in the grading of your homework, you may resubmit it for a regrade. If the error consists of more than

More information

Module 1: Analyzing the Efficiency of Algorithms

Module 1: Analyzing the Efficiency of Algorithms Module 1: Analyzing the Efficiency of Algorithms Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Based

More information

An analogy from Calculus: limits

An analogy from Calculus: limits COMP 250 Fall 2018 35 - big O Nov. 30, 2018 We have seen several algorithms in the course, and we have loosely characterized their runtimes in terms of the size n of the input. We say that the algorithm

More information

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation: Data structures Exercise 1 solution Question 1 Let s start by writing all the functions in big O notation: f 1 (n) = 2017 = O(1), f 2 (n) = 2 log 2 n = O(n 2 ), f 3 (n) = 2 n = O(2 n ), f 4 (n) = 1 = O

More information

CS361 Homework #3 Solutions

CS361 Homework #3 Solutions CS6 Homework # Solutions. Suppose I have a hash table with 5 locations. I would like to know how many items I can store in it before it becomes fairly likely that I have a collision, i.e., that two items

More information

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d CS161, Lecture 4 Median, Selection, and the Substitution Method Scribe: Albert Chen and Juliana Cook (2015), Sam Kim (2016), Gregory Valiant (2017) Date: January 23, 2017 1 Introduction Last lecture, we

More information

CSC236 Week 4. Larry Zhang

CSC236 Week 4. Larry Zhang CSC236 Week 4 Larry Zhang 1 Announcements PS2 due on Friday This week s tutorial: Exercises with big-oh PS1 feedback People generally did well Writing style need to be improved. This time the TAs are lenient,

More information

ICS 252 Introduction to Computer Design

ICS 252 Introduction to Computer Design ICS 252 fall 2006 Eli Bozorgzadeh Computer Science Department-UCI References and Copyright Textbooks referred [Mic94] G. De Micheli Synthesis and Optimization of Digital Circuits McGraw-Hill, 1994. [CLR90]

More information