Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

Size: px
Start display at page:

Download "Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes"

Transcription

1 Computational Complexity Lecture 02 - Basic Complexity Analysis Tom Kelsey & Susmit Sarkar School of Computer Science University of St Andrews twk@st-andrews.ac.uk Tom Kelsey and Susmit Sarkar CS3052-CC This lecture Time complexity analysis of actual algorithms Emphasise use of of O(n), Ω(n), and Θ(n) Time complexity is a combination of two factors 1 Time taken to do simple tasks on units of input 2 Frequency of those tasks The first of these is system dependent The second relates to the algorithm So abstract away from the first And concentrate on the frequencies For this lecture, a computer is what you think it is Tom Kelsey and Susmit Sarkar CS3052-CC Algorithm 1 generic for-loop code Require: Input X with X = n 1: Do c a things to initialise 2: for i = 1 to n do 3: c i things to X i 4: for j = 1 to n do 5: c j things to X j 6: c ij things to X i and X j 7: for k = 1 to n do 8: c k things to X k 9: c ik things to X i and X k 10: c jk things to X j and X k 11: c ijk things to X i, X j and X k 12: end for 13: end for 14: end for 15: Do c b things to return result Tom Kelsey and Susmit Sarkar CS3052-CC

2 Exact calculation of T(n) Constant time: (c a + c b )n 0 Linear time: (c i + c j + c k )n Quadratic time: (c ij + c jk + c ik )n 2 Cubic time: (c ijk )n 3 Large c means either "do many cheap things" or "do a few expensive things", or "do many expensive things" If c ijk is small, and (say) (c ij + c jk + c ik ) is large, then the quadratic term will dominate for small n After some n 0 the cubic term will dominate, for any positive c ijk Tom Kelsey and Susmit Sarkar CS3052-CC Exact calculation of T(n) Suppose (c a + c b ) = 1000 Suppose (c i + c j + c k ) = 100 Suppose (c ij + c jk + c ik ) = 3 Suppose (c ijk ) = 2 n has to be more than 10 before linear dominates constant effort, and more than 7 before n 3 dominates everything (since 8 3 = 512) T(n) 47n 4 n, so T(n) O(n 4 ) T(n) 0.01n 2 n, so T(n) Ω(n 2 ) If n 0 = 8 and c = c ijk, T(n) Ω(n 3 ) and O(n 3 ), hence T(n) Θ(n 3 ) Tom Kelsey and Susmit Sarkar CS3052-CC Asymptotic Order of Growth K. Wayne Princeton Tom Kelsey and Susmit Sarkar CS3052-CC

3 Asymptotic Order of Growth K. Wayne Princeton Tom Kelsey and Susmit Sarkar CS3052-CC Simplifying rules If some function g is an upper bound for your cost, then any upper bound for g is also an upper bound for your cost If some function g is a lower bound... If some function g is a tight bound... If some function h is an upper bound for two costs, then h is also an upper bound for the sum of those costs If some function h is a lower bound... If some function h is a tight bound... If some functions h and g are upper/lower/tight bounds for two costs, then the maximum of these is also an upper/lower/tight bound for the sum of those costs If some functions h and g are upper/lower/tight bounds for two costs, then the product of these is also an upper/lower/tight bound for the product of those costs Tom Kelsey and Susmit Sarkar CS3052-CC Asymptotic Order of Growth K. Wayne Princeton Tom Kelsey and Susmit Sarkar CS3052-CC

4 Asymptotic Order of Growth K. Wayne Princeton Tom Kelsey and Susmit Sarkar CS3052-CC Algorithm 2 example 1 Require: Input X with X = n 1: sum = 0 2: for i = 1 to n do 3: for j = 1 to n do 4: sum sum + 1 5: end for 6: end for 7: for k = 1 to n do 8: X k k 9: end for 10: return X Assignment is constant c 1 Final for-loop takes c 2 n = Θ(n) time Double loop involves each combination of (i, j) values i.e Θ(n 2 ) time Total time is Θ(c 1 + c 2 n + c 3 n 2 ) Using a simplifying rule, we only need the maximum of these functions So time is Θ(n 2 ) Tom Kelsey and Susmit Sarkar CS3052-CC Algorithm 3 example 2 1: sum1 = 0 2: for i = 1 to n do 3: for j = 1 to n do 4: sum1 sum : end for 6: end for 7: sum2 = 0 8: for i = 1 to n do 9: for j = 1 to i do 10: sum2 sum : end for 12: end for Assignments are constant c 1 First double loop takes c 2 n = Θ(n 2 ) time Second double loop takes c 3 ( n 1 + n) time i.e Θ(n 2 ) time Total time is Θ(c 1 + c 2 n 2 + ĉ 3 n 2 ) Using a simplifying rule, we only need the maximum of these functions So time is Θ(n 2 ), even though 2nd loop does half as much Tom Kelsey and Susmit Sarkar CS3052-CC

5 Algorithm 4 example 3 Require: n is a power of 2 1: sum1 = 0 2: for i = 1; i n; i = 2 do 3: for j = 1 to n do 4: sum1 sum : end for 6: end for 7: sum2 = 0 8: for i = 1; i n; i = 2 do 9: for j = 1 to i do 10: sum2 sum2 3 11: end for 12: end for Assignments are constant c 1 First double loop takes c 2 n log 2 n time i.e Θ(n log n) time Second double loop takes c 3 n time i.e Θ(n) time Total time is Θ(c 1 + c 2 n log n + c 3 n) Using a simplifying rule, we only need the maximum of these functions So time is Θ(n log n) Tom Kelsey and Susmit Sarkar CS3052-CC Problems with exact calculations Not always possible In fact, usually not possible In the examples, I used the two closed form solutions for summations given below These are not available, in general "Exact" only refers to frequencies, which are part of the algorithm Says nothing about assumed constants costs which are part of the system being used As we ll see, use of e.g. fast memory can drastically alter performance log(n) n = n log n, i=1 log(n) 2 i = 2 log(n) 1 = n 1 i=1 Tom Kelsey and Susmit Sarkar CS3052-CC More problems with exact calculations Neither exponents nor exponands have to be integral Strassen s algorithm is O(n ) 3-SAT using Speckenmeyer & Moniens 1985 method is O( n ) Usually straightforward to derive an upper bound So we often see big-o used as "the tightest bound we have derived so far" Often good theoretical lower bounds can be found, so we know that the exact worst-case time complexity is between the two Max Independent Set using a method due to Fomin et al. is O( n ) Lower bound of Ω( n ) for the (unknown) worst case running time Tom Kelsey and Susmit Sarkar CS3052-CC

6 What about average- and best case analysis? All our contrived examples take an exact number of steps for fixed n Most algorithms have if-statements if some conditions hold, stop, skip, or do less work roughly speaking, this is how we optimise our code Worst-case analysis assumes that none of these shortcuts happen we usually have to devise pathological inputs Best-case typically assumes that all shortcuts happen we devise optimal inputs Average-case involves empirical evaluation in principle this is simple run the code loads of times on loads of inputs, find a function that best describes the data but now we can t avoid system issues such as compiler flags, L2 cache use, pointer tricks, etc. Tom Kelsey and Susmit Sarkar CS3052-CC Empirical problems In Computer Science, the best framework for empirical evaluation a computer system is far from ideal I can run the same code with the same inputs on the same system and get wildly different results garbage collector kicks in, cron job starts, etc. We find it hard to control for all the possibly-confounding variables Many published CS methods can not be replicated Serious CS research communities do their best to overcome these problems libraries of test problems, competitions at conferences, evaluate old vs new on the same system, etc. CS is not as empirically rigorous as Physics, Chemistry, Medicine or Biology sometimes due to unscientific practice by people who should know better often for very good reasons Tom Kelsey and Susmit Sarkar CS3052-CC Divide & conquer The last few slides motivate why courses on this topic usually start with abstract models of computation I want to make sure that you are comfortable the terminology and results I use in those lectures So we re continuing with analysis of actual algorithms Recursive methods are used extensively some languages only allow recursive functions The analysis for for- and while-loops doesn t work Many recursive methods involve chopping the input into manageable chunks, then doing something to the chunks Instead of attacking the input as a monolithic entity This algorithmic design pattern is known as divide & conquer And we can often deduce complexity results from the associated recurrence relations Tom Kelsey and Susmit Sarkar CS3052-CC

7 Algorithm 5 Mergesort 1: procedure MSORT(X,L,R) 2: mid (L + R)/2 3: if L < R then 4: MSORT(X,L,mid) 5: MSORT(X,mid + 1,R) 6: COMBINE(X,L,mid,R) 7: end if 8: end procedure X consists of n comparables Strictly speaking, getting R is Θ(n) itself Let T(n) be time needed by MSORT Clearly T(1) = O(1) T(n) = 2T(n/2) + O(n) Two recursive calls Each applied to half the input the O(n) is for combine write O(n) as n to simplify things Tom Kelsey and Susmit Sarkar CS3052-CC Mergesort solve the RR by hand T(n) = 2T(n/2) + n = 2[2T(n/4) + n/2] + n = 4T(n/4) + 2n = 4[2T(n/8) + n/4] + 2n = 8T(n/8) + 3n =... = 16T(n/16) + 4n =... = 2 k T(n/2 k ) + kn = 2 log2n T(1) + (log 2 (n))n (since log 2 (n) = k) = n + (log 2 (n))n = O(n log 2 (n)) Tom Kelsey and Susmit Sarkar CS3052-CC Problems with solving by analysis That was a lot of effort for a simple result, and it might not generalise to other procedures We needed to make a simplifying assumption, and we still lack an induction proof that the solution of the RR is of the form n log(n) Deriving the starting RR was simple, though Count the recursive calls, spot the rate of division of the input, and calculate the work needed at each conquer stage We d like to be able to solve these RRs by inspection of a case split most of the time It turns out that this can be done Tom Kelsey and Susmit Sarkar CS3052-CC

8 Tom Kelsey and Susmit Sarkar CS3052-CC Mergesort using the MT T(n) = 2T(n/2) + n, so a = b = 2 and f (n) = n Case 1: n = O(n log2(2) ɛ )? = O(n 1 ɛ )? Case 2: n = Θ(n log k (n)) for some k 0? = Θ(n)? k = 0 works, so T(n) = Θ(n log(n)) Tom Kelsey and Susmit Sarkar CS3052-CC Algorithm 6 Example: binary search 1: procedure BINSCH(X,lo,hi,x) 2: mid lo + (hi lo)/2 3: if x < X mid then 4: return BINSCH(X,lo,mid 1,x) 5: else if x > X mid then 6: return BINSCH(X,mid + 1,hi,x) 7: else 8: return mid 9: end if 10: end procedure X consists of n ordered comparables As before, we ignore cost of getting input Let T(n) be time needed by BINSCH T(1) isn t needed T(n) = T(n/2) + O(1) At most one recursive call at any level Each applied to half the input the O(1) is for comparisons and returns Tom Kelsey and Susmit Sarkar CS3052-CC

9 Binary Search using the MT T(n) = T(n/2) + 1, so a = 1, b = 2 and f (n) = K Case 1: Case 2: K = O(n log 2(1) ɛ )? = O(n 0 ɛ )? K = Θ(n log 2(1) log k (n)) for some k 0? = Θ(n 0 log k (n)) for some k 0? = Θ(1 log 0 (n))? = Θ(1)? Tom Kelsey and Susmit Sarkar CS3052-CC Tom Kelsey and Susmit Sarkar CS3052-CC Tree Traversal & Print Paths using the MT Some combination of recurse left, print and recurse right check your notes for CS2001 last year T(n) = 2T(n/2) + O(1), so a = 2, b = 2 and f (n) = K Case 1: K = O(n log 2(2) ɛ ) for some ɛ 0? = O(n 1 ɛ )? Any ɛ in (0, 1) works, so T(n) = Θ(n log 2(2) ) = Θ(n) Tom Kelsey and Susmit Sarkar CS3052-CC

10 Master Theorem examples T(n) = 4T(n/2) + log(n), so a = 4, b = 2 and f (n) = log(n) Case 1: log(n) = O(n log2(4) ɛ ) for some ɛ > 0? = O(n 2 ɛ )? Any ɛ in (0, 1) works, so T(n) = Θ(n log 2(4) ) = Θ(n 2 ) Tom Kelsey and Susmit Sarkar CS3052-CC Master Theorem examples T(n) = 2T(n/2) + n log(n), so a = 2, b = 2 and f (n) = n log(n) Case 2: n log(n) = Θ(n log 2(2) log k (n)) for some k 0? = Θ(n 1 log k (n)) for some k 0? = Θ(n log 1 (n))? = Θ(n log(n)) k = 1 works, so T(n) = Θ(n log 2 (n)) Tom Kelsey and Susmit Sarkar CS3052-CC Master Theorem examples T(n) = 6T(n/3) + n 2 log(n), so a = 6, b = 3 and f (n) = n 2 log(n) Case 3: n 2 log(n) = Ω(n log 3(6)+ɛ ) for some ɛ > 0? log 3 (6) 1.6, so any ɛ < 2 log 3 (6) will do Still have to check the regularity condition: 6 n2 ( n ) 3 log cn 2 log(n)? 3 for positive c and large enough n. Any c bigger than 2 will do, so the result holds Tom Kelsey and Susmit Sarkar CS3052-CC

11 Master Theorem Summary We have a divide-and-conquer algorithm with known rate of division, number of recursive calls, and amount of work done at each level of recursion If the amount of work is larger than total recursive work, then this will dominate for f (n) = 2 n or f (n) = n! case 3 almost always holds If the amount of work is negligible at each recursion, then the recursive calls dominate case 1 It often happens that the amount of work adds log terms to recursive complexity case 2 So we can use the Master Theorem as a shortcut, avoiding the "correct" way to solve such RRs Tom Kelsey and Susmit Sarkar CS3052-CC Master Theorem Limitations Not applicable to many recursive methods Sequential search: T(n) = T(n 1) + O(1) Selection sort: T(n) = T(n 1) + O(n) Naïve Fibonacci: F(n) = F(n 1) + F(n 2) Obvious source of tricky exam questions for undergraduates For some problems, the shortcut may be no easier than the correct method: Guess an upper bound for the complexity Produce a proof by induction that validates your guess Guess a lower bound... Prove by induction... If both guesses were the same great, otherwise start tightening from above and/or below Tom Kelsey and Susmit Sarkar CS3052-CC Master Theorem Avoidance Most divide & conquer methods have RRs of the form T(n) = at(n/b) + f (n) Imagine a recursion tree with branching rate a and node values f (n/b i ) at depth i. Assume that T(1) = 1 (since we only care about asymptotic bounds) and make couple of other assumptions that do hold in general Now T(n) is the sum of all the nodes in the tree, and we can use a computer algebra system Maple or Mathematica to search for a closed form solution, or some simplified form T(n) = L i=1 a i f (n/b i ) Tom Kelsey and Susmit Sarkar CS3052-CC

12 Master Theorem Avoidance Even if there is no closed form solution, the series are often geometric the final term is larger than all its predecessors In which case, the final term is the dominating function we use for the complexity For mergesort we have 2 i nodes at each of log(n) levels, each with value n/2 i : log(n) T(n) = n i=1 Using the closed form from an earlier slide, T(n) = Θ(n log(n)) Tom Kelsey and Susmit Sarkar CS3052-CC Tom Kelsey and Susmit Sarkar CS3052-CC Naïve Fibonacci Possibly mentioned in CS2001 last year Without memoisation, we recompute each sub-value on every recursive branch that it appears Which is either all of them or nearly all of them So we conjecture that F(n) O(r n ) for some r > 1 F(n) = F(n 1) + F(n 2) αr n 1 + αr n 2 our guess αr n? So we want r 2 r 1 0, which is fine since the largest root is the golden ratio φ Now use the base cases F(0) φ 0 = 0 F(1) = 0 and 1 φ 1 = 1 φ > 0 Tom Kelsey and Susmit Sarkar CS3052-CC

13 Naïve Fibonacci We now have the two numbers needed to show that F(n) is O(r n ) φ is the exponand greater than 1, r 1/φ is the positive constant α with F(n) αr n n Trying the same with lower bounds F(n) βr n is not easy But we don t need a lower bound for all n, just above a certain n 0 Setting N 0 as 2 makes life much easier Just one base case F(2) φ 2 = 1 φ < 1 φ Tom Kelsey and Susmit Sarkar CS3052-CC Naïve Fibonacci We now have the three numbers needed to show that F(n) is Ω(r n ) φ is the exponand greater than 1, r 1/φ 2 is the positive constant β which is less than α n 0 = 2 with F(n) βr n n n 0 We have upper and lower bounds with the same exponand So we have a tight bound F(n) = Θ(φ n ) n 2 This is more Maths than CS, but a lot of complexity analysis is inherently mathematical We ll do a couple of more straightforward examples Tom Kelsey and Susmit Sarkar CS3052-CC Towers of Hanoi 1 Move the top n 1 disks from Tower 1 to Tower 2 2 Move the nth disk from Tower 1 to Tower 3 3 Move the n 1 disks from Tower 2 to Tower 3 Calculate for small n: T(n) = 2T(n 1) + 1, T(0) = 0 T(0) = 0, T(1) = 1, T(2) = 3, T(3) = 7, T(4) = 15, T(5) = 31, T(6) = 63,... For this course only I ll accept the above as justification for the assertion that 1 T(n) = 2 n 1 2 the algorithm is Θ(2 n ) Tom Kelsey and Susmit Sarkar CS3052-CC

14 Naïve Fibonacci again The reason I used r ealier is because it s a well-known exact example In general, we wouldn t expect things to work out so nicely So we conjecture that F(n) O(2 n ) F(n) = F(n 1) + F(n 2) α2 n 1 + α2 n 2 our guess α2 n? So we want 2 n 2 n n 2, which is true for n 2 We can forget about α and any base cases, and assert that F(n) = O(2 n ) And let someone else worry about lower and tight bounds Tom Kelsey and Susmit Sarkar CS3052-CC Summary We can do exact time complexity analysis in some cases Simple for- and while-loop methods using closed form solutions of finite summations Divide & conquer recursive methods using patterns, the Master Theorem, or recursion trees or full induction proofs for each type of bound Calculating the exact complexity of complicated algorithms is at best not easy So we often use the best lower bound available Tom Kelsey and Susmit Sarkar CS3052-CC Distler & Kelsey, 2008 Tom Kelsey and Susmit Sarkar CS3052-CC

15 Next lecture We ve assumed properties of computational frameworks and algorithms And classes of problems We can justify the use of of O(n), Ω(n), and Θ(n) Essentially, they deal with our abstraction away from real systems and mitigate our inability to compute most exact results With this out of the way, we go back to a much higher level of abstraction No assumptions about computational frameworks, algorithms, or classes of problems What progress can we make at a fully abstracted level? Tom Kelsey and Susmit Sarkar CS3052-CC

Asymptotic Analysis and Recurrences

Asymptotic Analysis and Recurrences Appendix A Asymptotic Analysis and Recurrences A.1 Overview We discuss the notion of asymptotic analysis and introduce O, Ω, Θ, and o notation. We then turn to the topic of recurrences, discussing several

More information

Analysis of Algorithms - Using Asymptotic Bounds -

Analysis of Algorithms - Using Asymptotic Bounds - Analysis of Algorithms - Using Asymptotic Bounds - Andreas Ermedahl MRTC (Mälardalens Real-Time Research Center) andreas.ermedahl@mdh.se Autumn 004 Rehersal: Asymptotic bounds Gives running time bounds

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17 2.1 Notes Homework 1 will be released today, and is due a week from today by the beginning

More information

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 MA008 p.1/37 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/37 Exercise 1 (from LN 2) Asymptotic Notation When constants appear in exponents

More information

An analogy from Calculus: limits

An analogy from Calculus: limits COMP 250 Fall 2018 35 - big O Nov. 30, 2018 We have seen several algorithms in the course, and we have loosely characterized their runtimes in terms of the size n of the input. We say that the algorithm

More information

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer. Department of Computer Science Virginia Tech Blacksburg, Virginia Copyright c 2010,2017 by Clifford A. Shaffer Data and Algorithm Analysis Title page Data and Algorithm Analysis Clifford A. Shaffer Spring

More information

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2 1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science,

More information

Asymptotic Algorithm Analysis & Sorting

Asymptotic Algorithm Analysis & Sorting Asymptotic Algorithm Analysis & Sorting (Version of 5th March 2010) (Based on original slides by John Hamer and Yves Deville) We can analyse an algorithm without needing to run it, and in so doing we can

More information

Divide and Conquer. Recurrence Relations

Divide and Conquer. Recurrence Relations Divide and Conquer Recurrence Relations Divide-and-Conquer Strategy: Break up problem into parts. Solve each part recursively. Combine solutions to sub-problems into overall solution. 2 MergeSort Mergesort.

More information

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem CS 577 Introduction to Algorithms: Jin-Yi Cai University of Wisconsin Madison In the last class, we described InsertionSort and showed that its worst-case running time is Θ(n 2 ). Check Figure 2.2 for

More information

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University CS 5321: Advanced Algorithms - Recurrence Ali Ebnenasir Department of Computer Science Michigan Technological University Acknowledgement Eric Torng Moon Jung Chung Charles Ofria Outline Motivating example:

More information

Lecture 3. Big-O notation, more recurrences!!

Lecture 3. Big-O notation, more recurrences!! Lecture 3 Big-O notation, more recurrences!! Announcements! HW1 is posted! (Due Friday) See Piazza for a list of HW clarifications First recitation section was this morning, there s another tomorrow (same

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 14 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 14 Notes Goals for this week Big-O complexity

More information

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms. Solving recurrences Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms Example: Merge-Sort(A, p, r) 1: if p < r then 2: q (p + r)/2 3: Merge-Sort(A,

More information

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline CS 5321: Advanced Algorithms Analysis Using Recurrence Ali Ebnenasir Department of Computer Science Michigan Technological University Acknowledgement Eric Torng Moon Jung Chung Charles Ofria Outline Motivating

More information

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence. 1 Recurrences and Recursive Code Solving Recurrences Many (perhaps most) recursive algorithms fall into one of two categories: tail recursion and divide-andconquer recursion. We would like to develop some

More information

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 CS 61B More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 Here is a review of some formulas that you will find useful when doing asymptotic analysis. ˆ N i=1 i = 1 + 2 + 3 + 4 + + N = N(N+1)

More information

Practical Session #3 - Recursions

Practical Session #3 - Recursions Practical Session #3 - Recursions Substitution method Guess the form of the solution and prove it by induction Iteration Method Convert the recurrence into a summation and solve it Tightly bound a recurrence

More information

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions

More information

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2 CMPSCI611: Three Divide-and-Conquer Examples Lecture 2 Last lecture we presented and analyzed Mergesort, a simple divide-and-conquer algorithm. We then stated and proved the Master Theorem, which gives

More information

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee Algorithm Analysis Recurrence Relation Chung-Ang University, Jaesung Lee Recursion 2 Recursion 3 Recursion in Real-world Fibonacci sequence = + Initial conditions: = 0 and = 1. = + = + = + 0, 1, 1, 2,

More information

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm: The maximum-subarray problem Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm: Brute force algorithm: At best, θ(n 2 ) time complexity 129 Can we do divide

More information

CS483 Design and Analysis of Algorithms

CS483 Design and Analysis of Algorithms CS483 Design and Analysis of Algorithms Chapter 2 Divide and Conquer Algorithms Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: Room 5326, Engineering Building, Thursday 4:30pm -

More information

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu Analysis of Algorithm Efficiency Dr. Yingwu Zhu Measure Algorithm Efficiency Time efficiency How fast the algorithm runs; amount of time required to accomplish the task Our focus! Space efficiency Amount

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2 In-Class Soln 1 Let f(n) be an always positive function and let g(n) = f(n) log n. Show that f(n) = o(g(n)) CS 361, Lecture 4 Jared Saia University of New Mexico For any positive constant c, we want to

More information

Review Of Topics. Review: Induction

Review Of Topics. Review: Induction Review Of Topics Asymptotic notation Solving recurrences Sorting algorithms Insertion sort Merge sort Heap sort Quick sort Counting sort Radix sort Medians/order statistics Randomized algorithm Worst-case

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products Recurrences Objective running time as recursive function solve recurrence for order of growth method: substitution method: iteration/recursion tree method: MASTER method prerequisite: - mathematical induction,

More information

Algorithm efficiency analysis

Algorithm efficiency analysis Algorithm efficiency analysis Mădălina Răschip, Cristian Gaţu Faculty of Computer Science Alexandru Ioan Cuza University of Iaşi, Romania DS 2017/2018 Content Algorithm efficiency analysis Recursive function

More information

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review: Mathematical Background Mathematical Background CSE 373 Data Structures Today, we will review: Logs and eponents Series Recursion Motivation for Algorithm Analysis 5 January 007 CSE 373 - Math Background

More information

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch] Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch] Divide and Conquer Paradigm An important general technique for designing algorithms: divide problem into subproblems recursively

More information

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt Lecture 3 The Analysis of Recursive Algorithm Efficiency What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency

More information

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101 A design paradigm Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/17 112 Multiplying complex numbers (from Jeff Edmonds slides) INPUT: Two pairs of integers, (a,b),

More information

CSC236 Week 3. Larry Zhang

CSC236 Week 3. Larry Zhang CSC236 Week 3 Larry Zhang 1 Announcements Problem Set 1 due this Friday Make sure to read Submission Instructions on the course web page. Search for Teammates on Piazza Educational memes: http://www.cs.toronto.edu/~ylzhang/csc236/memes.html

More information

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019 CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2019 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis

More information

CS 310 Advanced Data Structures and Algorithms

CS 310 Advanced Data Structures and Algorithms CS 310 Advanced Data Structures and Algorithms Runtime Analysis May 31, 2017 Tong Wang UMass Boston CS 310 May 31, 2017 1 / 37 Topics Weiss chapter 5 What is algorithm analysis Big O, big, big notations

More information

CSC236 Week 4. Larry Zhang

CSC236 Week 4. Larry Zhang CSC236 Week 4 Larry Zhang 1 Announcements PS2 due on Friday This week s tutorial: Exercises with big-oh PS1 feedback People generally did well Writing style need to be improved. This time the TAs are lenient,

More information

Lecture 3: Big-O and Big-Θ

Lecture 3: Big-O and Big-Θ Lecture 3: Big-O and Big-Θ COSC4: Algorithms and Data Structures Brendan McCane Department of Computer Science, University of Otago Landmark functions We saw that the amount of work done by Insertion Sort,

More information

CPS 616 DIVIDE-AND-CONQUER 6-1

CPS 616 DIVIDE-AND-CONQUER 6-1 CPS 616 DIVIDE-AND-CONQUER 6-1 DIVIDE-AND-CONQUER Approach 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger)

More information

data structures and algorithms lecture 2

data structures and algorithms lecture 2 data structures and algorithms 2018 09 06 lecture 2 recall: insertion sort Algorithm insertionsort(a, n): for j := 2 to n do key := A[j] i := j 1 while i 1 and A[i] > key do A[i + 1] := A[i] i := i 1 A[i

More information

Divide & Conquer. CS 320, Fall Dr. Geri Georg, Instructor CS320 Div&Conq 1

Divide & Conquer. CS 320, Fall Dr. Geri Georg, Instructor CS320 Div&Conq 1 Divide & Conquer CS 320, Fall 2017 Dr. Geri Georg, Instructor georg@colostate.edu CS320 Div&Conq 1 Strategy 1. Divide the problem up into equal sized sub problems 2. Solve the sub problems recursively

More information

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Lecture 1: Asymptotics, Recurrences, Elementary Sorting Lecture 1: Asymptotics, Recurrences, Elementary Sorting Instructor: Outline 1 Introduction to Asymptotic Analysis Rate of growth of functions Comparing and bounding functions: O, Θ, Ω Specifying running

More information

CS Analysis of Recursive Algorithms and Brute Force

CS Analysis of Recursive Algorithms and Brute Force CS483-05 Analysis of Recursive Algorithms and Brute Force Instructor: Fei Li Room 443 ST II Office hours: Tue. & Thur. 4:30pm - 5:30pm or by appointments lifei@cs.gmu.edu with subject: CS483 http://www.cs.gmu.edu/

More information

CDM. Recurrences and Fibonacci

CDM. Recurrences and Fibonacci CDM Recurrences and Fibonacci Klaus Sutner Carnegie Mellon University 20-fibonacci 2017/12/15 23:16 1 Recurrence Equations Second Order The Fibonacci Monoid Recurrence Equations 3 We can define a sequence

More information

CDM. Recurrences and Fibonacci. 20-fibonacci 2017/12/15 23:16. Terminology 4. Recurrence Equations 3. Solution and Asymptotics 6.

CDM. Recurrences and Fibonacci. 20-fibonacci 2017/12/15 23:16. Terminology 4. Recurrence Equations 3. Solution and Asymptotics 6. CDM Recurrences and Fibonacci 1 Recurrence Equations Klaus Sutner Carnegie Mellon University Second Order 20-fibonacci 2017/12/15 23:16 The Fibonacci Monoid Recurrence Equations 3 Terminology 4 We can

More information

Recursion: Introduction and Correctness

Recursion: Introduction and Correctness Recursion: Introduction and Correctness CSE21 Winter 2017, Day 7 (B00), Day 4-5 (A00) January 25, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Today s Plan From last time: intersecting sorted lists and

More information

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence. Solving Recurrences Recurrences and Recursive Code Many (perhaps most) recursive algorithms fall into one of two categories: tail recursion and divide-andconquer recursion. We would like to develop some

More information

Computational Complexity - Pseudocode and Recursions

Computational Complexity - Pseudocode and Recursions Computational Complexity - Pseudocode and Recursions Nicholas Mainardi 1 Dipartimento di Elettronica e Informazione Politecnico di Milano nicholas.mainardi@polimi.it June 6, 2018 1 Partly Based on Alessandro

More information

Analysis of Algorithms

Analysis of Algorithms Analysis of Algorithms Section 4.3 Prof. Nathan Wodarz Math 209 - Fall 2008 Contents 1 Analysis of Algorithms 2 1.1 Analysis of Algorithms....................... 2 2 Complexity Analysis 4 2.1 Notation

More information

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example Recurrence Relations Chapter 2 Divide and Conquer Equation or an inequality that describes a function by its values on smaller inputs. Recurrence relations arise when we analyze the running time of iterative

More information

Chapter 4 Divide-and-Conquer

Chapter 4 Divide-and-Conquer Chapter 4 Divide-and-Conquer 1 About this lecture (1) Recall the divide-and-conquer paradigm, which we used for merge sort: Divide the problem into a number of subproblems that are smaller instances of

More information

CS483 Design and Analysis of Algorithms

CS483 Design and Analysis of Algorithms CS483 Design and Analysis of Algorithms Lecture 6-8 Divide and Conquer Algorithms Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: STII, Room 443, Friday 4:00pm - 6:00pm or by appointments

More information

Reductions, Recursion and Divide and Conquer

Reductions, Recursion and Divide and Conquer Chapter 5 Reductions, Recursion and Divide and Conquer CS 473: Fundamental Algorithms, Fall 2011 September 13, 2011 5.1 Reductions and Recursion 5.1.0.1 Reduction Reducing problem A to problem B: (A) Algorithm

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17 12.1 Introduction Today we re going to do a couple more examples of dynamic programming. While

More information

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018 CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2018 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis

More information

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk CMPS 2200 Fall 2017 Divide-and-Conquer Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk 1 The divide-and-conquer design paradigm 1. Divide the problem (instance)

More information

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d CS161, Lecture 4 Median, Selection, and the Substitution Method Scribe: Albert Chen and Juliana Cook (2015), Sam Kim (2016), Gregory Valiant (2017) Date: January 23, 2017 1 Introduction Last lecture, we

More information

CS Data Structures and Algorithm Analysis

CS Data Structures and Algorithm Analysis CS 483 - Data Structures and Algorithm Analysis Lecture II: Chapter 2 R. Paul Wiegand George Mason University, Department of Computer Science February 1, 2006 Outline 1 Analysis Framework 2 Asymptotic

More information

Divide and Conquer Algorithms

Divide and Conquer Algorithms Divide and Conquer Algorithms T. M. Murali February 19, 2013 Divide and Conquer Break up a problem into several parts. Solve each part recursively. Solve base cases by brute force. Efficiently combine

More information

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count Types of formulas for basic operation count Exact formula e.g., C(n) = n(n-1)/2 Algorithms, Design and Analysis Big-Oh analysis, Brute Force, Divide and conquer intro Formula indicating order of growth

More information

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Xi Chen Columbia University We continue with two more asymptotic notation: o( ) and ω( ). Let f (n) and g(n) are functions that map

More information

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan Lecture 2 More Algorithm Analysis, Math and MCSS By: Sarah Buchanan Announcements Assignment #1 is posted online It is directly related to MCSS which we will be talking about today or Monday. There are

More information

Divide and Conquer. Andreas Klappenecker

Divide and Conquer. Andreas Klappenecker Divide and Conquer Andreas Klappenecker The Divide and Conquer Paradigm The divide and conquer paradigm is important general technique for designing algorithms. In general, it follows the steps: - divide

More information

Solving Recurrences. Lecture 23 CS2110 Fall 2011

Solving Recurrences. Lecture 23 CS2110 Fall 2011 Solving Recurrences Lecture 23 CS2110 Fall 2011 1 Announcements Makeup Prelim 2 Monday 11/21 7:30-9pm Upson 5130 Please do not discuss the prelim with your classmates! Quiz 4 next Tuesday in class Topics:

More information

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0 Asymptotic Notation Asymptotic notation deals with the behaviour of a function in the limit, that is, for sufficiently large values of its parameter. Often, when analysing the run time of an algorithm,

More information

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency Lecture 2 Fundamentals of the Analysis of Algorithm Efficiency 1 Lecture Contents 1. Analysis Framework 2. Asymptotic Notations and Basic Efficiency Classes 3. Mathematical Analysis of Nonrecursive Algorithms

More information

CS 2110: INDUCTION DISCUSSION TOPICS

CS 2110: INDUCTION DISCUSSION TOPICS CS 110: INDUCTION DISCUSSION TOPICS The following ideas are suggestions for how to handle your discussion classes. You can do as much or as little of this as you want. You can either present at the board,

More information

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018 CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2018 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis

More information

COMP 355 Advanced Algorithms

COMP 355 Advanced Algorithms COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Running Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that

More information

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms Taking Stock IE170: Algorithms in Systems Engineering: Lecture 3 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University January 19, 2007 Last Time Lots of funky math Playing

More information

1 Caveats of Parallel Algorithms

1 Caveats of Parallel Algorithms CME 323: Distriuted Algorithms and Optimization, Spring 2015 http://stanford.edu/ reza/dao. Instructor: Reza Zadeh, Matroid and Stanford. Lecture 1, 9/26/2015. Scried y Suhas Suresha, Pin Pin, Andreas

More information

Computational Complexity

Computational Complexity Computational Complexity S. V. N. Vishwanathan, Pinar Yanardag January 8, 016 1 Computational Complexity: What, Why, and How? Intuitively an algorithm is a well defined computational procedure that takes

More information

Methods for solving recurrences

Methods for solving recurrences Methods for solving recurrences Analyzing the complexity of mergesort The merge function Consider the following implementation: 1 int merge ( int v1, int n1, int v, int n ) { 3 int r = malloc ( ( n1+n

More information

Divide and Conquer. Arash Rafiey. 27 October, 2016

Divide and Conquer. Arash Rafiey. 27 October, 2016 27 October, 2016 Divide the problem into a number of subproblems Divide the problem into a number of subproblems Conquer the subproblems by solving them recursively or if they are small, there must be

More information

CS F-01 Algorithm Analysis 1

CS F-01 Algorithm Analysis 1 CS673-016F-01 Algorithm Analysis 1 01-0: Syllabus Office Hours Course Text Prerequisites Test Dates & Testing Policies Try to combine tests Grading Policies 01-1: How to Succeed Come to class. Pay attention.

More information

Introduction to Algorithms

Introduction to Algorithms Lecture 1 Introduction to Algorithms 1.1 Overview The purpose of this lecture is to give a brief overview of the topic of Algorithms and the kind of thinking it involves: why we focus on the subjects that

More information

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, Divide and Conquer CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Merging sorted lists: WHAT Given two sorted lists a 1 a 2 a 3 a k b 1 b 2 b 3 b

More information

1 Substitution method

1 Substitution method Recurrence Relations we have discussed asymptotic analysis of algorithms and various properties associated with asymptotic notation. As many algorithms are recursive in nature, it is natural to analyze

More information

Divide and Conquer Algorithms

Divide and Conquer Algorithms Divide and Conquer Algorithms T. M. Murali March 17, 2014 Divide and Conquer Break up a problem into several parts. Solve each part recursively. Solve base cases by brute force. Efficiently combine solutions

More information

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University) CIS 121 Analysis of Algorithms & Computational Complexity Slides based on materials provided by Mary Wootters (Stanford University) Today Sorting: InsertionSort vs MergeSort Analyzing the correctness of

More information

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that checks every

More information

Lecture 10: Powers of Matrices, Difference Equations

Lecture 10: Powers of Matrices, Difference Equations Lecture 10: Powers of Matrices, Difference Equations Difference Equations A difference equation, also sometimes called a recurrence equation is an equation that defines a sequence recursively, i.e. each

More information

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Inf 2B: Sorting, MergeSort and Divide-and-Conquer Inf 2B: Sorting, MergeSort and Divide-and-Conquer Lecture 7 of ADS thread Kyriakos Kalorkoti School of Informatics University of Edinburgh The Sorting Problem Input: Task: Array A of items with comparable

More information

Descriptive Statistics (And a little bit on rounding and significant digits)

Descriptive Statistics (And a little bit on rounding and significant digits) Descriptive Statistics (And a little bit on rounding and significant digits) Now that we know what our data look like, we d like to be able to describe it numerically. In other words, how can we represent

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 15 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 15 Notes Goals for this week Big-O complexity

More information

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 CS17 Integrated Introduction to Computer Science Klein Contents Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018 1 Tree definitions 1 2 Analysis of mergesort using a binary tree 1 3 Analysis of

More information

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity 15-251 Great Theoretical Ideas in Computer Science Lecture 7: Introduction to Computational Complexity September 20th, 2016 What have we done so far? What will we do next? What have we done so far? > Introduction

More information

Lecture 27: Theory of Computation. Marvin Zhang 08/08/2016

Lecture 27: Theory of Computation. Marvin Zhang 08/08/2016 Lecture 27: Theory of Computation Marvin Zhang 08/08/2016 Announcements Roadmap Introduction Functions Data Mutability Objects This week (Applications), the goals are: To go beyond CS 61A and see examples

More information

Divide and Conquer. Maximum/minimum. Median finding. CS125 Lecture 4 Fall 2016

Divide and Conquer. Maximum/minimum. Median finding. CS125 Lecture 4 Fall 2016 CS125 Lecture 4 Fall 2016 Divide and Conquer We have seen one general paradigm for finding algorithms: the greedy approach. We now consider another general paradigm, known as divide and conquer. We have

More information

Data Structures and Algorithms Chapter 3

Data Structures and Algorithms Chapter 3 Data Structures and Algorithms Chapter 3 1. Divide and conquer 2. Merge sort, repeated substitutions 3. Tiling 4. Recurrences Recurrences Running times of algorithms with recursive calls can be described

More information

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions

More information

CS 4349 Lecture August 30th, 2017

CS 4349 Lecture August 30th, 2017 CS 4349 Lecture August 30th, 2017 Main topics for #lecture include #recurrances. Prelude I will hold an extra office hour Friday, September 1st from 3:00pm to 4:00pm. Please fill out prerequisite forms

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 10, 2018 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. 1 /

More information

Pre-lecture R: Solving Recurrences

Pre-lecture R: Solving Recurrences R Solving Recurrences R.1 Fun with Fibonacci numbers Consider the reproductive cycle of bees. Each male bee has a mother but no father; each female bee has both a mother and a father. If we examine the

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 14, 2019 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. COMP

More information

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation: Data structures Exercise 1 solution Question 1 Let s start by writing all the functions in big O notation: f 1 (n) = 2017 = O(1), f 2 (n) = 2 log 2 n = O(n 2 ), f 3 (n) = 2 n = O(2 n ), f 4 (n) = 1 = O

More information

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004 Asymptotic Analysis Thomas A. Anastasio January 7, 004 1 Introduction As a programmer, you often have a choice of data structures and algorithms. Choosing the best one for a particular job involves, among

More information

MATRIX MULTIPLICATION AND INVERSION

MATRIX MULTIPLICATION AND INVERSION MATRIX MULTIPLICATION AND INVERSION MATH 196, SECTION 57 (VIPUL NAIK) Corresponding material in the book: Sections 2.3 and 2.4. Executive summary Note: The summary does not include some material from the

More information

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College Why analysis? We want to predict how the algorithm will behave (e.g. running time) on arbitrary inputs, and how it will

More information