Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Similar documents
1 Substitution method

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Methods for solving recurrences

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

CS Non-recursive and Recursive Algorithm Analysis

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Computational Complexity

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

CSCI 3110 Assignment 6 Solutions

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Asymptotic Analysis 1

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

3.1 Asymptotic notation

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

CS473 - Algorithms I

Analysis of Algorithms - Using Asymptotic Bounds -

Divide and Conquer. Andreas Klappenecker

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Analysis of Algorithms

Analysis of Multithreaded Algorithms

Asymptotic Analysis and Recurrences

Analysis of Algorithms

Growth of Functions (CLRS 2.3,3)

Module 1: Analyzing the Efficiency of Algorithms

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Data Structures and Algorithms CSE 465

Divide and Conquer. Arash Rafiey. 27 October, 2016

CS 4407 Algorithms Lecture 2: Growth Functions

Cpt S 223. School of EECS, WSU

data structures and algorithms lecture 2

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Divide and Conquer algorithms

COMPUTER ALGORITHMS. Athasit Surarerks.

The Time Complexity of an Algorithm

Agenda. We ve discussed

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Module 1: Analyzing the Efficiency of Algorithms

Data Structures and Algorithms. Asymptotic notation

Introduction to Algorithms: Asymptotic Notation

CS F-01 Algorithm Analysis 1

Chapter 4 Divide-and-Conquer

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Algorithms Design & Analysis. Analysis of Algorithm

Data Structures and Algorithms Chapter 3

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

COMP 382: Reasoning about algorithms

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

EECS 477: Introduction to algorithms. Lecture 5

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

Fast Convolution; Strassen s Method

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

CSE332: Data Abstrac0ons Sec%on 2. HyeIn Kim Spring 2014

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Analysis of Algorithms

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

CS483 Design and Analysis of Algorithms

How many hours would you estimate that you spent on this assignment?

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

CS483 Design and Analysis of Algorithms

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Homework 1 Solutions

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Divide and Conquer. Recurrence Relations

Review Of Topics. Review: Induction

An analogy from Calculus: limits

Recurrence Relations

Data Structures and Algorithms

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Lecture 2: Asymptotic Notation CSCI Algorithms I

Running Time Evaluation

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Introduction to Algorithms

Big O (Asymptotic Upper Bound)

Lecture 3. Big-O notation, more recurrences!!

2. ALGORITHM ANALYSIS

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

Recurrence Relations

Analysis of Algorithms

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Data Structures and Algorithms CMPSC 465

Divide-and-Conquer Algorithms Part Two

Algorithm efficiency analysis

Asymptotic Analysis Cont'd

Transcription:

1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science, we care about the running time of algorithms in terms of the size of the input. For example, consider the following code: define f1(n): for i from 1 to n: for j from 1 to n: print i*j How many times does the print statement happen? Exactly n 2 times. Discovering asymptotics Now consider the following code: define f2(n): for i from 1 to n: for j from 1 to i: print i*j How many times does the print statement happen? Exactly 1 + 2 + + n = n(n+1) 2 = 1 2 n2 + 1 2 n.

2 But we usually don t care so much about the difference between n 2 and n2 + n. That is, we would 2 2 view both of these algorithms as taking the same amount of time to run, roughly. By roughly, we mean the running time of both of these functions is Θ(n 2 ). But what does this mean? Formallly defining Θ If we have a function g(n), we can define a set of functions Θ(g(n)) = {f(n) : c 1 > 0, c 2 > 0, n 0 > 0 such that 0 c 1 g(n) f(n) c 2 g(n) n n 0 } What does this actually mean? Let s draw a picture:

3 Using the definition on the two code examples Recall that we said that both n 2 and 1 2 n2 + 1 2 n were Θ(n2 ). Why is n 2 Θ(n 2 )? Pick, for example, c 1 = c 2 = 1 and n 0 = 1. Then clearly n 2 n 2 n 2 for all n 1. What about showing 1 2 n2 + 1 2 n? More care We need to find constants c 1, c 2, n 0 > 0 such that: c 1 n 2 1 2 n2 + 1 2 n c 2n 2 for all values of n that are larger than n 0. Let s pick n 0 = 1, since we can make it work. We can pick c 1 = 1 2 which satisfies the left half. Pick c 2 = 1, and the right half is satisfied, with a bit of algebra: n 2 = 1 2 n2 + 1 2 n2 = 1 2 n2 + n 1 n (n 1) 2 1 2 n2 + 1 2 n Pulling these apart: O and Ω We do not need to be as tight as Θ all the time. Sometimes, we care only about upper-bounds or lower-bounds. Notice that Θ actually has both of these. Let s pull them apart! O(g(n)) = {f(n) : c > 0, n 0 > 0 such that f(n) cg(n) n n 0 } Ω(g(n)) = {f(n) : c > 0, n 0 > 0 such that 0 cg(n) f(n) n n 0 } Notice that f(n) Θ(g(n)) if and only if f(n) O(g(n)) and f(n) Θ(g(n)).

4 Recursion in programming language How does this connect to the lecture from last week? We may have recursive functions! A simple example: define r1(n): if n == 0: print * else: print * r1(n-1) How many stars get printed? Computing the running time of recursive algorithms Let s suppose there are T (n) stars that get printed. Then, we know: T (0) = 1 T (n) = T (n 1) + 1 You should know how to solve this recurrence, especially if we change it into the form a 0 = 1 a n = a n 1 + 1 But, we shall see, that we don t need to find an exact solution if we are looking only for the Θ bounds.

5 Another example: define mergesort(a,s,f): if s < f: mid = (s+f)/2 t1 = mergesort(a,s,mid) t2 = mergesort(a,mid+1,f) A = merge(t1, t2) Here, the algorithm merge merges two sorted lists into one sorted list. Let s trace this algorithm on A = [7,1,8,3,6,4,5,2]

6 The Master Theorem Let s try to find bounds more generally. The Master Theorem 1 Let a 1, b > 1 be constants, and let f(n) be a function, and let T (n) be a recurrence defined on the non-negative integers as T (n) = at (n/b) + f(n). Then, T (n) can be bounded asymptotically as: If f(n) O(n log b a ɛ ) for some constant ɛ > 0, then T (n) = Θ(n log b a ) If f(n) Θ(n log b a ) then T (n) = Θ(n log b a log n) If f(n) Ω(n log b a+ɛ ) for some constant ɛ > 0, and if af(n/b) cf(n) for some constant c < 1, then T (n) = Θ(f(n)) What these cases mean Observations: we compare f(n) to n log b a if f(n) is smaller, then the bounds are Θ(n log b a ) if f(n) is larger, then the bounds are Θ(f(n)) if they are asymptotically equal, then we add a logarithmic term the ɛ term is effectively saying it must be polynomially slower/faster. In other words, n t ɛ = nt n ɛ.

7 Using the Master Theorem Let s use it to solve our mergesort recurrence, which was: T (n) = 2T (n/2) + n Here a = 2, b = 2 and f(n) = n. Notice that n log b a = n 1 = n. Which applies: is 1 Ω(n), O(n) or Θ(n)? n Θ(n) So, the second case applies, and we have T (n) Θ(n log n) Binary Search Suppose that we have a sorted array. We can find an element k using: define binsearch(a, k, s, f): if s > f: return "Not found" else: mid = (s+f)/2 if A[mid] == k: return "Found" if A[mid] > k: return binsearch(a, k, s, mid-1) if A[mid] < k: return binsearch(a, k, mid+1, f)

8 Recurrence for binsearch T (1) = 1 T (n) = T (n/2) + 1 Here a = 1, b = 2 and f(n) = 1. Notice that 1 Θ(n log 2 1 ) = Θ(n 0 ) = Θ(1). Thus, the second case applies and T (n) Θ(n 0 log n) = Θ(log n). One last example Suppose we have T (n) = 9T (n/3) + n. Then a = 9, b = 3 and f(n) = n, with n log b a = n log 3 9 Θ(n 2 ). Since f(n) O(n log 3 9 1 ), we apply case 1 and conclude that T (n) = Θ(n 2 ).