Data Structures and Algorithms Chapter 3

Similar documents
Data Structures and Algorithms Chapter 3

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Analysis of Algorithms - Using Asymptotic Bounds -

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Divide and Conquer. Andreas Klappenecker

1 Substitution method

Methods for solving recurrences

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Asymptotic Analysis and Recurrences

Chapter 4 Divide-and-Conquer

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Algorithms Chapter 4 Recurrences

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

CPS 616 DIVIDE-AND-CONQUER 6-1

Data Structures and Algorithms CMPSC 465

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Divide-and-Conquer Algorithms and Recurrence Relations. Niloufar Shafiei

Review Of Topics. Review: Induction

Practical Session #3 - Recursions

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

CSCI 3110 Assignment 6 Solutions

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Recurrence Relations

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

Algorithms Design & Analysis. Analysis of Algorithm

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Divide and Conquer. Arash Rafiey. 27 October, 2016

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Asymptotic Algorithm Analysis & Sorting

Analysis of Multithreaded Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

The Divide-and-Conquer Design Paradigm

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Algorithm Design and Analysis

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Algorithm efficiency analysis

COMP Analysis of Algorithms & Data Structures

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

COMP Analysis of Algorithms & Data Structures

A SUMMARY OF RECURSION SOLVING TECHNIQUES

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Omega notation. Transitivity etc.

Divide and Conquer. Recurrence Relations

Divide and Conquer Strategy

COMPUTER ALGORITHMS. Athasit Surarerks.

CS 161 Summer 2009 Homework #2 Sample Solutions

Recurrence Relations

Introduction to Algorithms 6.046J/18.401J/SMA5503

CS483 Design and Analysis of Algorithms

COMP 382: Reasoning about algorithms

Data Structures and Algorithms

Lecture 4. Quicksort

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Appendix: Solving Recurrences

Lecture 3. Big-O notation, more recurrences!!

Divide and Conquer: Polynomial Multiplication Version of October 1 / 7, 24201

MIDTERM I CMPS Winter 2013 Warmuth

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Algorithms And Programming I. Lecture 5 Quicksort

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CS F-01 Algorithm Analysis 1

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Algorithms and Data Structures Lecture IV

Introduction to Divide and Conquer

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

Chapter 4. Recurrences

Computational Complexity

Divide-and-Conquer Algorithms Part Two

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Design and Analysis of Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Topic 17. Analysis of Algorithms

Chapter. Divide-and-Conquer. Contents

Data Structures and Algorithms

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Week 5: Quicksort, Lower bound, Greedy

Searching. Sorting. Lambdas

CS325: Analysis of Algorithms, Fall Midterm

Computational Complexity - Pseudocode and Recursions

Data Structures and Algorithms CSE 465

Transcription:

Data Structures and Algorithms Chapter 3 1. Divide and conquer 2. Merge sort, repeated substitutions 3. Tiling 4. Recurrences

Recurrences Running times of algorithms with recursive calls can be described using recurrences. A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. For divide and conquer algorithms: T n = { solving trivial problem NumPieces T n/subprobfactor divide combine Example: Merge Sort if n=1 } if n 1 T n = { 1 2T n/2 n if n=1 } if n 1 2

Solving Recurrences Repeated (backward) substitution method Expanding the recurrence by substitution and noticing a pattern (this is not a strictly formal proof). Substitution method guessing the solutions verifying the solution by the mathematical induction Recursion trees Master method templates for different classes of recurrences 3

Repeated Substitution Let s find the running time of merge sort (assume n=2 b ). 1 if n 1 T( n) 2 T( / n2 ) n if n 1 T(n) = 2T(n/2) + n substitute = 2(2T(n/4) + n/2) + n expand = 2 2 T(n/4) + 2n substitute = 2 2 (2T(n/8) + n/4) + 2n expand = 2 3 T(n/8) + 3n observe pattern 4

Repeated Substitution/2 From we get T(n) = 2 3 T(n/8) + 3n T(n) = 2 i T(n/2 i ) + i n An upper bound for i is log n: T(n) = 2 log n T(n/n) + n log n T(n) = n + n log n 5

Repeated Substitution Method 2 The procedure is straightforward: Substitute, Expand, Substitute, Expand, Observe a pattern and determine the expression after the i-th substitution. Find out what the highest value of i (number of iterations, e.g., log n) should be to get to the base case of the recurrence (e.g., T(1)). Insert the value of T(1) and the expression of i into your expression. 6

Analysis of Sort Merge Let s find a more exact running time of merge sort (assume n=2 b ). T( n) 2 if n1 2 T( n/ 2 ) n2 3 n if 1 T(n) = 2T(n/2) + 2n + 3 substitute = 2(2T(n/4) + n + 3) + 2n +3 expand = 2 2 T(n/4) + 4n + 2*3 + 3 substitute = 2 2 (2T(n/8) + n/2 + 3) + 4n + 2*3 + 3 expand = 2 3 T(n/2 3 ) + 2*3n + (2 2+ 2 1+ 2 0 )*3 observe pattern 7

Analysis of Sort Merge/2 6 3 T(n) = 2 i T(n/2 i ) + 2in + 3 i 1 2 j j=0 An upper bound for i is log n = 2 log n T(n/2 log n ) + 2nlog n + 3*(2 log n - 1) = 5n + 2nlog n 3 = Θ(n log n) 8

Substitution Method The substitution method to solve recurrences entails two steps: Guess the solution. Use induction to prove the solution. Example: T(n) = 4T(n/2) + n 9

Substitution Method/2 7 1) Guess T(n) = O(n 3 ), i.e., T(n) is of the form cn 3 2) Prove T(n) cn 3 by induction T(n) = 4T(n/2) + n recurrence 4c(n/2) 3 + n induction hypothesis = 0.5cn 3 + n simplify = cn 3 (0.5cn 3 n) rearrange cn 3 if c>=2 and n>=1 Thus T(n) = O(n 3 ) 10

Substitution Method/3 Tighter bound for T(n) = 4T(n/2) + n: Try to show T(n) = O(n 2 ) Prove T(n) cn 2 by induction T(n) = 4T(n/2) + n 4c(n/2) 2 + n = cn 2 + n NOT cn 2 => contradiction 11

Substitution Method/4 What is the problem? Rewriting T(n) = O(n 2 ) = cn 2 + (something positive) as T(n) cn 2 does not work with the inductive proof. Solution: Strengthen the hypothesis for the inductive proof: T(n) (answer you want) - (something > 0) 12

Substitution Method/5 4 Fixed proof: strengthen the inductive hypothesis by subtracting lower-order terms: Prove T(n) cn 2 - dn by induction T(n) = 4T(n/2) + n 4(c(n/2) 2 d(n/2)) + n = cn 2 2dn + n = cn 2 dn (dn n) cn 2 dn if d 1 13

Recursion Tree A recursion tree is a convenient way to visualize what happens when a recurrence is iterated. Good for guessing asymptotic solutions to recurrences n 2 n 2 1 4 n 2 1 2 n 2 T 1 4 n T 1 2 n T 1 1 6 n T 1 8 n T 1 8 n T 1 4 n 14

Recursion Tree/2 n 2 n 2 1 4 n 2 1 2 n 2 5 1 6 n 2 1 1 6 n 2 1 8 n 2 1 2 1 2 8 n 4 n 2 5 2 5 6 n 2 5 1 6 3 n 2 n 2 15

Master Method The idea is to solve a class of recurrences that have the form T(n) = at(n/b) + f(n) Assumptions: a 1 and b > 1, and f(n) is asymptotically positive. Abstractly speaking, T(n) is the runtime for an algorithm and we know that a subproblems of size n/b are solved recursively, each in time T(n/b). f(n) is the cost of dividing the problem and combining the results. In merge-sort T(n) = 2T(n/2) + (n). 16

Master Method/2 Iterating the recurrence (expanding the tree) yields T(n) = f(n) + at(n/b) = f(n) + af(n/b) + a 2 T(n/b 2 ) = f(n) + af(n/b) + a 2 f(n/b 2 ) + T(n) = + a b log n-1 f(n/a b log n-1 ) + a b log n T(1) b log n 1 j=0 a j f n /b j n b log a The first term is a division/recombination cost (totaled across all levels of the tree). The second term is the cost of doing all subproblems of size 1 (total of all work pushed to leaves). 17

Master Method/3 f(n) a f(n) f(n/b) a f(n/b) a f(n/b) a a f(n/b) b log n a 2 f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) f(n/b 2 ) a a a a a a a a a Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) Θ(1) n b log a Θ(1) Θ(1) Θ(1) log b n 1 Total: (n b log a ) + j=0 Note: split into a parts, b log n levels, a b log n = n b log a leaves. (n b log a ) a j f n/ b j 18

Master Method, Intuition n n Three common cases: 1. Running time dominated by cost at leaves. 2. Running time evenly distributed throughout the tree. 3. Running time dominated by cost at the root. To solve the recurrence, we need to identify the dominant term. n In each case compare f(n) with O(n b log a ). 19

Master Method, Case 1 f(n) = O(n b log a-ε ) for some constant ε>0 f(n) grows polynomially slower than n b log a (by factor n ε ). The work at the leaf level dominates T(n) = (n b log a ) Cost of all the leaves 20

Master Method, Case 2 f(n) = (n b log a ) f(n) and n b log a are asymptotically the same The work is distributed equally throughout the tree T(n) = (n b log a log n) (level cost) (number of levels) 21

Master Method, Case 3 f(n) = (n b log a+ε ) for some constant ε>0 Inverse of Case 1 f(n) grows polynomially faster than n b log a Also need a regularity condition c 1 and n 0 such a fthat n b( / c ) f ( n ) n n 0 0 The work at the root dominates T(n) = (f(n)) division/recombination cost 22

Master Theorem Summarized Given: recurrence of the form T(n) = at(n/b) + f(n) 1. f(n) = O(n b log a-ε ) => T(n) = (n b log a ) 2. f(n) = (n b log a ) => T(n) = (n b log a log n) 3. f(n) = (n b log a+ε ) and a f(n/b) α f(n) for some α < 1, n>n 0 => T(n) = (f(n)) 23

Strategy 1. Extract a, b, and f(n) from a given recurrence 2. Determine n b log a 3. Compare f(n) and n b log a asymptotically 4. Determine appropriate MT case and apply it Merge sort: T(n) = 2T(n/2) + Θ(n) a=2, b=2, f(n) = Θ(n) n 2 log2 = n Θ(n) = Θ(n) => Case 2: T(n) = Θ(n b log a log n) = Θ(n log n) 24

Examples of Master Method BinarySearch(A, l, r, q): m := (l+r)/2 if A[m]=q then return m else if A[m]>q then BinarySearch(A, l, m-1, q) else BinarySearch(A, m+1, r, q) T(n) = T(n/2) + 1 a=1, b=2, f(n) = 1 n 2 log1 = 1 1 = Θ(1) => Case 2: T(n) = Θ(log n) 25

Examples of Master Method/2 T(n) = 9T(n/3) + n a=9, b=3, f(n) = n n 3 log9 = n 2 n = Ο(n 3 log 9 - ε ) with ε = 1 => Case 1: T(n) = Θ(n 2 ) 26

Examples of Master Method/3 T(n) = 3T(n/4) + n log n a=3, b=4, f(n) = n log n n 4 log3 = n 0.792 n log n = Ω(n 4 log 3 + ε ) with ε = 0.208 => Case 3: regularity condition: af(n/b) <= cf(n) af(n/b) = 3(n/4)log(n/4) <= (3/4)n log n = cf(n) with c=3/4 T(n) = Θ(n log n) 15 27

BinarySearchRec1 Find a number in a sorted array: Trivial if the array contains one element. Else divide into two equal halves and solve each half. Combine the results. INPUT: A[1..n] sorted array of integers, q integer OUTPUT: index j s.t. A[j] = q, NIL if j(1 j n): A[j] q BinarySearchRec1(A, l, r, q): if l = r then if A[l] = q then return l else return NIL m := (l+r)/2 ret := BinarySearchRec1(A, l, m, q) if ret = NIL then return BinarySearchRec1(A, m+1, r, q) else return ret 28

T(n) of BinarySearchRec1 9 Example: BinarySearchRec1 T n = { 1 2T n/2 1 if n=1 } if n 1 Solving the recurrence yields T(n) = Θ(n) 29

BinarySearchRec2 T(n) = Θ(n) not better than brute force! Better way to conquer: Solve only one half! INPUT: A[1..n] sorted array of integers, q integer OUTPUT: j s.t. A[j] = q, NIL if j(1 j n): A[j] q BinarySearchRec2(A, l, r, q): if l = r then if A[l] = q then return l else return NIL m := (l+r)/2 if A[m] q then return BinarySearchRec2(A, l, m, q) else return BinarySearchRec2(A, m+1, r, q) 30

T(n) of BinarySearchRec2 T n ={ 1 T n/2 1 if n=1 } if n 1 10 Solving the recurrence yields T(n) = Θ(log n) 31

Example: Finding Min and Max 16 Given an unsorted array, find a minimum and a maximum element in the array. INPUT: A[l..r] an unsorted array of integers, l r. OUTPUT: (min,max) s.t. j(l j r): A[j] min and A[j] max MinMax(A, l, r): if l = r then return (A[l], A[r]) // Trivial case m := (l+r)/2 // Divide (minl,maxl) := MinMax(A, l, m) // Conquer (minr,maxr) := MinMax(A, m+1, r) // Conquer if minl < minr then min = minl else min = minr // Combine if maxl > maxr then max = maxl else max = maxr // Combine return (min,max) 32

Summary Divide and conquer Merge sort Tiling Recurrences repeated substitutions substitution master method Example recurrences: Binary search 33

Next Chapter Sorting HeapSort QuickSort 34