COMP Analysis of Algorithms & Data Structures

Similar documents
COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Algorithms, CSE, OSU Quicksort. Instructor: Anastasios Sidiropoulos

Recurrence Relations

Data selection. Lower complexity bound for sorting

Data Structures and Algorithms CSE 465

Analysis of Algorithms. Randomizing Quicksort

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Divide and Conquer. Arash Rafiey. 27 October, 2016

Analysis of Algorithms CMPSC 565

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Quicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort

Lecture 4. Quicksort

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

Data Structures and Algorithms

Algorithms And Programming I. Lecture 5 Quicksort

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

Fast Convolution; Strassen s Method

COMP 9024, Class notes, 11s2, Class 1

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

Divide-and-conquer. Curs 2015

CS483 Design and Analysis of Algorithms

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

1 Substitution method

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Algorithm efficiency analysis

Partition and Select

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

CSE 4502/5717: Big Data Analytics

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

CS 161 Summer 2009 Homework #2 Sample Solutions

CS483 Design and Analysis of Algorithms

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

COMP 250: Quicksort. Carlos G. Oliver. February 13, Carlos G. Oliver COMP 250: Quicksort February 13, / 21

Divide and Conquer Strategy

Linear Time Selection

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

Bucket-Sort. Have seen lower bound of Ω(nlog n) for comparisonbased. Some cheating algorithms achieve O(n), given certain assumptions re input

CPS 616 DIVIDE-AND-CONQUER 6-1

Methods for solving recurrences

Review Of Topics. Review: Induction

Quick Sort Notes , Spring 2010

Selection and Adversary Arguments. COMP 215 Lecture 19

Average Case Analysis. October 11, 2011

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Mergesort and Recurrences (CLRS 2.3, 4.4)

CSCI 3110 Assignment 6 Solutions

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Asymptotic Analysis and Recurrences

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

Data Structures and Algorithms CMPSC 465

Data Structures and Algorithms Chapter 3

Recursion. Algorithms and Data Structures. (c) Marcin Sydow. Introduction. Linear 2nd-order Equations. Important 3 Cases. Quicksort Average Complexity

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Divide and Conquer Algorithms

On Partial Sorting. Conrado Martínez. Univ. Politècnica de Catalunya, Spain

Data Structures and Algorithm Analysis (CSC317) Randomized Algorithms (part 3)

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

ITEC2620 Introduction to Data Structures

Divide and Conquer Algorithms

Chunksort: A Generalized Partial Sort Algorithm

COMP 382: Reasoning about algorithms

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

5. DIVIDE AND CONQUER I

Computational Complexity - Pseudocode and Recursions

1 Divide and Conquer (September 3)

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

N/4 + N/2 + N = 2N 2.

Quicksort algorithm Average case analysis

Asymptotic Algorithm Analysis & Sorting

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Introduction to Randomized Algorithms: Quick Sort and Quick Selection

Week 5: Quicksort, Lower bound, Greedy

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Searching. Sorting. Lambdas

COE428 Notes Week 4 (Week of Jan 30, 2017)

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Computational Complexity

Analysis of Algorithms - Using Asymptotic Bounds -

An analogy from Calculus: limits

Fast Sorting and Selection. A Lower Bound for Worst Case

Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

Fundamental Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Introduction to Algorithms 6.046J/18.401J/SMA5503

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Divide and Conquer Algorithms

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Lecture 6: Recurrent Algorithms:

Lecture 22: Multithreaded Algorithms CSCI Algorithms I. Andrew Rosenberg

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

Transcription:

COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 5 - Jan. 12, 2018 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. 1 / 9

Analysis of Recursions The following is the sloppy recurrence for time complexity of merge sort: { 2 T ( ) n T (n) = 2 + cn if n > 1 d if n = 1. We can find the solution using alternation method: T (n) = 2T (n/2) + cn = 2(2T (n/4) + cn/2) + cn = 4T (n/4) + 2cn = 4(2T (n/8) + cn/4) + 2cn = 8T (n/8) + 3cn =... = 2 k T (n/2 k ) + kcn = 2 log n T (1) + log ncn = Θ(n log n) 2 / 9

Substitution method Guess the growth function and prove an upper bound for it using induction. For merge-sort, prove T (n) < Mn log n for some value of M (that we choose). This holds for n = 2 since we have T (2) = 2d + 2c, which is less than 2M as long as M c + d (base of induction). Fix a value of n and assume the inequality holds for smaller values. we have T (n) = 2T (n/2) + cn 2M(n/2(log n/2)) + cn = Mn log n Mn + cn Mn log n as long as M is selected to be no less than c (the inequality comes from the induction hypothesis) This shows T (n) O(n log n) 3 / 9

Master theorem T (n) = (a 1, b > 1, and f (n) > 0) { a T ( n b ) + f (n) if n > 1 d if n = 1. Compare f (n) and n log b a Case 1: if f (n) O(n log b a ɛ ), then T (n) Θ(n log b a ) Case 2: if f (n) Θ(n log b a ) then T (n) Θ(n c log n) Case 3: if f (n) Ω(n log b a+ɛ ) and if af (n/b) cf (n) for some constant c < 1 (regularity condition), then T (n) Θ(f (n)) 4 / 9

Master theorem examples T (n) = 2T (n/2) + log n? 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) T (n) = 4T (n/4) + 100n 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) T (n) = 4T (n/4) + 100n case 2: T (n) Θ(n log n) 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) T (n) = 4T (n/4) + 100n case 2: T (n) Θ(n log n) T (n) = 3T (n/2) + n 2 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) T (n) = 4T (n/4) + 100n case 2: T (n) Θ(n log n) T (n) = 3T (n/2) + n 2 Case 3, check whether regularity condition holds, i.e., whether af (n/b) cf (n). Since we have 3(n/2) 2 = 3/4n 2 the regularity condition holds, i.e., f (n) Θ(n 2 ) 5 / 9

Master theorem examples T (n) = 2T (n/2) + log n? case 1: T (n) Θ(n) T (n) = 4T (n/4) + 100n case 2: T (n) Θ(n log n) T (n) = 3T (n/2) + n 2 Case 3, check whether regularity condition holds, i.e., whether af (n/b) cf (n). Since we have 3(n/2) 2 = 3/4n 2 the regularity condition holds, i.e., f (n) Θ(n 2 ) T (n) = T (n/2) + n(2 cos(n)) Case 3, check whether regularity condition holds. For n = 2kπ, we have cos(n/2) = 1 and cos(n) = 1; we have af (n/b) = n/2(2 cos(n/2)) = 3n/2, which is not within a factor c < 1 of f (n) = n(2 1) = n [i.e., we cannot say 3n/2 cn for any c < 1]. So we cannot get any conclusion from Master theorem. 5 / 9

QuickSort QuickSort QuickSelect is based on a sorting method developed by Hoare in 1960: quick-sort1(a) A: array of size n 1. if n 1 then return 2. p choose-pivot1(a) 3. i partition(a, p) 4. quick-sort1(a[0, 1,..., i 1]) 5. quick-sort1(a[i + 1,..., size(a) 1]) Here pivot is chosen arbitrarily (e.g., it is the first item in the array) 6 / 9

QuickSort Analysis of Quick-sort Worst case: T (worst) (n) = T (worst) (n 1) + Θ(n) The algorithm has a running time of Θ(n 2 ) in the worst case. 7 / 9

QuickSort Analysis of Quick-sort Worst case: T (worst) (n) = T (worst) (n 1) + Θ(n) The algorithm has a running time of Θ(n 2 ) in the worst case. Best case: T (best) (n) = T (best) ( n 1 2 ) + T (best) ( n 1 2 ) + Θ(n) Similar to Merge-sort; T (best) (n) Θ(n log n) 7 / 9

QuickSort Analysis of Quick-sort Worst case: T (worst) (n) = T (worst) (n 1) + Θ(n) The algorithm has a running time of Θ(n 2 ) in the worst case. Best case: T (best) (n) = T (best) ( n 1 2 ) + T (best) ( n 1 2 ) + Θ(n) Similar to Merge-sort; T (best) (n) Θ(n log n) Any other best case? T (n) = T (n/100) + T (99n/100) + cn which belongs to Θ(n log n) 7 / 9

QuickSort Average-case analysis of quick-sort In a comparison-based sorting the running time is proportional to the total number of comparisons performed during partitioning. 8 / 9

QuickSort Average-case analysis of quick-sort In a comparison-based sorting the running time is proportional to the total number of comparisons performed during partitioning. Let X n be a random variable denoting the number of comparisons made by quicksort on an array of size n. E[X n ] = expected no. ofcomparisons = prob(the i th and j th elements are compared i,j 0,...,n 1 8 / 9

QuickSort Average-case analysis of quick-sort In a comparison-based sorting the running time is proportional to the total number of comparisons performed during partitioning. Let X n be a random variable denoting the number of comparisons made by quicksort on an array of size n. E[X n ] = expected no. ofcomparisons = prob(the i th and j th elements are compared i,j 0,...,n 1 Elements i and j are compared iff one of them is selected as a pivot at some point before any other element in {i + 1, i + 2,..., j 1}. This occurs with probability 2 j i+1. In sequence 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, what is the chance that 3 and 7 are compared? the chance that 4, 5, 6 are Not selected as pivot before 3, 7 2/5 8 / 9

QuickSort Average-case analysis of quick-sort In a comparison-based sorting the running time is proportional to the total number of comparisons performed during partitioning. Let X n be a random variable denoting the number of comparisons made by quicksort on an array of size n. E[X n ] = expected no. ofcomparisons = prob(the i th and j th elements are compared i,j 0,...,n 1 Elements i and j are compared iff one of them is selected as a pivot at some point before any other element in {i + 1, i + 2,..., j 1}. 2 This occurs with probability j i+1. In sequence 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, what is the chance that 3 and 7 are compared? the chance that 4, 5, 6 are Not selected as pivot before 3, 7 2/5 The expected time complexity will be i,j 0,...,n 1,j>i 2 j i+1 8 / 9

QuickSort Average-case analysis of quick-sort For the expected time complexity of Quicksort, we have: 2 E[X n ] = j i + 1 = i,j 0,...,n 1,j>i n 2 n 1 i=0 j=i+1 n 2 n i = i=0 k=2 2 j i + 1 n 2 2 k < n i=0 k=2 2 k n 2 = Θ(log n) = Θ(n 2 ) i=0 Note that we used the fact that the sum of Harmonic series belongs to Θ(log n). 9 / 9