Divide and Conquer. Andreas Klappenecker

Similar documents
Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Analysis of Algorithms - Using Asymptotic Bounds -

Divide and Conquer. Recurrence Relations

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Data Structures and Algorithms Chapter 3

1 Substitution method

CPS 616 DIVIDE-AND-CONQUER 6-1

Asymptotic Analysis and Recurrences

Practical Session #3 - Recursions

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Algorithms Chapter 4 Recurrences

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Divide and Conquer algorithms

Divide-and-Conquer Algorithms and Recurrence Relations. Niloufar Shafiei

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Data Structures and Algorithms Chapter 3

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Recurrence Relations

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

CS483 Design and Analysis of Algorithms

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Design and Analysis of Algorithms

data structures and algorithms lecture 2

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

Analysis of Multithreaded Algorithms

Divide & Conquer. CS 320, Fall Dr. Geri Georg, Instructor CS320 Div&Conq 1

CSCI 3110 Assignment 6 Solutions

Divide and Conquer. Arash Rafiey. 27 October, 2016

Introduction to Divide and Conquer

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Mergesort and Recurrences (CLRS 2.3, 4.4)

Algorithm Design and Analysis

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

ITEC2620 Introduction to Data Structures

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Asymptotic Algorithm Analysis & Sorting

Recurrence Relations

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

COMP Analysis of Algorithms & Data Structures

Data Structures and Algorithms CMPSC 465

Divide&Conquer: MergeSort. Algorithmic Thinking Luay Nakhleh Department of Computer Science Rice University Spring 2014

Chapter 4 Divide-and-Conquer

COMP Analysis of Algorithms & Data Structures

CS 470/570 Divide-and-Conquer. Format of Divide-and-Conquer algorithms: Master Recurrence Theorem (simpler version)

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Data Structures and Algorithms CSE 465

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

The Divide-and-Conquer Design Paradigm

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS 2210 Discrete Structures Advanced Counting. Fall 2017 Sukumar Ghosh

Divide-and-Conquer Algorithms Part Two

Outline. 1 Introduction. Merging and MergeSort. 3 Analysis. 4 Reference

CS483 Design and Analysis of Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Methods for solving recurrences

Notes for Recitation 14

CS 2110: INDUCTION DISCUSSION TOPICS

COMP 382: Reasoning about algorithms

Divide-and-Conquer and Recurrence Relations: Notes on Chapter 2, Dasgupta et.al. Howard A. Blair

Advanced Counting Techniques. Chapter 8

COMPUTER ALGORITHMS. Athasit Surarerks.

COMP 9024, Class notes, 11s2, Class 1

COMP 355 Advanced Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Discrete Mathematics U. Waterloo ECE 103, Spring 2010 Ashwin Nayak May 17, 2010 Recursion

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Divide and Conquer Algorithms

TTIC / CMSC Algorithms. Lectures 1 and 2 + background material

Lecture 3. Big-O notation, more recurrences!!

EECS 477: Introduction to algorithms. Lecture 5

CS F-01 Algorithm Analysis 1

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Divide and conquer. Philip II of Macedon

CSE 421 Algorithms. T(n) = at(n/b) + n c. Closest Pair Problem. Divide and Conquer Algorithms. What you really need to know about recurrences

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Introduction to Algorithms 6.046J/18.401J/SMA5503

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Reductions, Recursion and Divide and Conquer

CSE 613: Parallel Programming. Lectures ( Analyzing Divide-and-Conquer Algorithms )

CS/SE 2C03. Sample solutions to the assignment 1.

Divide and Conquer Algorithms

Divide and Conquer Algorithms

Transcription:

Divide and Conquer Andreas Klappenecker

The Divide and Conquer Paradigm The divide and conquer paradigm is important general technique for designing algorithms. In general, it follows the steps: - divide the problem into subproblems - recursively solve the subproblems - combine solutions to subproblems to get solution to original problem

Mergesort

Example: Mergesort DIVIDE an input sequence of length n into two parts: the initial n/2 elements and the final n/2 elements. RECURSIVELY sort the two halves, using sequences with 1 key as a basis of the recursion. COMBINE the two sorted subsequences by merging them

Mergesort Example Example courtesy of wikipedia

Recurrence Relation for Mergesort Let T(n) be the worst case running time of mergesort on a sequence of n keys If n = 1, then T(n) = Θ(1) (constant) If n > 1, then T(n) = T( n/2 ) + T( n/2 ) + Θ(n)

Simplified Recurrence Relation Let T(n) be the worst case running time of mergesort on a sequence of n keys If n = 1, then T(n) = Θ(1) (constant) If n > 1 and n even, then T(n) = 2 T(n/2) + Θ(n) [Indeed, we recursively sort the two subproblems of size n/2, and we need Θ(n) time to do the merge.]

Recurrence Relations

Master Theorem Consider a recurrence of the form! T(n) = a T(n/b) + f(n)! with a>=1, b>1, and f(n) eventually positive. a) If f(n) = O(n log b(a)-ε), then T(n)=Θ(n log b(a) ). b) If f(n) = Θ(n log b(a) ), then T(n)=Θ(n log b(a) log(n)). c) If f(n) = Ω(n log b(a)+ε) and f(n) is regular, then T(n) = Θ(f(n)) [f(n) regular iff af(n/b) <= cf(n) for some constant c<1 and all but finitely many n]

Roughly speaking... Essentially, the Master theorem compares the function f(n) with the function g(n)=n log b(a). Roughly, the theorem says: a) If f(n) << g(n) then T(n)=Θ(g(n)). b) If f(n) g(n) then T(n)=Θ(g(n)log(n)) c) If f(n) >> g(n) then T(n)=Θ(f(n)). Now go back and memorize the theorem!

Nothing is perfect The Master theorem does not cover all possible cases. For example, if! f(n) = Θ(n log b(a) log n), then we lie between cases 2) and 3), but the theorem does not apply. There exist better versions of the Master theorem that cover more cases, but these are harder to memorize.

Idea of the Proof Let us iteratively substitute the recurrence:

Idea of the Proof Thus, we obtained! T(n) = n log b(a) T(1) + Σ a i f(n/b i ) The proof proceeds by distinguishing three cases: 1) The first term in dominant: f(n) = O(n log b(a)-ε) 2) Each part of the summation is equally dominant: f(n) = Θ(n log b(a) ) 3) The summation can be bounded by a geometric series: f(n) = Ω (n log b(a)+ε) and the regularity of f is key to make the argument work.

Fast Integer Multiplication

Integer Multiplication Elementary school algorithm (in binary) 101001 = 41 x 101010 = 42 -------------------- 1010010 1010010 + 1010010 -------------------- 11010111010 = 1722

Integer Multiplication Elementary school algorithm (in binary) 101001 = 41 x 101010 = 42 -------------------- 1010010 1010010 + 1010010 Scan second number from right to left. Whenever you see a 1, add the first number to the result shifted by the appropriate number of bits. -------------------- 11010111010 = 1722

Integer Multiplication The multiplication of two n bits numbers takes Ω(n 2 ) time using the elementary school algorithm. Can we do better? Kolmogorov conjectured in one of his seminars that one cannot, but was proved wrong by Karatsuba.

Divide and Conquer Let s split the two integers X and Y into two parts: their most significant part and their least significant part. X = 2 n/2 A + B (where A and B are n/2 bit integers) Y = 2 n/2 C + D (where C and D are n/2 bit integers) XY = 2 n AC + 2 n/2 BC + 2 n/2 AD + BD.

How Did We Do? Multiplication by 2 x can be done in hardware with very low cost (just a shift). We can apply this algorithm recursively: We replaced one multiplication of n bits numbers by four multiplications of n/2 bits numbers and 3 shifts and 3 additions T(n) = 4T(n/2) + cn

Solve the Recurrence T(n) = 4T(n/2) + cn By the Master theorem, g(n) = n log 2(4) = n 2. Since cn = O(n 2-epsilon ), we can conclude that T(n) = Θ(n 2 ).

How can we do better? Suppose that we are able to reduce the number of multiplications from 4 to 3, allowing for more additions and shifts (but still a constant number). Then T(n) = 3T(n/2) + dn Master theorem: dn = O(n log2(3)-epsilon ), so we get T(n) = O(n log2(3) ) = O(n 1.585 ).

Karatsuba s Idea Let s rewrite XY = 2 n AC + 2 n/2 BC + 2 n/2 AD + BD in the form XY = (2 n - 2 n/2 )AC + 2 n/2 (A+B)(C+D) + (1-2 n/2 )BD. Done! Wow!

Summary Split input X into two parts A and B such that X = 2 n/2 A + B. Split input Y into two parts C and D such that Y = 2 n/2 C + D. Then calculate AC, (A+B)(C+D), BD. Copy and shift the results, and add/subtract: XY = (2 n - 2 n/2 )AC + 2 n/2 (A+B)(C+D) + (1-2 n/2 )BD.