CSC 421: Algorithm Design & Analysis. Spring 2015

Similar documents
Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

CSC 8301 Design & Analysis of Algorithms: Lower Bounds

CS 350 Algorithms and Complexity

Find an Element x in an Unsorted Array

CS 350 Algorithms and Complexity

Limitations of Algorithm Power

Data Structures and Algorithms

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CS483 Design and Analysis of Algorithms

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

Design and Analysis of Algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CSC 421: Algorithm Design & Analysis. Spring 2018

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CPS 616 DIVIDE-AND-CONQUER 6-1

Recurrence Relations

b. Checking completeness of a graph represented by its adjacency matrix c. Generating all the subsets of a n-element set

Assignment 5: Solutions

compare to comparison and pointer based sorting, binary trees

Algorithms and Their Complexity

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

CSE 591 Homework 3 Sample Solutions. Problem 1

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Lower and Upper Bound Theory

Divide and Conquer Algorithms

Topic 17. Analysis of Algorithms

CS483 Design and Analysis of Algorithms

Lecture 2: A Las Vegas Algorithm for finding the closest pair of points in the plane

Divide and Conquer Algorithms

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

CSC236 Week 3. Larry Zhang

AMTH140 Trimester Question 1. [8 marks] Answer

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Bin Sort. Sorting integers in Range [1,...,n] Add all elements to table and then

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Selection and Adversary Arguments. COMP 215 Lecture 19

CS 2210 Discrete Structures Advanced Counting. Fall 2017 Sukumar Ghosh

TTIC / CMSC Algorithms. Lectures 1 and 2 + background material

CSC 2429 Approaches to the P vs. NP Question and Related Complexity Questions Lecture 2: Switching Lemma, AC 0 Circuit Lower Bounds

Divide-and-Conquer Algorithms Part Two

Exam in Discrete Mathematics

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Objec&ves. Review. Divide and conquer algorithms

Data Structures and Algorithms Chapter 3

Design and Analysis of Algorithms

Divide and Conquer. Arash Rafiey. 27 October, 2016

CS325: Analysis of Algorithms, Fall Midterm

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn

Chapter 10 Search Structures

COMP 9024, Class notes, 11s2, Class 1

Divide and Conquer. Andreas Klappenecker

CSC236 Intro. to the Theory of Computation Lecture 7: Master Theorem; more D&C; correctness

LOWELL. MICHIGAN. WEDNESDAY, FEBRUARY NUMllEE 33, Chicago. >::»«ad 0:30am, " 16.n«l w 00 ptn Jaekten,.'''4snd4:4(>a tii, ijilwopa

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Advanced Counting Techniques. Chapter 8

1 Substitution method

Data Structures and Algorithms CSE 465

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Introduction to Algorithms: Asymptotic Notation

PETERSON'S TABLE OF IRREDUCIBLE POLYNOMIALS OVER GF(2)

CS 2110: INDUCTION DISCUSSION TOPICS

CS Analysis of Recursive Algorithms and Brute Force

15.1 Introduction to Lower Bounds Proofs

Analysis of Algorithms

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

COMP Analysis of Algorithms & Data Structures

The breakpoint distance for signed sequences

Sorting algorithms. Sorting algorithms

Computational Complexity

Friday Four Square! Today at 4:15PM, Outside Gates

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

CS Communication Complexity: Applications and New Directions

The Growth of Functions (2A) Young Won Lim 4/6/18

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Asymptotic Analysis and Recurrences

CS3719 Theory of Computation and Algorithms

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

CS2223 Algorithms D Term 2009 Exam 3 Solutions

Outline. 1 NP-Completeness Theory. 2 Limitation of Computation. 3 Examples. 4 Decision Problems. 5 Verification Algorithm

CS6902 Theory of Computation and Algorithms

Introduction to Algorithms March 10, 2010 Massachusetts Institute of Technology Spring 2010 Professors Piotr Indyk and David Karger Quiz 1

Randomized Algorithms III Min Cut

Data Structures and Algorithms CSE 465

Divide-and-Conquer. Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, CSE 6331 Algorithms Steve Lai

CS6901: review of Theory of Computation and Algorithms

Math 421, Homework #6 Solutions. (1) Let E R n Show that = (E c ) o, i.e. the complement of the closure is the interior of the complement.

PanHomc'r I'rui;* :".>r '.a'' W"»' I'fltolt. 'j'l :. r... Jnfii<on. Kslaiaaac. <.T i.. %.. 1 >

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Recursion: Introduction and Correctness

Lecture 3: AC 0, the switching lemma

MANY BILLS OF CONCERN TO PUBLIC

Transcription:

CSC 421: Algorithm Design & Analysis Spring 2015 Complexity & lower bounds brute force decision trees adversary arguments problem reduction 1 Lower bounds on problems when studying a problem, may wish to establish a lower bound on efficiency for searching a sorted list, can we do better than binary search? O(log N for sorting a list, can we do better than merge/quick/heap sorts? O(N log N) establishing a lower bound for a problem can tell us when a particular algorithm is as good as possible when the problem is intractable (by showing that best possible algorithm is BAD) methods for establishing lower bounds: brute force information-theoretic arguments (decision trees) adversary arguments problem reduction 2 1

Brute force arguments sometimes, a problem-specific approach works example: polymial evaluation p(n) = a N x N + a N-1 x N-1 + + a 0 evaluating this polymial requires Ω(N) steps, since each coefficient must be processed example: Towers of Hai puzzle can prove, by induction, that moving a tower of size N requires Ω(2 N ) steps 3 Information-theoretic arguments can sometimes establish a lower bound based on the amount of information the solution must produce example: guess a randomly selected number between 1 and N with possible responses of "correct", "too low", or "too high" the amount of uncertainty is log 2 N, the number of bits needed to specify the selected largest number e.g., N = 127 à 7 bits each answer to a question yields at most 1 bit of information if guess of 64 yields too high, then 1 st bit must be a 0 à 0xxxxxx if next guess of 32 yields too low,, then 2 nd bit must be 1 à 01xxxxx if next guess of 48 yields too low, then 3 rd bit must be 1 à 011xxxx... thus, log 2 N is a lower bound on the number of questions 4 2

Decision trees a useful structure for information-theoretic arguments is a decision tree example: guessing a number between 1 and 15 0xxx 8? 1000 1xxx 00xx 4? 12? 0100 1100 01xx 10xx 11xx 0001 2? 6? 0010 0011 0101 0110 0111 1001 10? 14? 1010 1011 1101 1110 1111 5 Decision trees in general, a decision tree is a model of algorithms involving comparisons internal des represent comparisons leaves represent outcomes e.g., decision tree for 3-element insertion sort: a < b bac a < acb b < bca < b c < a < b < a c < b < a 6 3

Decision trees & sorting a < b bac a < acb b < bca < b c < a < b < a c < b < a te that any comparison-based sorting algorithm can be represented by a decision tree number of leaves (outcomes) N! height of binary tree with N! leaves log 2 N! therefore, the minimum number of comparisons required by any comparison-based sorting algorithm log 2 N! since log 2 N! N log 2 N (proof t shown), the lower bound Ω(N log N) is tight thus, merge/quick/heap sorts are as good as it gets 7 Decision trees & searching similarly, we can use a decision tree to show that binary search is as good as it gets (assuming the list is sorted) decision tree for binary search of 4-element list: internal des are found elements leaves are ranges if t found number of leaves (ranges where t found) = N + 1 height of binary tree with N+1 leaves log 2 (N+1) therefore, the minimum number of comparisons required by any comparison-based searching algorithm log 2 (N+1) lower bound Ω(log N) is tight 8 4

Adversary arguments using an adversary argument, you repeatedly adjust the input to make an algorithm work the hardest example: dishonest hangman adversary always puts the word in a larger of the subset generated by last guess for a given dictionary, can determine a lower bound on guesses example: merging two sorted lists of size N (as in merge sort) adversary makes it so that list "runs out" of values (e.g., a i < b j iff i < j) forces 2N-1 comparisons to produce b 1 < a 1 < b 2 < a 2 < < b N < a N 9 Problem reduction problem reduction uses a transform & conquer approach if we can show that problem P is at least as hard as problem Q, then a lower bound for Q is also a lower bound for P. i.e., hard(p) hard(q) è if Q is Ω(X), so is P in general, to prove lower bound for P: 1. find problem Q with a kwn lower bound 2. reduce that problem to problem P i.e., show that can solve Q by solving an instance of P 3. then P is at least as hard as Q, so same lower bound applies example: prove that multiplication (of N-bit numbers) is Ω(N) 1. squaring an N-bit number is kwn to be Ω(N) 2. can reduce squaring to multiplication: x 2 = x * x 3. then multiplication is at least as hard as squaring, so also Ω(N) REASONING: if multiplication could be solved in O(X) where X < N, then could do x 2 by doing x*x O(X) < O(N) CONTRADICTION OF SQUARE'S Ω(N) 10 5

Problem reduction example CLOSEST NUMBERS (CN) PROBLEM: given N numbers, find the two closest numbers 1. consider the ELEMENT UNIQUENESS (EU) problem given a list of N numbers, determine if all are unique ( dupes) this problem has been shown to have a lower bound of Ω(N log N) 2. can reduce EU to CN consider an instance of EU: given numbers e 1,, e N, determine if all are unique find the two closest numbers (this is an instance of CN) if the distance between them is > 0, then e 1,, e N are unique 3. this shows that CN is at least as hard as EU can solve an instance of EU by performing a transformation & solving CN since transformation is O(N), CN must also have a lower-bound of Ω(N log N) REASONING: if CN could be solved in O(X) where X < N log N, then could solve EU by transforming & solving CN O(N) +O(X) < O(N log N) CONTRADICTION OF EU's Ω(N log N) 11 Ather example CLOSEST POINTS (CP) PROBLEM: given N points in the plane, find the two closest points 1. consider the CLOSEST NUMBER (CN) problem we just showed that CN has a lower bound of Ω(N log N) 2. can reduce CN to CP consider an instance of CN: given numbers e 1,, e N, determine closest numbers from these N numbers, construct N points: (e 1, 0),, (e N, 0) find the two closest points (this is an instance of CP) if (e i, 0) and (e j, 0) are closest points, then e i and e j are closest numbers 3. this shows that CP is at least as hard as CN can solve an instance of CN by performing a transformation & solving CP since transformation is O(N), CP must also have a lower-bound of Ω(N log N) REASONING: if CP could be solved in O(X) where X < N log N, then could solve CN by transforming & solving CP O(N) +O(X) < O(N log N) CONTRADICTION OF CN's Ω(N log N) 12 6

Tightness te: if an algorithm is Ω(N log N), then it is also Ω(N) are the Ω(N log N) lower bounds tight for CLOSEST NUMBERS and CLOSEST POINTS problems? can you devise O(N log N) algorithm for CLOSEST NUMBERS? can you devise O(N log N) algorithm for CLOSEST POINTS? 13 7