COMP 382: Reasoning about algorithms

Similar documents
Copyright 2000, Kevin Wayne 1

COMP 355 Advanced Algorithms

Algorithms Design & Analysis. Analysis of Algorithm

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

2.1 Computational Tractability. Chapter 2. Basics of Algorithm Analysis. Computational Tractability. Polynomial-Time

Copyright 2000, Kevin Wayne 1

Growth of Functions (CLRS 2.3,3)

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

2. ALGORITHM ANALYSIS

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Algorithm Design and Analysis

Lecture 2: Asymptotic Notation CSCI Algorithms I

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Computational Complexity

CS173 Running Time and Big-O. Tandy Warnow

CS483 Design and Analysis of Algorithms

CPS 616 DIVIDE-AND-CONQUER 6-1

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

COMP 9024, Class notes, 11s2, Class 1

Approximation and Randomized Algorithms (ARA) Lecture 1, September 3, 2012

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms - Using Asymptotic Bounds -

Module 1: Analyzing the Efficiency of Algorithms

COMP Analysis of Algorithms & Data Structures

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

data structures and algorithms lecture 2

COMP Analysis of Algorithms & Data Structures

Asymptotic Algorithm Analysis & Sorting

Algorithm runtime analysis and computational tractability

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Data Structures and Algorithms Running time and growth functions January 18, 2018

Asymptotic Analysis and Recurrences

Data Structures and Algorithms

Divide and Conquer. Recurrence Relations

Data Structures and Algorithms Chapter 2

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

CS 61B Asymptotic Analysis Fall 2017

Asymptotic Analysis 1

2.1 Computational Tractability

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Time complexity analysis

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Introduction to Divide and Conquer

Fundamental Algorithms

3.1 Asymptotic notation

The Time Complexity of an Algorithm

Linear Selection and Linear Sorting

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

EECS 477: Introduction to algorithms. Lecture 5

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Matching Residents to Hospitals

Ch01. Analysis of Algorithms

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

CS483 Design and Analysis of Algorithms

Lecture 4. Quicksort

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Analysis of Algorithms

Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Fundamental Algorithms

Data Structures and Algorithms CSE 465

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Analysis of Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

The Time Complexity of an Algorithm

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

CS Analysis of Recursive Algorithms and Brute Force

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Recurrence Relations

Mergesort and Recurrences (CLRS 2.3, 4.4)

Introduction to Algorithms 6.046J/18.401J/SMA5503

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Chapter 4 Divide-and-Conquer

Data Structures and Algorithms. Asymptotic notation

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS Non-recursive and Recursive Algorithm Analysis

282 Math Preview. Chris Brown. September 8, Why This? 2. 2 Logarithms Basic Identities Basic Consequences...

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Transcription:

Fall 2014 Unit 4: Basics of complexity analysis

Correctness and efficiency So far, we have talked about correctness and termination of algorithms What about efficiency?

Running time of an algorithm For a Turing machine, the number of transitions taken until an accept or reject state is reached For a structured program, the number of evaluations of basic operations (skip, assignments, and basic arithmetic operations)

Polynomial time Brute force: For many non-trivial problems, there is a natural brute force search algorithm that checks every possible solution. Typically takes 2 N time or worse for inputs of size N.

Polynomial time There exists constants c > 0 and d > 0 such that on every input of size N, its running time is bounded by c N d steps. When the input size doubles, the algorithm only slows down by a constant factor.

Worst-case analysis Worst case running time: Obtain bound on largest possible running time of algorithm on input of a given size N Downside: Draconian view.

Worst-case analysis Worst case running time: Obtain bound on largest possible running time of algorithm on input of a given size N Downside: Draconian view. Average case running time: Obtain bound on running time of algorithm on random input as a function of input size N. Downside: Hard to model real instances by random distributions.

Worst-case polynomial time An algorithm is efficient if its running time is polynomial. In practice, poly-time algorithms almost always have low constants and low exponents Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. Exceptions: Some poly-time algorithms useless in practice Some exponential-time algorithms widely used as worst-case instances are rare

Asymptotic order of growth Assumptions: All functions f (n) that we consider have the properties: f (n) > 1 lim n f (n) is

Asymptotic order of growth Upper bounds: T (n) is O(f (n)) if there exist constants c > 0 and n 0 0 such that for all n n 0 we have T (n) c f (n) Lower bounds: T (n) is Ω(f (n)) if there exist constants c > 0 and n 0 0 such that for all n n 0 we have T (n) c f (n). Tight bounds: T (n) is Θ(f (n)) if T (n) is both O(f (n)) and Ω(f (n)).

Examples T (n) = 32n 2 + 17n + 32 T (n) is O(n 2 ), O(n 3 ), Ω(n 2 ), Ω(n), and Θ(n 2 ) T (n) is not O(n), Ω(n 3 ), Θ(n), or Ω(n 3 )

Notation Slight abuse of notation: T (n) = O(f (n)) Asymmetric: f (n) = 5n 3 ; g(n) = 3n 2 f (n) = O(n 3 ) = g(n)... even though f (n) g(n) What I actually mean: T (n) O(f (n))

Properties: Transitivity If f = O(g) and g = O(h) then f = O(h) If f = Ω(g) and g = Ω(h) then f = Ω(h) If f = Θ(g) and g = Θ(h) then f = Θ(h)

Properties: Additivity If f = O(h) and g = O(h) then f + g = O(h) If f = Ω(h) and g = Ω(h) then f + g = Ω(h) If f = Θ(h) and g = Θ(h) then f + g = Θ(h)

Bounds through limits Let f (n) and g(n) be two positive-valued functions such that f (n) lim n g(n) exists and equals some number c > 0. Then f (n) = Θ(g(n)).

Complexity analysis: rules of thumb For all b > 1 and d > 0, we have log b n = O(n d ) For all r > 1 and d > 0, we have n d = O(r n )

Common runtimes Linear time O(n log n) time Quadratic time Cubic time O(n k ) time Exponential time Sublinear time Check out http://bigocheatsheet.com/!

Exercise: Upper and lower bounds Input: an array A with integer entries A[1],..., A[n] Output: a 2-D array B such that B[i, j] contains the sum A[i] + A[i + 1] + + A[j] 1 for i=1 to n 2 for j = i+1 to n { 3 Add up entries A[i] through A[j] 4 Store result in B[i,j] 5 } 6 } Obtain an upper bound and a lower bound for the algorithm.

Lower bound proof Consider the times during the execution of the algorithm when i n 4 and j 3n 4. In these cases, j i + 1 3n 4 n 4 + 1 > n 2. Therefore, adding up the array entries A[i] through A[j] would require at least n 2 operations, since there are more then n 2 terms to add up. There are ( n 4 )2 pairs (i, j) with i n 4 and j 3n 4. The algorithm enumerates over all of them. Therefore, the algorithm must perform at least n 2 ( n 4 )2 = n3 32 operations. This is Ω(n 3 ).

Complexity analysis of recursive algorithms Key tool: Recurrence equations

Divide-and-conquer algorithms Applicable even when a polynomial-time brute-force algorithm exists: Break up problem of size n into multiple parts of size n/k Solve the k parts recursively Combine two solutions into overall solution

Mergesort Divide array of size n into two halves Recursively sort each half Merge two halves (in n steps) to make sorted whole.

Complexity of mergesort Let T (n) = number of comparisons to mergesort an input of size n { 0 if n = 1 T (n) T ( n/2 ) + T ( n/2 ) + n otherwise

Complexity of mergesort Let T (n) = number of comparisons to mergesort an input of size n { 0 if n = 1 T (n) T ( n/2 ) + T ( n/2 ) + n otherwise Solution T (n) = O(n log 2 n) Read chapter on recurrences in Erickson s text: http://www.cs.uiuc.edu/ jeffe/teaching/algorithms/notes/ 99-recurrences.pdf

Proof by recursion tree Proof by Recursion Tree Assuming n is a power of 2: {" 0 if n =1 $ 0 if n = 1 T (n) T(n) = # 2T(n /2) + n $ T (n/2) + T (n/2) otherwise + n otherwise % sorting both halves merging T(n) n T(n/2) T(n/2) 2(n/2) T(n/4) T(n/4) T(n/4) T(n/4) log 2 n 4(n/4)... T(n / 2 k ) 2 k (n / 2 k )... T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) n/2 (2) n log 2 n 8

Exercises Solve the following recurrences: 1 T (n) = 4T (n/2) + n

Exercises Solve the following recurrences: 1 T (n) = 4T (n/2) + n 2 T (n) = 8T (n/2) + n 2

Exercises Solve the following recurrences: 1 T (n) = 4T (n/2) + n 2 T (n) = 8T (n/2) + n 2 3 T (n) = 3T (n/2) + n

Exercises Solve the following recurrences: 1 T (n) = 4T (n/2) + n 2 T (n) = 8T (n/2) + n 2 3 T (n) = 3T (n/2) + n 4 T (n) = nt ( n) + n

Proof by induction Claim: Suppose n is a power of 2. Then if T (n) satisfies the following recurrence, then T (n) = n log 2 n. { 0 if n = 1 T (n) T (n/2) + T (n/2) + n otherwise

Proof by induction Claim: Suppose n is a power of 2. Then if T (n) satisfies the following recurrence, then T (n) = n log 2 n. { 0 if n = 1 T (n) T (n/2) + T (n/2) + n otherwise Proof: Base case: n = 1 Inductive case: Assuming T (n) = n log n, show that T (2n) = 2n log 2 2n

The general case Claim: If T (n) satisfies the following recurrence, then T (n) n lg n. { 0 if n = 1 T (n) T ( n/2 ) + T ( n/2 ) + n otherwise

The general case Claim: If T (n) satisfies the following recurrence, then T (n) n lg n. { 0 if n = 1 T (n) T ( n/2 ) + T ( n/2 ) + n otherwise Proof by complete induction: Inductive case: T (n) T (n 1 ) + T (n 2 ) + n where n 1 = n/2, n 2 = n/2 n 1 lg n 1 + n 2 lg n 2 + n n lg n 2 + n n( lg n 1 1) + n n lg n n 2 = n/2 2 lg n /2 = 2 lg n /2 lg n 2 lg n 1

Master s theorem For recurrences of the form T (n) = at (n/b) + f (n), where a 1, b > 1 If f (n) = O(n log b a c ), where c > 0, then T (n) = Θ(n log b a ) If f (n) = Θ(n log b a ), then T (n) = Θ(n log b a log n) If f (n) = Ω(n log b a+c ), where c > 0, and if af (n/b) kf (n) for some k < 1 and all sufficiently large n, then T (n) = Θ(f (n)).

Master s theorem 1 T (n) = 9T (n/3) + n 2 T (n) = T (2n/3) + 1 3 T (n) = 3T (n/4) + n lg n 4 T (n) = 2T (n/2) + n lg n

Master s theorem 1 T (n) = 9T (n/3) + n 2 T (n) = T (2n/3) + 1 3 T (n) = 3T (n/4) + n lg n 4 T (n) = 2T (n/2) + n lg n Case 3 does not apply! f (n) = n lg n is not Ω(n 1+c ) for any c.

Complexity: bits vs. words What are we counting while giving complexity bounds? Issues: Basic operations in executions of a structured program Number of moves in executions of a Turing machine Mergesort takes O(n lg n) comparisons But a comparison isn t constant-time in a Turing machine! lg n bitwise operations But neither can you assume arbitrary arithmetic operations

Complexity: bits vs. words What are we counting while giving complexity bounds? Issues: Basic operations in executions of a structured program Number of moves in executions of a Turing machine Mergesort takes O(n lg n) comparisons But a comparison isn t constant-time in a Turing machine! lg n bitwise operations But neither can you assume arbitrary arithmetic operations Conventions: Usual assumption: Unit-time operations on log n-bit words Exceptions: In certain problems like integer multiplication, use bit-level model