CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. January 16, 2019

Similar documents
COMP Analysis of Algorithms & Data Structures

CS Non-recursive and Recursive Algorithm Analysis

COMP Analysis of Algorithms & Data Structures

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Module 1: Analyzing the Efficiency of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Foundations II: Data Structures and Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Running Time Evaluation

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms

Introduction to Algorithms

Asymptotic Analysis 1

Ch01. Analysis of Algorithms

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

CS483 Design and Analysis of Algorithms

Computational Complexity

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Data Structures and Algorithms Chapter 2

CSC Design and Analysis of Algorithms. Lecture 1

Cpt S 223. School of EECS, WSU

COMP 9024, Class notes, 11s2, Class 1

Homework 1 Solutions

Growth of Functions (CLRS 2.3,3)

P, NP, NP-Complete, and NPhard

Algorithms and Their Complexity

COMPUTER ALGORITHMS. Athasit Surarerks.

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

Data Structures and Algorithms Running time and growth functions January 18, 2018

3.1 Asymptotic notation

Analysis of Algorithms

Ch 01. Analysis of Algorithms

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Data Structures and Algorithms. Asymptotic notation

CS 161 Summer 2009 Homework #2 Sample Solutions

Algorithms and Programming I. Lecture#1 Spring 2015

Searching Algorithms. CSE21 Winter 2017, Day 3 (B00), Day 2 (A00) January 13, 2017

Introduction to Algorithms: Asymptotic Notation

Homework 1 Submission

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Enumerate all possible assignments and take the An algorithm is a well-defined computational

CS473 - Algorithms I

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

EECS 477: Introduction to algorithms. Lecture 5

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Topic 17. Analysis of Algorithms

Course: Math 111 Pre-Calculus Summer 2016

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Algorithms Exam TIN093 /DIT602

Algorithms Design & Analysis. Analysis of Algorithm

CS Data Structures and Algorithm Analysis

Lecture 2: Asymptotic Notation CSCI Algorithms I

Texts 1. Required Worboys, M. and Duckham, M. (2004) GIS: A Computing Perspective. Other readings see course schedule.

Course Structure. Computer Science 2300: Data Structures and Algorithms. What is this class about? Textbooks

Analysis of Algorithms

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 3. Big-O notation, more recurrences!!

Big-O Notation and Complexity Analysis

Chapter 2: The Basics. slides 2017, David Doty ECS 220: Theory of Computation based on The Nature of Computation by Moore and Mertens

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours (Las/CSEB 3043): Tue 6-7 pm, Wed 4-6 pm or by appointment.

Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

Analysis of Algorithms

Big O (Asymptotic Upper Bound)

Asymptotic Analysis Cont'd

Introduction to Algorithms 6.046J/18.401J

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

How many hours would you estimate that you spent on this assignment?

CSCE 222 Discrete Structures for Computing. Review for Exam 2. Dr. Hyunyoung Lee !!!

Analysis of Algorithms - Using Asymptotic Bounds -

CSC236 Week 4. Larry Zhang

Computational Complexity - Pseudocode and Recursions

2. ALGORITHM ANALYSIS

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Introduction to Divide and Conquer

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

The Growth of Functions and Big-O Notation

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

ICS 252 Introduction to Computer Design

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Written Homework #1: Analysis of Algorithms

Welcome to CSci Algorithms and Data Structures

CS 380 ALGORITHM DESIGN AND ANALYSIS

NP-Completeness Part II

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Transcription:

CS 341: Algorithms Douglas R. Stinson David R. Cheriton School of Computer Science University of Waterloo January 16, 2019 D.R. Stinson (SCS) CS 341 January 16, 2019 1 / 294

1 Course Information 2 Introduction 3 Divide-and-Conquer Algorithms 4 Greedy Algorithms 5 Dynamic Programming Algorithms 6 Graph Algorithms 7 Intractability and Undecidability D.R. Stinson (SCS) CS 341 January 16, 2019 2 / 294

Course Information Table of Contents 1 Course Information D.R. Stinson (SCS) CS 341 January 16, 2019 3 / 294

Course Information Course mechanics My Sections: LEC 003, T Th 4:00 5:20, PHY 235 LEC 004, T Th 1:00 2:20, MC 2035 My Scheduled Office Hours: Tuesday, 11:00 12:00, DC 3522 D.R. Stinson (SCS) CS 341 January 16, 2019 4 / 294

Course Information Course mechanics Come to class! Not all the material will be on the slides or in the text. You will need an account in the student.cs environment The course website can be found at https://www.student.cs.uwaterloo.ca/~cs341/ Syllabus, calendar, policies, etc. can be found there. D.R. Stinson (SCS) CS 341 January 16, 2019 5 / 294

Course Information Learn and Piazza Slides and assignments will be available on the course website. Grades and assignment solutions will be available on Learn. Discussion related to the course will take place on Piazza (piazza.com). General course questions, announcements Assignment-related questions You will be getting an invitation via email to join Piazza in the first week of classes. Keep up with the information posted on the course website, Learn and Piazza. D.R. Stinson (SCS) CS 341 January 16, 2019 6 / 294

Course Information Courtesy Please silence cell phones and other mobile devices before coming to class. Questions are encouraged, but please refrain from talking in class it is distracting to your classmates who are trying to listen to the lectures and to your professor who is trying to think, talk and write at the same time. Carefully consider whether using your laptop, ipad, smartphone, etc., in class will help you learn the material and follow the lectures. Do not play games, tweet, watch youtube videos, update your facebook page or use a mobile device in any other way that will distract your classmates. D.R. Stinson (SCS) CS 341 January 16, 2019 7 / 294

Course Information Course syllabus You are expected to be familiar with the contents of the course syllabus Available on the course home page If you haven t read it, read it as soon as possible D.R. Stinson (SCS) CS 341 January 16, 2019 8 / 294

Course Information Plagiarism and academic offenses We take academic offences very seriously There is a good discussion of plagiarism online: https://uwaterloo.ca/academic-integrity/sites/ca. academic-integrity/files/uploads/files/revised_ undergraduate_tip_sheet_2013_final.pdf Read this and understand it Ignorance is no excuse! Questions should be brought to instructor Plagiarism applies to both text and code You are free (even encouraged) to exchange ideas, but no sharing code or text D.R. Stinson (SCS) CS 341 January 16, 2019 9 / 294

Course Information Plagiarism (2) Common mistakes Excess collaboration with other students Share ideas, but no design or code! Using solutions from other sources (like for previous offerings of this course, maybe written by yourself) More information linked to from course syllabus D.R. Stinson (SCS) CS 341 January 16, 2019 10 / 294

Course Information Grading scheme for CS 341 Midterm (25%) Tuesday, Feb. 26, 2019, 7:00 8:50 PM Assignments (30%) There will be five assignments. Work alone See syllabus for reappraisal policies, academic integrity policy, and other details Final (45%) For medical conditions, you need to submit a Verification of Illness form. D.R. Stinson (SCS) CS 341 January 16, 2019 11 / 294

Course Information Assignments All sections will have the same assignments, midterm and final exam. Assignments will be due at 6:00 PM on the due date. No late submissions will be accepted. You need to notify your instructor well before the due date of a severe, long-lasting or ongoing problem that prevents you from doing an assignment. D.R. Stinson (SCS) CS 341 January 16, 2019 12 / 294

Course Information Assignment due dates Assignment 1: due Friday Jan. 25 Assignment 2: due Friday Feb. 8 Assignment 3: due Friday March 1 Assignment 4: due Friday March 22 Assignment 5: due Friday April 5 D.R. Stinson (SCS) CS 341 January 16, 2019 13 / 294

Course Information Textbook Introduction to Algorithms, Third Edition, by Cormen, Leiserson, Rivest and Stein, MIT Press, 2009. You are expected to know all the material presented in class relevant textbook sections, as listed on course website D.R. Stinson (SCS) CS 341 January 16, 2019 14 / 294

Table of Contents 2 Introduction Algorithm Design and Analysis The 3SUM Problem Reductions Definitions and Terminology Order Notation Formulae Loop Analysis Techniques D.R. Stinson (SCS) CS 341 January 16, 2019 15 / 294

Algorithm Design and Analysis Analysis of Algorithms In this course, we study the design and analysis of algorithms. Analysis refers to mathematical techniques for establishing both the correctness and efficiency of algorithms. Correctness: The level of detail required in a proof of correctness depends greatly on the type and/or difficulty of the algorithm. D.R. Stinson (SCS) CS 341 January 16, 2019 16 / 294

Algorithm Design and Analysis Analysis of Algorithms (cont.) Efficiency: Given an algorithm A, we want to know how efficient it is. This includes several possible criteria: What is the asymptotic complexity of algorithm A? What is the exact number of specified computations done by A? How does the average-case complexity of A compare to the worst-case complexity? Is A the most efficient algorithm to solve the given problem? (For example, can we find a lower bound on the complexity of any algorithm to solve the given problem?) Are there problems that cannot be solved efficiently? This topic is addressed in the theory of NP-completeness. Are there problems that cannot be solved by any algorithm? Such problems are termed undecidable. D.R. Stinson (SCS) CS 341 January 16, 2019 17 / 294

Algorithm Design and Analysis Design of Algorithms Design refers to general strategies for creating new algorithms. If we have good design strategies, then it will be easier to end up with correct and efficient algorithms. Also, we want to avoid using ad hoc algorithms that are hard to analyze and understand. Here are some examples of useful design strategies, many of which we will study: reductions divide-and conquer greedy dynamic programming depth-first and breadth-first search local search exhaustive search (backtracking, branch-and-bound) D.R. Stinson (SCS) CS 341 January 16, 2019 18 / 294

The 3SUM Problem The 3SUM Problem Problem 2.1 3SUM Instance: an array A of n distinct integers, A = [A[1],..., A[n]]. Question: do there exist three distinct elements in A whose sum equals 0? The 3SUM problem has an obvious algorithm to solve it. Algorithm: Trivial3SUM(A = [A[1],..., A[n]]) for i 1 to n 2 for j i + 1 to n 1 for k do { j + 1 to n do if A[i] + A[j] + A[k] = 0 do then output (i, j, k) The complexity of Trivial3SUM is O(n 3 ). D.R. Stinson (SCS) CS 341 January 16, 2019 19 / 294

The 3SUM Problem An Improvement Instead of having three nested loops, suppose we have two nested loops (with indices i and j, say) and then we search for an A[k] for which A[i] + A[j] + A[k] = 0. If we sequentially try all possible k-values, then we basically have the previous algorithm. What can we do to make the search more efficient? What effect does this have on the complexity of the resulting algorithm? D.R. Stinson (SCS) CS 341 January 16, 2019 20 / 294

The 3SUM Problem An Improved Algorithm for the 3SUM Problem Algorithm: Improved3SUM(A = [A[1],..., A[n]]) (B, π) sort(a) comment: π is the permutation such that A[π(i)] = B[i] for all i for i 1 to n 2 for j i + 1 to n 1 perform a binary search for B[k] = B[i] B[j] do in the subarray [B[j + 1],..., B[n]] do if the search is successful then output (π(i), π(j), π(k)) Note that π(i), π(j), π(k) are indices in in the original array A. The complexity of Improved3SUM is O(n log n + n 2 log n) = O(n 2 log n). The n log n term is the sort. The n 2 log n term accounts for n 2 binary searches. D.R. Stinson (SCS) CS 341 January 16, 2019 21 / 294

The 3SUM Problem A Further Improvement The sort is an example of pre-processing. It modifies the input to permit a more efficient algorithm to be used (binary search as opposed to linear search). Note that a pre-processing step is only done once. However, there is a better way to make use of the sorted array B. Namely, for a given B[i], we simultaneously scan from both ends of A looking for B[j] + B[k] = B[i]. We start with j = i + 1 and k = n. At any stage of the algorithm, we either increment j or decrement k (or both, if B[i] + B[j] + B[k] = 0). The resulting algorithm will have complexity O(n log n + n 2 ) = O(n 2 ). D.R. Stinson (SCS) CS 341 January 16, 2019 22 / 294

The 3SUM Problem A Quadratic Time Algorithm for the 3SUM Problem Algorithm: Quadratic3SUM(A = [A[1],..., A[n]]) (B, π) sort(a) comment: π is the permutation such that A[π(i)] = B[i] for all i for i 1 to n 2 j i + 1 k n while j < k S B[i] + B[j] + B[k] do if S < 0 then j j + 1 else if do S > 0 then k k 1 output (π(i), π(j), π(k)) else j j + 1 k k 1 D.R. Stinson (SCS) CS 341 January 16, 2019 23 / 294

The 3SUM Problem Example Consider the sorted array 11 10 7 3 2 4 8 10. The algorithm will not find any solutions when i = 1: 11 10 7 3 2 4 8 10 S = 11 11 10 7 3 2 4 8 10 S = 8 11 10 7 3 2 4 8 10 S = 4 11 10 7 3 2 4 8 10 S = 1 11 10 7 3 2 4 8 10 S = 1 11 10 7 3 2 4 8 10 S = 1 When i = 2, we have 11 10 7 3 2 4 8 10 S = 7 11 10 7 3 2 4 8 10 S = 3 11 10 7 3 2 4 8 10 S = 2 11 10 7 3 2 4 8 10 S = 0 D.R. Stinson (SCS) CS 341 January 16, 2019 24 / 294

Reductions Reductions Suppose Π 1 and Π 2 are problems and we can use a hypothetical algorithm solving Π 2 as a subroutine to solve Π 1. Then we have a Turing reduction (or more simply, a reduction) from Π 1 to Π 2 ; this is denoted Π 1 Π 2. The hypothetical algorithm solving Π 2 is called an oracle. The reduction must treat the oracle as a black box. There may be more than one call made to the oracle in the reduction. Suppose Π 1 Π 2 and we also have an algorithm A 2 that solves Π 2. If we plug A 2 into the reduction, then we obtain an algorithm that solves Π 1. Reductions potentially allow re-using code, which may be advantageous. D.R. Stinson (SCS) CS 341 January 16, 2019 25 / 294

Reductions A Simple Reduction Consider the algebraic identity xy = (x + y)2 (x y) 2. 4 This identity allows us to show that Multiplication Squaring. Algorithm: MultiplicationtoSquaring(x, y) external ComputeSquare s ComputeSquare(x + y) t ComputeSquare(x y) return ((s t)/4) Note that the division by 4 just consists of deleting the two low-order bits, which are guaranteed to be 00. D.R. Stinson (SCS) CS 341 January 16, 2019 26 / 294

Reductions The Target 3SUM Problem Problem 2.2 Target3SUM Instance: an array A of n distinct integers, A = [A[1],..., A[n]] and a target T. Question: do there exist three distinct elements in A whose sum equals T? It is straightforward to modify any algorithm solving the 3SUM problem so it solves the Target3SUM problem. Another approach is to find a reduction Target3SUM 3SUM. This would allow us to re-use code as opposed to modifying code. D.R. Stinson (SCS) CS 341 January 16, 2019 27 / 294

Reductions Target3SUM 3SUM Observe that A[i] + A[j] + A[k] = T if and only if (A[i] T/3) + (A[j] T/3) + (A[k] T/3) = 0. This suggests the following approach, which works if T is divisible by three. Algorithm: Target3SUMto3SUM(A = [A[1],..., A[n]], T ) comment: assume T is divisible by 3 external 3SUM-solver for i 1 to n do B[i] A[i] T/3 return (3SUM-solver(B)) Modification: the transformation B[i] 3A[i] T works for any integer T. D.R. Stinson (SCS) CS 341 January 16, 2019 28 / 294

Reductions Complexity analysis Suppose we replace the oracle 3SUM-solver by a real algorithm. What is the complexity of the resulting algorithm? If we plug in Trivial3SUM, the complexity is O(n + n 3 ) = O(n 3 ). If we plug in Improved3SUM, the complexity is O(n + n 2 log n) = O(n 2 log n). If we plug in Quadratic3SUM, the complexity is O(n + n 2 ) = O(n 2 ). In each case, it turns out that the n term is subsumed by the second term. D.R. Stinson (SCS) CS 341 January 16, 2019 29 / 294

Reductions Many-one Reductions The reduction on the previous slide had a very special structure: We transformed an instance of the first problem to an instance of the second problem. We called the oracle once, on the transformed instance. Reductions of the this form, in the context of decision problems, are called many-one reductions (also known as polynomial transformations or Karp reductions). We will many examples of these in the section on intractability. D.R. Stinson (SCS) CS 341 January 16, 2019 30 / 294

Reductions The 3array3SUM Problem Problem 2.3 3array3SUM Instance: three arrays of n distinct integers, A, B and C. Question: do there exist array elements, one from each of A, B and C, whose sum equals 0? Algorithm: 3array3SUMto3SUM(A, B, C) external 3SUM-solver for i 1 to n D[i] 10A[i] + 1 do E[i] 10B[i] + 2 F [i] 10C[i] 3 let A denote the concatenation of D, E and F if 3SUM-solver(A ) = (i, j, k) then return (i, j n, k 2n) D.R. Stinson (SCS) CS 341 January 16, 2019 31 / 294

Reductions 3array3SUM 3SUM (cont.) To show that 3array3SUMto3SUM is a reduction, we show that (i, j, k) is a solution to the instance A of 3SUM if and only if (i, j n, k 2n) is a solution to the instance A, B, C of 3array3SUM. Assume first that (i, j, k ) is a solution to 3array3SUM. Then Hence, A[i ] + B[j ] + C[k ] = 0. D[i ] + E[j ] + F [k ] = 10A[i ] + 1 + 10B[j ] + 2 + 10C[k ] 3 = 0 and thus A [i ] + A [j + n] + A [k + 2n] = 0. Conversely, suppose that A [i] + A [j] + A [k] = 0. We claim that this sum consists of one element from each of D, E and F. This can be proven by considering the sum modulo 10 and observing that the only way to get a sum that is divisible by 10 is 1 + 2 3 mod 3. The result follows by translating back to the original arrays A, B and C. D.R. Stinson (SCS) CS 341 January 16, 2019 32 / 294

Definitions and Terminology Problems Problem: Given a problem instance I for a problem P, carry out a particular computational task. Problem Instance: Input for the specified problem. Problem Solution: Output (correct answer) for the specified problem. Size of a problem instance: Size(I) is a positive integer which is a measure of the size of the instance I. D.R. Stinson (SCS) CS 341 January 16, 2019 33 / 294

Definitions and Terminology Algorithms and Programs Algorithm: An algorithm is a step-by-step process (e.g., described in pseudocode) for carrying out a series of computations, given some appropriate input. Algorithm solving a problem: An Algorithm A solves a problem P if, for every instance I of P, A finds a valid solution for the instance I in finite time. Program: A program is an implementation of an algorithm using a specified computer language. D.R. Stinson (SCS) CS 341 January 16, 2019 34 / 294

Definitions and Terminology Running Time Running Time of a Program: T M (I) denotes the running time (in seconds) of a program M on a problem instance I. Worst-case Running Time as a Function of Input Size: T M (n) denotes the maximum running time of program M on instances of size n: T M (n) = max{t M (I) : Size(I) = n}. Average-case Running Time as a Function of Input Size: T avg M (n) denotes the average running time of program M over all instances of size n: T avg M (n) = 1 T M (I). {I : Size(I) = n} {I:Size(I)=n} D.R. Stinson (SCS) CS 341 January 16, 2019 35 / 294

Definitions and Terminology Complexity Worst-case complexity of an algorithm: Let f : Z + R. An algorithm A has worst-case complexity f(n) if there exists a program M implementing the algorithm A such that T M (n) Θ(f(n)). Average-case complexity of an algorithm: Let f : Z + R. An algorithm A has average-case complexity f(n) if there exists a program M implementing the algorithm A such that T avg M (n) Θ(f(n)). D.R. Stinson (SCS) CS 341 January 16, 2019 36 / 294

Definitions and Terminology Running Time vs Complexity Running time can only be determined by implementing a program and running it on a specific computer. Running time is influenced by many factors, including the programming language, processor, operating system, etc. Complexity (AKA growth rate) can be analyzed by high-level mathematical analysis. It is independent of the above-mentioned factors affecting running time. Complexity is a less precise measure than running time since it is asymptotic and it incorporates unspecified constant factors and unspecified lower order terms. However, if algorithm A has lower complexity than algorithm B, then a program implementing algorithm A will be faster than a program implementing algorithm B for sufficiently large inputs. D.R. Stinson (SCS) CS 341 January 16, 2019 37 / 294

Order Notation Order Notation O-notation: f(n) O(g(n)) if there exist constants c > 0 and n 0 > 0 such that 0 f(n) cg(n) for all n n 0. Here the complexity of f is not higher than the complexity of g. Ω-notation: f(n) Ω(g(n)) if there exist constants c > 0 and n 0 > 0 such that 0 cg(n) f(n) for all n n 0. Here the complexity of f is not lower than the complexity of g. Θ-notation: f(n) Θ(g(n)) if there exist constants c 1, c 2 > 0 and n 0 > 0 such that 0 c 1 g(n) f(n) c 2 g(n) for all n n 0. Here f and g have the same complexity. D.R. Stinson (SCS) CS 341 January 16, 2019 38 / 294

Order Notation Order Notation (cont.) o-notation: f(n) o(g(n)) if for all constants c > 0, there exists a constant n 0 > 0 such that 0 f(n) cg(n) for all n n 0. Here f has lower complexity than g. ω-notation: f(n) ω(g(n)) if for all constants c > 0, there exists a constant n 0 > 0 such that 0 cg(n) f(n) for all n n 0. Here f has higher complexity than g. D.R. Stinson (SCS) CS 341 January 16, 2019 39 / 294

Order Notation Exercises 1 Let f(n) = n 2 7n 30. Prove from first principles that f(n) O(n 2 ). 2 Let f(n) = n 2 7n 30. Prove from first principles that f(n) Ω(n 2 ). 3 Suppose f(n) = n 2 + n. Prove from first principles that f(n) O(n). D.R. Stinson (SCS) CS 341 January 16, 2019 40 / 294

Order Notation Techniques for Order Notation Suppose that f(n) > 0 and g(n) > 0 for all n n 0. Suppose that f(n) L = lim n g(n). Then o(g(n)) if L = 0 f(n) Θ(g(n)) if 0 < L < ω(g(n)) if L =. D.R. Stinson (SCS) CS 341 January 16, 2019 41 / 294

Order Notation Exercises Using the Limit Method 1 Compare the growth rate of the functions (ln n) 2 and n 1/2. 2 Use the limit method to compare the growth rate of the functions n 2 and n 2 7n 30. D.R. Stinson (SCS) CS 341 January 16, 2019 42 / 294

Order Notation Additional Exercises 1 Compare the growth rate of the functions (3 + ( 1) n )n and n. 2 Compare the growth rates of the functions f(n) = n sin πn/2 + 1 and g(n) = n. D.R. Stinson (SCS) CS 341 January 16, 2019 43 / 294

Order Notation Relationships between Order Notations f(n) Θ(g(n)) g(n) Θ(f(n)) f(n) O(g(n)) g(n) Ω(f(n)) f(n) o(g(n)) g(n) ω(f(n)) f(n) Θ(g(n)) f(n) O(g(n)) and f(n) Ω(g(n)) f(n) o(g(n)) f(n) O(g(n)) f(n) ω(g(n)) f(n) Ω(g(n)) D.R. Stinson (SCS) CS 341 January 16, 2019 44 / 294

Order Notation Algebra of Order Notations Maximum rules: Suppose that f(n) > 0 and g(n) > 0 for all n n 0. Then: O(f(n) + g(n)) = O(max{f(n), g(n)}) Θ(f(n) + g(n)) = Θ(max{f(n), g(n)}) Ω(f(n) + g(n)) = Ω(max{f(n), g(n)}) Summation rules: Suppose I is a finite set. Then ( ) O f(i) = O(f(i)) i I i I ( ) Θ f(i) = Θ(f(i)) i I i I ( ) Ω f(i) = Ω(f(i)) i I i I D.R. Stinson (SCS) CS 341 January 16, 2019 45 / 294

Order Notation Some Common Growth Rates (in increasing order) polynomial exponential Θ(1.1 n ) Θ(2 n ) Θ(e n ) Θ(n!) Θ(n n ) Θ(1) Θ(log n) Θ( n) Θ(n) Θ(n 2 ) Θ(n ( c ) Θ e c(log n)1/3 (log log n) 2/3) (number field sieve) ( Θ n ) n log 2 n (graph isomorphism) D.R. Stinson (SCS) CS 341 January 16, 2019 46 / 294

Formulae Sequences Arithmetic sequence with d > 0: n 1 (a + di) = na + i=0 dn(n 1) 2 Θ(n 2 ). Geometric sequence: n 1 a rn 1 ar i r 1 Θ(rn ) if r > 1 = na Θ(n) if r = 1 i=0 Θ(1) if 0 < r < 1. a 1 rn 1 r D.R. Stinson (SCS) CS 341 January 16, 2019 47 / 294

Formulae Sequences (cont.) Arithmetic-geometric sequence: n 1 i=0 (a + di)r i = a 1 r (a + (n 1)d)rn 1 r + dr(1 rn 1 ) (1 r) 2 provided that r 1. Harmonic sequence: H n = n i=1 1 i Θ(log n) More precisely, it is possible to prove that lim (H n ln n) = γ, n where γ 0.57721 is Euler s constant. D.R. Stinson (SCS) CS 341 January 16, 2019 48 / 294

Formulae Logarithm Formulae 1 log b xy = log b x + log b y 2 log b x/y = log b x log b y 3 log b 1/x = log b x 4 log b x y = y log b x 5 log b a = 1 log a b 6 log b a = log c a log c b 7 a log b c = c log b a D.R. Stinson (SCS) CS 341 January 16, 2019 49 / 294

Formulae Miscellaneous Formulae n! Θ ( n n+1/2 e n) log n! Θ(n log n) Another useful formula is which implies that i=1 n i=1 1 i 2 = π2 6, 1 i 2 Θ(1). A sum of powers of integers when c 1: n i c Θ(n c+1 ). i=1 D.R. Stinson (SCS) CS 341 January 16, 2019 50 / 294

Loop Analysis Techniques Two General Strategies for Loop Analysis Sometimes a O-bound is sufficient. However, we often want a precise Θ-bound. Two general strategies are as follows: Use Θ-bounds throughout the analysis and thereby obtain a Θ-bound for the complexity of the algorithm. Prove a O-bound and a matching Ω-bound separately to get a Θ-bound. Sometimes this technique is easier because arguments for O-bounds may use simpler upper bounds (and arguments for Ω-bounds may use simpler lower bounds) than arguments for Θ-bounds do. D.R. Stinson (SCS) CS 341 January 16, 2019 51 / 294

Loop Analysis Techniques Techniques for Loop Analysis Identify elementary operations that require constant time (denoted Θ(1) time). The complexity of a loop is expressed as the sum of the complexities of each iteration of the loop. Analyze independent loops separately, and then add the results: use maximum rules and simplify whenever possible. If loops are nested, start with the innermost loop and proceed outwards. In general, this kind of analysis requires evaluation of nested summations. D.R. Stinson (SCS) CS 341 January 16, 2019 52 / 294

Loop Analysis Techniques Elementary Operations in the Unit Cost Model For now, we will work in the unit cost model, where we assume that arithmetic operations such as +,, and integer division take time Θ(1). This is a reasonable assumption for integers of bounded size (e.g., integers that fit into one work of memory). If we want to consider the complexity of arithmetic operation on integers of arbitrary size, we need to consider bit complexity, where we express the complexity as a function of the length of the integers (as measured in bits). We will see some examples later, such as multiprecision multiplication. D.R. Stinson (SCS) CS 341 January 16, 2019 53 / 294

Loop Analysis Techniques Example of Loop Analysis Algorithm: LoopAnalysis1(n : integer) (1) sum 0 (2) for i 1 to n for j { 1 to i do sum sum + (i j) 2 do sum sum/i (3) return (sum) Θ-bound analysis (1) Θ(1) (2) Complexity of inner for loop: Θ(i) Complexity of outer for loop: n i=1 Θ(i) = Θ(n2 ) (3) Θ(1) total Θ(1) + Θ(n 2 ) + Θ(1) = Θ(n 2 ) D.R. Stinson (SCS) CS 341 January 16, 2019 54 / 294

Loop Analysis Techniques Example of Loop Analysis (cont.) Proving separate O- and Ω-bounds We focus on the two nested for loops (i.e., (2)). The total number of iterations is n i=1 i, with Θ(1) time per iteration. Upper bound: n n O(i) O(n) = O(n 2 ). Lower bound: n Ω(i) i=1 i=1 n i=n/2 Ω(i) i=1 n i=n/2 Ω(n/2) = Ω(n 2 /4) = Ω(n 2 ). Since the upper and lower bounds match, the complexity is Θ(n 2 ). D.R. Stinson (SCS) CS 341 January 16, 2019 55 / 294

Loop Analysis Techniques Another Example of Loop Analysis Algorithm: LoopAnalysis2(A : array; n : integer) max 0 for i 1 to n for j i to n sum 0 for k do i to j do sum sum + A[k] do if sum > max then max sum return (max) D.R. Stinson (SCS) CS 341 January 16, 2019 56 / 294

Loop Analysis Techniques Another Example of Loop Analysis (cont.) Θ-bound analysis The innermost loop (for k) has complexity Θ(j i + 1). The next loop (for j) has complexity n n Θ(j i + 1) = Θ (j i + 1) j=i j=i = Θ (1 + 2 + + (n i + 1)) = Θ ((n i + 1)(n i + 2)). The outer loop (for i) has complexity ( n n ) Θ((n i + 1)(n i + 2)) = Θ (n i + 1)(n i + 2) i=1 i=1 = Θ (1 2 + 2 3 + + n(n + 1)) = Θ ( n 3 /3 + n 2 + 2n/3 ) from Maple = Θ(n 3 ). D.R. Stinson (SCS) CS 341 January 16, 2019 57 / 294

Loop Analysis Techniques Another Example of Loop Analysis (cont.) Proving an Ω-bound Consider two loop structures: L 1 L 2 i = 1,..., n/3 i = 1,..., n j = 1 + 2n/3,..., n j = i + 1..., n k = 1 + n/3,..., 1 + 2n/3 k = i..., j It is easy to see that L 1 L 2, where L 2 is loop structure of the algorithm. This is because the algorithm examines all triples (i, k, j) with 1 i k j n. There are (n/3) 3 = n 3 /27 iterations in L 1. Therefore the number of iterations in L 2 is Ω(n 3 ). D.R. Stinson (SCS) CS 341 January 16, 2019 58 / 294

Loop Analysis Techniques Yet Another Example of Loop Analysis Algorithm: LoopAnalysis3(n : integer) sum 0 for i 1 to n j i while do { j 1 sum sum + i/j do j j 2 return (sum) D.R. Stinson (SCS) CS 341 January 16, 2019 59 / 294