COMP 9024, Class notes, 11s2, Class 1

Similar documents
Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

The Time Complexity of an Algorithm

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Growth of Functions (CLRS 2.3,3)

CS 4407 Algorithms Lecture 2: Growth Functions

Introduction to Algorithms: Asymptotic Notation

Analysis of Algorithms

Running Time Evaluation

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Data Structures and Algorithms. Asymptotic notation

CS473 - Algorithms I

Analysis of Algorithms

with the size of the input in the limit, as the size of the misused.

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Data Structures and Algorithms CSE 465

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Lecture 2: Asymptotic Notation CSCI Algorithms I

Review Of Topics. Review: Induction

COMP 382: Reasoning about algorithms

CS Non-recursive and Recursive Algorithm Analysis

Computational Complexity

Data Structures and Algorithms

Analysis of Algorithms

2. ALGORITHM ANALYSIS

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms - Using Asymptotic Bounds -

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

IS 709/809: Computational Methods in IS Research Fall Exam Review

3.1 Asymptotic notation

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

COMP 355 Advanced Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Module 1: Analyzing the Efficiency of Algorithms

The Growth of Functions and Big-O Notation

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Notes for Recitation 14

Ch01. Analysis of Algorithms

Algorithms Design & Analysis. Analysis of Algorithm

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CSC Design and Analysis of Algorithms. Lecture 1

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Algorithm efficiency analysis

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Asymptotic Algorithm Analysis & Sorting

Mergesort and Recurrences (CLRS 2.3, 4.4)

COMPUTER ALGORITHMS. Athasit Surarerks.

Asymptotic Analysis of Algorithms. Chapter 4

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

COMP Analysis of Algorithms & Data Structures

Asymptotic Analysis 1

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Big O (Asymptotic Upper Bound)

data structures and algorithms lecture 2

Analysis of Algorithms

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Divide and Conquer. Arash Rafiey. 27 October, 2016

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Divide and Conquer Problem Solving Method

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Principles of Algorithm Analysis

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Analysis of Multithreaded Algorithms

Asymptotic Analysis and Recurrences

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Design and Analysis of Algorithms

EECS 477: Introduction to algorithms. Lecture 5

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

P, NP, NP-Complete, and NPhard

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Introduction to Computer Science Lecture 5: Algorithms

Analysis of Algorithms - Midterm (Solutions)

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Algorithm Design and Analysis

Review 1. Andreas Klappenecker

Divide and Conquer. Andreas Klappenecker

Data Structures and Algorithms

Algorithms Chapter 4 Recurrences

Introduction to Algorithms

Ch 01. Analysis of Algorithms

Transcription:

COMP 90, Class notes, 11s, Class 1 John Plaice Sun Jul 31 1::5 EST 011 In this course, you will need to know a bit of mathematics. We cover in today s lecture the basics. Some of this material is covered in Chapter of your textbook. 1 Functions You should be familiar with the following functions: constant: f(n) = c logarithmic: f(n) = log n linear: f(n) = n linearithmic: f(n) = n log n quadratic: f(n) = n cubic: f(n) = n 3 exponential: f(n) = b n You should also know these two functions floor: f(x) = x ceiling: f(x) = x Properties You should know some basic properties of the exponential and logarithmic functions. First the exponential: Proposition 1. For all a, b, c R, b 0, b a b c = b a+c b a b c = ba c ( b a ) c = b ac For a, c N, this can be proven by induction. Extending these results to Z, then Q, then R, requires deeper mathematical results. Now for the logarithm: Proposition. For all a, b, c R, b > 1, a > 0, c > 0, log b ac = log b a + log b c 1

b log b ac = ac = b log b a b log b c = b (log b a+log b c) log b ac = log b a + log b c Proposition 3. For all a, b, c R, b > 1, a > 0, c > 0, log b a c = log b a log b c b log b a c = a c = blog b a b log b c = b (log b a log b c) log b a c = log b a log b c Proposition. For all a, b, c R, b > 1, a > 0, c > 0, log b a c = c log b a b log b ac = a c = ( b log a) c b = b c log b a log b a c = c log b a Proposition 5. For all a, b, d R, b > 1, d > 1, a > 0, log d a = log d b log b a a = b log b a = ( d log b) log d b a = d (log d b log b a) log d a = log d b log b a

Proposition. For all a, b, d R, b > 1, d > 1, a > 0, b log d a = a log d b b log d a = b (log d b log b a) = b (log b a log d b) = ( b log a) log b d b = a log d b 3 Arithmetic series To prove the complexity of an algorithm, one needs to be able to count. Below are some important identities for arithmetic series. Proposition 7. For all n N, Proof by induction on n. i = n(n + 1) Case n = 0. 0 i = 0 = 0(0 + 1) Case n = N + 1. Then By the induction hypothesis, Suppose for induction that N i = N+1 i = N(N + 1) N i + (N + 1) N(N + 1) (N + 1) = + (N + 1)(N + ) = = (N + 1)( (N + 1) + 1 ) i = n(n + 1) Proposition 8. For all n N, Proof by induction on n. i = n(n + 1)(n + 1) 3

Case n = 0. 0 i = 0 = 0(0 + 1)( 0 + 1) Case n = N + 1. Suppose for induction that N i = N(N + 1)(N + 1) Then N+1 By the induction hypothesis, i = N i + (N + 1) N(N + 1)(N + 1) (N + 1) = + = (N + 1)(N + N + N + ) (N + 1)(N + )(N + 3) = = (N + 1)( (N + 1) + 1 )( (N + 1) + 1 ) i = n(n + 1)(n + 1) Proposition 9. For all n N, Proof by induction on n. Case n = 0. ( ) n(n + 1) i 3 = = n (n + 1) 0 i 3 = 0 = 0 (0 + 1) Case n = N + 1. Suppose for induction that N i 3 = N (N + 1)

Then By the induction hypothesis, N+1 i 3 = N i 3 + (N + 1) 3 = N (N + 1) + (N + 1)3 = (N + 1) (N + N + ) = (N + 1) (N + ) = (N + 1)( (N + 1) + 1 ) i 3 = n (n + 1) Geometric series Below are some important identities for arithmetic series. Proposition 10. For all a R, a 1, a i = an+1 1 a 1 a n+1 1 = a n+1 + (a n a n ) + + (a a) 1 = a a i 1 a i = (a 1) a i a i = an+1 1 a 1 Proposition 11. For all a R, 0 < a < 1, a i = 1 1 a 5

a i = lim a i n = lim n = 0 1 a 1 = 1 1 a a n+1 1 a 1 5 Asymptotic complexity definitions Definition 1. Let f, g : N R. We say f(n) is O(g(n)) iff there exists c R, c > 0, and n 0 N, n 1, such that n n 0, we have f(n) cg(n). Definition. Let f, g : N R. We say f(n) is Ω(g(n)) iff there exists c R, c > 0, and n 0 N, n 1, such that n n 0, we have f(n) cg(n). Definition 3. Let f, g : N R. We say f(n) is Θ(g(n)) iff there exists c, c R, c > 0, c > 0, and n 0 N, n 1, such that n n 0, we have c g(n) f(n) c g(n). We write O(1) for a constant function and O(n O(1) ) for a polynomial function. Asymptotic geometric series Proposition 1. For all c R, c > 0, g(n) = c i is Θ(1), if c < 1 Θ(n), if c = 1 Θ(c n ), if c > 1 Case c < 1. c i < c i = 1 1 c lim n g(n) = 1 1 c, so g(n) is Θ(1). Case c = 1. g(n) = n, so g(n) is Θ(n). c i = n Case c > 1. c i is a polynomial over c g(n) is Θ(c n ), because in a polynomial, the term of highest degree dominates the other terms with respect to asymptotic complexity.

7 Counting divide-and-conquer Let a, b, d, n N, a > 0, b > 1. We consider, as last example, a problem of size n which will be solved using the divide-and-conquer technique. At each stage, it will be divided into a subproblems, each of size n/b, and that the work at that stage is O(n d ). At level k of the subdividing, there will be a k subproblems, each of size (n/b) k. The cost for level k is therefore ( ( n ) ) ( ) d n O b k a k d b dk a k ( n d) ) k b d The total number of levels before getting down to subproblems of size 1 is. The total cost is therefore O ( n d) ( n d ) Case a < b d. is Θ(1) the total cost is O(n d ). Case a = b d. is Θ(logb n) the total cost is Case a > b d. O ( n d ) ( n d log n ) is Θ ( b d ) logb n ) the total cost is ( ( O n d a ) ) logb n b d (n d (b d ) log b ( (n n d a ( b ) d ( )) a (n d ) ( n log b a) n d )) )) 7

8 Basic sorts We had a brief look at the following sorts. They will be presented in more detail, with code, in future classes. bogosort: Ω(n), O( ). Best case: sorted input. bubble sort: Ω(n), O(n ). Best case: sorted input. Worst case: reverse sorted input. selection sort: Θ(n ). insertion sort: Ω(n), O(n ). Best case: reverse sorted input. Worst case: sorted input. merge sort: Θ(n log n). quicksort: Ω(n log n), O(n ). Worst case: sorted input. Quicksort can be improved by taking as pivot the median of three randomly chosen inputs. In practice, quicksort is used for large data sets, and insertion sort for small data sets. 8