Lecture 2: Asymptotic Notation CSCI Algorithms I

Similar documents
Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Growth of Functions (CLRS 2.3,3)

Module 1: Analyzing the Efficiency of Algorithms

Computational Complexity

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Module 1: Analyzing the Efficiency of Algorithms

CS473 - Algorithms I

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Algorithms Design & Analysis. Analysis of Algorithm

Big-O Notation and Complexity Analysis

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Data Structures and Algorithms CSE 465

COMP 382: Reasoning about algorithms

CSC236 Week 4. Larry Zhang

Analysis of Algorithms

COMP 9024, Class notes, 11s2, Class 1

The Time Complexity of an Algorithm

Ch01. Analysis of Algorithms

Time complexity analysis

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Asymptotic Analysis 1

The Time Complexity of an Algorithm

3.1 Asymptotic notation

CS Non-recursive and Recursive Algorithm Analysis

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

COMP Analysis of Algorithms & Data Structures

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

COMP Analysis of Algorithms & Data Structures

Data Structures and Algorithms. Asymptotic notation

Data Structures and Algorithms Chapter 2

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Lecture 3. Big-O notation, more recurrences!!

Lecture 5: Loop Invariants and Insertion-sort

Analysis of Algorithms

Introduction to Computer Science Lecture 5: Algorithms

Big O (Asymptotic Upper Bound)

Data Structures and Algorithms Running time and growth functions January 18, 2018

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

data structures and algorithms lecture 2

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Cpt S 223. School of EECS, WSU

EECS 477: Introduction to algorithms. Lecture 5

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Foundations II: Data Structures and Algorithms

Ch 01. Analysis of Algorithms

Principles of Algorithm Analysis

Analysis of Algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CS 361, Lecture 14. Outline

CS 310 Advanced Data Structures and Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Introduction to Algorithms and Asymptotic analysis

Analysis of Algorithms - Using Asymptotic Bounds -

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CSC Design and Analysis of Algorithms. Lecture 1

An analogy from Calculus: limits

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Asymptotic Analysis of Algorithms. Chapter 4

CISC 235: Topic 1. Complexity of Iterative Algorithms

CS 380 ALGORITHM DESIGN AND ANALYSIS

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Introduction to Algorithms: Asymptotic Notation

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Programming, Data Structures and Algorithms Prof. Hema Murthy Department of Computer Science and Engineering Indian Institute Technology, Madras

P, NP, NP-Complete, and NPhard

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Written Homework #1: Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Running Time Evaluation

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Algorithms and Programming I. Lecture#1 Spring 2015

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Lecture 3: Big-O and Big-Θ

Review Of Topics. Review: Induction

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

CSE 417: Algorithms and Computational Complexity

Fall 2016 Test 1 with Solutions

CS 161 Summer 2009 Homework #2 Sample Solutions

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

Asymptotic Analysis Cont'd

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Agenda. We ve discussed

CS F-01 Algorithm Analysis 1

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

Introduction to Algorithms 6.046J/18.401J

Design and Analysis of Algorithms. Part 1 Program Costs and Asymptotic Notations

Transcription:

Lecture 2: Asymptotic Notation CSCI 700 - Algorithms I Andrew Rosenberg September 2, 2010

Last Time Review Insertion Sort Analysis of Runtime Proof of Correctness

Today Asymptotic Notation Its use in analyzing runtimes. Review of Proofs by Induction An important tool for correctness proofs.

Two Approaches 1 Emperical Generate input. Time the executable. 2 Theoretical Prove the runtime as a function of the size of the input. This is what we ll be focusing on. The RAM model. All information can be held in RAM All operations take the same amount of resources each time they re called.

Asymptotic Notation Runtime on large inputs is more important than small input. Sometimes, inefficient, but frequently called subroutines that operate on small input can have a dramatic impact on overall performance. But this isnt the kind of efficiency well be analyzing here. It is usually safe to assume that live inputs are larger and less well-formed than test input used during development.

Classes of Functions We want to develop classes of functions to compare the runtime of an algorithm across machines. Criteria 1 We only care about the behavior of the algorithm on input with size greater than some value n 0. 2 To transfer our analyses across machines, we consider functions that differ by at most a constant multiplier to be equivalent.

Definition Asymptotic Notation is a formal notation for discussing and analyzing classes of functions. Criteria 1 Capture behavior when n 0 n. 2 Functions that differ by at most a constant multiplier are considered equivalent.

Example One one machine an algorithm may take T(n) = 15n 3 +n 2 +4 On another, T(n) = 5n 3 +4n +5 Both will belong to the same class of functions. Namely, cubic functions of n. Function classes can be thought of at how many times we do something to each input.

O(n) definition Definition f(n) O(g(n)) if there exists constants c > 0 and n 0 > 0 such that 0 f(n) c g(n) for all n n 0 Big-O notation O(g(n)) is an upper-bound on the growth of a function, f(n).

O(n) proofs Proof. Prove that if T(n) = 15n 3 +n 2 +4, T(n) = O(n 3 ). Let c = 20 and n 0 = 1. Must show that 0 f(n) and f(n) cg(n). 0 15n 3 +n 2 +4 for all n n 0 = 1. f(n) = 15n 3 +n 2 +4 15n 3 +n 3 +4n 3 15n 3 +n 3 +4n 3 = 20n 3 = 20g(n) = cg(n)

O(n) proofs Proof. Prove that if T(n) = 15n 3 +n 2 +4, T(n) = O(n 4 ). Let c = 20 and n 0 = 1. Must show that 0 f(n) and f(n) cg(n). 0 15n 3 +n 2 +4 for all n n 0 = 1. f(n) = 15n 3 +n 2 +4 15n 4 +n 4 +4n 4 15n 4 +n 4 +4n 4 = 20n 4 = 20g(n) = cg(n)

O(n) recap T(n) = 15n 3 +n 2 +4 T(n) = O(n 3 ). T(n) = O(n 4 ). O(n) is an upper bound = is not really equality. It s used as set inclusion here. Don t use O(n) = T(n)

Ω(n) definition Definition f(n) Ω(g(n)) if there exists constants c > 0 and n 0 > 0 such that 0 c g(n) f(n) for all n n 0 Big-Omega of n Ω(g(n)) is lower-bound on the growth of a function, f(n).

Ω(n) proofs Proof. Prove that if T(n) = 15n 3 +n 2 +4, T(n) = Ω(n 3 ). Let c = 15 and n 0 = 1. Must show that 0 cg(n) and cg(n) f(n). 0 15n 3 for all n n 0 = 1. cg(n) = 15n 3 15n 3 +n 2 +4 = f(n)

Ω(n) proofs Proof. Prove that if T(n) = 15n 3 +n 2 +4, T(n) = Ω(n 2 ). Let c = 15 and n 0 = 1. Must show that 0 cg(n) and cg(n) f(n). 0 15n 3 for all n n 0 = 1. cg(n) = 15n 2 15n 3 15n 3 +n 2 +4 = f(n)

Ω(n) recap T(n) = 15n 3 +n 2 +4 T(n) = Ω(n 3 ). T(n) = Ω(n 2 ). Ω(n) is a lower bound

Θ(n) definition Definition f(n) Θ(g(n)) if there exists constants c 1 > 0, c 2 > 0 and n 0 > 0 such that c 1 g(n) f(n) c 2 g(n) for all n n 0 Theta of n Θ(g(n)) is tight bound on the growth of a function, f(n). Can also say, f(n) = Θ(g(n)) iff f(n) = O(g(n)) and f(n) = Ω(g(n)) T(n) = 15n 3 +n 2 +4 T(n) = Θ(n 3 ).

Θ(n) proof Proof. Prove that if T(n) = 15n 3 +n 2 +4, T(n) = Θ(n 3 ). Let c 1 = 15, c 2 = 20 and n 0 = 1. Must show that c 1 g(n) f(n) and f(n) c 2 g(n). c 1 g(n) = 15n 3 15n 3 +n 2 +4 = f(n). f(n) = 15n 3 +n 2 +4 15n 3 +n 3 +4n 3 = 20n 3 = c 2 g(n).

Asymptotic Notation Examples T(n) = 15n 3 +n 2 +4 = Θ(n 3 ) T(n) = 2n 3 +4n+4 = Θ(n 3 ) T(n) = 2n 2 +4n+4 = Θ(n 2 ) T(n) = 15n 3 +n 2 +4 = Θ(2n 3 +n 2 ) While true, this is never done. Note: f = Θ(g) is a common shorthand for f(n) = Θ(g(n))

Asymptotic Notation Examples T(n) = n 3 +nlogn = Θ(n 3 )? If nlogn < n 3 for all n > n 0, then we can use the proof structure from before. Let c 1 = 1. c 1 g(n) = n 3 n 3 +nlogn. Let c 2 = 2. n 3 +nlogn n 3 +n 3 = 2n 3 = c 2 g(n)

Runtime at small n T 1 (n) = n 3 = Θ(n 3 ) T 2 (n) = 2n+100 = Θ(n) n T 1 (n) T 2 (n) 1 1 102 2 8 104 3 27 106 4 64 108 5 125 110 6 216 112

Asymptotic Notation Properties f = Θ(h) and g = Θ(h) then f +g = Θ(h) f = Θ(h) = c 1 h+slack. g = Θ(h) = c 2 h+slack. f +g = (c 1 +c 2 )h+slack. Θ(f +g) = Θ(max(f,g)) f = n 3, g = nlogn, h = f +g = n 3 +nlogn Θ(h) = Θ(f +g) = Θ(max(f,g)) = Θ(n 3 )

Orders of growth Behavior of a function as n. f(n) if lim n g(n) = then f(n) = Ω(g(n)) f is bigger than g, or f dominates g f(n) if lim n g(n) = 0 then f(n) = O(g(n)) f is smaller than g, or g dominates f f(n) if lim n g(n) = c then f(n) = Θ(g(n)) f is the same as g

Orders of growth examples Show that n 2 = Ω(n) Prove it to yourself first. n n 2 n 2 n 1 1 0 2 4 2 3 9 6 4 16 12 5 25 19 6 36 30

Orders of growth examples Show that n 2 = Ω(n). Evaluate lim n f(n) g(n) lim n n 2 n lim n n = so n 2 = Ω(n)

Orders of growth examples Show that n = Ω(logn). Prove it to yourself first. n logn n logn 1 0 1 4 2 2 8 3 5 16 4 12 32 5 27 64 6 58

Orders of growth example. Show that n 2 = Ω(n). Evaluate lim n f(n) g(n) lim n n logn l Hopital s rule. (from Calculus) if lim n c f(n) = and lim n c g(n) =, then f(n) lim n c g(n) = lim n c f (n) g (n) n lim n logn = lim n 1 1/n n lim n logn = lim n n = So, n = Ω(logn)

Order of growth of a summation Proof. S(n) = n i=1 i = n(n+1) 2 Even if we don t know this, we can show that S(n) = Θ(n 2 ) Show S(n) = O(n 2 ). S(n) = n n i n = n n = n 2 i=1 i=1 S(n) n 2

Order of growth of a summation Proof. Show S(n) = Ω(n 2 ). S(n) = n i i=1 n i=n/2+1 n i=n/2+1 i n 2 = n 2 n 2 = n2 4 Thus, S(n) = Ω(n 2 ), and S(n) = Θ(n 2 ) n i=n/2+1 n 2

Arithmetic series S(n) = n i i=1 6 S(6) = i i=1 = 1+2+3+4+5+6 = (1+6)+(2+5)+(3+4) = 3 7

Arithmetic series S(n) = n i i=1 n S(n) = i i=1 = 1+2+...+n = (1+n)+(2+(n 1))+...( n 2 + n 2 +1) = n 2 (n+1)

Inductive Proof S(n) = n i=1 i = n(n+1) 2 Base Case: S(1) S(1) = n(n+1) 2 = 1 1 i i=1 = 1(1+1) 2 = 2 2 = 1

Inductive Step: Assume true for n n 0 = 1. Prove for n+1. S(n+1) = n+1 i i=1 = (n+1)+ n i i=1 = (n+1)+ n(n+1) 2 = 2(n+1) n(n +1) + 2 2 2(n+1)+n(n +1) = 2 = (n+2)(n+1) 2 (n+1)((n +1)+1) = 2

Induction over structures The structure of an inductive proof can be applied to structures. Sometimes using the construct of Loop invariants. Initialization = Base Case Maintenance = Inductive Step Termination

Insertion Sort Pseudocode InsertionSort(A) for j 2 to size(a) do key A[j] i j 1 while i > 0 and A[i] > key do {Slide the array to the right} A[i +1] A[i] i i +1 end while A[i +1] key end for

Insertion Sort key:2 j Start of the Loop i A[i]: 5 2 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:2 j i A[i]: 5 2 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:2 j i A[i]: 5 5 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:2 j i A[i]: 2 5 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:4 j Start of the Loop i A[i]: 2 5 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:4 j i A[i]: 2 5 4 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:4 j i A[i]: 2 5 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:4 j i A[i]: 2 5 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:4 j i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:6 j Start of the Loop i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:6 j i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:6 j i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j Start of the Loop i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 5 6 1 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 5 6 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 5 6 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 5 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 5 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 4 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 2 2 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:1 j i A[i]: 1 2 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:3 j Start of the Loop i A[i]: 1 2 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 5 6 3 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 5 6 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 5 6 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 5 5 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 5 5 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 4 5 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 4 4 5 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i A[i]: 1 2 3 4 5 6 idx: 1 2 3 4 5 6

Insertion Sort key:3 j i Termination A[i]: 1 2 3 4 5 6 idx: 1 2 3 4 5 6

InsertionSort Loop Invariant At the start of the for loop, the subarray A[1..j-1] is sorted. Sorted here means contains the initial items of A[1..j-1] in ascending order.

InsertionSort Induction Proof Induction Proof: At the start of the for loop, the subarray A[1..j-1] is sorted. Base Case: At initialization j = 2. The subarray A[1..j-1] contains a single element, A[1]. A single element is in sorted order.

InsertionSort Induction Proof Induction Proof: At the start of the for loop, the subarray A[1..j-1] is sorted. Induction Step: Assume true for j > n 0. Prove for j +1. 1 Show that A[j+1] is put in the correct position. 2 Show that A[1..j+1] remains sorted.

InsertionSort Induction Proof Induction Proof: At the start of the for loop, the subarray A[1..j-1] is sorted. Induction Step: Assume true for j > n 0. Prove for j +1. 1 Show that A[j+1] is put in the correct;; position. The index i is used to find the correct position to insert A[j +1], specifically, the position i such that, A[i]leA[j +1] < A[i +2]. At the end of the internal while loop, i points to element with the greatest index smaller than A[j +1], or 0 if no such element exists. A[j +1] is moved to the position i +1. Because A[1..j] is sorted, every element A[1..i] is smaller than A[j] and every element A[(i +1)..j] is larger than A[j +1]. Therefore A[j +1] is put in the correct position.

InsertionSort Induction Proof Induction Proof: At the start of the for loop, the subarray A[1..j-1] is sorted. Induction Step: Assume true for j > n 0. Prove for j +1. 2 Show that A[1..j+1] remains sorted. At the end of the internal while loop, the elements A[i +1..j] have been each been shifted up one position to A[i +1..j +1]. Therefore A[i+2..j+1] is sorted. A[1..i] has not been modified. Since A[1..j] was sorted at the start of the loop, the subarray A[1..i] remains sorted. Since A[j+1] was put in the position, i+1 such that A[i]leA[j +1] < A[i +2] the resulting array A[1..i],A[i +1],A[i +2..j +1]issorted.

InsertionSort Induction Proof Induction Proof: At the start of the for loop, the subarray A[1..j-1] is sorted. Termination: At the termination of the for loop, j = n+1, therefore A[1..n] is sorted.

Bye Homework 2 is posted to the course website. Next time (9/7) Recursion For Next Class Read Sections 2.1, 2.2, 7.1, 7.2, 7.4