Analysis of Algorithms

Similar documents
Analysis of Algorithms

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS Non-recursive and Recursive Algorithm Analysis

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Data Structures and Algorithms. Asymptotic notation

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Running Time Evaluation

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Module 1: Analyzing the Efficiency of Algorithms

Computational Complexity

CS 4407 Algorithms Lecture 2: Growth Functions

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

CSC Design and Analysis of Algorithms. Lecture 1

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

EECS 477: Introduction to algorithms. Lecture 5

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Module 1: Analyzing the Efficiency of Algorithms

Cpt S 223. School of EECS, WSU

Asymptotic Analysis of Algorithms. Chapter 4

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Analysis of Algorithms - Using Asymptotic Bounds -

data structures and algorithms lecture 2

Ch01. Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Written Homework #1: Analysis of Algorithms

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Copyright 2000, Kevin Wayne 1

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Algorithms Design & Analysis. Analysis of Algorithm

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Principles of Algorithm Analysis

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

COMP 382: Reasoning about algorithms

Asymptotic Algorithm Analysis & Sorting

Analysis of Algorithms

The Time Complexity of an Algorithm

COMP 9024, Class notes, 11s2, Class 1

Ch 01. Analysis of Algorithms

Data Structures and Algorithms Running time and growth functions January 18, 2018

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Algorithm efficiency analysis

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

3.1 Asymptotic notation

Analysis of Algorithms

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

CS 310 Advanced Data Structures and Algorithms

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

What is Performance Analysis?

Data Structures and Algorithms Chapter 2

Asymptotic Analysis 1

Algorithms and Their Complexity

Introduction to Algorithms

Review Of Topics. Review: Induction

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS

Homework 1 Submission

The Time Complexity of an Algorithm

Mergesort and Recurrences (CLRS 2.3, 4.4)

CS Data Structures and Algorithm Analysis

CS F-01 Algorithm Analysis 1

CS473 - Algorithms I

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

CS173 Running Time and Big-O. Tandy Warnow

csci 210: Data Structures Program Analysis

csci 210: Data Structures Program Analysis

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

Divide-and-Conquer Algorithms Part Two

Growth of Functions (CLRS 2.3,3)

Topic 17. Analysis of Algorithms

COMPUTER ALGORITHMS. Athasit Surarerks.

1 Substitution method

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

CS483 Design and Analysis of Algorithms

Big O (Asymptotic Upper Bound)

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

Divide and Conquer. Recurrence Relations

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

How many hours would you estimate that you spent on this assignment?

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Omega notation. Transitivity etc.

CS Analysis of Recursive Algorithms and Brute Force

Data Structures and Algorithms CSE 465

Reductions, Recursion and Divide and Conquer

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Divide and Conquer Problem Solving Method

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Transcription:

September 29, 2017 Analysis of Algorithms CS 141, Fall 2017 1 Analysis of Algorithms: Issues Correctness/Optimality Running time ( time complexity ) Memory requirements ( space complexity ) Power I/O utilization Ease of implementation 2 Stefano Lonardi, UCR 1

Worst Case Time-Complexity Definition: The worst case time-complexity of an algorithm A is the asymptotic running time of A as a function of the size of the input, when the input is the one that makes the algorithm slower in the limit How do we measure the running time of an algorithm? 5 Python (the language) We will use python code to describe algorithms (sometime mixed w English) Python is High-level (easy to read/use/learn) Object-oriented Interpreted (but can be compiled) Portable Free/open-source 6 Stefano Lonardi, UCR 2

Python: an example Algorithm for finding the maximum element of an array def imax(a): currentmax = A[0] for i in range(1,len(a)): if currentmax < A[i]: currentmax = A[i] return currentmax 7 more python-ish Algorithm for finding the maximum element of an array def imax(a): currentmax = A[0] for x in A[1:]: if currentmax < x: currentmax = x return currentmax 8 Stefano Lonardi, UCR 3

Input size and basic operation examples Problem Input size measure Basic operation Searching for key in a list of n items Number of items in the list, i.e., n Key comparison Multiplication of two matrices Matrix dimensions or total number of elements Multiplication of two numbers Checking primality of a given integer n size of n = number of digits (in binary representation) Division Typical graph problem #vertices and/or #edges Visiting a vertex or traversing an edge 10 Example (Max iterative) def imax(a): currentmax = A[0] for i in range(len(a)): if currentmax < A[i]: currentmax = A[i] return currentmax The program executes n-1 comparisons (irrespective from the type of input) where n=len(a) therefore the worst case time-complexity is O(n) 11 Stefano Lonardi, UCR 4

Example (Max recursive) def rmax(a): if len(a) == 1: return A[0] return max(rmax(a[1:]),a[0]) The program executes n-1 comparisons (irrespective from the type of input) therefore the worst case time-complexity is O(n) 12 Asymptotic notation Section 0.3 of the textbook 13 Stefano Lonardi, UCR 5

The Big-Oh Notation Definition: Given functions f(n) and g(n), we say that f(n) is O(g(n)) if and only if there are positive constants c and n 0 such that f(n) c g(n) for n n 0 14 The Big-Oh Notation 15 Stefano Lonardi, UCR 6

Proof f(n)=2n+6 g(n)=n 2n+6 4n when n 3 So, if we choose c=4, then n 0 =3 satisfies f(n) c g(n) for n n 0 Conclusion: 2n+6 is O(n) 17 Asymptotic Notation Note: Even though it is correct to say 7n - 3 is O(n 3 ), a more precise statement is 7n - 3 is O(n) Simple Rule: Drop lower order terms and constant factors 7n-3 is O(n) 8n 2 log n + 5n 2 + n is O(n 2 log n) 18 Stefano Lonardi, UCR 7

Asymptotic Notation Special classes of algorithms constant: O(1) logarithmic: O(log n) linear: O(n) quadratic: O(n 2 ) cubic: O(n 3 ) polynomial: O(n k ), k 1 exponential: O(a n ), n > 1 19 Asymptotic Notation Relatives of the Big-Oh Ω(f(n)): Big Omega asymptotic lower bound Θ(f(n)): Big Theta asymptotic tight bound 20 Stefano Lonardi, UCR 8

Big Omega Definition: Given two functions f(n) and g(n), we say that f(n) is Ω(g(n)) if and only if there are positive constants c and n 0 such that f(n) c g(n) for n n 0 Property: f(n) is Ω(g(n)) iff g(n) is O(f(n)) 21 Big Theta Definition: Given two functions f(n) and g(n), we say that f(n) is Θ(g(n)) if and only if there are positive constants c 1, c 2 and n 0 such that c 1 g(n) f(n) c 2 g(n) for n n 0 Property: f(n) is Θ(g(n)) if and only if f(n) is O(g(n)) AND f(n) is Ω(g(n)) 22 Stefano Lonardi, UCR 9

Establishing order of growth using limits 0 order of growth of f(n) < order of growth of g(n) lim f(n)/g(n) = n c > 0 order of growth of f(n) = order of growth of g(n) order of growth of f(n) > order of growth of g(n) Examples: 10n vs. n 2 n(n+1)/2 vs. n 2 24 Orders of growth: some important functions All logarithmic functions log a n belong to the same class Θ(log n) no matter what the logarithm s base a > 1 is All polynomials of the same degree k belong to the same class: a k n k + a k-1 n k-1 + + a 0 in Θ(n k ) Exponential functions a n have different orders of growth for different a s order log n < order n < order n log n < order n k (k 2 constant) < order a n < order n! < order n n Caution: Be aware of very large constant factors 25 Stefano Lonardi, UCR 10

26 Time analysis for iterative algorithms Steps Decide on parameter n indicating input size Identify algorithm s basic operation Determine worst case(s) for input of size n Set up a sum for the number of times the basic operation is executed Simplify the sum using standard formulas and rules 27 Stefano Lonardi, UCR 11

Example of Asymptotic Analysis def prefixaverages1(x): A = [] for i in range(len(x)): a = 0 for j in range(i+1): a += X[j] step A.append(a/float(i+1)) return A n iterations i+1 iterations then the algorithm is O(n 2 ) 28 A faster algorithm Observe that 29 Stefano Lonardi, UCR 12

A linear-time algorithm def prefixaverages2(x): A,a = [],0 for i in range(len(x)): a = a + X[i] A.append(a/float(i+1)) return A 30 A trickier example Analyze the worst-case time complexity of the following algorithm, and give a tight bound using the big-theta notation def weirdloop(n): i = n while i >= 1: return for j in range(i): i = i/2 print 'Hello' 31 Stefano Lonardi, UCR 13

Math review Appendix A of the textbook 32 Summations n å i= 1 n å i= 0 nn ( + 1) i = 2 n+ 1 i 1- a a = when a > 0, a ¹ 1 1- a n i n n+ 1 e.g., 2 = 1+ 2 + 4 +... + 2 = 2-1 å i= 0 33 Stefano Lonardi, UCR 14

Binomial expansion n n ænö k n-k ( a+ b) = åç a b k 0 k = è ø In particular, if we choose a = 1, b= 1 we get 2 n n ænö = åç k= 0 k è ø 34 Bounding sums Upper bound: Any sum is at most the number of terms times the maximum term Example: 1+4+9+...+n 2 is at most n*n 2 = n 3 Lower bound: If the terms are non-negative, any sum is at least half the number of terms times the median term Example: 1+4+9+...+n 2 is at least (n/2)*(n/2) 2 = n 3 /8 35 Stefano Lonardi, UCR 15

Proving (or disproving) p q Counterexample (used to prove that p q is false showing one particular choice of p that makes q false) Direct proof (p p 1 p n q ) Contrapositive (prove that ~q ~p) Contradiction (assume p and ~q true, find a contradiction) Induction (prove base case + induction) 36 Theorem: n å i= 1 Induction proof nn ( + 1) i = 2 Proof: by induction on n. Base case: n = 1. Trivial since 1 = 1(1 + 1) / 2. Induction step: n ³ 2. Assume the claim is true for n n-1 ( n- 1) n n( n+ 1) any n' < n. Then åi = n+ åi = n+ = 2 2 i= 1 i= 1 using induction 37 Stefano Lonardi, UCR 16

Recurrence Relation Analysis 39 Recurrence relation A recurrence relation is an equation that recursively define a sequence: each term of the sequence is defined as a function of the preceding term(s) For instance 2 n=1 f(n) = f(n-1)+n n>1 40 Stefano Lonardi, UCR 17

General form 42 def factorial(n): if n == 0: return 1 else: return n * factorial(n-1) 43 Stefano Lonardi, UCR 18

def fibonacci(n): if n == 0: return 0 elif n == 1: return 1 else: return fibonacci(n-1) + fibonacci(n-2) 44 def bugs(n): if n <= 1: do_something() else: Example bugs(n-1) bugs(n-2) for i in range(n): do_something_else() ì c1 if n 1 Tn ( ) = í îtn ( - 1) + Tn ( - 2) + nc2 otherwise 45 Stefano Lonardi, UCR 19

Example def daffy(n): if n == 1 or n == 2: do_something() else: daffy(n-1) for i in range(n): do_something_else() daffy(n-1) ì c1 if n= 1 or n= 2 Tn ( ) = í î2 Tn ( - 1) + nc2 otherwise 46 Example def elmer(n): if n == 1: do_something() elif n == 2: do_something_else() else: for i in range(n): elmer(n-1) do_something_different() ì c1 if n= 1 ï Tn ( ) = í c2 if n= 2 ï înt ( ( n- 1) + c3) otherwise 47 Stefano Lonardi, UCR 20

def yosemite(n): if n == 1: do_something() else: Example for i in range(1,n): ì ï Tn ( ) = n-1 í ï å î i= 1 yosemite(i) do_something_different() c 1 ( Ti+ c) 2 if n= 1 ( ) otherwise 48 MergeSort MergeSort is a divide & conquer algorithm Divide: divide an n-element sequence into two subsequences of approx n/2 elements Conquer: sort the subsequences recursively Combine: merge the two sorted subsequences to produce the final sorted sequence 49 Stefano Lonardi, UCR 21

MergeSort def mergesort(a): if len(a) < 2: return A else: m = len(a)/2 l = mergesort(a[:m]) r = mergesort(a[m:]) return merge(l,r) A l m r 50 Example 51 Stefano Lonardi, UCR 22

def merge(l,r): Merge of MergeSort result, i, j = [], 0, 0 while i < len(l) and j < len(r): if l[i] <= r[j]: result.append(l[i]) i += 1 else: result += l[i:] result += r[j:] result.append(r[j]) j += 1 return result 52 MergeSort Analysis 53 Stefano Lonardi, UCR 23

Visual Analysis 54 Solving Recurrence Relation 55 Stefano Lonardi, UCR 24

Methods Two methods for solving recurrences Iterative substitution method Master method (not covered: Recursion Tree) (not covered: Guess-and-Test method) 56 Iterative substitution Assume n large enough Substitute T on the right-hand side of the recurrence relation Iterate the substitution until we see a pattern which can be converted into a general closed-form formula 57 Stefano Lonardi, UCR 25

MergeSort recurrence relation 58 2 2 59 Stefano Lonardi, UCR 26

Verify the correctness How to verify the solution is correct? Use proof by induction! Important: make sure the constant c works for both the base case and the induction step 60 Proof by induction 1 if n = 1 T (n) = 2T (n / 2) + n otherwise Fact: T (n) O(nlog 2 n) The constant c used in the induction and the base case has to be the same! Proof. Base case: T (2) = 2T (1) + 2 = 4 c(2log 2 2) = 2c. Hence, c 2. Induction hypothesis: T (n / 2) c n 2 log 2 Induction: T (n) = 2T (n / 2) + n Choose c = 2. 2c n 2 log 2 n 2 + n n 2 = cnlog 2 n 2 + n = cnlog 2 n cnlog 2 2 + n = cnlog 2 n + n(1 c) cnlog 2 n when c 1 61 Stefano Lonardi, UCR 27

Wrong proof by induction 1 if n = 1 T (n) = 2T (n / 2) + n otherwise Fact (wrong): T (n) O(n) Proof. Base case: T (1) = 1 c1, hence c 1 Induction hypothesis: T (n / 2) c(n / 2) Induction: T (n) = 2T (n / 2) + n 2c(n / 2) + n = cn + n O(n) proof is WRONG, but where is the mistake? 62 Towers of Hanoi 63 Stefano Lonardi, UCR 28

Towers of Hanoi 64 Towers of Hanoi def hanoi(n, a='a', b='b', c='c'): if n == 0: return hanoi(n-1, a, c, b) print a, '->', c hanoi(n-1, b, a, c) 65 Stefano Lonardi, UCR 29

Towers of Hanoi: Recurrence Relation Solve T( N) ì2 T( N - 1) + 1 N > 1 = í î 1 N = 1 66 Towers of Hanoi: Unfolding the relation ÎQ( 2 N ) 67 Stefano Lonardi, UCR 30

Problem Problem: Solve exactly (by iterative substitution) ì 4 n = 1 Tn ( ) = í î4 Tn ( - 1) + 3 n> 1 68 Problem Problem: Solve exactly (by iterative substitution) ì 4 n = 1 Tn ( ) = í î4 Tn ( - 1) + 3 n> 1 n n-1 Solution: Tn ( ) = 4 + 4-1 Proof? 69 Stefano Lonardi, UCR 31

Another example 70 Another example -1 71 Stefano Lonardi, UCR 32

Master Theorem method 72 Master Theorem Condition on f(n) Condition Conclusion on T(n) Q log ( b a-e ) On logb ( n a log k n) W log ( b a+e n ) e > 0 k ³ 0 e > 0, d < 1 ( ) af n / b d f ( n) Q log ( b a Q n ) logb 1 ( a log k + n n) Q( f ( n )) 73 Stefano Lonardi, UCR 33

Master method (first case) 74 Master method (second case) 75 Stefano Lonardi, UCR 34

Master method: binary search (second case) 76 Master method: merge-sort (second case) 77 Stefano Lonardi, UCR 35

Master method (third case) 78 Summary (1/3) Goal: analyze the worst-case timecomplexity of iterative and recursive algorithms Tools: Pseudo-code/Python Big-O, Big-Omega, Big-Theta notations Recurrence relations Discrete Math (summations, induction proofs, methods to solve recurrence relations) 79 Stefano Lonardi, UCR 36

Summary (2/3) Pure iterative algorithm: Analyze the loops Determine how many times the inner core is repeated as a function of the input size Determine the worst-case for the input Write the number of repetitions as a function of the input size Simplify the function using big-o or big-theta notation (optional) 80 Summary (3/3) Recursive + iterative algorithm: Analyze the recursive calls and the loops Determine how many recursive calls are made and the size of the arguments of the recursive calls Determine how much extra processing (loops) is done Determine the worst-case for the input Derive a recurrence relation Solve the recurrence relation Simplify the solution using big-o, or big-theta 81 Stefano Lonardi, UCR 37