Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Similar documents
CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Principles of Algorithm Analysis

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Data Structures and Algorithms. Asymptotic notation

Asymptotic Analysis of Algorithms. Chapter 4

The Time Complexity of an Algorithm

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithms

CS 310 Advanced Data Structures and Algorithms

Analysis of Algorithms

The Time Complexity of an Algorithm

Ch 01. Analysis of Algorithms

Data Structures and Algorithms Running time and growth functions January 18, 2018

Computational Complexity

Ch01. Analysis of Algorithms

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

What is Performance Analysis?

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

COMP 9024, Class notes, 11s2, Class 1

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

3.1 Asymptotic notation

Advanced Algorithmics (6EAP)

csci 210: Data Structures Program Analysis

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

csci 210: Data Structures Program Analysis

COMP Analysis of Algorithms & Data Structures

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Analysis of Algorithms

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

COMP Analysis of Algorithms & Data Structures

CISC 235: Topic 1. Complexity of Iterative Algorithms

CSE 417: Algorithms and Computational Complexity

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Data Structures and Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Algorithms and Their Complexity

Lecture 2: Asymptotic Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Algorithms Design & Analysis. Analysis of Algorithm

Big-O Notation and Complexity Analysis

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Lecture 12: Algorithm Analysis

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Design and Analysis of Algorithms

with the size of the input in the limit, as the size of the misused.

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Lecture 2: Asymptotic Notation CSCI Algorithms I

Runtime Complexity. CS 331: Data Structures and Algorithms

CS173 Running Time and Big-O. Tandy Warnow

Topic 17. Analysis of Algorithms

Omega notation. Transitivity etc.

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

CS 4407 Algorithms Lecture 2: Growth Functions

Asymptotic Running Time of Algorithms

Growth of Functions (CLRS 2.3,3)

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

CS1210 Lecture 23 March 8, 2019

Problem-Solving via Search Lecture 3

Running Time Evaluation

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

CS473 - Algorithms I

Divide and Conquer Problem Solving Method

Introduction to Algorithms: Asymptotic Notation

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Cpt S 223. School of EECS, WSU

ASYMPTOTIC COMPLEXITY SEARCHING/SORTING

Asymptotic Algorithm Analysis & Sorting

Data Structures and Algorithms CSE 465

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

CSCE 222 Discrete Structures for Computing

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

CSC Design and Analysis of Algorithms. Lecture 1

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

data structures and algorithms lecture 2

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Answer the following questions: Q1: ( 15 points) : A) Choose the correct answer of the following questions: نموذج اإلجابة

2. ALGORITHM ANALYSIS

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Data Structures and Algorithms Chapter 2

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Discrete Mathematics CS October 17, 2006

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Transcription:

15-112 Fundamentals of Programming Efficiency of algorithms November 5, 2017

Complexity of sorting algorithms Selection Sort Bubble Sort Insertion Sort

Efficiency of Algorithms A computer program should be completely correct, but it should also Execute as quickly as possible (time-efficiency) Use memory wisely (space efficiency) Often, there is more than one algorithm to solve a problem Often the choice between efficient and inefficient algorithms can make the difference between a practical solution to a problem and an impractical one How do we precisely measure the performance of an algorithm or program?

Measuring Performance Measuring performance requires that we determine how an algorithm s execution time increases with respect to the input size When comparing two algorithms to determine which is better, we can compare their growth rates

Experimental Studies Write a program implementing the algorithm 9000 8000 Run the program with inputs of varying size and composition Use a function like time.time() to get an accurate measure of the actual running time Plot the results Time (ms) 7000 6000 5000 4000 3000 2000 1000 0 0 50 100 Input Size

Timing Sorts Time different sorting algorithms Bubble Sort Insertion Sort Selection Sort Merge Sort

Limitations of Experiments It is necessary to implement the algorithm, which may be difficult Results may not be indicative of the running time on other inputs not included in the experiment. In order to compare two algorithms, the same hardware and software environments must be used

Theoretical Analysis Characterizes running time as a function of the input size, n. Takes into account all possible inputs Allows us to evaluate the speed of an algorithm independent of the hardware/software environment Speed of an algorithm depends on the number of operations executed

What are the operations? Example: Search an array for a value def search(array, target): 1 i = 0 2 while i < len(array): if (array[i]==target): 5 return i 4 i = i + 1 6 return -1 3 Operations include: -Evaluating expressions -Assignments -Returning from a method

def search(array, target): 1 i = 0 2 while i < len(array): if (array[i]==target): 5 return i 4 i = i + 1 6 return -1 3 Let n = array.length How many times is operation 1 executed? How many times is operation 5 and 6 executed in total? Total so far?

def search(array, target): 1 i = 0 2 while i < len(array): if (array[i]==target): 5 return i 4 i = i + 1 6 return -1 Let s be pessimistic and consider the case with maximum number of operations performed: Worst case: target is not in the array How many times is operation 2 executed? How many times is operation 3 executed? How many times is operation 4 executed? New Total? 3

Number of Operations = 4n+3 in the Worst Case If n is the size of the array, then if n = 10, number of operations = 43 n = 20, number of operations = 83 n = 30, number of operations = 123 The number of operations performed is linearly proportional to the number of data values, since if we double n, the number of operations doubles as well In other words, the search algorithm has a linear growth rate This is independent of the hardware we use or the software environment we use.

Must we count each operation to determine the growth rate? We could have simply counted the operation that is performed the most Operation 2, 3 or 4 Each execute about n times

Consider this Example i = 0 while i < 100*n: print This is one operation i = i + 1 Which operation is executed the most? How many times is it executed? This algorithm also has a linear growth rate

Consider this Example j = 0 while j < n: i = 0 while i < n: print This is one operation i = i + 1 j = j + 1 Which operation is executed the most? How many times is it executed? This algorithm has a quadratic growth rate

Consider this Example j = 0 while j < 10*n: i = 0 while i < 18*n: print This is one operation i = i + 1 j = j + 1 k = 0 while k < n: print This is one operation ) k = k + 1 Which operation is executed the most? How many times is it executed? This algorithm has a quadratic growth rate

Constant Factors The growth rate is not affected by constant factors or lower-order terms Examples (number of operations) 10 2 n 10 5 has a linear growth rate 10 5 n 2 10 8 n has a quadratic growth rate 3n 3 20n 2 has a cubic growth rate 97logn + log logn has a logarithmic growth rate The slower the growth rate, the more efficient the algorithm, so choose an algorithm with a slower growth rate!

Let T(n) be execution time, a function of n T(n) = n If we double n, T(2n) = 2n = 2T(n) If we triple n, T(3n) = 3n = 3 T(n) T(n) = n 2 If we double n, T(2n) = (2n) 2 = 4 T(n) If we triple n, T(3n) = (3n) 2 = 9 T(n) T(n) = log n If we double n, T(2n) = log(2n) = log2+logn = 1+ T(n) If we quadruple n, T(4n) = log(4n) = log4+logn = 2+T(n) T(n) = 2 n If we add 1 to n, T(n+1) = 2 n+1 = 2 1 2 n = 2 T(n) If we add 2 to n, T(n+2) = 2 n+2 = 2 2 2 n = 4 T(n)

Order of Growth

Constants become irrelevant as n grows: Let s compare f(n) = 100n 2 and g(n)=n 4

Big O Notation To express the growth rate or complexity of an algorithm, we use a notation called Big O or Big Oh We can say that the complexity of the search algorithm is O(n) Big O of n order n

Mathematical Definition of Big-O Notation Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and n 0 such that f(n) cg(n) for n n 0

Graphical Representation of f(n) is O(g(n)) The growth rate of f(n) is bounded by the growth rate of g(n)

Big-Oh Examples Example the function 100n is O(n) 100n c n 0 n (c-100) Pick c=100 and n 0 0 Example: 2n 10 is O(n) 2n 10 cn (c 2) n 10 n 10 (c 2) Pick c 3 and n 0 10

Big-Oh Examples Example: the function n 2 is not O(n) n 2 cn n c The above inequality cannot be satisfied since c must be a constant Example: the function n is O(n 2 ) n c n 2 1 c n Pick c and n 0 1

Express the Complexity using Big-O Notation i = 0 while i < 1000*n: print one statement i = i + 1 O(n)

Express the Complexity using Big-O Notation O(n 2 ) i = 0 while i < n: j = 0 while j < n: print one operation j = j + 1 i = i + 1

Express the Complexity using Big-O Notation O(n) i = 0 while i < n: j = 0 while j < 2: print one operation j = j + 1 i = i + 1

Express the Complexity using Big-O Notation O(n 2 ) i = 0 while i < 10*n: j = 0 while j < 18*n: print one operation j = j + 1 i = i + 1 k = 0 while k < n: print one operation k = k + 1

Express the Complexity using Big-O Notation i = 0 while i < n: j = 0 while j < n: call a method that takes time n call a method that takes time n 2 j = j + 1 i = i + 1 O(n 4 )

Express the Complexity using Big-O Notation i = 0 while i < n: j = 0 while j < n: if <test that takes time n> <this code takes time n 3 > else <this code takes time n> j = j + 1 i = i + 1 O(n 5 )

Express the Complexity using Big-O Notation i = 0 while i < n: j = 0 while j < i: print hi j = j + 1 i = i + 1 Print statement is executed : 1+ 2+ 3 + 4 + + n =???

What is 1+2+3+.+n?

Why is 1+2+3+.+n = n(n+1)/2? 7 6 5 4 3 2 1 0 1 2 3 4 5 6

Why is 1+2+3+.+n = n(n+1)/2? 7 6 5 4 3 2 1 0 1 2 3 4 5 6

Express the Complexity using i = 0 while i < n j = i while j >= 0: print hi j = j 1 i = i + 1 Big-O Notation n(n+1)/2 =n 2 /2 +n/2 which is O(n 2 )

Express the Complexity using Big-O Notation i = 0 while i < n*n j = 0 while j < i: print hi j = j + 1 i = i + 1 n 2 (n 2 +1)/2 =n 4 /2 +n 2 /2 which is O(n 4 )