Principles of Algorithm Analysis

Similar documents
CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Data Structures and Algorithms. Asymptotic notation

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithms

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Asymptotic Analysis of Algorithms. Chapter 4

The Time Complexity of an Algorithm

CSC Design and Analysis of Algorithms. Lecture 1

The Time Complexity of an Algorithm

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Analysis of Algorithms

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

with the size of the input in the limit, as the size of the misused.

Ch 01. Analysis of Algorithms

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Ch01. Analysis of Algorithms

What is Performance Analysis?

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Running Time Evaluation

CISC 235: Topic 1. Complexity of Iterative Algorithms

Analysis of Algorithms

Growth of Functions (CLRS 2.3,3)

Data Structures and Algorithms Running time and growth functions January 18, 2018

Advanced Algorithmics (6EAP)

csci 210: Data Structures Program Analysis

EECS 477: Introduction to algorithms. Lecture 5

CSC2100B Data Structures Analysis

csci 210: Data Structures Program Analysis

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Cpt S 223. School of EECS, WSU

Lecture 2: Asymptotic Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Data Structures and Algorithms

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

CS473 - Algorithms I

CS 4407 Algorithms Lecture 2: Growth Functions

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CSE 417: Algorithms and Computational Complexity

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Big O (Asymptotic Upper Bound)

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Divide and Conquer Problem Solving Method

Asymptotic Algorithm Analysis & Sorting

More on Asymptotic Running Time Analysis CMPSC 122

CS 310 Advanced Data Structures and Algorithms

Foundations II: Data Structures and Algorithms

3.1 Asymptotic notation

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Lecture 2: Asymptotic Notation CSCI Algorithms I

Analysis of Algorithms

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Data Structures and Algorithms CSE 465

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Algorithms and Their Complexity

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Algorithms Design & Analysis. Analysis of Algorithm

Asymptotic Analysis Cont'd

Written Homework #1: Analysis of Algorithms

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Analysis of Algorithms Review

Assignment 5 Bounding Complexities KEY

COMP 9024, Class notes, 11s2, Class 1

Runtime Complexity. CS 331: Data Structures and Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Design and Analysis of Algorithms

Omega notation. Transitivity etc.

Module 1: Analyzing the Efficiency of Algorithms

Enumerate all possible assignments and take the An algorithm is a well-defined computational

COMP 382: Reasoning about algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Asymptotic Analysis 1

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CS Data Structures and Algorithm Analysis

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Programming, Data Structures and Algorithms Prof. Hema Murthy Department of Computer Science and Engineering Indian Institute Technology, Madras

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

Lecture 1 - Preliminaries

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

CS Asymptotic Notations for Algorithm Analysis

A point p is said to be dominated by point q if p.x=q.x&p.y=q.y 2 true. In RAM computation model, RAM stands for Random Access Model.

CSC236 Week 4. Larry Zhang

Introduction to Algorithms

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Computational Complexity

2. ALGORITHM ANALYSIS

CPSC 221 Basic Algorithms and Data Structures

Transcription:

C H A P T E R 3 Principles of Algorithm Analysis 3.1 Computer Programs The design of computer programs requires:- 1. An algorithm that is easy to understand, code and debug. This is the concern of software engineering. 2. An algorithm that makes efficient use of the computer s resources. This is the concern of data structure and algorithm analysis. Two algorithms, A and B, for solving a given problem. Let the running times of the algorithms to be T a (n) and T b (n) Suppose the problem size is n 0 and T a (n 0 ) < T b (n 0 ). Then algorithm A is better than algorithm B for problem size n 0 only, which is not good enough If T a (n) < T b (n) for all n, n n 0, then algorithm A is better than algorithm B regardless of the problem size. Note: Our primarily concern is to estimate the time of an algorithm instead of computing its exact time. A useful measurement of time can be obtained by counting the fundamental, dominating steps of the algorithm. E.g. counting the number of comparisons in a sorting algorithm Page 1 of 6

3.2 Running Time Let - c op = execution time of an algorithm s basic operation on a computer. - C(n) = number of times the basic operation is executed for this algorithm - T(n) = running time of this algorithm on the computer. T(n) c op C(n) Assume that for an algorithm, the operation count :- C(n) = ½ n(n-1) ½ n 2 How much longer will the algorithm run if the input size is doubled? 2 T 2n c (2) 1 (2) opc n n 2 4 1 2 T ()() n copc n () n 2 The run time will be 4 times longer when the input size is doubled. For algorithm analysis, we emphasize on the operation count s order of growth for large input sizes. Small input size cannot distinguish efficient algorithm from inefficient ones. Three notations are defined to compare and rank the order of growth ( the efficiency) of different algorithms. They are :- O (Big Oh), (Big Omega) and (Big Theta) 3.3 Asymptotic Notation When we look at input sizes large enough to make only the order of growth of the running time relevant, we are studying the asymptotic efficiency of algorithms. Usually, an algorithm that is asymptotically more efficient will be the best choice for all but very small inputs. 3.31 The Big Oh Notation (O) - A notation used for characterizing the asymptotic behavior of functions. - Gives an asymptotic upper bound for a function to within a constant factor. Definition :- Given non-negative functions f(n) and g(n), we say that f(n) = O (g(n)), if there exists an integer n 0, and a constant k>0 such that f(n) kg(n) for all integers nn 0. f(n) = O(g(n)) :- f(n) is of order at most g(n) or f(n) is big oh of g(n) The value of n 0 is the minimum possible value. Page 2 of 6

Example 1: 2n + 10 is O(n) 100 2n 10 kn ( k - 2) n 10 10 n k 2 Pick k = 3 and therefore n 0 = 10. 10 Example 2: Given f(n) = 8n + 128 and g(n) is n 2. Is f(n) = O(g(n))? We need to find an integer n 0 and a constant k > 0 such that for all integers n n 0, f(n) kn 2. 1 3n 2n+10 10 100 Therefore, 8n + 128 kn 2 kn 2 8n 128 0 If we set k=1, we have (n+8) (n-16) 0 Hence, (n-16) 0, n 0 = 16 when k=1. Example 3: To prove that the function n 2 is NOT O(n). n 2 kn n k The equality cannot be satisfied as k must be a constant. Example 4: 7n-2 7n-2 is O(n) We need k > 0 and n 0 1, such that 7n-2 kn for n n 0 7n-2 kn Assume k = 7, n o =1. Example 5: 3n 3 + 20n 2 + 5 3n 3 + 20n 2 + 5 is O(n 3 ) We need k > 0 and n 0 1, such that 3n 3 + 20n 2 + 5 kn 3 for n n 0 3n 3 + 20n 2 + 5 kn 3 Assume k = 4 20n 2 + 5 4n 3 3n 3 20n 2 + 5 n 2 n n o = 21. Page 3 of 6

Example 6: 3lg (n) + 5 3lg(n) + 5 is O(lg(n)) We need k > 0 and n 0 1, such that 3lg(n) + 5 k lg(n) for n n 0 3lg(n) + 5 k lg(n) 5 k lg(n) 3lg(n) 5 (k 3) lg(n) Analyzing the two log 2 graphs below, we can find some similarities:- 1. Both 3lg(n) and lg(n) cuts the n-axis at n=1 2. At n=2 (which is the base of the log), the graph will have a y-values equivalent to the magnitude. i.e. 3lg(n) has a value of 3 when n=2. 3 lg n lg n Therefore, to satisfy 5 (k 3) lg(n), (k 3) should be at least 5. Assume k=8, n 0 will be equal to 2. 3.3.2 Big-Oh Rules If f(n) is a polynomial of degree d, then f(n) is O(n d ), i.e. 1. Drop Lower-order terms 2. Drop Constant factors E.g. 100n + 2 is O(n) Page 4 of 6

3.3.3 Big-Oh and Growth Rate The Big-Oh notation gives an upper bound (or worst case) on the growth rate of the function. The statement f(n) is O(g(n)) means that the growth rate of f(n) is no more than the growth rate of g(n). We can use the Big-Oh to rank functions according to their growth rate. 3.3.4 Functions representing Growth Rates The amount of time required to execute an algorithm usually depends on the input size, n. Here are seven functions that often appear in algorithm analysis in ascending order in terms of running time. - Constant 1 Fastest - Logarithmic log n - Linear n - N-Log-N nlogn - Quadratic n 2 - Cubic n 3 - Exponential 2 n Slowest 3.3.5 The Big Omega Notation () The Ω-notation provides an asymptotic lower bound. f(n) = (g(n)) f(n) is of order at least g(n) or f(n) is the omega of g(n) if there exists constants k 2 and n 2 such that f(n) k 2 g(n) for all n n 2 3.3.6 The Big Theta Notation () The -notation provides an asymptotic tight bound. f(n) = (g(n)) f(n) is of order g(n) or f(n) is theta of g(n) if f(n)=o (g(n)) and f(n)= (g(n)) Page 5 of 6

Example 7: Given f(n) = 60n 2 + 5n + 1, g(n) = n 2, prove that 60n 2 +5n+1=(n 2 ) 60n 2 + 5n + 1 < 66n 2 for all n 1 Hence we can take k 1 =66 and n 1 =1, and conclude that f(n)=o(g(n 2 )) Since, 60n 2 + 5n + 1 > 60n 2 for all n 1 we can take k 2 =60 and n 2 =1, and conclude that f(n)=(g(n 2 )) Based on the above, we have 60n 2 + 5n + 1 = (n 2 ) Example 8: Given f(n) = 2n + 3lg(n), g(n) = n, prove that 2n + 3lg(n)=(n) Proof:- 60n 2 + 5n + 1 < 60n 2 + 5n 2 + n 2 Proof:- 2n + 3lg(n) 2n + 3n 2n + 3lg(n) 5n for all n 1 Hence we can take k 1 =5 and n 1 =1, and conclude that f(n)=o(g(n)) Since, 2n + 3lg(n) 2n for all n 1 we can take k 2 =2 and n 2 =1, and conclude that f(n)=(g(n)) Based on the above, we have 2n + 3lg(n) = (n) Page 6 of 6