Data Structures and Algorithms Running time and growth functions January 18, 2018

Similar documents
CS 4407 Algorithms Lecture 2: Growth Functions

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Computational Complexity

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

COMP 9024, Class notes, 11s2, Class 1

Data Structures and Algorithms. Asymptotic notation

CS Non-recursive and Recursive Algorithm Analysis

Growth of Functions (CLRS 2.3,3)

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Big-O Notation and Complexity Analysis

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Ch01. Analysis of Algorithms

CS473 - Algorithms I

The Time Complexity of an Algorithm

Analysis of Algorithms

Analysis of Algorithms

COMP Analysis of Algorithms & Data Structures

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

COMP Analysis of Algorithms & Data Structures

The Time Complexity of an Algorithm

Introduction to Algorithms

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

3.1 Asymptotic notation

CSC Design and Analysis of Algorithms. Lecture 1

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Cpt S 223. School of EECS, WSU

EECS 477: Introduction to algorithms. Lecture 5

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Module 1: Analyzing the Efficiency of Algorithms

Ch 01. Analysis of Algorithms

with the size of the input in the limit, as the size of the misused.

Analysis of Algorithms

Principles of Algorithm Analysis

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Asymptotic Analysis of Algorithms. Chapter 4

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Running Time Evaluation

Big O (Asymptotic Upper Bound)

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Asymptotic Analysis Cont'd

P, NP, NP-Complete, and NPhard

Data Structures and Algorithms

COMP 382: Reasoning about algorithms

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Asymptotic Analysis 1

csci 210: Data Structures Program Analysis

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Lecture 2: Asymptotic Notation CSCI Algorithms I

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

csci 210: Data Structures Program Analysis

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Algorithms Design & Analysis. Analysis of Algorithm

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

The Growth of Functions and Big-O Notation

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Review Of Topics. Review: Induction

Agenda. We ve discussed

CS Data Structures and Algorithm Analysis

Module 1: Analyzing the Efficiency of Algorithms

Algorithms and Their Complexity

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Introduction to Algorithms and Asymptotic analysis

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

This chapter covers asymptotic analysis of function growth and big-o notation.

Data Structures and Algorithms CSE 465

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Time complexity analysis

CS 380 ALGORITHM DESIGN AND ANALYSIS

Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

COMPUTER ALGORITHMS. Athasit Surarerks.

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Theory of Computation Chapter 1: Introduction

Data Structures and Algorithms Chapter 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Analysis Fundamentals

Divide and Conquer Problem Solving Method

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

Asymptotic Algorithm Analysis & Sorting

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Asymptotic Analysis and Recurrences

Written Homework #1: Analysis of Algorithms

data structures and algorithms lecture 2

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

Analysis of Algorithms - Using Asymptotic Bounds -

Transcription:

Data Structures and Algorithms Running time and growth functions January 18, 2018 Measuring Running Time of Algorithms One way to measure the running time of an algorithm is to implement it and then study its running time by executing it on various inputs. This approach has the following limitations. It is necessary to implement and execute an algorithm to study its running time experimentally. Experiments can be done only on a limited set of inputs and may not be indicative of the running time on other inputs that were not included in the experiment. It is difficult to compare the efficiency of two algorithms unless they are implemented and executed in the same software and hardware environments. Analyzing algorithms analytically does not require that the algorithms be implemented, takes into account all possible inputs, and allows us to compare the efficiency of two algorithms independent of the hardware and software environment. Model of Computation The model of computation that we use to analyze algorithms is called the Random Access Machine(RAM). In this model of computation we assume that high-level operations that are independent of the programming language used take 1 time step. Such high-level operations include the following: performing arithmetic operations, comparing two numbers, assigning a value to a variable, calling a function, returning from a function, and indexing into an array. Note that actually, each of the above high-level operations take a few primitive lowlevel instructions whose execution time depends on the hardware and software enviroment, but this is bounded by a constant. Note that in this model we assume that each memory access takes exactly one time step and we have unlimited memory. The model does not take into account whether an item is in the cache or on the disk. While some of the above assumptions do not hold on a real computer, the assumptions are made to simplify the mathematical analysis of algorithms. Despite being simple, RAM captures the essential behavior of computers and is an excellent model for understanding how an algorithm will perform on a real computer. Average Case and Worst Case As we saw in class (Insertion Sort), an algorithm works faster on some inputs than others. How should we express the running time of an algorithm? Let T (n) denote the running time (number of high-level steps) of algorithm on an input of size n. Note that there are 2 n input instances with n bits. Below are three possibilities of inputs that we will consider.

2 Running time and growth functions January 18, 2018 A typical input: The problem with considering a typical input is that different applications will have very different typical inputs. Average Case: Average case analysis is challenging. It requires us to define a probability distribution on the set of inputs, which is a difficult task. Worst Case: In this measure we consider the input on which the algorithm performs the slowest. When we say that T (n) is the worst case running time of an algorithm we mean that the algorithm takes no more than T (n) steps for any input of size n. In other words, it is an upper bound on the running time of the algorithm for any input. This measure is a nice clean mathematical definition and is easiest to analyze. Order of Growth To simplify our analysis we will ignore the lower-order terms in the function T (n) and the multiplicative constants in front. That is, it is the rate of growth or the order of growth that will be of interest to us. We also ignore the function for small values of n and focus on the asymptotic behavior as n becomes very large. Some reasons are as follows. The multiplicative constants depends on how fast the machine is and the precise definition of an operation. Counting every operation that the algorithm takes is tedious and more work than it is worth. It is much more significant whether the time complexity is T (n) = n 2 or n 3 than whether it is T (n) = 2n 2 or T (n) = 3n 2. Definitions of Asymptotic notations O, Ω, Θ, o, ω Textbook (Chapter 3). Prove that 7n 2 = Θ(n). Solution. Note that 7n 2 14n. Thus 7n 2 = O(n) follows by setting c = 14 and n 0 = 1. To show that 7n 2 = Ω(n), we need to show that there exists positive constants c and n 0 such that 7n 2 cn, for all n n 0. This is the same as showing that for all n n 0, (7 c)n 2. Setting c = 1 and n 0 = 1 proves the lower bound. Hence 7n 2 = Θ(n). Prove that 10n 3 + 55n log n + 23 = O(n 3 ). Solution. claim. The left hand side is at most 88n 3. Thus setting c = 88 and n 0 = 1 proves the Prove that 3 60 = O(1).

January 18, 2018 Running time and growth functions 3 Solution. This follows because 3 60 3 60 1, for all n 0. Prove that 5/n = O(1/n). Solution. This follows because 5/n 5 (1/n), for all n 1. Prove that n 2 /8 50n = Θ(n 2 ). Solution. First Solution: n 2 /8 50n n 2. To prove the upperbound, we need to prove that 0 n2 8 50n c 2n 2, n n 0 Setting c 2 = 1 and n 0 = 800 satisfies the above inequalities. To show that n 2 /8 50n = Ω(n 2 ), we need to show that there exists positive constants c 1 and n 0 such that n 2 /8 50n c 1 n 2, for all n n 0. This is the same as showing that for all n n 0, (1/8 c 1 )n 50, or equivalently (1 8c 1 )n 400. Setting c 1 = 1/16 and n 0 = 800 proves the lower bound. Hence setting c 1 = 1/16, c 2 = 1, and n 0 = 800 in the definition of Θ( ) proves that n 2 /8 50n = Θ(n 2 ). Second Solution: Note that lim n (n 2 /(n 2 /8 50n)) = lim n 8n/(n 400) = lim n 8/(1 400/n) = 8. Thus the claim follows. Prove that lg n = O(n). Solution. We will show using induction on n that lg n n, for all n 1. Base Case: n = 1: the claim holds for this case as the left hand side is 0 and the right hand side is 1. Induction Hypothesis: Assume that lg k k, for some k 1. Induction Step: We want to show that lg(k + 1) (k + 1). We have LHS = lg(k + 1) lg(2k) = lg 2 + lg k 1 + k (using induction hypothesis) Prove that 3n 100 = O(2 n ). Solution. We will show using induction on n that 3n 100 3(100 100 )(2 n ), for all n 200. That is c = 3(100 100 ) and n 0 = 200. Base Case: n = 200: the claim holds for this case as the left hand side is 3(100 100 )(2 100 ) and the right hand side is 3(100 100 )(2 200 ).

4 Running time and growth functions January 18, 2018 Induction Hypothesis: Assume that 3k 100 3(100 100 )(2 k ), for some k 200. Induction Step: We want to show that 3(k + 1) 100 3(100 100 )(2 k+1 ). LHS = 3(k + 1) 100 ( ( ) ) 100 3 k 100 + 100k 99 + k 98 + + 1 (using Binomial Theorem) 2 3 ( k 100 + 100k 99 + 100 2 k 98 + + 100 100 k 0 ) ) = 3k 100 ( 1 + (100/k) + (100/k) 2 + (100/k) 3 + + (100/k) 100) 3(100 100 )(2 k ) (1/2) i (using induction hypothesis and the fact that k 200) i=0 3(100 100 )(2 k+1 ) Another way to prove the claim is as follows. Consider lim n 2 n /3n 100. This can be written as lim n 2n lg 3n100 3n100 /2 = lim 2n log =. n Prove that 7n 2 is not Ω(n 10 ). Solution. Note that lim n n 10 /(7n 2) = and hence 7n 2 = o(n 10 ). Thus it cannot be Ω(n 10 ). We can also prove the claim as follows using the definition of Ω. Assume for the sake of contradiction that 7n 2 = Ω(n 10 ). This means that for some constants c and n 0, 7n 2 cn 10, for all n n 0. This implies that 7n cn 10 7 cn 9 This is a contraction as the left hand side is a constant and the right had side keeps growing with n. More specifically, the right hand side becomes greater than the left hand side as soon as n > 2c 1/9. Prove that n 1+0.0001 is not O(n). Solution. Assume for the sake of contradiction that n 1+0.0001 = O(n). This means that for some constants c and n 0, n 1+0.0001 cn, for all n n 0. This is equivalent to n 0.0001 c, for all n n 0 This is impossible as the left hand side grows unboundedly with n. More specifically, when n > c 104, the left hand side becomes greater than c. We now state without proof some of the properties of asymptotic growth functions. 1. If f(n) = O(g(n)) and g(n) = O(h(n)) then f(n) = O(h(n)). 2. If f(n) = Ω(g(n)) and g(n) = Ω(h(n)) then f(n) = Ω(h(n)).

January 18, 2018 Running time and growth functions 5 3. If f(n) = Θ(g(n)) and g(n) = Θ(h(n)), then f(n) = Θ(h(n)). 4. Suppose f and g are two functions such that for some other function h, we have f(n) = O(h(n)) and g(n) = O(h(n)). Then f(n) + g(n) = O(h(n)). We now state asymptotic bounds of some common functions. 1. Recall that a polynomial is a function that can be written in the form a 0 + a 1 n + a 2 n 2 + + a d n d for some integer constant d > 0 and a d is non-zero. Let f be a polynomial of degree d in which the coefficient a d > 0. Then f = O(n d ). 2. For every b > 1 and any real numbers x, y > 0, we have (log b n) x = O(n y ). Note that (log b n) x is also written as log x b n. 3. For every r > 1 and every d > 0, we have n d = O(r n ). As a result of the above bounds we can conclude that asymptotically, 10 5 log 38 n grows slower than n/10 10. Prove that n 10 = o(2 lg3 n ) and 2 lg3 n = o(2 n ). Solution. Note that 2 lg3 n = (2 lg n ) lg2 n = n lg2 n. Since we know that 10 = o(lg 2 n), we conclude that n 10 = o(2 lg3 n ) Since we also know that any poly-logarithmic function in n such as lg 3 n is asymptotically upper-bounded by a polynomial function in n, we conclude that lg 3 n = O(n) Actually, lg 3 n = o(n) and hence 2 lg3 n = o(2 n ).