Big-O Notation and Complexity Analysis

Similar documents
Data Structures and Algorithms Running time and growth functions January 18, 2018

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CS 380 ALGORITHM DESIGN AND ANALYSIS

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Module 1: Analyzing the Efficiency of Algorithms

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CS473 - Algorithms I

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

data structures and algorithms lecture 2

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Algorithms Design & Analysis. Analysis of Algorithm

Lecture 2: Asymptotic Notation CSCI Algorithms I

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Growth of Functions (CLRS 2.3,3)

Ch01. Analysis of Algorithms

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

3.1 Asymptotic notation

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Computational Complexity

Module 1: Analyzing the Efficiency of Algorithms

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Foundations II: Data Structures and Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Ch 01. Analysis of Algorithms

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Time complexity analysis

Data Structures and Algorithms Chapter 2

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Data Structures and Algorithms CSE 465

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

csci 210: Data Structures Program Analysis

Introduction to Algorithms and Asymptotic analysis

CS 4407 Algorithms Lecture 2: Growth Functions

csci 210: Data Structures Program Analysis

P, NP, NP-Complete, and NPhard

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Data Structures and Algorithms. Asymptotic notation

Analysis of Algorithms

Analysis of Algorithms

with the size of the input in the limit, as the size of the misused.

The Time Complexity of an Algorithm

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Algorithms and Their Complexity

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

EECS 477: Introduction to algorithms. Lecture 5

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Review Of Topics. Review: Induction

CSC Design and Analysis of Algorithms. Lecture 1

The Time Complexity of an Algorithm

Analysis of Algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Running Time Evaluation

CS Data Structures and Algorithm Analysis

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Homework Assignment 1 Solutions

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Big O (Asymptotic Upper Bound)

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

Advanced Algorithmics (6EAP)

Cpt S 223. School of EECS, WSU

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Data Structures and Algorithms

Asymptotic Analysis of Algorithms. Chapter 4

CS Non-recursive and Recursive Algorithm Analysis

Principles of Algorithm Analysis

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

Asymptotic Analysis 1

This chapter covers asymptotic analysis of function growth and big-o notation.

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Introduction to Algorithms 6.046J/18.401J

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithms - Using Asymptotic Bounds -

Agenda. We ve discussed

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

An analogy from Calculus: limits

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

ICS 252 Introduction to Computer Design

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Problem Set 1 Solutions

Theory of Computation Chapter 1: Introduction

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Algorithms and Programming I. Lecture#1 Spring 2015

What is Performance Analysis?

Introduction to Computer Science Lecture 5: Algorithms

CSCE 222 Discrete Structures for Computing

How many hours would you estimate that you spent on this assignment?

COMP 382: Reasoning about algorithms

Transcription:

Big-O Notation and Complexity Analysis Jonathan Backer backer@cs.ubc.ca Department of Computer Science University of British Columbia May 28, 2007 Problems Reading: CLRS: Growth of Functions 3 GT: Algorithm Analysis 1.1-1.3 Course is about solving problems with algorithms: Find a function from a set of valid inputs to a set of outputs. An instance of a problem is just one specific input. Sorting Input: A sequence of n values a 1, a 2,..., a n. Output: A permutation b 1, b 2,..., b n of a 1, a 2,..., a n such that b 1 b 2... b n. Instance: 3, 8, 2, 5. Compiling a Program Input: A sequence of characters (file). Output: A sequence of bytes (either an executable file or error messages).

Algorithms An algorithm is a finite set of instructions such that each step is precisely stated (e.g. english instructions, pseudo-code, flow charts, etc.), the result of each step is uniquely defined and depends only on the input and previously executed steps, and it stops after finitely many steps on every instance of the problem (i.e. no infinite loops). Algorithms (cont d) What follows is a pseudo-code description of the insertion sort algorithm. We more interested in clarity than syntax. InsertionSort(A) for j 2 to A.length-1 do key A[j] i j-1 while (i 0 and key < A[i]) do A[i+1] A[i] i i-1 A[i+1] key Just like Java, we pass parameters by value for simple types and reference for arrays and objects.

Analysis We analyse the behaviour of algorithms. That is, we will prove an algorithm is correct (i.e. it always terminate and returns the right result) a bound on its best-/worst-/average-case time or space complexity A machine model captures the relevant properties of the machine that the algorithm is running on. We usually use the Integer-RAM model with constant time memory access, sequential instruction execution, a single processor, and memory cells that hold integers of arbitrary size. Time Complexity We count the # of elementary steps, which depends on the instance of the problem. We look at worst-case: usual, best-case: typically not very useful, and average-case: hard and really depends on input distribution (i.e what is average?). Search an unsorted list of n numbers. Worst-case: n comparisons (not in the list). Best-case: 1 comparison (won the lottery!). Average-case: n/2 comparisons, if all numbers are different, the number occurs in the array, and each permutation of the array is equally likely.

Time Complexity (cont d) Why analyse the worst-case? Because it provides an upperbound, may frequently occur, and is often related to the average-case (but it is much easier to prove). How do we count elementary steps? It is hard to do exactly (even in worst-case). So we use asymptotic notation because it focuses on behaviour in the limit (where things break) and is independent of underlying technology (e.g. machine, OS, compiler, etc.). Big-O Notation Intuition: f is upperbounded by a multiple of g in the limit. Definition Let g : N R. Then f : N R is in O (g (n)) if and only if c R + and n 0 N such that f (n) c g(n), n n 0. c g(n) f(n) n 0

Constructive Big-O Proofs Proofs following the definition. Called constructive because we construct specific c and n 0. Show 2n O (n 2 ) Take c = 1, n 0 = 2. 2n 2 n n n = n 2 Show 7n 2 + 5 O (n 3 /6) Take c = 72, n 0 = 1. 7n 2 + 5 7n 2 + 5n 2 12n 2 12n 3 Or take c = 2, n 0 = 1. 2n 2 n n 2n 2 72 n3 6 Big-O Proofs by Contradiction Typically used to prove that f (n) O (g (n)). Show n 2 O(2n) Suppose n 2 O(2n) and consider any n > max{n 0, 2c}. Then n 2 = n n > (2c) n = c (2n) which is a contradiction. n 2 c(2n)

Big-O Ignores Constant Factors Theorem If f (n) O (g (n)), then x f (n) O (g (n)) for every (constant) x R +. Proof. Since f (n) O (g (n)), consider c, n 0 such that f (n) c g (n) for all n n 0. Let b = c x. Then, for n > n 0 Hence x f (n) O(g(n)). x f (n) x c g (n) = b g(n) Big-O Ignores Lower Order Terms Theorem If f (n) O(g(n)) and h(n) O(f (n)), then f (n) + h(n) O(g(n)). Proof. Consider a, l 0 such that f (l) a g(l), for l l 0. Consider b, m 0 such that h(m) b f (m), for m m 0. Let c = a (1 + b) and n 0 = max{l 0, m 0 }. Then for n n 0 So f (n) + h(n) O(g(n)) f (n) + h(n) f (n) + b f (n) = (1 + b) f (n) (1 + b) a g(n) = c g(n)

Big-Ω Notation Intuition: f is lowerbounded by a multiple of g in the limit. Definition Let g : N R. Then f : N R is in Ω (g (n)) if and only if c R + and n 0 N such that f (n) c g(n), n n 0. f(n) c g(n) n 0 Other Asymptotic Notations Definition Θ(g(n)) = O(g(n)) Ω(g(n)) There is a correspondence: f(n) < = > o O θ Ω ω Except that not every pair of functions is comparable. g(n)

Limits Theorem Let f, g : N R +. Suppose L = lim n f (n)/g(n) exists. Then f (n) ω(g(n)), if L = + f (n) Θ(g(n)), if L R + f (n) o(g(n)), if L = 0 Show n ω(log n) lim n n log n = = lim n so use L Hopital s Rule 1 2 n 1 2 n 1 = lim n 1 n = 2 Time Complexity (Redux) How do you determine the run-time of an algorithm? Pick a barometer: An operation performed a # of times proportional to the (worst case) running time. Count how many times the barometer is performed. Example for i 1 to n do for j 1 to 2 i print "Hi!" do Geometric Series: n i=0 ai = an+1 1 a 1 n T (n) = i=1 2 i = 2n+1 1 1 2 1 = 2 n+1 2 Θ (2 n )