CISC 235: Topic 1. Complexity of Iterative Algorithms

Similar documents
Algorithms and Their Complexity

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Principles of Algorithm Analysis

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

CS 4407 Algorithms Lecture 2: Growth Functions

Topic 17. Analysis of Algorithms

Analysis of Algorithms

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Data Structures and Algorithms. Asymptotic notation

CS 310 Advanced Data Structures and Algorithms

3. Algorithms. What matters? How fast do we solve the problem? How much computer resource do we need?

Advanced Algorithmics (6EAP)

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

csci 210: Data Structures Program Analysis

Analysis of Algorithms

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

Lecture 2: Asymptotic Notation CSCI Algorithms I

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Ch 01. Analysis of Algorithms

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Describe an algorithm for finding the smallest integer in a finite sequence of natural numbers.

Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

ECOM Discrete Mathematics

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

csci 210: Data Structures Program Analysis

CSCE 222 Discrete Structures for Computing

Ch01. Analysis of Algorithms

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

Big O Notation. P. Danziger

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Cpt S 223. School of EECS, WSU

Data Structures and Algorithms Chapter 2

Asymptotic Analysis of Algorithms. Chapter 4

Data Structures and Algorithms

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS 380 ALGORITHM DESIGN AND ANALYSIS

Algorithms 2/6/2018. Algorithms. Enough Mathematical Appetizers! Algorithm Examples. Algorithms. Algorithm Examples. Algorithm Examples

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Analysis of Algorithms Review

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CSC2100B Data Structures Analysis

with the size of the input in the limit, as the size of the misused.

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

The Time Complexity of an Algorithm

Module 1: Analyzing the Efficiency of Algorithms

Discrete Mathematics CS October 17, 2006

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

P, NP, NP-Complete, and NPhard

Growth of Functions (CLRS 2.3,3)

COMPUTER ALGORITHMS. Athasit Surarerks.

3.1 Asymptotic notation

Algorithms Design & Analysis. Analysis of Algorithm

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

Module 1: Analyzing the Efficiency of Algorithms

The Growth of Functions (2A) Young Won Lim 4/6/18

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

The Time Complexity of an Algorithm

Asymptotic Algorithm Analysis & Sorting

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Big O Notation. P. Danziger

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Lecture 5: Introduction to Complexity Theory

data structures and algorithms lecture 2

O Notation (Big Oh) We want to give an upper bound on the amount of time it takes to solve a problem.

Review Of Topics. Review: Induction

What is Performance Analysis?

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Theory of Computation

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Assignment 5 Bounding Complexities KEY

Big O (Asymptotic Upper Bound)

COMP 9024, Class notes, 11s2, Class 1

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

Big-O Notation and Complexity Analysis

Computational Complexity

CSCE 750, Spring 2001 Notes 1 Page 1 An outline of the book 1. Analyzing algorithms 2. Data abstraction 3. Recursion and induction 4. Sorting 5. Selec

Running Time Evaluation

Runtime Complexity. CS 331: Data Structures and Algorithms

CE 221 Data Structures and Algorithms. Chapter 7: Sorting (Insertion Sort, Shellsort)

Math 391: Midterm 1.0 Spring 2016

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Mat Week 6. Fall Mat Week 6. Algorithms. Properties. Examples. Searching. Sorting. Time Complexity. Example. Properties.

Student Responsibilities Week 6. Mat Properties of Algorithms. 3.1 Algorithms. Finding the Maximum Value in a Finite Sequence Pseudocode

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

CS Data Structures and Algorithm Analysis

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Transcription:

CISC 235: Topic 1 Complexity of Iterative Algorithms

Outline Complexity Basics Big-Oh Notation Big-Ω and Big-θ Notation Summations Limitations of Big-Oh Analysis 2

Complexity Complexity is the study of how the time and space to execute an algorithm vary with problem size. The purpose is to compare alternative algorithms to solve a problem to determine which is best. Time Complexity: Let T(n) denote the time to execute an algorithm with input size n How can we measure T(n)? 3

Experimental Study Implement the alternative algorithms and then time them with various benchmarks, measuring running time with a method like Java s System.currentTimeMillis( ) T(n) Too much coding & not general results n 4

Mathematical Analysis Analyze alternative algorithms mathematically prior to coding Define the amount of time taken by an algorithm to be a function of the size of the input data: T(n) =? Count the key instructions that are executed to obtain the value of the function in terms of the size of the input data Compare algorithms by comparing how fast their running time functions grow as the input size increases 5

Finding an Algorithm s Running Time Function Count the key instructions that are executed to obtain the running time in terms of the size of the input data. Important Decisions: What is the measure of the size of input? Which are the key instructions? 6

Example: Find smallest value in int small = 0; for ( int j = 0; j < n; j++ ) if ( A[j] < A[small] ) small = j; array A of length n Counting Assignments: Line 1: 1 Line 2: n + 1 Line 3: 0 Line 4: best case: 0 worst case: n So, T(n) = 2n + 2 7

Example: Find smallest value in int small = 0; for ( int j = 0; j < n; j++ ) if ( A[j] < A[small] ) small = j; array A of length n Substitute constants a & b to reduce analysis time: Line 1: b (constant time for everything outside loop) Line 2: n (variable number of times through loop) Lines 2, 3, 4: a (constant time for everything inside loop, including loop test & increment) So, T(n) = an + b 8

T P (n) For large input sizes, constant terms are insignificant Program A with running time T A (n)= 100n Program B with running time T B (n)= 2n 2 T B (n) = 2n 2 5000 T A (n) = 100n 50 Input Size n 9

Big-Oh Notation Purpose: Establish a relative ordering among functions by comparing their relative rates of growth Example: f(x) = 4x + 3 Big-Oh Notation: f(x) O(x) or f(x) is O(x) 10

Definition of Big-Oh Notation f(x) is O(g(x)) if two constants C and k can be found such that: for all x k, f(x) Cg(x) Note: Big-Oh is an upper bound 11

Meaning of Big-Oh Notation The function f(x) is one of the set of functions that has an order of magnitude growth rate the growth rate of g(x) OR f(x) is at most a constant times g(x), except possibly for some small values of x 12

Graph of Big-Oh for a Program T(n) Running Time For all n k, T(n) Cg(n) Cg(n) T(n) g(n) K or K or Input Size n 13

03-2-001.jpg Show that f(x) = x 2 + 2x + 1 is O(x 2 ) So, we can take k = 1 and C = 4 to show that f(x) = x 2 + 2x + 1 is O(x 2 ) If x > 1, then x 2 + 2x + 1 x 2 + 2x 2 + x 2 = 4x 2 If x > 2, then x 2 + 2x + 1 x 2 + x 2 + x 2 = 3x 2 14

Rules for Big-Oh We want to find the closest upper bound if f(x) = 100x, we choose f(x) is O(x), not f(x) is O(x 2 ) Never include constants or lower-order terms within a Big-Oh (so also don t include the base of a logarithm) if f(x) = 2x 2 + x we choose f(x) is O(x 2 ), not f(x) is O(2x 2 ) and not f(x) is O(x 2 + x) 15

Order of Magnitude Growth Rates Function Descriptor Big-Oh C Constant O( 1 ) log n Logarithmic O( log n ) n Linear O( n ) n log n n log n O( n log n ) n 2 Quadratic O( n 2 ) n 3 Cubic O( n 3 ) n k Polynomial O( n k ) 2 n Exponential O( 2 n ) n! Factorial O( n! ) 16

03-2-003.jpg Order of Magnitude Growth Rates 17

Change to Running Times When Input Size n is Doubled? Function n = 100 n = 200 Change C C C + log n ~5 ~5 + n 100 100 * n 2 100 2 100 2 * n 3 100 3 100 3 * 2 n ~100 15 ~100 15 * 18

Growth of Combinations of Functions If f 1 (x) is O( g 1 (x) ) and f 2 (x) is O( g 2 (x) ), Then (f 1 + f 2 )(x) is O( max( g 1 (x), g 2 (x) )) and (f 1 f 2 )(x) is O( g 1 (x)g 2 (x) ) 19

Deriving Big-Oh of Functions f( x ) Big-Oh 3x 2 + 5 O(x 2 ) 2x 3 - x 2-6 O(x 3 ) log 2 x + x O(x ) (5 + log 2 x)(3x - 3) O(xlogx ) 4( 2 x - x 3 ) O(2 x ) 4c - 6d + 17a O( 1 ) 20

Big-Omega (Big-Ω) Notation Expresses a lower-bound of a function f(x) is Ω(g(x)) if f(x) is at least a constant times g(x), except possibly for some small values of x. Formally: f(x) is Ω(g(x)) if two constants C and k can be found such that for all x k, f(x) Cg(x) 21

Graph of Big-Ω for a Program Running Time T(n) For all n k, T(n) Cg(n) T(n) Cg(n) K Input Size n 22

Big-Theta (Big- ) Notation Expresses a tight bound of a function f(x) is (g(x)) if f(x) is both O(g(x)) and Ω(g(x)) T(n) C 1 g(n) T(n) C 2 g(n) Input Size n 23

Example Input: An n+1 element array A of coefficients and x, a number Output: pval, the value of the polynomial A[0] + A[1]x + A[2]x 2 + + A[n]x n 24

Algorithm 1 pval = A[0] for (int i = 1; i <= n; i++) pval = pval + A[i] * Math.pow(x,i); What is the Big-Oh analysis of this algorithm? 25

Algorithm 2 pval = A[0] for ( int i = 1; i <= n; i++ ) { powx = 1 for ( int j = 0; j < i; j++ ) powx = x * powx pval = pval + A[i] * powx } What is the Big-Oh analysis of this algorithm? 26

Algorithm 3 Horner s method: To evaluate a 0 + a 1 x + a 2 x 2 + + a n x n, use: P(x) = a 0 + (x (a 1 + x(a 2 + + x(a n-1 + xa n ) ))) pval = x * A[n] for (int i = n -1; i > 0; i - -) pval = x * ( A[i] + pval ) pval = pval + A[0] What is the Big-Oh analysis of this algorithm? 27

Example: Selection Sort for ( int i = 0; i < n-1; i++ ) { int small = i; for ( int j = i + 1; j < n; j++ ) if ( A[j] < A[small] ) small = j; int temp = A[small]; A[small] = A[i]; A[i] = temp; } 28

Summation Notation n a i i = m Represents a m + a m+1 + + a n 29

t02-4-02.jpg 30

Summation Manipulations 1. Take out constant terms: n n k/2 = ½ k k=1 k=1 2. Decompose large terms: n n n n n-k+k 2 = n k + k 2 k=1 k=1 k=1 k=1 31

Summation Manipulations 3. Partial Sums: n n j k = k k k=j+1 k=1 k=1 4. Practice: n kj + a = j=i 32

Number of Terms in a Summation Upper Bound Lower Bound + 1 n 1 = 1 + 1 + + 1 = n 2 + 1 = n 1 k=2 n-1 1 = 1 + 1 + + 1 = (n 1) (j+1) + 1 k=j+1 = n j 1 Note: This is equivalent to calculating the number of iterations in a for loop 33

Calculating Arithmetic Summations ((First Term + Last Term) * Number of Terms) / 2 n k = 1 + 2 + + n = (1+n) * n/2 = n(n+1) k=1 2 n-2 n i 1 = (n 1) + (n 2) + + 1 i=0 = ((n-1) + 1) * (n-1))/2 = n( n-1) 2 34

Example: Max Subsequence Sum int maxsum = 0; int besti = 0; int bestj = 0; for ( int i = 0; i < n; i++ ) for ( int j = i; j < n; j++ ) { int thissum = 0; for ( int k = i; k <= j; k++ ) thissum = thissum + A[k]; if ( thissum > maxsum ) { maxsum = thissum; besti = i; bestj = j; } } 35

Analyzing Complexity of Lists Operation Sorted Array Sorted Linked List Unsorted Array Unsorted Linked List Search( L, x ) Insert( L, x ) Delete( L, x ) 36

Limitations of Big-Oh Analysis Not appropriate for small amounts of input Use the simplest algorithm Large constants can affect which algorithm is more efficient e.g., 2nlogn versus 1000n Assumption that all basic operations take 1 time unit is faulty e.g., memory access versus disk access (1000s of time more ) Big-Oh can give serious over-estimate e.g., loop inside an if statement that seldom executes 37