CS 380 ALGORITHM DESIGN AND ANALYSIS

Similar documents
Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Big-O Notation and Complexity Analysis

Growth of Functions (CLRS 2.3,3)

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CS 4407 Algorithms Lecture 2: Growth Functions

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Computational Complexity

Data Structures and Algorithms Chapter 2

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

CS473 - Algorithms I

EECS 477: Introduction to algorithms. Lecture 5

Introduction to Algorithms: Asymptotic Notation

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

The Time Complexity of an Algorithm

Ch01. Analysis of Algorithms

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

with the size of the input in the limit, as the size of the misused.

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

The Time Complexity of an Algorithm

Data Structures and Algorithms CSE 465

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS Data Structures and Algorithm Analysis

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Time complexity analysis

Analysis of Algorithms

P, NP, NP-Complete, and NPhard

Data Structures and Algorithms Running time and growth functions January 18, 2018

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

Lecture 2: Asymptotic Notation CSCI Algorithms I

CSC Design and Analysis of Algorithms. Lecture 1

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Algorithms and Their Complexity

Algorithms Design & Analysis. Analysis of Algorithm

Ch 01. Analysis of Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Agenda. We ve discussed

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

3.1 Asymptotic notation

Cpt S 223. School of EECS, WSU

Introduction to Algorithms 6.046J/18.401J

Asymptotic Analysis Cont'd

CISC 235: Topic 1. Complexity of Iterative Algorithms

csci 210: Data Structures Program Analysis

Analysis of Algorithms

Analysis of Algorithms

csci 210: Data Structures Program Analysis

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

COMP Analysis of Algorithms & Data Structures

Data Structures and Algorithms. Asymptotic notation

Advanced Algorithmics (6EAP)

COMP Analysis of Algorithms & Data Structures

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Running Time Evaluation

Module 1: Analyzing the Efficiency of Algorithms

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours (Las/CSEB 3043): Tue 6-7 pm, Wed 4-6 pm or by appointment.

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Asymptotic Algorithm Analysis & Sorting

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

data structures and algorithms lecture 2

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

COMP 9024, Class notes, 11s2, Class 1

Asymptotic Analysis 1

Analysis of Algorithms Review

Review Of Topics. Review: Induction

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Principles of Algorithm Analysis

Written Homework #1: Analysis of Algorithms

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Introduction to Algorithms

Lecture 3: Big-O and Big-Θ

Module 1: Analyzing the Efficiency of Algorithms

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 5: Loop Invariants and Insertion-sort

Asymptotic Analysis of Algorithms. Chapter 4

Foundations II: Data Structures and Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Big O (Asymptotic Upper Bound)

Data Structures and Algorithms

An analogy from Calculus: limits

COMP 382: Reasoning about algorithms

Theory of Computation Chapter 1: Introduction

CS1210 Lecture 23 March 8, 2019

Algorithms and Programming I. Lecture#1 Spring 2015

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Theory of Computation

CS Non-recursive and Recursive Algorithm Analysis

Topic 17. Analysis of Algorithms

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Asymptotic Running Time of Algorithms

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

2. ALGORITHM ANALYSIS

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CS F-01 Algorithm Analysis 1

Transcription:

CS 380 ALGORITHM DESIGN AND ANALYSIS Lecture 2: Asymptotic Analysis, Insertion Sort Text Reference: Chapters 2, 3 Space and Time Complexity For a given problem, may be a variety of algorithms that you could implement (and a variety of data structures you could use). For each algorithm, want to consider the time complexity (and occasionally. the space (memory) complexity) 1

Exact Time Complexity Analysis is Hard It is difficult to work with exactly because it is typically very complicated. It is cleaner and easier to talk about upper and lower bounds of the function. Remember that we ignore constants. This makes sense since running our algorithm on a machine that is twice as fast will affect the running time by a multiplicative constant of 2, we are going to have to ignore constant factors anyway. Asymptotic notation (Ο, Θ, Ω) are the best that we can practically do to deal with the complexity of functions. Asymptotic Analysis of Algorithms Best case: (Ω) is the function defined by the minimum number of steps taken on any instance of size n. Too easy to cheat with best case. We do not rely it on much. Average case: (θ) is the function defined by an average number of steps taken on any instance of size n. Usually very hard to compute the average running time. Very time consuming. Worst case: (O) is the function defined by the maximum number of steps taken on any instance of size n. Fairly easy to analyze. Often close to the average running time. 2

Bounding Functions (Informally) f(n) = Ο(g(n)) means c x g(n) is an upper bound on f(n). f(n) = Ω(g(n)) means c x g(n) is a lower bound on f(n). f(n) = Θ(g(n)) means c 1 x g(n) is an upper bound on g(n) and c 2 x g(n) is a lower bound on f(n). c, c 1, and c 2 are all constants independent of n. Examples of Ο, Ω, and Θ 3

Formal Definitions Big Oh f ( n) ( g( n)) if there are positive constants and such that to the right of n 0, the value of f (n) always lies on or below. 0 c g( n) c n Think of the equality (=) as meaning in the set of functions. Examples: f(n) = 4n + 3, f(n) = O( ) f(n) = 3n 2-100n + 6 = O( ) f(n) = 3n 2-100n + 6 = O(n)?? Using Calculus: f ( n) ( g( n)) iff 0 lim x f( x) gx ( ) Formal Definitions Big Omega f ( n) ( g( n)) if there are positive constants and such that to the right of n 0, the value of f( n) always lies on or above. c g( n) Examples f(n) = 4n + 3, f( n) ( ) 2 f(n) = 4n + 3, f ( n) ( n )? c n 0 Using Calculus: f ( n) ( g( n)) iff 0 lim x f( x) gx ( ) 4

Formal Definitions Big Theta (Avg.) f ( n) ( g( n)) if there are positive constants, and n 0 such that to the right of n 0, the value of always lies on or below c1 g( n) and on or above c2 g( n) c 1 c 2 f( n) Examples f(n) = 4n + 3, f(n) = 4n + 3, f( n) ( ) 2 f ( n) ( n )? Using Calculus: f ( n) ( g( n)) iff 0 lim x gx ( ) f( x) Review Given unsorted array A of size n, suppose we want to determine if a particular element is in the array. What are the associated time complexities of a linear search? Average : ( ) Best : ( ) Worst : O( ) 5

Logarithms It is important to understand intuitively what logarithms are and where they come from. A logarithm is simply an inverse exponential function. x Saying b y is equivalent to saying that log b y x i.e. x is the power to which b must be raised to give y Examples: Logarithms Exponential functions, like the amount owed on a n year mortgage at an interest rate of c% per year, are functions which grow distressingly fast, as anyone who has tried to pay off a mortgage knows. Thus inverse exponential functions, i.e. logarithms, grow refreshingly slowly. 6

Examples of Logarithmic Functions Binary search is an example of an O(lg n) algorithm. After each comparison, we can throw away half the possible number of keys. Thus twenty comparisons suffice to find any name in the million-name Manhattan phone book! If you have an algorithm which runs in O(lg n) time, take it, because this is blindingly fast even on very large instances. Properties of Logarithms log( x y) log( x) log( y) log x log x log y y n log( ) log x n x Logarithms are increasing functions, i.e. if x < y, where x, y >0, then log(x) < log(y) All three are helpful when comparing runtime of algorithms (especially the third and fourth) 7

Asymptotic Analysis: Examples 2 n 2 versus n Asymptotic Dominance in Action n O(lg n) O(n) O(n lg n) O(n 2 ) O(2 n ) O(n!) 10 0.003 μs 0.01 μs 0.033 μs 0.1 μs 1 μs 3.63 ms 20 0.004 μs 0.02 μs 0.086 μs 0.4 μs 1 ms 77.1 years 30 0.005 μs 0.03 μs 0.147 μs 0.9 μs 1 sec 8.4*10 15 yrs 40 0.005 μs 0.04 μs 0.213 μs 1.6 μs 18.3 min 50 0.006 μs 0.05 μs 0.282 μs 2.5 μs 13 days 100 0.007 μs 0.1 μs 0.644 μs 10 μs 4*10 13 yrs 1,000 0.010 μs 1.00 μs 9.966 μs 1 ms 10,000 0.013 μs 10 μs 130 μs 100 ms 100,000 0.017 μs 0.10 ms 1.67 ms 10 sec 1,000,000 0.020 μs 1 ms 19.93 ms 16.7 min 10,000,000 0.023 μs 0.01 sec 0.23 sec 1.16 days 100,000,000 0.027 μs 0.10 sec 2.66 sec 115.7 days 1,000,000,000 0.030 μs 1 sec 29.90 sec 3.7 years 8

bigocheatsheet.com Example: Sorting Input: A sequence of n numbers <a 1, a 2,, a n > Output: A permutation (reordering) <a' 1, a' 2,, a' n > of the input sequence such that a' 1 a' 2 a' n We seek algorithms that are correct and efficient 9

Sorting Algorithms https://www.youtube.com/watch?v=kpra0w1kecg Insertion Sort INSERTION-SORT(A,n) // A[1..n] 1 for j = 2 to A.length 2 key A[j] 3 //Insert A[j] into A[1.. j-1] 4 i j 1 5 while (i > 0 and A[i] > key) 6 do A[i+1] A[i] 7 i i 1 8 A[i+1] key 10

Example How would insertion sort work on the following numbers? 3 1 7 4 8 2 6 Your Turn Problem: How would insertion sort work on the following characters to sort them alphabetically (from A to Z)? Show each step. S O R T E D 11

Insertion Sort Is the algorithm correct? How efficient is the algorithm? Time Complexity? Space Complexity? How does insertion sort do on sorted permutations? How about unsorted permutations? Analysis of Insertion Sort Best Case 12

Analysis of Insertion Sort Worst Case For next time: 13