CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Similar documents
Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Growth of Functions (CLRS 2.3,3)

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Module 1: Analyzing the Efficiency of Algorithms

Algorithms and Their Complexity

Advanced Algorithmics (6EAP)

COMP 382: Reasoning about algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Computational Complexity

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

COMP 9024, Class notes, 11s2, Class 1

CS 4407 Algorithms Lecture 2: Growth Functions

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Introduction to Algorithms: Asymptotic Notation

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Copyright 2000, Kevin Wayne 1

Module 1: Analyzing the Efficiency of Algorithms

Practical Session #3 - Recursions

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Big O Notation. P. Danziger

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

An analogy from Calculus: limits

Algorithms Design & Analysis. Analysis of Algorithm

Data Structures and Algorithms CSE 465

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

with the size of the input in the limit, as the size of the misused.

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Reading 10 : Asymptotic Analysis

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

EECS 477: Introduction to algorithms. Lecture 5

3.1 Asymptotic notation

Analysis of Algorithms

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

The Time Complexity of an Algorithm

The Time Complexity of an Algorithm

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Asymptotic Algorithm Analysis & Sorting

Lecture 2: Asymptotic Notation CSCI Algorithms I

Data Structures and Algorithms

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

CS1210 Lecture 23 March 8, 2019

CS 380 ALGORITHM DESIGN AND ANALYSIS

Submit Growable Array exercise Answer Q1-3 from today's in-class quiz.

Omega notation. Transitivity etc.

Asymptotic Analysis 1

Divide and Conquer. Recurrence Relations

Big O Notation. P. Danziger

CS483 Design and Analysis of Algorithms

2. ALGORITHM ANALYSIS

Data Structures and Algorithms Chapter 3

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Topic 17. Analysis of Algorithms

Review Asympto&c Bounds

Data Structures and Algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Big , and Definition Definition

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CS 310 Advanced Data Structures and Algorithms

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Assignment 5 Bounding Complexities KEY

CS473 - Algorithms I

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS Non-recursive and Recursive Algorithm Analysis

CSE 373: Data Structures and Algorithms. Asymptotic Analysis. Autumn Shrirang (Shri) Mare

IS 709/809: Computational Methods in IS Research Fall Exam Review

Principles of Algorithm Analysis

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

data structures and algorithms lecture 2

Space Complexity of Algorithms

Cpt S 223. School of EECS, WSU

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Ch01. Analysis of Algorithms

CSC Design and Analysis of Algorithms. Lecture 1

CS 161 Summer 2009 Homework #2 Sample Solutions

Divide-and-Conquer Algorithms Part Two

COMP Analysis of Algorithms & Data Structures

Problem Set 1. CSE 373 Spring Out: February 9, 2016

COMP Analysis of Algorithms & Data Structures

1 Substitution method

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

The Growth of Functions (2A) Young Won Lim 4/6/18

Analysis of Algorithms

Lecture 3. Big-O notation, more recurrences!!

Complexity (Pre Lecture)

Ch 01. Analysis of Algorithms

Transcription:

CS 344 Design and Analysis of Algorithms Tarek El-Gaaly tgaaly@cs.rutgers.edu Course website: www.cs.rutgers.edu/~tgaaly/cs344.html

Course Outline Textbook: Algorithms by S. Dasgupta, C.H. Papadimitriou, and U.V. Vazirani ebook Chapters: Prologue Today: growth rates of functions Algorithms with numbers Divide & Conquer Decomposition of graphs Paths in graphs Greedy Algorithms Dynamic Programming NP-complete problems Coping with NP-completeness No LP and Quantum Computing Good ref: Intro to Algorithms, Cormen et al.

What makes a good algorithm? Simplicity in implementation (debuggable) Parallelizability Space complexity Time complexity Worst case performance? When input size gets big! Growth rates Recursive vs. Iterative? E.g. Towers of Hanoi

Runtime of an algorithm and comparing with others Growth rates give a simple measure of efficiency and allow comparison of algorithms In terms of n (input size) Example: When n gets large Merge-sort: θ(n logn) Insertion Sort: θ(n 2 ) [best case θ(n)] Wikipedia: Sorting Algorithms - http://en.wikipedia.org/wiki/sorting_algorithm Wikipedia: Big O Notation - http://en.wikipedia.org/wiki/big_o_notation

Asymptotic Analysis Asymptote: in analytical geometry line which a curve approaches at infinity (bound) When things get big what are the limits? Logarithmic Polylogarithmic Sublinear (e.g. n.5 = n Linear Quadratic Exponential

Definitions: θ-notation θ-notation If c1, c2 and n0 exist Example: 1 2 n2 3n = θ(n 2 ) Must determine c 1, c 2, n o s.t c 1 n 2 1 2 n2 3n c 2 n 2, for all n n o

Definitions: O-notation O-notation O(g n ) = {f n : there exists pos. const. c and n o s.t. 0 f n cg n for all n n o } Bounded above

Definitions: Ω-notation Ω-notation Ω g n = {f n : there exists pos. const. c and n o s.t. 0 cg n f(n) for all n n o } Asymptotically bounded below Bounded best-case of an algorithm (e.g. insertion sort: Ω(n)) Insertion sort: tightly bound below by Ω(n) and tightly bound above by O n 2

Definition: o-notation o-notation o(g n = {f(n): for any pos. const. c > 0, there exists a const. n o > 0 s.t. 0 f n < cg n for all n n o } Asymptotically loosely bound

Definition: ω-notation ω-notation Related to Ω as o is to O ω g n = {f n : for any pos. const. c > 0, there exists a const. n o s.t. 0 cg n < f(n) for all n n o Lower loose bound

Asymptotic Notation Exact complexity is not worth the effort n + 2 operations is no different from n operations Larger order terms dominate: n 2 + n + 1 O bounded above Ω bounded below θ bounded above and below o loosely bound from above ω loosely bound from below

O, θ, Ω Asymptotic Analysis

Analogous to Inequalities f = O(g) f g (asymptotic upper bound) f = Ω(g) f g (asymptotic lower bound) f = θ(g) f = g (asymptotic tight bound) f = o(g) f < g (asymptotic loose upper bound) f = ω(g) f > g (asymptotic loose lower bound)

Rules of Thumb Multiplicative constants can be omitted 14n 2 n 2 n a dominates n b if a > b e.g. n 2 dominates n Any exponential dominates any polynomial 3 n dominates n 5 (and even 2 n ) Any polynomial dominates any logarithm n dominates logn 4

Rules of Thumb

Rules of Thumb

Rules of Thumb f = Ω g g = O(f) f = θ n f = O g and f = Ω(g) f = O g and g = O f f = θ(g)

Limits Lim f g Lim f g = f = ω(g) = 0 f = o(g) Real-life O, θ What happens if you get or 0 0 (undefined values) L Hopital s rule

L Hopital s Rule Lim f g is the same as Lim f g Which is the same as Lim f g And so on Intuitively: compare the rate of change of the rate of change etc.

Logarithmic Order Binary search (search sorted data) Select middle element of array (i.e median) Compares it against a target value. If the values match it will return success. If lower than target take upper half of data If higher than target take lower half Continue O(log N) Iterative halving of data produces a growth curve that peaks at the beginning and slowly flattens out as size of the data increase e.g. 10 items takes one second to complete, 100 items takes two seconds, and 1000 items will take three seconds. Doubling the size of the input data set has little effect on its growth. This makes algorithms like binary search extremely efficient when dealing with large data

Logarithmic Order

Logarithmic Rules Intro to Algorithms: Change of base

Hierarchy of Functions

Hierarchy of Functions

Examples Determine if: f = O(g) or f = Ω(g) or both: f = θ(g)? f g Intuitively Rules of Thumb Limits (L Hopital s Rule) Apply log What does nlogn mean intuitively it means the problem space is dividing (by two) recursively i.e. binary search, merge-sort, n 4 + n 2 n 5 n 2 n 2 nlogn 4 log 2 n 3 n 2 n

Examples If you can find the constant then it becomes big-oh or big-omega http://www.cs.odu.edu/~toida/nerzic/content/function/growth.html

Some Additional Notes n! = o(n n ) n! = ω(2 n ) Log(n!) = θ(nlogn)

Summary When comparing functions: Use rule of thumbs Use direct substitution Look at limits Apply log

Examples Problem: Remove duplicates from an unsorted array Sort: O(n logn) Quicksort One sweep through the array will find duplicates. Copy the nonduplicates elements to a new array Overall O(nlogn + n) = O(nlogn) Use a hashtable of the elements to check if duplicate Space complexity? Another array is needed Alternative: do it in-place One solution: Go through the array, build a hashtable and mark the duplicates and move nonduplicates back into the hole (still space complexity is double not in place) If elements are limited between 0 and n think about it!