Data Structures and Algorithms CSE 465

Similar documents
CS473 - Algorithms I

Introduction to Algorithms: Asymptotic Notation

Growth of Functions (CLRS 2.3,3)

with the size of the input in the limit, as the size of the misused.

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Data Structures and Algorithms CSE 465

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

3.1 Asymptotic notation

Computational Complexity

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Algorithms Design & Analysis. Analysis of Algorithm

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Analysis of Algorithms - Using Asymptotic Bounds -

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Introduction to Algorithms 6.046J/18.401J

Data Structures and Algorithms CMPSC 465

COMP 9024, Class notes, 11s2, Class 1

Lecture 2: Asymptotic Notation CSCI Algorithms I

Intro to Theory of Computation

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

data structures and algorithms lecture 2

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

The Time Complexity of an Algorithm

Omega notation. Transitivity etc.

Module 1: Analyzing the Efficiency of Algorithms

CS173 Running Time and Big-O. Tandy Warnow

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Algorithm Design and Analysis

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

The Time Complexity of an Algorithm

EECS 477: Introduction to algorithms. Lecture 5

Agenda. We ve discussed

Algorithm Design and Analysis

Module 1: Analyzing the Efficiency of Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Big-O Notation and Complexity Analysis

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 3. Big-O notation, more recurrences!!

Foundations II: Data Structures and Algorithms

CS 380 ALGORITHM DESIGN AND ANALYSIS

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Problem Set 1 Solutions

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Asymptotic Analysis and Recurrences

Review Of Topics. Review: Induction

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Data Structures and Algorithms Running time and growth functions January 18, 2018

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Asymptotic Analysis 1

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Principles of Algorithm Analysis

COMP 382: Reasoning about algorithms

Divide and Conquer. Recurrence Relations

Divide and Conquer. Andreas Klappenecker

CSE 417: Algorithms and Computational Complexity

Algorithms Chapter 4 Recurrences

Time complexity analysis

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CSC Design and Analysis of Algorithms. Lecture 1

P, NP, NP-Complete, and NPhard

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

ITEC2620 Introduction to Data Structures

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

CS F-01 Algorithm Analysis 1

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Design and Analysis of Algorithms

The Divide-and-Conquer Design Paradigm

CS 310 Advanced Data Structures and Algorithms

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS Non-recursive and Recursive Algorithm Analysis

Introduction to Algorithms 6.046J/18.401J/SMA5503

CS 161 Summer 2009 Homework #2 Sample Solutions

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

CSC236 Week 4. Larry Zhang

Mergesort and Recurrences (CLRS 2.3, 4.4)

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Analysis of Algorithms Review

1 Substitution method

CSE332: Data Abstrac0ons Sec%on 2. HyeIn Kim Spring 2014

Asymptotic Algorithm Analysis & Sorting

Transcription:

Data Structures and Algorithms CSE 465 LECTURE 3 Asymptotic Notation O-, Ω-, Θ-, o-, ω-notation Divide and Conquer Merge Sort Binary Search Sofya Raskhodnikova and Adam Smith /5/0

Review Questions If input length doubles, by roughly how much does the worst-case running time of Insertion Sort increase? (Answer: 4) How long does Insertion Sort run on the input,,4,3,6,5,,n, n (as a function of n)? (Answer: Θ(n). Inner insertion loop never has to look more than one position down the list to find the right spot.) /5/0

Asymptotic notation O-notation (upper bounds): We write f(n) = O(g(n)) if there exist constants c > 0, n 0 > 0 such that 0 f(n) cg(n) for all n n 0. EXAMPLE: n = O(n 3 ) functions, not values (c =, n 0 = ) funny, one-way equality /5/0

Set definition of O-notation O(g(n)) = { f(n) : there exist constants c > 0, n 0 > 0 such that 0 f(n) cg(n) for all n n 0 } EXAMPLE: n O(n 3 ) (Logicians: λn.n O(λn.n 3 ), but it s convenient to be sloppy, as long as we understand what s really going on.) /5/0

Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: (right-hand side) f(n) = n 3 + O(n ) means f(n) = n 3 + h(n) for some h(n) O(n ). /5/0

Macro substitution Convention: A set in a formula represents an anonymous function in the set. EXAMPLE: (left-hand side) n + O(n) = O(n ) means for any f(n) O(n): n + f(n) = h(n) for some h(n) O(n ). /5/0

Ω-notation (lower bounds) O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n ). Ω(g(n)) = { f(n) : there exist constants c > 0, n 0 > 0 such that 0 cg(n) f(n) for all n n 0 } EXAMPLE: n = Ω(lg n) (c =, n 0 = 6) /5/0

Θ-notation (tight bounds) Θ(g(n)) = O(g(n)) Ω(g(n)) EXAMPLE: n n = Θ( n ) Polynomials are simple: a d n d + a d n d + + a n + a 0 = Θ(n d ) /5/0

ο-notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like < and >. ο(g(n)) = { f(n) : for any constant c > 0, there is a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n 0 } EXAMPLE: n = o(n 3 ) (n 0 = /c) /5/0

ο-notation and ω-notation O-notation and Ω-notation are like and. o-notation and ω-notation are like < and >. ω(g(n)) = { f(n) : for any constant c > 0, there is a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n 0 } EXAMPLE: n = ω (lg n) (n 0 = +/c) /5/0

Summary Notation means Think E.g. Lim f(n)/g(n) f(n)=o(n) c>0, n 0 >0, n > n 0 : 0 f(n) < cg(n) Upper bound 00n = O(n 3 ) If it exists, it is < f(n)=ω(g(n)) c>0, n 0 >0, n > n 0 : 0 cg(n) < f(n) Lower bound n 00 = Ω( n ) If it exists, it is > 0 f(n)=θ(g(n)) both of the above: f=ω(g) and f = O(g) Tight bound = log(n!) = Θ(n log n) If it exists, it is > 0 and < f(n)=o(g(n)) c>0, n 0 >0, n > n 0 : 0 f(n) < cg(n) < n = o( n ) Limit exists, =0 f(n)=ω(g(n)) c>0, n 0 >0, n > n 0 : 0 cg(n) < f(n) > n = ω(log n) Limit exists, = /5/0

log(n) n n n log(n) n n,000,000 Some functions sorted by asymptotic growth n (beats n k for any fixed k) n! /5/0

The divide-and-conquer design paradigm. Divide the problem (instance) into subproblems.. Conquer the subproblems by solving them recursively. 3. Combine subproblem solutions. We ll see lots of examples /5/0

A faster sort: Merge Sort MERGE-SORT A[.. n]. If n =, done.. Recursively sort A[.. n/ ] and A[ n/+.. n ]. 3. Merge the sorted lists. Key subroutine: MERGE /5/0

A faster sort: Merge Sort input A[.. n] A[.. n/ ] A[n/+.. n] MERGE SORT MERGE SORT Sorted A[.. n/ ] Sorted A[n/+.. n] MERGE output /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0

Merging two sorted arrays /5/0 Time = one pass through each array = Θ(n) to merge a total of n elements (linear time).

Analyzing Merge Sort Abuse T(n) Θ() T(n/) Θ(n) MERGE-SORT A[.. n]. If n =, done.. Recursively sort A[.. n/ ] and A[ n/+.. n ]. 3. Merge the sorted lists Sloppiness: Should be T( n/ ) + T( n/ ), but it turns out not to matter asymptotically. /5/0

Recurrence for Merge Sort T(n) = Θ() if n = ; T(n/) + Θ(n) if n >. We usually omit stating the base case because our algorithms always run in time Θ() when n is a small constant. CLRS and next lectures provide several ways to find a good upper bound on T(n). /5/0

Recursion tree Solve T(n) = T(n/) + cn, where c > 0 is constant. /5/0

Recursion tree Solve T(n) = T(n/) + cn, where c > 0 is constant. T(n) /5/0

Recursion tree Solve T(n) = T(n/) + cn, where c > 0 is constant. cn T(n/) T(n/) /5/0

Recursion tree Solve T(n) = T(n/) + cn, where c > 0 is constant. cn cn/ cn/ T(n/4) T(n/4) T(n/4) T(n/4) /5/0

Recursion tree Solve T(n) = T(n/) + cn, where c > 0 is constant. cn cn cn/ cn/ cn h = lg n cn/4 cn/4 cn/4 cn/4 cn Θ() #leaves = n Θ(n) Total = Θ(n lg n) /5/0

Merge Sort vs Insertion Sort Θ(n lg n) grows more slowly than Θ(n ). Therefore, Merge Sort asymptotically beats Insertion Sort in the worst case. In practice, Merge Sort beats Insertion Sort for n > 30 or so But how does Merge Sort do on nearly sorted inputs? Go test it out for yourself! /5/0