Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Similar documents
Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Growth of Functions (CLRS 2.3,3)

COMP 382: Reasoning about algorithms

Computational Complexity

Analysis of Algorithms - Using Asymptotic Bounds -

Chapter 4. Recurrences

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Algorithms Design & Analysis. Analysis of Algorithm

Reading 10 : Asymptotic Analysis

with the size of the input in the limit, as the size of the misused.

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Module 1: Analyzing the Efficiency of Algorithms

Lecture 3. Big-O notation, more recurrences!!

COMP 9024, Class notes, 11s2, Class 1

Asymptotic Analysis 1

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Asymptotic Analysis and Recurrences

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS Data Structures and Algorithm Analysis

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Recursion. Computational complexity

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

CS F-01 Algorithm Analysis 1

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Data Structures and Algorithms CMPSC 465

Data Structures and Algorithms Running time and growth functions January 18, 2018

Module 1: Analyzing the Efficiency of Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Lecture 2: Asymptotic Notation CSCI Algorithms I

Algorithm efficiency analysis

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Review Of Topics. Review: Induction

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Data Structures and Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Methods for solving recurrences

Asymptotic Algorithm Analysis & Sorting

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

282 Math Preview. Chris Brown. September 8, Why This? 2. 2 Logarithms Basic Identities Basic Consequences...

Math 391: Midterm 1.0 Spring 2016

Lecture 6: Recurrent Algorithms:

Algorithms Chapter 4 Recurrences

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Analysis of Algorithms

Big , and Definition Definition

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Big O Notation. P. Danziger

Big O Notation. P. Danziger

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Introduction to Algorithms

The Time Complexity of an Algorithm

How many hours would you estimate that you spent on this assignment?

data structures and algorithms lecture 2

Chapter 4 Divide-and-Conquer

Data Structures and Algorithms CSE 465

The Time Complexity of an Algorithm

Big-oh stuff. You should know this definition by heart and be able to give it,

Notes for Recitation 14

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS Non-recursive and Recursive Algorithm Analysis

Ch01. Analysis of Algorithms

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Written Homework #1: Analysis of Algorithms

Computational complexity and some Graph Theory

Fundamental Algorithms

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CSC Design and Analysis of Algorithms. Lecture 1

COMPUTER ALGORITHMS. Athasit Surarerks.

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

N/4 + N/2 + N = 2N 2.

Introduction to Algorithms: Asymptotic Notation

Data Structures and Algorithms. Asymptotic notation

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis. Hunter Zahn Summer 2016

The Growth of Functions and Big-O Notation

CS 161 Summer 2009 Homework #2 Sample Solutions

Homework 1 Solutions

Ch 01. Analysis of Algorithms

Transcription:

Taking Stock IE170: Algorithms in Systems Engineering: Lecture 3 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University January 19, 2007 Last Time Lots of funky math Playing with summations: Formulae and Bounds Sets A brief introduction to our friend Θ This Time Questions on Homework? Θ, O and Ω Recursion Analyzing Recurrences Comparing Algorithms Θ Notation We now define the set Consider algorithm A with running time given by f and algorithm B with running time given by g. We are interested in knowing L = lim n g(n) What are the four possibilities? L = 0: g grows faster than f L = : f grows faster than g L = c: f and g grow at the same rate. The limit doesn t exist. Θ(g) = f : c 1, c 2, n 0 > 0 such that c 1 g(n) c 2 g(n) n n 0 } If f Θ(g), then we say that f and g grow at the same rate or that they are of the same order. Note that We also know that if f Θ(g) g Θ(f) lim n g(n) = c for some constant c, then f Θ(g).

Big-O Notation Big-Ω Notation O(g) = f constants c, n 0 > 0 s.t. cg(n) n n 0 } If f O(g), then we say that f is big-o of g or that g grows at least as fast as f If we say 2n 2 + 3n + 1 = 2n 2 + O(n) this means that 2n 2 + 3n + 1 = 2n 2 + for some f O(n) (e.g. = 3n + 1). Ω(g) = f constants c, n 0 > 0 s.t. 0 cg(n) n n 0 } f Ω(g) means that g is an asymptotic lower bound on f f grows faster than g Note f Θ(g) f O(g) and f Ω(g). f Ω(g) g O(f). Strict Asymptotic Bounds. Little oh Comparing Functions Note f o(g) lim n g(n) = 0 f ω(g) g o(f) lim n g(n) = f o(g) f O(g) \ Θ(g). f ω(g) f O(g) \ Θ(g). The notation we have just defined gives us a way of ordering functions. This gives us a method for comparing algorithms based on their running times! The Upshot! f O(g) is like f g, f Ω(g) is like f g, f o(g) is like f < g, f ω(g) is like f > g, and f Θ(g) is like f = g.

Examples Commonly Occurring Functions Some Functions in O(n 2 ) n 2 n 2 + n n 2 + 1000n 1000n 2 + 1000n n n 1.9999 n 2 / lg lg n A Question Which of these are o(n 2 )?, ω(n 2 )? Some Functions in Ω(n 2 ) n 2 n 2 + n n 2 + 1000n 1000n 2 + 1000n n 3 n 2.0001 n 2 / lg lg n Polynomials = k i=0 a in i is a polynomial of degree k Polynomials f of degree k are in Θ(n k ). Exponentials A function in which n appears as an exponent on a constant is an exponential function, i.e., 2 n. For all positive constants a and b, lim n n a b n = 0. This means that exponential functions always grow faster than polynomials More Functions Logarithms x = log n (a) b x = a Logarithms of different bases differ only by a constant multiple, so they all grow at the same rate. A polylogarithmic function is a function in O(lg k ). Polylogarithmic functions always grow more slowly than polynomials. Factorials n! = n(n 1)(n 2) (1) n! = o(n n ), n! = ω(2 n ) lg(n!) = Θ(n lg n) Logs a n a m = a n+m We use the notation lg n = log 2 n ln n = log e n lg k n = (lg n) k Changing the base of a logarithm changes its value by a constant factor Log Rules a = b log b a lg ( n k=1 a k) = n k=1 lg a k log b a n = n log b a log b a = (log c a)/(log c b) log b a = 1/(log a b) a log b n = n log b a

Problem Difficulty A+++++++++++++++++++++++ The difficulty of a problem can be judged by the (worst-case) running time of the best-known algorithm. Problems for which there is an algorithm with polynomial running time (or better) are called polynomially solvable. Generally, these problems are considered to be easy. Formally, they are in the complexity class P There are many interesting problems for which it is not known if there is a polynomial-time algorithm. These problems are generally considered difficult. This is known as the complexity class N P. You will get a very good grade in this class if you prove P = N P It is open of the great open questions in mathematics: Are these truly difficult problems, or have we not yet discovered the right algorithm? If you answer this question, you can win a million dollars: http://www.claymath.org/millennium/p vs NP/ Most important, you can get the jokes from the Simpsons: www.mathsci.appstate.edu/ sjg/simpsonsmath/ In this course, we will stick mostly to the easy problems, for which a polynomial time algorithm is known. Analyzing Recurrences Good Stuff Deep Thoughts To understand recursion, we must first understand recursion General methods for analyzing recurrences Substitution Master Theorem Generating Functions Note that when we analyze a recurrence, we may not get or need an exact answer, only an asymptotic one We may prove the running time is in O(f) or Θ(f) If you are only concerned about the asymptotic behavior of a recurrence, then 1 You can ignore floors and ceilings: (Asymptotic behavior doesn t care if you round down or up) 2 We assume that all algorithms run in Θ(1) (Constant time) for a small enough fixed input size n. This makes the base case of induction easy.

A Few Examples of Recurrences Soms More Recurrences This recurrence arises in algorithms that loop through the input to eliminate one item. T (n 1) + n n > 1 This recurrence arises in algorithms that halve the input in one step. T (n/2) + 1 n > 1 This recurrence arises in algorithms that halve the input in one step, but have to scan through the data at each step. T (n/2) + n n > 1 This recurrence arises in algorithms that quarter the input in one step, but have to scan through the data 4 times at each step. T (n/4) + 4n n > 1 Solving Recurrences by Substitution The Master Theorem A Simple Two Part Plan 1 Guess an answer 2 Use induction to prove or disprove your guess Here let s show that if T ( n/2 ) + 1 T O(lg n) Most recurrences that we will be interested in are of the form at (n/b) + n > 1 The Master Theorem tells us how to analyze recurrences of this form. If f O(n log b a ε ), for some constant ε > 0, then T Θ(n log b a ). If f Θ(n log b a ), then T Θ(n log b a lg n). If f Ω(n log b a+ε ), for some constant ε > 0, and if af(n/b) c for some constant c < 1 and n > n 0, then T Θ(f). How do we interpret this?

A Few More Examples A Few More Comments on Recursion This recurrence arises in algorithms that partition the input in one step, but then make recursive calls on both pieces. 2T (n/2) + 1 n > 1 This recurrence arises in algorithms that scan through the data at each step, divide it in half and then make recursive calls on each piece. 2T (n/2) + n n > 1 We can analyze these using the Master Theorem. Generally speaking, recursive algorithms should have the following two properties to be guarantee well-defined termination. They should solve an explicit base case. Each recursive call should be made with a smaller input size. All recursive algorithms have an associated tree that can be used to diagram the function calls. Execution of the program essentially requires traversal of the tree. By adding up the number of steps at each node of the tree, we can compute the running time. We will revisit trees later in the course. The Call Stack The call stack of a program keeps track of the current sequence of function calls. When a new function call is made, data for the current one is saved on the call stack. When a function call returns, it returns to the next function on the top of the stack. The stack depth is the maximum number of functions on the stack at any one time. In a recursive program, the stack depth can be very large. This can create memory problems, even for simple recursive programs. There is also an overhead associated with each function call. Next Time Homework is due! Simple Sorting and Its Analysis Three Hours of Fun-Filled Lab Go Bears!