Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Similar documents
Growth of Functions (CLRS 2.3,3)

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Algorithms, CSE, OSU Quicksort. Instructor: Anastasios Sidiropoulos

Big-O Notation and Complexity Analysis

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

CS Non-recursive and Recursive Algorithm Analysis

The Time Complexity of an Algorithm

Computational Complexity

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Time complexity analysis

COMP Analysis of Algorithms & Data Structures

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Agenda. We ve discussed

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

COMP Analysis of Algorithms & Data Structures

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Big O (Asymptotic Upper Bound)

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Module 1: Analyzing the Efficiency of Algorithms

CS473 - Algorithms I

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

EECS 477: Introduction to algorithms. Lecture 5

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Module 1: Analyzing the Efficiency of Algorithms

Data Structures and Algorithms CSE 465

Review for Midterm 1. Andreas Klappenecker

3.1 Asymptotic notation

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

COMP 382: Reasoning about algorithms

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Ch01. Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

CS 380 ALGORITHM DESIGN AND ANALYSIS

Asymptotic Analysis of Algorithms. Chapter 4

Lecture 2: Asymptotic Notation CSCI Algorithms I

Data Structures and Algorithms. Asymptotic notation

Introduction to Algorithms and Asymptotic analysis

Foundations II: Data Structures and Algorithms

Homework Assignment 1 Solutions

CSC Design and Analysis of Algorithms. Lecture 1

Asymptotic Analysis Cont'd

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Analysis of Algorithms

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Chapter 0: Preliminaries

Introduction to Algorithms

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

The Growth of Functions and Big-O Notation

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. January 16, 2019

data structures and algorithms lecture 2

How many hours would you estimate that you spent on this assignment?

P, NP, NP-Complete, and NPhard

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Review 1. Andreas Klappenecker

Written Homework #1: Analysis of Algorithms

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Analysis of Multithreaded Algorithms

CS 61B Asymptotic Analysis Fall 2017

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Cpt S 223. School of EECS, WSU

Asymptotic Analysis 1

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Data Structures and Algorithms

COMPUTER ALGORITHMS. Athasit Surarerks.

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

with the size of the input in the limit, as the size of the misused.

Introduction to Algorithms: Asymptotic Notation

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Complexity. CS 320, Fall Dr. Geri Georg, Instructor Some materials adapted from Prof. Wim Bohm.

Problem Set 1 Solutions

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Big-oh stuff. You should know this definition by heart and be able to give it,

IS 709/809: Computational Methods in IS Research Fall Exam Review

Analysis of Algorithms

Algorithms Design & Analysis. Analysis of Algorithm

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Ch 01. Analysis of Algorithms

Reading 10 : Asymptotic Analysis

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Analysis Fundamentals

Fall 2016 Test 1 with Solutions

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

A New Paradigm for the Computational Complexity Analysis of Algorithms and Functions

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Theory of Computation Chapter 1: Introduction

Review Of Topics. Review: Induction

Algorithms and Their Complexity

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

CS Data Structures and Algorithm Analysis

Transcription:

6331 - Algorithms, CSE, OSU Introduction, complexity of algorithms, asymptotic growth of functions Instructor: Anastasios Sidiropoulos

Why algorithms? Algorithms are at the core of Computer Science

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms. Operating systems: Scheduling, paging, etc.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms. Operating systems: Scheduling, paging, etc. AI: Learning algorithms, etc.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms. Operating systems: Scheduling, paging, etc. AI: Learning algorithms, etc. Graphics: Rendering algorithms.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms. Operating systems: Scheduling, paging, etc. AI: Learning algorithms, etc. Graphics: Rendering algorithms. Robotics: Motion planning / control algorithms, etc.

Why algorithms? Algorithms are at the core of Computer Science Data bases: Data structures. Networks: Routing / communication algorithms. Operating systems: Scheduling, paging, etc. AI: Learning algorithms, etc. Graphics: Rendering algorithms. Robotics: Motion planning / control algorithms, etc. Game development

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering.

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering. Algorithms for processing complicated data.

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering. Algorithms for processing complicated data. Computational Biology.

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering. Algorithms for processing complicated data. Computational Biology. Medicine: drug design / delivery.

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering. Algorithms for processing complicated data. Computational Biology. Medicine: drug design / delivery. Sociology / Economics: Algorithmic game theory.

Algorithms beyond Computer Science Algorithms are a transformative force in Science & Engineering. Algorithms for processing complicated data. Computational Biology. Medicine: drug design / delivery. Sociology / Economics: Algorithmic game theory....

What makes a good algorithm? How can we quantify the performance of an algorithm?

What makes a good algorithm? How can we quantify the performance of an algorithm? Computational resources Time complexity: How much time does an algorithm need to terminate?

What makes a good algorithm? How can we quantify the performance of an algorithm? Computational resources Time complexity: How much time does an algorithm need to terminate? Space complexity: How much memory does an algorithm require?

What makes a good algorithm? How can we quantify the performance of an algorithm? Computational resources Time complexity: How much time does an algorithm need to terminate? Space complexity: How much memory does an algorithm require? In other contexts, we might also be interested in different parameters.

What makes a good algorithm? How can we quantify the performance of an algorithm? Computational resources Time complexity: How much time does an algorithm need to terminate? Space complexity: How much memory does an algorithm require? In other contexts, we might also be interested in different parameters. Communication complexity (i.e. the total amount of bits exchanged in a system).

What makes a good algorithm? How can we quantify the performance of an algorithm? Computational resources Time complexity: How much time does an algorithm need to terminate? Space complexity: How much memory does an algorithm require? In other contexts, we might also be interested in different parameters. Communication complexity (i.e. the total amount of bits exchanged in a system). Waiting / service time (e.g. in queuing systems).

Worst-case complexity The worst-case time complexity, or worst-case running time of an algorithm is a function f : N N, where f (n) = maximum # of steps required on any input of size n

Worst-case complexity The worst-case time complexity, or worst-case running time of an algorithm is a function f : N N, where f (n) = maximum # of steps required on any input of size n More precisely: For any input x {0, 1} n, let T (x) = # of steps required on input x

Worst-case complexity The worst-case time complexity, or worst-case running time of an algorithm is a function f : N N, where f (n) = maximum # of steps required on any input of size n More precisely: For any input x {0, 1} n, let T (x) = # of steps required on input x Then, f (n) = max T (x) x {0,1} n

Example of worst-case complexity Finding an element in an array.

Example of worst-case complexity Finding an element in an array. Input: integer array A[1... n], and integer x.

Example of worst-case complexity Finding an element in an array. Input: integer array A[1... n], and integer x. Find some i, if one exists, such that A[i] = x.

Example of worst-case complexity Finding an element in an array. Input: integer array A[1... n], and integer x. Find some i, if one exists, such that A[i] = x. Algorithm for i = 1 to n if A[i] = x output i, and terminate end output not found

Example of worst-case complexity Finding an element in an array. Input: integer array A[1... n], and integer x. Find some i, if one exists, such that A[i] = x. Algorithm for i = 1 to n if A[i] = x output i, and terminate end output not found What is the worst-case time complexity of this algorithm?

Example of worst-case complexity Finding an element in an array. Input: integer array A[1... n], and integer x. Find some i, if one exists, such that A[i] = x. Algorithm for i = 1 to n if A[i] = x output i, and terminate end output not found What is the worst-case time complexity of this algorithm? What is the best possible time complexity?

How do we compare different functions? We will mostly deal with non-decreasing functions.

How do we compare different functions? We will mostly deal with non-decreasing functions. In general, we cannot compare functions the same way we compare numbers.

How do we compare different functions? We will mostly deal with non-decreasing functions. In general, we cannot compare functions the same way we compare numbers. E.g., n 2 vs 1000000n. Which one is smaller?

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 }

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. 10n 2 + 5n 100 O(n 2 )

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. We write 10n 2 + 5n 100 O(n 2 ) 10n 2 + 5n 100 = O(n 2 )

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. We write 10n 2 + 5n 100 O(n 2 ) 10n 2 + 5n 100 = O(n 2 ) Examples: n 2 vs 1000000n?

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. We write 10n 2 + 5n 100 O(n 2 ) 10n 2 + 5n 100 = O(n 2 ) Examples: n 2 vs 1000000n? n 100 vs 2 n?

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. We write 10n 2 + 5n 100 O(n 2 ) 10n 2 + 5n 100 = O(n 2 ) Examples: n 2 vs 1000000n? n 100 vs 2 n? n log n vs 2 n?

O-notation O(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 f (n) c g(n) for all n n 0 } E.g. We write 10n 2 + 5n 100 O(n 2 ) 10n 2 + 5n 100 = O(n 2 ) Examples: n 2 vs 1000000n? n 100 vs 2 n? n log n vs 2 n? 2 2n vs 2 n?

Ω-notation Ω(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 c g(n) f (n) for all n n 0 } Theorem f (n) = O(g(n)) if and only if g(n) = Ω(f (n)).

Ω-notation Ω(g(n)) = {f (n) : there exists positive constants c and n 0 such that 0 c g(n) f (n) for all n n 0 } Theorem f (n) = O(g(n)) if and only if g(n) = Ω(f (n)). Question: Suppose that f (n) = Ω(n). Does this imply that f (n) is increasing?

Θ-notation f (n) = Θ(g(n)) if and only if both of the following hold: f (n) = O(g(n)) f (n) = Ω(g(n))

Θ-notation f (n) = Θ(g(n)) if and only if both of the following hold: f (n) = O(g(n)) f (n) = Ω(g(n)) Examples: n 2 + n + 5 vs 100n 2 + 5n + 3?

Θ-notation f (n) = Θ(g(n)) if and only if both of the following hold: f (n) = O(g(n)) f (n) = Ω(g(n)) Examples: n 2 + n + 5 vs 100n 2 + 5n + 3? n log n vs n 1.0001?

o-notation o(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 f (n) < c g(n) for all n n 0 }

o-notation o(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 f (n) < c g(n) for all n n 0 } If f (n) = o(g(n)), then f (n) lim n g(n) = 0

o-notation o(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 f (n) < c g(n) for all n n 0 } If f (n) = o(g(n)), then f (n) lim n g(n) = 0 Examples: 100n = o(n 2 )

o-notation o(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 f (n) < c g(n) for all n n 0 } If f (n) = o(g(n)), then f (n) lim n g(n) = 0 Examples: 100n = o(n 2 ) n 2 o(n 2 )

ω-notation ω(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 c g(n) < f (n) for all n n 0 }

ω-notation ω(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 c g(n) < f (n) for all n n 0 } f (n) = o(g(n)) if and only if g(n) = ω(f (n)).

ω-notation ω(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 c g(n) < f (n) for all n n 0 } f (n) = o(g(n)) if and only if g(n) = ω(f (n)). Examples: 2 n vs n 10?

ω-notation ω(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 c g(n) < f (n) for all n n 0 } f (n) = o(g(n)) if and only if g(n) = ω(f (n)). Examples: 2 n vs n 10? n vs n log n?

ω-notation ω(g(n)) = {f (n) : for any positive constant c > 0, there exists a constant n 0 such that 0 c g(n) < f (n) for all n n 0 } f (n) = o(g(n)) if and only if g(n) = ω(f (n)). Examples: 2 n vs n 10? n vs n log n? log(n) vs log(log(n))?