Introduction. How can we say that one algorithm performs better than another? Quantify the resources required to execute:

Similar documents
Topic 17. Analysis of Algorithms

Recursion. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Fall 2007

Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Module 1: Analyzing the Efficiency of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Notes. Functions. Introduction. Notes. Notes. Definition Function. Definition. Slides by Christopher M. Bourke Instructor: Berthe Y.

Analysis of Algorithms. Unit 5 - Intractable Problems

CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis. Hunter Zahn Summer 2016

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

CS 310 Advanced Data Structures and Algorithms

Analysis of Algorithms

CS Data Structures and Algorithm Analysis

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Growth of Functions (CLRS 2.3,3)

Computational complexity and some Graph Theory

CHAPTER 3 FUNDAMENTALS OF COMPUTATIONAL COMPLEXITY. E. Amaldi Foundations of Operations Research Politecnico di Milano 1

Notes. Combinatorics. Combinatorics II. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CPSC 467b: Cryptography and Computer Security

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Algorithms and Data S tructures Structures Complexity Complexit of Algorithms Ulf Leser

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

Algorithm Efficiency. Algorithmic Thinking Luay Nakhleh Department of Computer Science Rice University

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Notes. Relations. Introduction. Notes. Relations. Notes. Definition. Example. Slides by Christopher M. Bourke Instructor: Berthe Y.

Ch 01. Analysis of Algorithms

Running Time Evaluation

Algorithms and Their Complexity

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

csci 210: Data Structures Program Analysis

Time Analysis of Sorting and Searching Algorithms

Ch01. Analysis of Algorithms

Chapter 2: The Basics. slides 2017, David Doty ECS 220: Theory of Computation based on The Nature of Computation by Moore and Mertens

Sets. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006

Spring Lecture 21 NP-Complete Problems

csci 210: Data Structures Program Analysis

A point p is said to be dominated by point q if p.x=q.x&p.y=q.y 2 true. In RAM computation model, RAM stands for Random Access Model.

Limitations of Algorithm Power

Algorithms 2/6/2018. Algorithms. Enough Mathematical Appetizers! Algorithm Examples. Algorithms. Algorithm Examples. Algorithm Examples

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Data Structures and Algorithms Running time and growth functions January 18, 2018

Programming, Data Structures and Algorithms Prof. Hema Murthy Department of Computer Science and Engineering Indian Institute Technology, Madras

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Computational Models Lecture 11, Spring 2009

Notes Discrete Mathematics: Introduction. Computer Science & Engineering 235 Discrete Mathematics. Notes. Notes. Why Discrete Mathematics?

CSC 8301 Design & Analysis of Algorithms: Lower Bounds

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Lecture 4: Best, Worst, and Average Case Complexity. Wednesday, 30 Sept 2009

1 Computational problems

Computational complexity theory

Sets. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Fall 2007

CSE 417: Algorithms and Computational Complexity

Notes. Number Theory: Applications. Notes. Number Theory: Applications. Notes. Hash Functions I

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

Lecture 5: Loop Invariants and Insertion-sort

Automata Theory CS Complexity Theory I: Polynomial Time

CS Analysis of Recursive Algorithms and Brute Force

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Adam Blank Spring 2017 CSE 311. Foundations of Computing I. * All slides are a combined effort between previous instructors of the course

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS 350 Algorithms and Complexity

Computer Sciences Department

CS 350 Algorithms and Complexity

Big-O Notation and Complexity Analysis

EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs)

Discrete Mathematics CS October 17, 2006

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size.

a 1 a 2 a 3 a 4 v i c i c(a 1, a 3 ) = 3

Lecture 29: Tractable and Intractable Problems

Asymptotic Algorithm Analysis & Sorting

Computational complexity theory

Intractable Problems Part Two

3.1 Asymptotic notation

Analysis of Algorithms

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Boolean circuits. Lecture Definitions

Section Summary. Sequences. Recurrence Relations. Summations Special Integer Sequences (optional)

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS

Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Topics in Computer Mathematics

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

Data Structures and Algorithms Chapter 2

CS Non-recursive and Recursive Algorithm Analysis

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

EECS150 - Digital Design Lecture 23 - FFs revisited, FIFOs, ECCs, LSFRs. Cross-coupled NOR gates

Transcription:

Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry Spring 2006 1 / 1 Computer Science & Engineering 235 Section 2.3 of Rosen cse235@cse.unl.edu

Introduction How can we say that one algorithm performs better than another? Quantify the resources required to execute: Time Memory I/O circuits, power, etc Time is not merely CPU clock cycles, we want to study algorithms independent or implementations, platforms, and hardware. We need an objective point of reference. For that, we measure time by the number of operations as a function of an algorithm s input size. 2 / 1

Input Size For a given problem, we characterize the input size, n, appropriately: Sorting The number of items to be sorted Graphs The number of vertices and/or edges Numerical The number of bits needed to represent a number The choice of an input size greatly depends on the elementary operation; the most relevant or important operation of an algorithm. 3 / 1 Comparisons Additions Multiplications

Orders of Growth Small input sizes can usually be computed instantaneously, thus we are most interested in how an algorithm performs as n. Indeed, for small values of n, most such functions will be very similar in running time. Only for sufficiently large n do differences in running time become apparent. As n the differences become more and more stark. 4 / 1

Intractability Problems that we can solve (today) only with exponential or super-exponential time algorithms are said to be (likely) intractable. That is, though they may be solved in a reasonable amount of time for small n, for large n, there is (likely) no hope of efficient execution. It may take millions or billions of years. Tractable problems are problems that have efficient (read: polynomial) algorithms to solve them. Polynomial order of magnitude usually means there exists a polynomial p(n) = n k for some constant k that always bounds the order of growth. More on asymptotics in the next slide set. Intractable problems may need to be solved using approximation or randomized algorithms. 5 / 1

Worst, Best, and Average Case Some algorithms perform differently on various inputs of similar size. It is sometimes helpful to consider the Worst-Case, Best-Case, and Average-Case efficiencies of algorithms. For example, say we want to search an array A of size n for a given value K. Worst-Case: K A then we must search every item (n comparisons) Best-Case: K is the first item that we check, so only one comparison 6 / 1

Average-Case 7 / 1 Since any worth-while algorithm will be used quite extensively, the average running-time is arguably the best measure of its performance (if the worst-case is not frequently encountered). For searching an array, and assuming that p is the probability of a successful search, we have: C avg (n) = p + 2p +... + ip +... np n + n(1 p) = p (1 + 2 +... + i +... + n) + n(1 p) n = p (n(n + 1) + n(1 p) n 2 If p = 1 (search succeeds), C avg (n) = n+1 2.5n. If p = 0 (search fails), C avg (n) = n. A more intuitive interpretation is that the algorithm must examine, on average, half of all elements in A.

Average-Case Average-Case analysis of algorithms is important in a practical sense. Often, C avg and C worst have the same order of magnitude and thus, from a theoretical point of view, are no different from each other. Practical implementations, however, require a real-world examination. 8 / 1

Mathematical of s After developing pseudo-code for an algorithm, we wish to analyze its efficiency as a function of the size of the input, n in terms of how many times the elementary operation is performed. Here is a general strategy: 1 Decide on a parameter(s) for the input, n. 2 Identify the basic operation. 3 Evaluate if the elementary operation depends only on n (otherwise evaluate best, worst, and average-case separately. 4 Set up a summation corresponding to the number of elementary operations 5 Simplify the equation to get as simple of a function f(n) as possible. 9 / 1

Examples Example I Consider the following code. (UniqueElements) 1 2 3 4 5 6 7 8 Input : Integer array A of size n Output : true if all elements a A are distinct for i = 1,..., n 1 do for j = i + 1,..., n do if a i = a j then return false end end end return true 10 / 1

Example Example I - For this algorithm, what is The elementary operation? Input Size? Does the elementary operation depend only on n? 11 / 1

Example Example I - For this algorithm, what is The elementary operation? Input Size? Does the elementary operation depend only on n? The outer for-loop is run n 2 times. More formally, it contributes n 1 i=1 12 / 1

Example Example I - The inner for-loop depends on the outer for-loop, so it contributes n j=i+1 13 / 1

Example Example I - The inner for-loop depends on the outer for-loop, so it contributes n j=i+1 We observe that the elementary operation is executed once in each iteration, thus we have C worst (n) = n 1 n i=1 j=i+1 1 = n(n 1) 2 14 / 1

Example Example II The parity of a bit string determines whether or not the number of 1s appearing in it is even or odd. It is used as a simple form of error correction over communication networks. (Parity) 1 2 3 4 5 6 7 8 Input : An integer n in binary (b[]) Output : 0 if the parity of n is even, 1 otherwise parity 0 while n > 0 do if b[0] = 1 then parity parity + 1 mod 2 right-shift(n) end end return parity 15 / 1

Example Example II - For this algorithm, what is The elementary operation? Input Size? Does the elementary operation depend only on n? 16 / 1

Example Example II - For this algorithm, what is The elementary operation? Input Size? Does the elementary operation depend only on n? The while-loop will be executed as many times as there are 1-bits in its binary representation. In the worst case, we ll have a bit string of all ones. 17 / 1

Example Example II - For this algorithm, what is The elementary operation? Input Size? Does the elementary operation depend only on n? The while-loop will be executed as many times as there are 1-bits in its binary representation. In the worst case, we ll have a bit string of all ones. The number of bits required to represent an integer n is log n so the running time is simply log n. 18 / 1

Example Example III (MyFunction(n, m, p)) 1 2 3 4 5 6 7 8 9 Input : Integers n, m, p such that n > m > p Output : Some function f(n, m, p) x = 1 for i = 0,..., 10 do for j = 0,..., n do for k = m/2,..., m do x = x p end end end return x 19 / 1

Example Example III - Outer Loop: executed 11 times. 20 / 1

Example Example III - Outer Loop: executed 11 times. 2nd Loop: executed n + 1 times. 21 / 1

Example Example III - Outer Loop: executed 11 times. 2nd Loop: executed n + 1 times. Inner Loop: executed about m 2 times. 22 / 1

Example Example III - Outer Loop: executed 11 times. 2nd Loop: executed n + 1 times. Inner Loop: executed about m 2 times. Thus we have C(n, m, p) = 11(n + 1)(m/2) 23 / 1

Example Example III - Outer Loop: executed 11 times. 2nd Loop: executed n + 1 times. Inner Loop: executed about m 2 times. Thus we have C(n, m, p) = 11(n + 1)(m/2) But, do we really need to consider p? 24 / 1

Summation Tools I Section 3.2 (p229) has more summation rules. You can always use Maple to evaluate and simplify complex expressions (but know how to do them by hand!). To invoke maple, on cse you can use the command-line interface by typing maple. Under unix (gnome or KDE) or via any xwindows interface, you can use the graphical version via xmaple. Will be demonstrated during recitation. 25 / 1

Summation Tools II Example > simplify(sum(i, i=0..n)); 1 2 n2 + 1 2 n > Sum(Sum(j, j=i..n), i=0..n); n n j i=0 j=1 26 / 1