Electrical & Computer Engineering University of Waterloo Canada February 26, 2007
|
|
- Elizabeth Alexander
- 5 years ago
- Views:
Transcription
1 : Electrical & Computer Engineering University of Waterloo Canada February 26, 2007 We want to choose the best algorithm or data structure for the job. Need characterizations of resource use, e.g., time, space; for circuits: area, depth. Many, many approaches: Worst Case Execution Time (WCET): for hard real-time applications Exact measurements for a specific problem size, e.g., number of gates in a 64-bit addition circuit. Performance models, e.g., R, n 1/2 for latency-throughput, HINT curves for linear algebra (characterize performance through different cache regimes), etc.... :
2 Asymptotic analysis We will focus on Asymptotic analysis: a good first approximation of performance that describes behaviour on big problems Reasonably independent of: Machine details (e.g., 2 cycles for add+mult vs. 1 cycle) Clock speed, programming language, compiler, etc. : : Brief history Basic ideas originated in Paul du Bois-Reymond s Infinitärcalcül ( calculus of infinities ) developed in the 1870s. G. H. Hardy greatly expanded on Paul du Bois-Reymond s ideas in his monograph Orders of Infinity (1910) [3]. The big-o notation was first used by Bachmann (1894), and popularized by Landau (hence sometimes called Landau notation. ) Adopted by computer scientists [4] to characterize resource consumption, independent of small machine differences, languages, compilers, etc. :
3 Basic asymptotic notations Asymptotic behaviour as n, where for our purposes n is the problem size. Three basic notations: f g ( f and g are asymptotically equivalent ) : f g ( f is asymptotically dominated by g ) f g (f and g are asymptotically bounded by one another) Basic asymptotic notations f g means lim n f (n) g(n) = 1 : Example: 3x 2 + 2x + 1 3x 2. is an equivalence relation: Transitive: (x y) (y z) (x z) Reflexive: x x Symmetric: (x y) (y x). Basic idea: We only care about the leading term, disregarding less quickly-growing terms.
4 Basic asymptotic notations i.e., f (n) g(n) f g means lim sup n f (n) g(n) < is eventually bounded by a finite value. Basic idea: f grows more slowly than g, or just as quickly as g. is a preorder (or quasiorder): Transitive: (f g) (g h) (f h). Reflexive: f f fails to be a partial order because it is not antisymmetric: there are functions f, g where f g and g f but f g. Variant: g f means f g. : Basic asymptotic notations Write f g when there are positive constants c 1, c 2 such that c 1 f (n) g(n) c 2 : for sufficiently large n. Examples: n 2n n (2 + sin πn)n is an equivalence relation.
5 Strict forms Write f g when f g but f g. Basic idea: f grows strictly less quickly than g Equivalent: f g exactly when lim n f (n) g(n) = 0. Examples x 2 x 3 log x x Variant: f g means g f : Orders of growth We can use as a ruler by which to judge the growth of functions. Some common tick marks on this ruler are: log log n log n log k n n ɛ n n 2 2 n n! n n 2 2n We can always find in a dense total order without endpoints. i.e., There is no slowest-growing function; There is no fastest-growing function; If f h we can always find a g such that f g h. (The canonical example of a dense total order without endpoints is Q, the rationals.) This fact allows us to sketch graphs in which points on the axes are asymptotes. :
6 Big-O Notation Big-O is a convenient family of notations for asymptotics: O(g) {f : f g} i.e., O(g) is the set of functions f so that f g. O(n 2 ) contains n 2, 7n 2, n, log n, n 3/2, 5,... Note that f O(g) means exactly f g. A standard abuse of notation is to treat a big-o expression as if it were a term: : x 2 + 2x 1/2 + 1 }{{} x 2 = x 2 + O(x 1/2 ) The above equation should be read as there exists a function f O(x 1/2 ) such that x 2 + 2x 1/2 + 1 = x 2 + f (x). Big-O for algorithm analysis Big-O notation is an excellent tool for expressing machine/compiler/language-independent complexity properties. On one machine a sorting algorithm might take 5.73n log n seconds, on another it might take 9.42n log n + 3.2n seconds. We can wave these differences aside by saying the algorithm runs in O(n log n) seconds. O(f (n)) means something that behaves asymptotically like f (n): Disregarding any initial transient behaviour; Disregarding any multiplicative constants c f (n); Disregarding any additive terms that grow less quickly than f (n). :
7 Basic properties of big-o notation Given a choice between an sorting algorithm that runs in O(n 2 ) time and one that runs in O(n log n) time, which should we choose? : 1. Gut instinct: the O(n log n) one, of course! 2. But: note that the class of functions O(n 2 ) also contains n log n. Just because we say an algorithm is O(n 2 ) does not mean it takes n 2 time! 3. It could be that the O(n 2 ) algorithm is faster than the O(n log n) one. Additional notations To distinguish between at most this fast, at least this fast, etc. there are additional big-o-like notations: : f O(g) f g f o(g) f g f Θ(g) f g f Ω(g) f g f ω(g) f g upper bound strict upper bound tight bound lower bound strict lower bound
8 Tricks for a bad remembering day Lower case means strict: o(n) is strict version of O(n) ω(n) is strict version of Ω(n) ω, Ω (omega) is the last letter of the greek alphabet if f ω(g) then g comes after f in asymptotic ordering. f Θ(g): the line through the middle of the theta asymptotes converge : Notation: o( ) f o(g) means f g : o( ) expresses a strict upper bound. If f (n) is o(g(n)), then f grows strictly slower than g. Example: n k=0 2 k = = 2 + o(1) n o(1) indicates the class of functions for which g(n) lim n 1 = 0, which means lim n g(n) = o(1) means 2 plus something that vanishes as n If f is o(g), it is also O(g). n! = o(n n ).
9 Notation: ω( ) f ω(g) means f g : ω( ) expresses a strict lower bound. If f (n) is ω(g(n)), then f grows strictly faster than g. f ω(g) is equivalent to g o(f ). Example: Harmonic series hn = n k=0 1 k ln n + γ + O(n 1 ) hn ω(1) (It is unbounded.) hn ω(ln ln n) n! = ω(2 n ) (grows faster than 2 n ) Notation: Ω( ) f Ω(g) means f g : Ω( ) expresses a lower bound, not necessarily strict If f (n) is Ω(g(n)), then f grows at least as fast as g. f Ω(g) is equivalent to g O(f ) Example: Matrix multiplication requires Ω(n 2 ) time. (At least enough time to look at each of the n 2 entries in the matrices.)
10 Notation: Θ( ) f Θ(g) means f g Θ( ) expresses a tight asymptotic bound If f (n) is Θ(g(n)), then f (n)/g(n) is eventually contained in a finite positive interval [c 1, c 2 ]. Θ( ) bounds are very precise, but often hard to obtain. Example: QuickSort runs in time Θ(n log n) on average. (Tight! Not much faster or slower!) Example: Stirling s approximation ln n! n ln n n + O(ln n) implies that ln n! is Θ(n ln n) Don t make the mistake of thinking that f Θ(g) f (n) means lim n g(n) = k for some constant k. : Algebraic manipulations of big-o Manipulating big-o terms requires some thought always keep in mind what the symbols mean! An additive O(f (n)) term swallows any terms that are f (n): n 2 + n 1/2 + O(n) + 3 = n 2 + O(n) : The n 1/2 and 3 on the l.h.s. are meaningless in the presence of an O(n) term. O(f (n)) O(f (n)) = O(f (n)) not 0! O(f (n)) O(g(n)) = O(f (n)g(n)). Example: What is ln n + γ + O(n 1 ) times n + O(n 1/2 )? [ ln n + γ + O(n 1 ) ] [ ] n + O(n 1/2 ) = n ln n + γn + O(n 1/2 ln n) The terms γo(n 1/2 ), O(n 1/2 ), O(1), etc. get swallowed by O(n 1/2 ln n).
11 Sharpness of estimates Example: for a constant c, ( ( ln(n + c) = ln n 1 + c )) n = ln n + c n c2 2n 2 + ( ) 1 = ln n + Θ n ( = ln n + ln 1 + c ) n : (Maclaurin series) It is also correct to write ln(n + c) = ln n + O(n 1 ) ln(n + c) = ln n + o(1) since Θ(n 1 ) O( 1 n ) o(1). However, the Θ( 1 n ) error term is sharper a better estimate of the error. Sharpness of estimates & The Riemann Hypothesis Example: let π(n) be the number of prime numbers n. The Prime Number Theorem is that where Li(n) = n x=2 π(n) Li(n) (1) 1 ln x dx is the logarithmic integral, and Li(n) Note that (1) is equivalent to: n ln n : π(n) = Li(n) + o(li(n)) It is known that the error term can be improved, for example to ( n ) π(n) = Li(n) + O ln n ln n e a
12 Sharpness of estimates & The Riemann Hypothesis The famous Riemann hypothesis is the conjecture that a sharper error estimate is true: : π(n) = Li(n) + O(n 1 2 ln n) This is one of the Clay Institute millenium problems, with a $1,000,000 reward for a positive proof. Sharp estimates matter! Sharpness of estimates To maintain sharpness of asymptotic estimates during analysis, some caution is required. E.g. If f (n) = 2 n + O(n), what is log f (n)? Bad answer: log f (n) = n + O(n). More careful answer: : log f (n) = log(2 n + O(n)) = log(2 n (1 + O(n2 n ))) = log(2 n ) + log(1 + O(n2 n )) Since log(1 + δ(n)) O(δ(n)) if δ o(1), log f (n) = n + O(n2 n ) i.e., log f (n) is equal to n plus some value converging exponentially fast to 0.
13 Sharpness of estimates log f (n) = n + O(n2 n ) is a reasonably sharp estimate (but, what happens if we take 2 log f (n) with this estimate?) If we don t care about the rate of convergence we can write : f (n) = n + o(1) where o(1) represents some function converging to zero. This is less sharp since we have lost the rate of convergence. Even less sharp is f (n) n which loses the idea that f (n) n 0, and doesn t rule out things like f (n) = n + n 3/4. Asymptotic expansions An asymptotic expansion of a function describes how that function behaves for large values. Often it is used when an explicit description of the function is too messy or hard to derive. e.g. if I choose a string of n bits uniformly at random (i.e., each of the 2 n possible strings has probability 2 n ), what is the probability of getting 3 4 n 1 s? Easy to write the answer: there are ( n k) ways of arranging k 1 s, so the probability of getting 3 4n 1 s is: : P(n) = n k= 3 4 n 2 n ( ) n k This equation is both exact and wholly uninformative.
14 Asymptotic expansions Can we do better? Yes! The number of 1 s in a random bit string is a binomial distribution and is well-approximated by the normal distribution as n : n k= 1 2 n+α n ( ) n 2 n k x=α = 1 F (α) 1 2π e x2 2 dx ( ( )) where F (x) = erf x is the cumulative normal 2 distribution. Maple s asympt command yields the asymptotic expansion: ( ) 1 F (x) 1 O xe x2 2 : Asymptotic expansions example We want to estimate the probability of 3 4 n 1 s: gives α = 1 2 n + α n = 3 4 n n 4. Therefore the probability is ( ) n P(n) 1 F 4 ( O = O ( 1 ne n 32 ) ne n 32 ) : So, the probability of having more than 3 4 n 1 s converges to 0 exponentially fast.
15 Asymptotic Expansions When taking an asymptotic expansion, one writes ln n! n ln n n + O(1) rather than ln n! = n ln n n + O(1) : Writing is a clue to the reader that an asymptotic expansion is being taken, rather than just carrying an error term around. Asymptotic expansions are very important in average case analysis, where we are interested in characterizing how an algorithm performs for most inputs. To prove an algorithm runs in O(f (n)) on average, one technique is to obtain an asymptotic estimate of the probability of running in time f (n), and show it converges to zero very quickly. Asymptotic Expansions for Average-Case Analysis The time required to add two n-bit integers by a no carry adder is proportional to the longest carry sequence. It can be shown that the probability of having a carry sequence of length t(n) satisfies : t(n)+log n+o(1) Pr(carry sequence t(n)) 2 If t(n) log n, the probability converges to 0. We can conclude that the average running time is O(log n). In fact we can make a stronger statement: Pr(carry sequence log n + ω(1)) 0 Translation: The probability of having a carry sequence longer than log n + δ(n), where δ(n) is any unbounded function, converges to zero.
16 The Taylor series method of asymptotic expansion This is a very simple method for asymptotic expansion that works for simple cases; it is one technique Maple s asympt function uses. Recall that the Taylor series of a C function about x = 0 is given by: : f (x) = f (0) + xf (0) + x 2 2! f (0) + x 3 3! f (0) + To obtain an asymptotic expansion of some function F (n) as n, 1. Substitute n = x 1 into F (n). (Then n as x 0.) 2. Take a Taylor series about x = Substitute x = n Use the dominating term(s) as the expansion, and the next term as the error term. Taylor series method of asymptotic expansion: example Example expansion: F (n) = e 1+ 1 n. Obviously lim n F (n) = e, so we expect something of the form F (n) e + o(1). 1. Substitute n = x 1 into F (n): obtain F (x 1 ) = e 1+x 2. Taylor series about x = 0: : 3. Substitute x = n 1 : e 1+x = e + xe + x 2 4. Since e 1 n e 1 2n 2 e, 2 e + x 3 6 e + = e + e n + 1 2n 2 e + 1 6n 3 e + F (n) e + Θ ( ) 1 n
17 of algorithms is a key tool for algorithms and data structures: Analyze algorithms/data structures to obtain sharp estimates of asymptotic resource consumption (e.g., time, space) Possibly use asymptotic expansions in the analysis to estimate e.g. probabilities Use these resource estimates to Decide which algorithm/data structure is best according to design criteria Reason about the performance of compositions (combinations) of algorithms and data structures. : References on asymptotics Course text: [1] Asymptotic notations Concrete Mathematics, Ronald L. Graham, Donald E. Knuth and Oren Patashnik, Ch. 9 [2] Advanced: Shackell, Symbolic [6] Hardy, Orders of Infinity [3] Lightstone + Robinson, Nonarchimedean fields and asymptotic expansions [5] :
18 I [1] Thomas H. Cormen, Charles E. Leiserson, and Ronald R. Rivest. Intoduction to algorithms. McGraw Hill, bib : [2] Ronald L. Graham, Donald E. Knuth, and Oren Patashnik. Concrete Mathematics: A Foundation for Computer Science. Addison-Wesley, Reading, MA, USA, second edition, bib II [3] G. H. Hardy. Orders of infinity. The Infinitärcalcül of Paul du Bois-Reymond. Hafner Publishing Co., New York, Reprint of the 1910 edition, Cambridge Tracts in Mathematics and Mathematical Physics, No. 12. bib : [4] Donald E. Knuth. Big omicron and big omega and big theta. SIGACT News, 8(2):18 24, bib pdf [5] A. H. Lightstone and Abraham Robinson. Nonarchimedean fields and asymptotic expansions. North-Holland Publishing Co., Amsterdam, North-Holland Mathematical Library, Vol. 13. bib
19 III : [6] John R. Shackell. Symbolic asymptotics, volume 12 of Algorithms and Computation in Mathematics. Springer-Verlag, Berlin, bib
Growth of Functions (CLRS 2.3,3)
Growth of Functions (CLRS 2.3,3) 1 Review Last time we discussed running time of algorithms and introduced the RAM model of computation. Best-case running time: the shortest running time for any input
More informationAsymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2
Asymptotic Analysis Slides by Carl Kingsford Jan. 27, 2014 AD Chapter 2 Independent Set Definition (Independent Set). Given a graph G = (V, E) an independent set is a set S V if no two nodes in S are joined
More informationElectrical & Computer Engineering University of Waterloo Canada February 26, 2007
Lecture 1: Asymptotics Lecture 1: Asymptotics Asymptotics Asymptotics: Motivation Electrical & Computer Engineering University of Waterloo Canada February 26, 2007 Motivation We want to choose the best
More informationThis chapter covers asymptotic analysis of function growth and big-o notation.
Chapter 14 Big-O This chapter covers asymptotic analysis of function growth and big-o notation. 14.1 Running times of programs An important aspect of designing a computer programs is figuring out how well
More informationAsymptotic Analysis of Algorithms. Chapter 4
Asymptotic Analysis of Algorithms Chapter 4 Overview Motivation Definition of Running Time Classifying Running Time Asymptotic Notation & Proving Bounds Algorithm Complexity vs Problem Complexity Overview
More informationCIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23
CIS 11 Data Structures and Algorithms with Java Spring 018 Big-Oh Notation Monday, January /Tuesday, January 3 Learning Goals Review Big-Oh and learn big/small omega/theta notations Discuss running time
More informationDefining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo
CSE 421: Intro Algorithms 2: Analysis Summer 2007 Larry Ruzzo Defining Efficiency Runs fast on typical real problem instances Pro: sensible, bottom-line-oriented Con: moving target (diff computers, compilers,
More informationAnalysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College
Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College Why analysis? We want to predict how the algorithm will behave (e.g. running time) on arbitrary inputs, and how it will
More informationAn analogy from Calculus: limits
COMP 250 Fall 2018 35 - big O Nov. 30, 2018 We have seen several algorithms in the course, and we have loosely characterized their runtimes in terms of the size n of the input. We say that the algorithm
More informationComputational Complexity
Computational Complexity S. V. N. Vishwanathan, Pinar Yanardag January 8, 016 1 Computational Complexity: What, Why, and How? Intuitively an algorithm is a well defined computational procedure that takes
More informationAnnouncements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2
CompSci 230 Discrete Math for Computer Science Announcements Read Chap. 3.1-3.3 No recitation Friday, Oct 11 or Mon Oct 14 October 8, 2013 Prof. Rodger Section 3.2 Big-O Notation Big-O Estimates for Important
More informationData Structures and Algorithms Running time and growth functions January 18, 2018
Data Structures and Algorithms Running time and growth functions January 18, 2018 Measuring Running Time of Algorithms One way to measure the running time of an algorithm is to implement it and then study
More informationDefine Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo
CSE 417: Algorithms and Computational 2: Analysis Winter 2007 Larry Ruzzo Define Efficiency Runs fast on typical real problem instances Pro: sensible, bottom-line-oriented Con: moving target (diff computers,
More informationCSC Design and Analysis of Algorithms. Lecture 1
CSC 8301- Design and Analysis of Algorithms Lecture 1 Introduction Analysis framework and asymptotic notations What is an algorithm? An algorithm is a finite sequence of unambiguous instructions for solving
More informationInput Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary
Complexity Analysis Complexity Theory Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary Output TRUE or FALSE Time and Space
More informationAsymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0
Asymptotic Notation Asymptotic notation deals with the behaviour of a function in the limit, that is, for sufficiently large values of its parameter. Often, when analysing the run time of an algorithm,
More informationAsymptotic Analysis. Thomas A. Anastasio. January 7, 2004
Asymptotic Analysis Thomas A. Anastasio January 7, 004 1 Introduction As a programmer, you often have a choice of data structures and algorithms. Choosing the best one for a particular job involves, among
More informationAsymptotic Analysis Cont'd
Cont'd Carlos Moreno cmoreno @ uwaterloo.ca EIT-4103 https://ece.uwaterloo.ca/~cmoreno/ece250 Announcements We have class this Wednesday, Jan 18 at 12:30 That is, we have two sessions this Wednesday: at
More informationData Structures and Algorithms. Asymptotic notation
Data Structures and Algorithms Asymptotic notation Estimating Running Time Algorithm arraymax executes 7n 1 primitive operations in the worst case. Define: a = Time taken by the fastest primitive operation
More informationGrowth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.
Function Growth of Functions Subjects to be Learned Contents big oh max function big omega big theta little oh little omega Introduction One of the important criteria in evaluating algorithms is the time
More informationAdvanced Algorithmics (6EAP)
Advanced Algorithmics (6EAP) MTAT.03.238 Order of growth maths Jaak Vilo 2017 fall Jaak Vilo 1 Program execution on input of size n How many steps/cycles a processor would need to do How to relate algorithm
More informationAlgorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.
Algorithms Analysis Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc. Algorithms analysis tends to focus on time: Techniques for measuring
More informationModule 1: Analyzing the Efficiency of Algorithms
Module 1: Analyzing the Efficiency of Algorithms Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu Based
More informationMid-term Exam Answers and Final Exam Study Guide CIS 675 Summer 2010
Mid-term Exam Answers and Final Exam Study Guide CIS 675 Summer 2010 Midterm Problem 1: Recall that for two functions g : N N + and h : N N +, h = Θ(g) iff for some positive integer N and positive real
More information3.1 Asymptotic notation
3.1 Asymptotic notation The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers N = {0, 1, 2,... Such
More informationReading 10 : Asymptotic Analysis
CS/Math 240: Introduction to Discrete Mathematics Fall 201 Instructor: Beck Hasti and Gautam Prakriya Reading 10 : Asymptotic Analysis In the last reading, we analyzed the running times of various algorithms.
More informationModule 1: Analyzing the Efficiency of Algorithms
Module 1: Analyzing the Efficiency of Algorithms Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu What is an Algorithm?
More informationThe Time Complexity of an Algorithm
Analysis of Algorithms The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. Purpose To estimate how long a program will run. To estimate the largest input
More informationCS Data Structures and Algorithm Analysis
CS 483 - Data Structures and Algorithm Analysis Lecture II: Chapter 2 R. Paul Wiegand George Mason University, Department of Computer Science February 1, 2006 Outline 1 Analysis Framework 2 Asymptotic
More informationAnalysis of Algorithm Efficiency. Dr. Yingwu Zhu
Analysis of Algorithm Efficiency Dr. Yingwu Zhu Measure Algorithm Efficiency Time efficiency How fast the algorithm runs; amount of time required to accomplish the task Our focus! Space efficiency Amount
More informationCS473 - Algorithms I
CS473 - Algorithms I Lecture 2 Asymptotic Notation 1 O-notation: Asymptotic upper bound f(n) = O(g(n)) if positive constants c, n 0 such that 0 f(n) cg(n), n n 0 f(n) = O(g(n)) cg(n) f(n) Asymptotic running
More informationCSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo
CSE 421: Intro Algorithms 2: Analysis Winter 2012 Larry Ruzzo 1 Efficiency Our correct TSP algorithm was incredibly slow Basically slow no matter what computer you have We want a general theory of efficiency
More informationAnalysis of Algorithms
October 1, 2015 Analysis of Algorithms CS 141, Fall 2015 1 Analysis of Algorithms: Issues Correctness/Optimality Running time ( time complexity ) Memory requirements ( space complexity ) Power I/O utilization
More informationAsymptotic Analysis 1
Asymptotic Analysis 1 Last week, we discussed how to present algorithms using pseudocode. For example, we looked at an algorithm for singing the annoying song 99 Bottles of Beer on the Wall for arbitrary
More informationThe Time Complexity of an Algorithm
CSE 3101Z Design and Analysis of Algorithms The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. Purpose To estimate how long a program will run. To estimate
More informationEECS 477: Introduction to algorithms. Lecture 5
EECS 477: Introduction to algorithms. Lecture 5 Prof. Igor Guskov guskov@eecs.umich.edu September 19, 2002 1 Lecture outline Asymptotic notation: applies to worst, best, average case performance, amortized
More informationIntroduction to Algorithms
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 3: Growth of Functions (slides enhanced by N. Adlai A. DePano) Overview Order of growth of functions provides a simple
More informationTime-bounded computations
Lecture 18 Time-bounded computations We now begin the final part of the course, which is on complexity theory. We ll have time to only scratch the surface complexity theory is a rich subject, and many
More informationCh01. Analysis of Algorithms
Ch01. Analysis of Algorithms Input Algorithm Output Acknowledgement: Parts of slides in this presentation come from the materials accompanying the textbook Algorithm Design and Applications, by M. T. Goodrich
More informationMA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2
MA008 p.1/36 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/36 Content of lecture 2 Examples Review data structures Data types vs. data
More information2. ALGORITHM ANALYSIS
2. ALGORITHM ANALYSIS computational tractability asymptotic order of growth survey of common running times Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos
More informationBig O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013
/4/3 Administrative Big O David Kauchak cs3 Spring 3 l Assignment : how d it go? l Assignment : out soon l CLRS code? l Videos Insertion-sort Insertion-sort Does it terminate? /4/3 Insertion-sort Loop
More informationAlgorithms Design & Analysis. Analysis of Algorithm
Algorithms Design & Analysis Analysis of Algorithm Review Internship Stable Matching Algorithm 2 Outline Time complexity Computation model Asymptotic notions Recurrence Master theorem 3 The problem of
More information5 + 9(10) + 3(100) + 0(1000) + 2(10000) =
Chapter 5 Analyzing Algorithms So far we have been proving statements about databases, mathematics and arithmetic, or sequences of numbers. Though these types of statements are common in computer science,
More informationHow many hours would you estimate that you spent on this assignment?
The first page of your homework submission must be a cover sheet answering the following questions. Do not leave it until the last minute; it s fine to fill out the cover sheet before you have completely
More informationSECTION 9.2: ARITHMETIC SEQUENCES and PARTIAL SUMS
(Chapter 9: Discrete Math) 9.11 SECTION 9.2: ARITHMETIC SEQUENCES and PARTIAL SUMS PART A: WHAT IS AN ARITHMETIC SEQUENCE? The following appears to be an example of an arithmetic (stress on the me ) sequence:
More informationLecture 2. Fundamentals of the Analysis of Algorithm Efficiency
Lecture 2 Fundamentals of the Analysis of Algorithm Efficiency 1 Lecture Contents 1. Analysis Framework 2. Asymptotic Notations and Basic Efficiency Classes 3. Mathematical Analysis of Nonrecursive Algorithms
More informationKurt Schmidt. June 5, 2017
Dept. of Computer Science, Drexel University June 5, 2017 Examples are taken from Kernighan & Pike, The Practice of ming, Addison-Wesley, 1999 Objective: To learn when and how to optimize the performance
More informationAgenda. We ve discussed
Agenda We ve discussed Now Next C++ basics Some built-in data structures and their applications: stack, map, vector, array The Fibonacci example showing the importance of good algorithms and asymptotic
More informationCS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:
CS 344 Design and Analysis of Algorithms Tarek El-Gaaly tgaaly@cs.rutgers.edu Course website: www.cs.rutgers.edu/~tgaaly/cs344.html Course Outline Textbook: Algorithms by S. Dasgupta, C.H. Papadimitriou,
More informationAnalysis of Algorithms
Analysis of Algorithms Section 4.3 Prof. Nathan Wodarz Math 209 - Fall 2008 Contents 1 Analysis of Algorithms 2 1.1 Analysis of Algorithms....................... 2 2 Complexity Analysis 4 2.1 Notation
More informationTheory of Computation
Theory of Computation Dr. Sarmad Abbasi Dr. Sarmad Abbasi () Theory of Computation 1 / 33 Lecture 20: Overview Incompressible strings Minimal Length Descriptions Descriptive Complexity Dr. Sarmad Abbasi
More informationIntroduction to Algorithms and Asymptotic analysis
Indian Institute of Information Technology Design and Manufacturing, Kancheepuram Chennai 600 127, India An Autonomous Institute under MHRD, Govt of India An Institute of National Importance COM 501 Advanced
More informationMore Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018
CS 61B More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 Here is a review of some formulas that you will find useful when doing asymptotic analysis. ˆ N i=1 i = 1 + 2 + 3 + 4 + + N = N(N+1)
More informationCSE 417: Algorithms and Computational Complexity
CSE 417: Algorithms and Computational Complexity Lecture 2: Analysis Larry Ruzzo 1 Why big-o: measuring algorithm efficiency outline What s big-o: definition and related concepts Reasoning with big-o:
More informationP, NP, NP-Complete, and NPhard
P, NP, NP-Complete, and NPhard Problems Zhenjiang Li 21/09/2011 Outline Algorithm time complicity P and NP problems NP-Complete and NP-Hard problems Algorithm time complicity Outline What is this course
More informationFoundations II: Data Structures and Algorithms
Foundations II: Data Structures and Algorithms Instructor : Yusu Wang Topic 1 : Introduction and Asymptotic notation Course Information Course webpage http://www.cse.ohio-state.edu/~yusu/courses/2331 Office
More informationNotational conventions
CHAPTER 0 Notational conventions We now specify some of the notations and conventions used throughout this book. We make use of some notions from discrete mathematics such as strings, sets, functions,
More informationTerminology Notation Definition Big oh notation f(s) = O(g(s)) (s S) There exists a constant c such that f(s) c g(s) for all s S
Chapter 2 Asymptotic notations 2.1 The oh notations Terminology Notation Definition Big oh notation f(s) = O(g(s)) (s S) There exists a constant c such that f(s) c g(s) for all s S nota- Vinogradov tion
More informationGrade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2
1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science,
More informationIntroduction to Algorithms: Asymptotic Notation
Introduction to Algorithms: Asymptotic Notation Why Should We Care? (Order of growth) CS 421 - Analysis of Algorithms 2 Asymptotic Notation O-notation (upper bounds): that 0 f(n) cg(n) for all n n. 0 CS
More informationTaking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms
Taking Stock IE170: Algorithms in Systems Engineering: Lecture 3 Jeff Linderoth Department of Industrial and Systems Engineering Lehigh University January 19, 2007 Last Time Lots of funky math Playing
More informationCS173 Running Time and Big-O. Tandy Warnow
CS173 Running Time and Big-O Tandy Warnow CS 173 Running Times and Big-O analysis Tandy Warnow Today s material We will cover: Running time analysis Review of running time analysis of Bubblesort Review
More information2. Limits at Infinity
2 Limits at Infinity To understand sequences and series fully, we will need to have a better understanding of its at infinity We begin with a few examples to motivate our discussion EXAMPLE 1 Find SOLUTION
More informationCalculus Favorite: Stirling s Approximation, Approximately
Calculus Favorite: Stirling s Approximation, Approximately Robert Sachs Department of Mathematical Sciences George Mason University Fairfax, Virginia 22030 rsachs@gmu.edu August 6, 2011 Introduction Stirling
More informationCSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018
CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2018 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis
More informationCSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018
CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2018 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis
More informationCSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019
CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2019 Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis
More informationLecture 1: Asymptotics, Recurrences, Elementary Sorting
Lecture 1: Asymptotics, Recurrences, Elementary Sorting Instructor: Outline 1 Introduction to Asymptotic Analysis Rate of growth of functions Comparing and bounding functions: O, Θ, Ω Specifying running
More informationComparison of functions. Comparison of functions. Proof for reflexivity. Proof for transitivity. Reflexivity:
Comparison of functions Comparison of functions Reading: Cormen et al, Section 3. (for slide 7 and later slides) Transitivity f (n) = (g(n)) and g(n) = (h(n)) imply f (n) = (h(n)) Reflexivity: f (n) =
More informationCS 4407 Algorithms Lecture 2: Growth Functions
CS 4407 Algorithms Lecture 2: Growth Functions Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline Growth Functions Mathematical specification of growth functions
More informationwith the size of the input in the limit, as the size of the misused.
Chapter 3. Growth of Functions Outline Study the asymptotic efficiency of algorithms Give several standard methods for simplifying the asymptotic analysis of algorithms Present several notational conventions
More informationAlgorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms
Algorithms and Theory of Computation Lecture 2: Big-O Notation Graph Algorithms Xiaohui Bei MAS 714 August 14, 2018 Nanyang Technological University MAS 714 August 14, 2018 1 / 20 O, Ω, and Θ Nanyang Technological
More informationMATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King
MATH 22 Lecture O: 10/21/2003 FUNCTIONS: ORDER OF GROWTH The old order changeth, yielding place to new. Tennyson, Idylls of the King Men are but children of a larger growth. Dryden, All for Love, Act 4,
More informationBig-oh stuff. You should know this definition by heart and be able to give it,
Big-oh stuff Definition. if asked. You should know this definition by heart and be able to give it, Let f and g both be functions from R + to R +. Then f is O(g) (pronounced big-oh ) if and only if there
More informationAnalysis of Algorithms
Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia, and M. H. Goldwasser, Wiley, 2014 Analysis of Algorithms Input Algorithm Analysis
More informationBig , and Definition Definition
Big O, Ω, and Θ Big-O gives us only a one-way comparison; if f is O(g) then g eventually is bigger than f from that point on, but in fact f could be very small in comparison. Example; 3n is O(2 2n ). We
More informationPrinciples of Algorithm Analysis
C H A P T E R 3 Principles of Algorithm Analysis 3.1 Computer Programs The design of computer programs requires:- 1. An algorithm that is easy to understand, code and debug. This is the concern of software
More informationBig O (Asymptotic Upper Bound)
Big O (Asymptotic Upper Bound) Linear search takes O(n) time. Binary search takes O(lg(n)) time. (lg means log 2 ) Bubble sort takes O(n 2 ) time. n 2 + 2n + 1 O(n 2 ), n 2 + 2n + 1 O(n) Definition: f
More informationWhen we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:
CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe
More informationON THE TAYLOR COEFFICIENTS OF THE HURWITZ ZETA FUNCTION
ON THE TAYLOR COEFFICIENTS OF THE HURWITZ ZETA FUNCTION Khristo N. Boyadzhiev Department of Mathematics, Ohio Northern University, Ada, Ohio, 45810 k-boyadzhiev@onu.edu Abstract. We find a representation
More informationLecture 2: Asymptotic Notation CSCI Algorithms I
Lecture 2: Asymptotic Notation CSCI 700 - Algorithms I Andrew Rosenberg September 2, 2010 Last Time Review Insertion Sort Analysis of Runtime Proof of Correctness Today Asymptotic Notation Its use in analyzing
More informationCS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms
CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions
More informationAnnouncements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms
CompSci 102 Discrete Math for Computer Science Announcements Read for next time Chap. 3.1-3.3 Homework 3 due Tuesday We ll finish Chapter 2 first today February 7, 2012 Prof. Rodger Chap. 3.1 Algorithms
More informationAlgorithms and Their Complexity
CSCE 222 Discrete Structures for Computing David Kebo Houngninou Algorithms and Their Complexity Chapter 3 Algorithm An algorithm is a finite sequence of steps that solves a problem. Computational complexity
More informationAP Calculus Chapter 9: Infinite Series
AP Calculus Chapter 9: Infinite Series 9. Sequences a, a 2, a 3, a 4, a 5,... Sequence: A function whose domain is the set of positive integers n = 2 3 4 a n = a a 2 a 3 a 4 terms of the sequence Begin
More informationThe Growth of Functions. A Practical Introduction with as Little Theory as possible
The Growth of Functions A Practical Introduction with as Little Theory as possible Complexity of Algorithms (1) Before we talk about the growth of functions and the concept of order, let s discuss why
More informationAssignment 5 Bounding Complexities KEY
Assignment 5 Bounding Complexities KEY Print this sheet and fill in your answers. Please staple the sheets together. Turn in at the beginning of class on Friday, September 16. There are links in the examples
More informationCOMPUTATIONAL COMPLEXITY
ATHEATICS: CONCEPTS, AND FOUNDATIONS Vol. III - Computational Complexity - Osamu Watanabe COPUTATIONAL COPLEXITY Osamu Watanabe Tokyo Institute of Technology, Tokyo, Japan Keywords: {deterministic, randomized,
More informationInfinite series, improper integrals, and Taylor series
Chapter 2 Infinite series, improper integrals, and Taylor series 2. Introduction to series In studying calculus, we have explored a variety of functions. Among the most basic are polynomials, i.e. functions
More informationEXAMPLES OF PROOFS BY INDUCTION
EXAMPLES OF PROOFS BY INDUCTION KEITH CONRAD 1. Introduction In this handout we illustrate proofs by induction from several areas of mathematics: linear algebra, polynomial algebra, and calculus. Becoming
More informationCpt S 223. School of EECS, WSU
Algorithm Analysis 1 Purpose Why bother analyzing code; isn t getting it to work enough? Estimate time and memory in the average case and worst case Identify bottlenecks, i.e., where to reduce time Compare
More informationWe are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero
Chapter Limits of Sequences Calculus Student: lim s n = 0 means the s n are getting closer and closer to zero but never gets there. Instructor: ARGHHHHH! Exercise. Think of a better response for the instructor.
More informationWhen we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:
CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe
More informationSolving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.
Solving recurrences Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms Example: Merge-Sort(A, p, r) 1: if p < r then 2: q (p + r)/2 3: Merge-Sort(A,
More informationCOMP 382: Reasoning about algorithms
Fall 2014 Unit 4: Basics of complexity analysis Correctness and efficiency So far, we have talked about correctness and termination of algorithms What about efficiency? Running time of an algorithm For
More informationCh 01. Analysis of Algorithms
Ch 01. Analysis of Algorithms Input Algorithm Output Acknowledgement: Parts of slides in this presentation come from the materials accompanying the textbook Algorithm Design and Applications, by M. T.
More information2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)
2.2 Asymptotic Order of Growth definitions and notation (2.2) examples (2.4) properties (2.2) Asymptotic Order of Growth Upper bounds. T(n) is O(f(n)) if there exist constants c > 0 and n 0 0 such that
More informationTheory of Computation Chapter 1: Introduction
Theory of Computation Chapter 1: Introduction Guan-Shieng Huang Sep. 20, 2006 Feb. 9, 2009 0-0 Text Book Computational Complexity, by C. H. Papadimitriou, Addison-Wesley, 1994. 1 References Garey, M.R.
More informationMA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3
MA008 p.1/37 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/37 Exercise 1 (from LN 2) Asymptotic Notation When constants appear in exponents
More information