Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Similar documents
Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

CS Non-recursive and Recursive Algorithm Analysis

Analysis of Algorithms

CSC Design and Analysis of Algorithms. Lecture 1

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Lecture 2: Asymptotic Notation CSCI Algorithms I

Principles of Algorithm Analysis

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 310 Advanced Data Structures and Algorithms

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Growth of Functions (CLRS 2.3,3)

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

EECS 477: Introduction to algorithms. Lecture 5

The Time Complexity of an Algorithm

Data Structures and Algorithms. Asymptotic notation

Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

CS Data Structures and Algorithm Analysis

Data Structures and Algorithms Running time and growth functions January 18, 2018

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Analysis Fundamentals

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Computational Complexity

The Time Complexity of an Algorithm

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Ch01. Analysis of Algorithms

Algorithms and Their Complexity

Big-O Notation and Complexity Analysis

Data Structures and Algorithms

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Running Time Evaluation

CSC2100B Data Structures Analysis

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Introduction to Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Ch 01. Analysis of Algorithms

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

3.1 Asymptotic notation

Cpt S 223. School of EECS, WSU

csci 210: Data Structures Program Analysis

Asymptotic Analysis of Algorithms. Chapter 4

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS 380 ALGORITHM DESIGN AND ANALYSIS

Runtime Complexity. CS 331: Data Structures and Algorithms

COMP 382: Reasoning about algorithms

data structures and algorithms lecture 2

csci 210: Data Structures Program Analysis

Analysis of Algorithms

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CISC 235: Topic 1. Complexity of Iterative Algorithms

COMP Analysis of Algorithms & Data Structures

Algorithms. Adnan YAZICI Dept. of Computer Engineering Middle East Technical Univ. Ankara - TURKEY. Algorihms, A.Yazici, Fall 2007 CEng 315

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Algorithms Design & Analysis. Analysis of Algorithm

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

COMP Analysis of Algorithms & Data Structures

Big O (Asymptotic Upper Bound)

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

with the size of the input in the limit, as the size of the misused.

CSE 417: Algorithms and Computational Complexity

What is Performance Analysis?

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Topic 17. Analysis of Algorithms

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

P, NP, NP-Complete, and NPhard

Data Structures and Algorithms Chapter 2

Divide and Conquer. Recurrence Relations

COMP 9024, Class notes, 11s2, Class 1

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Data Structures and Algorithms CSE 465

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Advanced Counting Techniques. Chapter 8

Programming, Data Structures and Algorithms Prof. Hema Murthy Department of Computer Science and Engineering Indian Institute Technology, Madras

Analysis of Algorithms Review

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Advanced Algorithmics (6EAP)

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Asymptotic Algorithm Analysis & Sorting

Introduction to Algorithms and Asymptotic analysis

CS173 Running Time and Big-O. Tandy Warnow

Asymptotic Analysis 1

CS473 - Algorithms I

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

COMPUTER ALGORITHMS. Athasit Surarerks.

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Copyright 2000, Kevin Wayne 1

Mat Week 6. Fall Mat Week 6. Algorithms. Properties. Examples. Searching. Sorting. Time Complexity. Example. Properties.

Introduction to Computer Science Lecture 5: Algorithms

Transcription:

Analysis of Algorithm Efficiency Dr. Yingwu Zhu

Measure Algorithm Efficiency Time efficiency How fast the algorithm runs; amount of time required to accomplish the task Our focus! Space efficiency Amount of memory required Deals with the extra space the algorithm requires

Time Efficiency Time efficiency depends on : size of input speed of machine quality of source code quality of compiler These vary from one platform to another So, we cannot express time efficiency meaningfully in real time units such as seconds!

Empirical analysis of time efficiency Select a specific (typical) sample of inputs Use physical unit of time (e.g., milliseconds) or Count actual number of basic operation s executions Analyze the empirical data Limitation: results dependent on the particular computer and the sample of inputs

Theoretical analysis of time efficiency Time efficiency is analyzed by determining the number of repetitions of the basic operation as a function of input size Basic operation: the operation that contributes most towards the running time of the algorithm Input size T(n) c op C(n) running time execution time for basic operation Number of times basic operation is executed

Input size and basic operation examples Problem Input size measure Basic operation Searching for key in a list of n items Number of list s items, i.e. n Key comparison Multiplication of two matrices Matrix dimensions or total number of elements Multiplication of two numbers Checking primality of a given integer n Typical graph problem n size = number of digits (in binary representation) #vertices and/or edges Division Visiting a vertex or traversing an edge

Time Efficiency T(n) = (approximated by) number of times the basic operation is executed. Not only depends on the input size n, but also depends on the arrangement of the input items Best case: not informative Average case: difficult to determine Worst case: is used to measure an algorithm s performance

Example: Sequential Search Best case T(n) Worst case T(n) Average case T(n) Assume success search probability of p

Order of Growth Established framework for analyzing T(n) Order of growth as n Highest-order term is what counts Remember, we are doing asymptotic analysis As the input size grows larger it is the high order term that dominates

Asymptotic Order of Growth A way of comparing functions that ignores constant factors and small input sizes O(g(n)): class of functions f(n) that grow no faster than g(n) Θ(g(n)): class of functions f(n) that grow at same rate as g(n) Ω(g(n)): class of functions f(n) that grow at least as fast as g(n)

Big-oh

Big-omega

Big-theta

Upper Bound Notation In general a function f(n) is O(g(n)) if there exist positive constants c and n 0 such that f(n) c g(n) for all n n 0 Order of growth of f(n) order of growth of g(n) (within constant multiple) Formally O(g(n)) = { f(n): positive constants c and n 0 such that f(n) c g(n) n n 0

Big-Oh Examples 10n is O(n 2 ) 5n+20 is O(n)

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 10 40 20 1 2 3 4 i = j = key = A[j] = A[j+1] = InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 10 40 20 1 2 3 4 i = 2 j = 1 key = 10 A[j] = 30 A[j+1] = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 30 40 20 1 2 3 4 i = 2 j = 1 key = 10 A[j] = 30 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 30 40 20 1 2 3 4 i = 2 j = 1 key = 10 A[j] = 30 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 30 40 20 1 2 3 4 i = 2 j = 0 key = 10 A[j] = A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 30 30 40 20 1 2 3 4 i = 2 j = 0 key = 10 A[j] = A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 2 j = 0 key = 10 A[j] = A[j+1] = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 0 key = 10 A[j] = A[j+1] = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 0 key = 40 A[j] = A[j+1] = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 0 key = 40 A[j] = A[j+1] = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 2 key = 40 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 2 key = 40 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 3 j = 2 key = 40 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 4 j = 2 key = 40 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 4 j = 3 key = 20 A[j] = 40 A[j+1] = 20 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 20 1 2 3 4 i = 4 j = 3 key = 20 A[j] = 40 A[j+1] = 20 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 40 1 2 3 4 i = 4 j = 3 key = 20 A[j] = 40 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 40 1 2 3 4 i = 4 j = 3 key = 20 A[j] = 40 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 40 1 2 3 4 i = 4 j = 3 key = 20 A[j] = 40 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 40 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 40 40 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 40 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 30 40 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 30 40 1 2 3 4 i = 4 j = 2 key = 20 A[j] = 30 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 30 40 1 2 3 4 i = 4 j = 1 key = 20 A[j] = 10 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 30 30 40 1 2 3 4 i = 4 j = 1 key = 20 A[j] = 10 A[j+1] = 30 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 20 30 40 1 2 3 4 i = 4 j = 1 key = 20 A[j] = 10 A[j+1] = 20 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key

An Example: Insertion Sort 10 20 30 40 1 2 3 4 i = 4 j = 1 key = 20 A[j] = 10 A[j+1] = 20 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key Done!

Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 A[j+1] = key How many times will this loop execute?

T(n) for Insertion Sort Worst case? Best case?

Insertion Sort Is O(n 2 ) Proof Suppose runtime is an 2 + bn + c If any of a, b, and c are less than 0 replace the constant with its absolute value an 2 + bn + c (a + b + c)n 2 + (a + b + c)n + (a + b + c) 3(a + b + c)n 2 for n 1 Let c = 3(a + b + c) and let n 0 = 1 Question Is InsertionSort O(n 3 )? Is InsertionSort O(n)?

Big O Fact A polynomial of degree k is O(n k ) Proof: Suppose f(n) = b k n k + b k-1 n k-1 + + b 1 n + b 0 Let a i = b i f(n) a k n k + a k-1 n k-1 + + a 1 n + a 0 n k a i n n i k n k a i cn k

Lower Bound Notation In general a function f(n) is (g(n)) if positive constants c and n 0 such that 0 c g(n) f(n) n n 0 Proof: Suppose run time is an + b Assume a and b are positive (what if b is negative?) an an + b Example: n 3 = (n 2 )

Asymptotic Tight Bound A function f(n) is (g(n)) if positive constants c 1, c 2, and n 0 such that c 1 g(n) f(n) c 2 g(n) n n 0 Theorem f(n) is (g(n)) iff f(n) is both O(g(n)) and (g(n)) Proof Engineering Drop low-order items; ignore leading constants Example: 3n 3 + 90n 2-5n + 9999 = (n 3 )

Using limits for comparing orders of growth lim n t( n) g( n) 0 c 0 T(n) has a smaller order of growth than g(n) T(n) has the same order of growth than g(n) T(n) has a larger order of growth than g(n) Example: compare the orders of growth of 1 2 n( n 1) and n^2

Practical Complexity 250 f(n) = n f(n) = log(n) f(n) = n log(n) f(n) = n^2 f(n) = n^3 f(n) = 2^n 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Practical Complexity 500 f(n) = n f(n) = log(n) f(n) = n log(n) f(n) = n^2 f(n) = n^3 f(n) = 2^n 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Practical Complexity 1000 f(n) = n f(n) = log(n) f(n) = n log(n) f(n) = n^2 f(n) = n^3 f(n) = 2^n 0 1 3 5 7 9 11 13 15 17 19

Practical Complexity 5000 4000 3000 2000 f(n) = n f(n) = log(n) f(n) = n log(n) f(n) = n^2 f(n) = n^3 f(n) = 2^n 1000 David Luebke 56 0 1 3 5 7 9 11 13 15 17 19

Properties of Big-O f(n) O(f(n)) f(n) O(g(n)) iff g(n) (f(n)) If f (n) O(g (n)) and g(n) O(h(n)), then f(n) O(h(n)) Note similarity with a b If f 1 (n) O(g 1 (n)) and f 2 (n) O(g 2 (n)), then f 1 (n) + f 2 (n) O(max{g 1 (n), g 2 (n))

Basic asymptotic efficiency classes 1 constant log n n n log n n 2 n 3 logarithmic linear n-log-n quadratic cubic 2 n exponential n! factorial

Analysis for Algorithms Non-recursive algorithms Recursive algorithms Often, we focus on worst case

T(n) for nonrecursive algorithms General Plan for Analysis Decide on parameter n indicating input size Identify algorithm s basic operation Determine worst, average, and best cases for input of size n Set up a sum for the number of times the basic operation is executed Simplify the sum using standard formulas and rules

Useful summation formulas and rules l i u 1 = 1+1+ +1 = u - l + 1 In particular, 1 i n 1 = n - 1 + 1 = n (n) 1 i n i = 1+2+ +n = n(n+1)/2 n 2 /2 (n 2 ) 1 i n i 2 = 1 2 +2 2 + +n 2 = n(n+1)(2n+1)/6 n 3 /3 (n 3 ) 0 i n a i = 1 + a + + a n = (a n+1-1)/(a - 1) for any a 1 In particular, 0 i n 2 i = 2 0 + 2 1 + + 2 n = 2 n+1-1 (2 n ) (a i ± b i ) = a i ± b i ca i = c a i l i u a i = l i m a i + m+1 i u a i

Example 1: Maximum element

Example 2: Element uniqueness problem

Example 3: Matrix multiplication

Plan for Analysis of Recursive Algorithms Decide on a parameter indicating an input s size. Identify the algorithm s basic operation. Check whether the number of times the basic op. is executed may vary on different inputs of the same size. (If it may, the worst, average, and best cases must be investigated separately.) Set up a recurrence relation with an appropriate initial condition expressing the number of times the basic op. is executed. Solve the recurrence (or, at the very least, establish its solution s order of growth) by backward substitutions or another method.

Example 1: Recursive evaluation of n! Definition: n! = 1 2 (n-1) n for n 1 and 0! = 1 Recursive definition of n!: F(n) = F(n-1) n for n 1 and F(0) = 1 Size: Basic operation: Recurrence relation:

Example 2: The Tower of Hanoi Puzzle 1 3 Recurrence for number of moves: 2 n disks of different sizes and 3 pegs; the goal is to move all the disks to the third peg using the 2 nd one as an auxiliary if necessary. Move on disk at a time and do not place a larger disk on top of a smaller one!

The Tower of Hanoi Puzzle T(n) = 2T(n-1) + 1 T(1) = 1

Example 3: Counting #bits

Smoothness rule In the preceding example, we assume n = 2 k and then take advantage of the theorem called the smoothness rule: Claims that: under very broad assumptions the order of growth observed for n= 2 k gives a correct answer about the order of growth for all values of n.