Central Algorithmic Techniques. Iterative Algorithms

Similar documents
Loop Invariants and Binary Search. Chapter 4.4, 5.1

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

Fundamental Algorithms

Fundamental Algorithms

Week 5: Quicksort, Lower bound, Greedy

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

Linear Selection and Linear Sorting

Lecture 4. Quicksort

Advanced Analysis of Algorithms - Midterm (Solutions)

Review Of Topics. Review: Induction

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Data selection. Lower complexity bound for sorting

Asymptotic Running Time of Algorithms

Algorithms And Programming I. Lecture 5 Quicksort

Sorting algorithms. Sorting algorithms

CS 1110: Introduction to Computing Using Python Sequence Algorithms

CSE 21 Practice Exam for Midterm 2 Fall 2017

Introduction to Divide and Conquer

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

COMP 250: Quicksort. Carlos G. Oliver. February 13, Carlos G. Oliver COMP 250: Quicksort February 13, / 21

University of New Mexico Department of Computer Science. Midterm Examination. CS 361 Data Structures and Algorithms Spring, 2003

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR Design and Analysis of Algorithms Department of Mathematics MA21007/Assignment-2/Due: 18 Aug.

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

Fast Sorting and Selection. A Lower Bound for Worst Case

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Outline. 1 Introduction. Merging and MergeSort. 3 Analysis. 4 Reference

CSCE 750, Spring 2001 Notes 2 Page 1 4 Chapter 4 Sorting ffl Reasons for studying sorting is a big deal pedagogically useful Λ the application itself

Data Structures and Algorithms

An analogy from Calculus: limits

COL106: Data Structures and Algorithms (IIT Delhi, Semester-II )

Divide and Conquer Strategy

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

CSCI 3110 Assignment 6 Solutions

CS 61B Asymptotic Analysis Fall 2017

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Problem Set 2 Solutions

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Bucket-Sort. Have seen lower bound of Ω(nlog n) for comparisonbased. Some cheating algorithms achieve O(n), given certain assumptions re input

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 25. Designing Sequence Algorithms

Heaps Induction. Heaps. Heaps. Tirgul 6

MIDTERM I CMPS Winter 2013 Warmuth

CSE 331 Winter 2018 Reasoning About Code I

Lecture 3. Big-O notation, more recurrences!!

ASYMPTOTIC COMPLEXITY SEARCHING/SORTING

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Proof Calculus for Partial Correctness

CS 161 Summer 2009 Homework #2 Sample Solutions

Sorting DS 2017/2018

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Proving Programs Correct

1. [10 marks] Consider the following two algorithms that find the two largest elements in an array A[1..n], where n >= 2.

Hoare Logic I. Introduction to Deductive Program Verification. Simple Imperative Programming Language. Hoare Logic. Meaning of Hoare Triples

Data Structures and Algorithms CSE 465

1 Divide and Conquer (September 3)

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Data Structures and Algorithms Chapter 2

Loop Convergence. CS 536: Science of Programming, Fall 2018

Sorting and Searching. Tony Wong

Analysis of Algorithms CMPSC 565

Solutions. Prelim 2[Solutions]

Ch01. Analysis of Algorithms

Algorithms, CSE, OSU Quicksort. Instructor: Anastasios Sidiropoulos

15.1 Introduction to Lower Bounds Proofs

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort

CS361 Homework #3 Solutions

Algorithms Design & Analysis. Analysis of Algorithm

Hoare Logic: Reasoning About Imperative Programs

Searching. Sorting. Lambdas

N/4 + N/2 + N = 2N 2.

Ch 01. Analysis of Algorithms

CSCE 222 Discrete Structures for Computing

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

Math 391: Midterm 1.0 Spring 2016

CS Data Structures and Algorithm Analysis

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Analysis of Algorithms. Randomizing Quicksort

data structures and algorithms lecture 2

1. Basic Algorithms: Bubble Sort

CPSC 320 Sample Final Examination December 2013

1 Terminology and setup

DM507 - Algoritmer og Datastrukturer Project 1

CSE 421, Spring 2017, W.L.Ruzzo. 8. Average-Case Analysis of Algorithms + Randomized Algorithms

1 Closest Pair of Points on the Plane

Sorting. CS 3343 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

CS173 Running Time and Big-O. Tandy Warnow

Sorting. CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Computational Complexity - Pseudocode and Recursions

CS325: Analysis of Algorithms, Fall Midterm

Transcription:

Central Algorithmic Techniques Iterative Algorithms

Code Representation of an Algorithm class InsertionSortAlgorithm extends SortAlgorithm { void sort(int a[]) throws Exception { for (int i = 1; i < a.length; i++) { int j = i; int B = a[i]; while ((j > 0) && (a[j-1] > B)) { a[j] = a[j-1]; a[j] = B; }} j--; } 2 Pros and Cons?

Code Representation of an Algorithm Pros: Runs on computers Precise and succinct Cons: I am not a computer I need a higher level of intuition. Prone to bugs Language dependent 3

Two Key Types of Algorithms Iterative Algorithms Recursive Algorithms 4

Iterative Algorithms Take one step at a time towards the final destination loop (done) take step end loop 5

Loop Invariants A good way to structure many programs: Store the key information you currently know in some data representation. In the main loop, take a step forward towards destination by making a simple change to this data. 6

The Getting to School Problem 7

Problem Specification Pre condition: location of home and school Post condition: Traveled from home to school 8

General Principle Do not worry about the entire computation. Take one step at a time! 9

A Measure of Progress 75 km to school 79 km to school 10

Safe Locations Algorithm specifies from which locations it knows how to step. 11

Loop Invariant The computation is presently in a safe location. May or may not be true. 12

Defining Algorithm From every safe location, define one step towards school. 13

Take a step What is required of this step? 14

Maintain Loop Invariant If the computation is in a safe location, it does not step into an unsafe one. Can we be assured that the computation will always be in a safe location? No. What if it is not initially true? 15

Establishing Loop Invariant From the Pre-Conditions on the input instance we must establish the loop invariant. 16

Maintain Loop Invariant Can we be assured that the computation will always be in a safe location? By what principle? 17

Maintain Loop Invariant By Induction the computation will always be in a safe location. S(0) isi, ( ) isi, ( ) Si ( + 1) 18

Ending The Algorithm Define Exit Condition Exit Termination: With sufficient progress, the exit condition will be met. 0 km Exit When we exit, we know exit condition is true loop invariant is true from these we must establish the post conditions. Exit 19

Let s Recap 20

Designing an Algorithm Define Problem Define Loop Invariants Define Measure of Progress 79 km to school Define Step Define Exit Condition Maintain Loop Inv Exit Exit Make Progress Initial Conditions Ending Exit 79 km 75 km km 0 km Exit Exit 21

Simple Example Insertion Sort Algorithm

Code Representation of an Algorithm class InsertionSortAlgorithm extends SortAlgorithm { void sort(int a[]) throws Exception { for (int i = 1; i < a.length; i++) { int j = i; int B = a[i]; while ((j > 0) && (a[j-1] > B)) { a[j] = a[j-1]; j--; } a[j] = B; }} 23

Higher Level Abstract View Representation of an Algorithm 5 km 9 km 24

Designing an Algorithm Define Problem Define Loop Invariants Define Measure of Progress 79 km to school Define Step Define Exit Condition Maintain Loop Inv Exit Exit Make Progress Initial Conditions Ending Exit 79 km 75 km km 0 km Exit Exit 25

Problem Specification Precondition: The input is a list of n values with the same value possibly repeated. Post condition: The output is a list consisting of the same n values in non-decreasing order. 88 52 31 14 25 98 30 14,23,25,30,31,52,62,79,88,98 62 23 79 26

Define Loop Invariant Some subset of the elements are sorted The remaining elements are off to the side. 88 52 14 31 25 62 30 98 23 79 14,23,25,30,31,52,62,79,88,98 23,31,52,88 30 14 25 62 98 79 27

Defining Measure of Progress 23,31,52,88 30 14 25 62 98 79 6 elements to school 28

Define Step Select arbitrary element from side. Insert it where it belongs. 23,31,52,88 30 14 25 62 98 79 23,31,52,62,88 30 25 98 14 79 29

Making progress while Maintaining the loop invariant 79 km 75 km Exit 23,31,52,88 30 14 25 62 98 79 6 elements to school 23,31,52,62,88 30 25 98 14 79 Sorted sublist 30 5 elements to school

Beginning & km 0 km Exit Exit Ending 88 52 14 31 25 62 30 98 23 79 88 52 14 31 25 62 30 98 23 79 n elements to school 14,23,25,30,31,52,62,79,88,98 0 elements to school 14,23,25,30,31,52,62,79,88,98 31

Running Time Inserting an element into a list of size i takes θ(i) time. Total = 1+2+3+ +n = θ(n 2 ) 23,31,52,88 30 14 25 62 98 79 23,31,52,62,88 32 30 25 98 14 79

Ok I know you knew Insertion Sort But hopefully you are beginning to appreciate Loop Invariants for describing algorithms 33

Assertions in Algorithms

Purpose of Assertions Useful for thinking about algorithms developing describing proving correctness 35

Definition of Assertions An assertion is a statement about the current state of the data structure that is either true or false. eg. the amount in your bank account is not negative. 36

Definition of Assertions It is made at some particular point during the execution of an algorithm. If it is false, then something has gone wrong in the logic of the algorithm. 37

Definition of Assertions An assertion is not a task for the algorithm to perform. It is only a comment that is added for the benefit of the reader. 38

Specifying a Computational Problem Example of Assertions Preconditions: Any assumptions that must be true about the input instance. Postconditions: The statement of what must be true when the algorithm/program returns.. 39

Definition of Correctness <PreCond> & <code> <PostCond> If the input meets the preconditions, then the output must meet the postconditions. If the input does not meet the preconditions, then nothing is required. 40

<assertion 0 > if( <condition 1 > ) then An Example: A Sequence of Assertions else code <1,true> code <1,false> end if <assertion 1 > <assertion r-1 > if( <condition r > ) then else code <r,true> code <r,false> end if <assertion r > Definition of Correctness <assertion 0 > any <conditions> code 41 <assertion r > How is this proved? Must check 2 r different settings of <conditions> paths through the code. Is there a faster way?

<assertion 0 > if( <condition 1 > ) then An Example: A Sequence of Assertions else code <1,true> code <1,false> end if <assertion 1 > <assertion r-1 > if( <condition r > ) then else code <r,true> code <r,false> end if <assertion r > 42 Step 1 <assertion 0 > <condition 1 > code <1,true> <assertion 1 > <assertion 0 > <condition 1 > code <1,false> <assertion 1 >

<assertion 0 > if( <condition 1 > ) then An Example: A Sequence of Assertions else code <1,true> code <1,false> end if <assertion 1 > <assertion r-1 > if( <condition r > ) then else code <r,true> code <r,false> end if <assertion r > 43 Step 2 <assertion 1 > <condition 2 > code <2,true> <assertion 2 > <assertion 1 > <condition 2 > code <2,false> <assertion 2 >

<assertion 0 > if( <condition 1 > ) then An Example: A Sequence of Assertions else code <1,true> code <1,false> end if <assertion 1 > <assertion r-1 > if( <condition r > ) then else code <r,true> code <r,false> end if <assertion r > 44 Step r <assertion r-1 > <condition r > code <r,true> <assertion r > <assertion r-1 > <condition r > code <r,false> <assertion r >

<assertion 0 > if( <condition 1 > ) then A Sequence of Assertions else code <1,true> code <1,false> end if <assertion 1 > <assertion r-1 > if( <condition r > ) then else code <r,true> code <r,false> end if <assertion r > 45 Step r <assertion r-1 > <condition r > code <r,true> <assertion r > <assertion r-1 > <condition r > code <r,false> <assertion r >

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Another Example: A Loop Type of Algorithm: Iterative Type of Assertion: Loop Invariants 46

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants Definition of Correctness? 47

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <precond> any <conditions> code Definition of Correctness How is this proved? <postcond> 48

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <precond> any <conditions> code Definition of Correctness The computation may go around the loop an arbitrary number of times. Is there a faster way? 49 <postcond>

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <precond> codea Step 0 <loop-invariant 50

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <loop-invariant> <exit Cond> codeb Step 1 <loop-invariant 51

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <loop-invariant> <exit Cond> codeb Step 2 <loop-invariant 52

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <loop-invariant> <exit Cond> codeb Step 3 <loop-invariant 53

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <loop-invariant> <exit Cond> codeb 54 Step i <loop-invariant All these steps are the same and therefore only need be done once!

<precond> codea loop <loop-invariant> exit when <exit Cond> codeb endloop codec <postcond> Iterative Algorithms Loop Invariants <loop-invariant> <exit Cond> codec Last Step <postcond> 55

Partial Correctness Establishing Loop Invariant <precond> codea <loop-invariant> Maintaining Loop Invariant Exit Clean up loose ends Exit <loop-invariant> <exit Cond> codeb <loop-invariant> <exit Cond> codec Proves that IF the program terminates then it works <PreCond> & <code> <PostCond> 56 <loop-invariant> <postcond>

Algorithm Termination Measure of progress km 0 km Exit 79 km 75 km 57

Algorithm Correctness Partial Correctness + Termination Correctness 58

Designing Loop Invariants Coming up with the loop invariant is the hardest part of designing an algorithm. It requires practice, perseverance, and insight. Yet from it the rest of the algorithm follows easily 59

Don t start coding You must design a working algorithm first. 60

Exemplification: Try solving the problem on small input examples. 61

Start with Small Steps What basic steps might you follow to make some kind of progress towards the answer? Describe or draw a picture of what the data structure might look like after a number of these steps. 62

Picture from the Middle Leap into the middle of the algorithm. What would you like your data structure to look like when you are half done? 63

Ask for 100% Pretend that a genie has granted your wish. You are now in the middle of your computation and your dream loop invariant is true. 64

Ask for 100% Maintain the Loop Invariant: From here, are you able to take some computational steps that will make progress while maintaining the loop invariant? 65

Ask for 100% If you can maintain the loop invariant, great. If not, Too Weak: If your loop invariant is too weak, then the genie has not provided you with everything you need to move on. Too Strong: If your loop invariant is too strong, then you will not be able to establish it initially or maintain it. 66

Differentiating between Iterations x=x+2 Meaningful as code False as a mathematical statement x = x i = value at the beginning of the iteration x = x i+1 = new value after going around the loop one more time. x'' = x'+2 Meaningful as a mathematical statement 67

Loop Invariants for Iterative Algorithms Three Search Examples

Define Problem: Binary Search PreConditions Key 25 Sorted List 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 PostConditions Find key in list (if there). 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 69

Define Loop Invariant Maintain a sublist. If the key is contained in the original list, then the key is contained in the sublist. key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 70

Define Step Make Progress Maintain Loop Invariant key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 71

Define Step Cut sublist in half. Determine which half the key would be in. Keep that half. key 25 mid 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. 72 If key > mid, then key is in right half.

Define Step It is faster not to check if the middle element is the key. Simply continue. key 43 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. 73 If key > mid, then key is in right half.

Make Progress Exit 79 km 75 km The size of the list becomes smaller. 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 79 km 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 75 km 74

Initial Conditions km key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 n km The sublist is the entire original list. If the key is contained in the original list, then the key is contained in the sublist. 75

Ending Algorithm Exit key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 0 km If the key is contained in the original list, then the key is contained in the sublist. Sublist contains one element. Exit If the key is contained in the original list, then the key is at this location. 76

If key not in original list If the key is contained in the original list, then the key is contained in the sublist. Loop invariant true, even if the key is not in the list. key 24 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If the key is contained in the original list, then the key is at this location. Conclusion still solves the problem. Simply check this one location for the key. 77

Running Time The sublist is of size n, n / 2, n / 4, n / 8,,1 Each step θ(1) time. Total = θ(log n) key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. 78 If key > mid, then key is in right half.

BinarySearch(A[1..n], key) <precondition>: A[1..n] is sorted in non-decreasing order <postcondition>: If key is in A[1..n], algorithm returns its location p = 1, q = n while q > p < loop-invariant>: If key is in A[1..n], then key is in A[p.. q] p + q mid = 2 if key A[ mid ] q = mid else p = mid + 1 end end if key = A[ p] return( p) else return("key not in list") end 79

Algorithm Definition Completed Define Problem Define Loop Invariants Define Measure of Progress 79 km to school Define Step Define Exit Condition Maintain Loop Inv Exit Exit Make Progress Initial Conditions Ending Exit 79 km 75 km km 0 km Exit Exit 80

BinarySearch(A[1..n], key) <precondition>: A[1..n] is sorted in non-decreasing order <postcondition>: If key is in A[1..n], algorithm returns its location p = 1, q = n while q > p < loop-invariant>: If key is in A[1..n], then key is in A[p.. q] p + q mid = 2 if key A[ mid ] q = mid else p = mid + 1 end end if key = A[ p] return( p) else return("key not in list") end 81

Simple, right? Although the concept is simple, binary search is notoriously easy to get wrong. Why is this? 82

The Devil in the Details The basic idea behind binary search is easy to grasp. It is then easy to write pseudocode that works for a typical case. Unfortunately, it is equally easy to write pseudocode that fails on the boundary conditions. 83

The Devil in the Details if key A[ mid ] q = mid else p = mid + 1 end or if key < A[ mid ] q = mid else p = mid + 1 end What condition will break the loop invariant? 84

The Devil in the Details key 36 mid 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 Code: key A[mid] select right half Bug!! 85

The Devil in the Details if key A[ mid ] q = mid else p = mid + 1 end if key < A[ mid ] q = mid 1 else p = mid end if key < A[ mid ] q = mid else p = mid + 1 end OK OK Not OK!! 86

The Devil in the Details mid p+ q = 2 or mid p+ q = 2 key 25 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 Shouldn t matter, right? Select mid p + q = 2 87

The Devil in the Details key 25 mid 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. If key > mid, then key is in right half. 88

The Devil in the Details key 25 mid 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. If key > mid, then key is in right half. 89

The Devil in the Details Exit 79 km 75 km Another bug! key 25 mid No progress toward goal: Loops Forever! 3 5 6 13 18 21 21 25 36 43 49 51 53 60 72 74 83 88 91 95 If key mid, then key is in left half. If key > mid, then key is in right half. 90

The Devil in the Details p + q mid = 2 if key A[ mid ] q = mid else p = mid + 1 end p + q mid = 2 if key < A[ mid ] q = mid 1 else p = mid end p + q mid = 2 if key A[ mid ] q = mid else p = mid + 1 end OK OK Not OK!! 91

How Many Possible Algorithms? p + q mid = 2 if key A[ mid ] q = mid else p = mid + 1 end p + q o r mid =? 2 or if key < A[ mid ]? or q = mid 1 else end p = mid 92

Alternative Algorithm: Less Efficient but More Clear BinarySearch(A[1..n], key) <precondition>: A[1..n] is sorted in non-decreasing order <postcondition>: If key is in A[1..n], algorithm returns its location p = 1, q = n while q > p < loop-invariant>: If key is in A[1..n], then key is in A[p..q] p + q mid = 2 if key = A[ mid ] return(mid) elseif key < A[ mid ] q = mid 1 else p = mid + 1 end end if key = A[ p] return( p) else return("key not in list" ) end 93

Moral Use the loop invariant method to think about algorithms. Be careful with your definitions. Be sure that the loop invariant is always maintained. Be sure progress is always made. Having checked the typical cases, pay particular attention to boundary conditions and the end game. 94

Loop Invariants for Iterative Algorithms A Second Search Example: The Binary Search Tree

Define Problem: Binary Search Tree PreConditions Key 25 A binary search tree. 38 25 51 17 31 42 63 4 21 28 35 40 49 55 71 PostConditions Find key in BST (if there). 96

Binary Search Tree All nodes in left subtree Any node All nodes in right subtree 38 25 51 17 31 42 63 4 21 28 35 40 49 55 71 97

Maintain a sub-tree. Define Loop Invariant If the key is contained in the original tree, then the key is contained in the sub-tree. key 17 38 25 51 17 31 42 63 4 21 28 35 40 49 55 71 98

Define Step Cut sub-tree in half. Determine which half the key would be in. Keep that half. key 17 38 25 51 17 31 42 63 4 21 28 35 40 49 55 71 If key < root, then key is in left half. If key = root, then key is found If key > root, then key is in right half. 99

Algorithm Definition Completed Define Problem Define Loop Invariants Define Measure of Progress 79 km to school Define Step Define Exit Condition Maintain Loop Inv Exit Exit Make Progress Initial Conditions Ending Exit 79 km 75 km km 0 km Exit Exit 100

Card Trick A volunteer, please. 101

Loop Invariants for Iterative Algorithms A Third Search Example: A Card Trick

Pick a Card Done 103

Loop Invariant: The selected card is one of these. 104

Which column? left 105

Loop Invariant: The selected card is one of these. 106

Selected column is placed in the middle 107

I will rearrange the cards 108

Relax Loop Invariant: I will remember the same about each column. 109

Which column? right 110

Loop Invariant: The selected card is one of these. 111

Selected column is placed in the middle 112

I will rearrange the cards 113

Which column? left 114

Loop Invariant: The selected card is one of these. 115

Selected column is placed in the middle 116

Here is your card. Wow! 117

Ternary Search Loop Invariant: selected card in central subset of cards Size of subset = n / 3 where n = i 1 total number of cards i = iteration index How many iterations are required to guarantee success? 118

Loop Invariants for Iterative Algorithms A Fourth Example: Partitioning (Not a search problem: can be used for sorting, e.g., Quicksort)

The Partitioning Problem Input: x=52 Output: 88 31 25 52 14 98 30 62 23 79 31 14 30 25 23 52 88 62 98 79 Problem: Partition a list into a set of small values and a set of large values. 120

Precise Specification Precondit ion: Ap [... r] is an arbitrary list of values. x= Ar [ ] is the pivot. p r Postcondition: A is rearranged such that A[ p... q 1] A[ q] = x A[ q + 1... r] for some q. p q r 121

Loop Invariant 3 subsets are maintained One containing values less than or equal to the pivot One containing values greater than the pivot One containing values yet to be processed 122

Maintaining Loop Invariant Consider element at location j If greater than pivot, incorporate into > set by incrementing j. If less than or equal to pivot, incorporate into set by swapping with element at location i+1 and incrementing both i and j. Measure of progress: size of unprocessed set. 123

Maintaining Loop Invariant 124

Establishing Loop Invariant 125

Establishing Postcondition = j on exit Exhaustive on exit 126

Establishing Postcondition 127

An Example 128

Running Time Each iteration takes θ(1) time Total = θ(n) or 129

More Examples of Iterative Algorithms Using Constraints on Input to Achieve Linear- Time Sorting

Recall: InsertionSort nn ( + 1) n 2 Worst case (reverse order): tj = j : j = 1 T( n) θ( n ) j = 2 2 131

Recall: MergeSort 132 Tn ( ) θ( nlog n)

Comparison Sorts InsertionSort and MergeSort are examples of (stable) Comparison Sort algorithms. QuickSort is another example we will study shortly. Comparison Sort algorithms sort the input by successive comparison of pairs of input elements. Comparison Sort algorithms are very general: they make no assumptions about the values of the input elements. 133

Comparison Sorts 2 InsertionSort is θ( n ). MergeSort is θ( nlog n). Can we do better? 134

Comparison Sort: Decision Trees Example: Sorting a 3-element array A[1..3] 135

Comparison Sort Worst-case time is equal to the height of the binary decision tree. The height of the tree is the log of the number of leaves. The leaves of the tree represent all possible permutations of the input. How many are there? ( ) log n! Ω( nlog n) Thus MergeSort is asymptotically optimal. 136

Linear Sorts? Comparison sorts are very general, but are Ω( nlog n) Faster sorting may be possible if we can constrain the nature of the input. 137

Example 1. Counting Sort Counting Sort applies when the elements to be sorted come from a finite (and preferably small) set. For example, the elements to be sorted are integers in the range [0 k-1], for some fixed integer k. We can then create an array V[0 k-1] and use it to count the number of elements with each value [0 k-1]. Then each input element can be placed in exactly the right place in the output array in constant time. 138

Counting Sort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 0 0 1 1 1 1 1 1 1 1 1 2 2 3 3 3 Input: N records with integer keys between [0 k-1]. Output: Stable sorted keys. Algorithm: Count frequency of each key value to determine transition locations Go through the records in order putting them where they go. 139

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 0 0 1 1 1 1 1 1 1 1 1 2 2 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Stable sort: If two keys are the same, their order does not change. Thus the 4 th record in input with digit 1 must be the 4 th record in output with digit 1. It belongs at output index 8, because 8 records go before it ie, 5 records with a smaller digit & 3 records with the same digit Count These! 140

Input: Output: CountingSort 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Index: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 Value v: # of records with digit v: 0 5 1 9 2 3 3 2 N records. Time to count? Θ(N) 141

Input: Output: CountingSort 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Index: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 Value v: # of records with digit v: # of records with digit < v: 0 5 0 1 9 5 2 3 14 3 3 17 N records, k different values. Time to count? Θ(k) 142

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 0 0 1 1 1 1 1 1 1 1 1 2 2 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: # of records with digit < v: 0 0 1 5 = location of first record with digit v. 2 14 3 17 143

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0? 1 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of first record with digit v. 0 0 1 5 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 144

Loop Invariant The first i-1 keys have been placed in the correct locations in the output array The auxiliary data structure v indicates the location at which to place the i th key for each possible key value from [1..k-1]. 145

CountingSort Input: Output: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 1 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 0 1 5 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 146

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 Index: 1 2 3 4 1 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 0 1 6 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 147

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 1 1 6 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 148

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 6 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 149

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 7 2 14 3 17 Algorithm: Go through the records in order putting them where they go. 150

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 1 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 7 2 14 3 18 Algorithm: Go through the records in order putting them where they go. 151

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 1 1 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 8 2 14 3 18 Algorithm: Go through the records in order putting them where they go. 152

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 1 1 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 9 2 14 3 18 Algorithm: Go through the records in order putting them where they go. 153

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 1 1 1 1 1 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 9 2 14 3 19 Algorithm: Go through the records in order putting them where they go. 154

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 1 1 1 1 1 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 2 1 10 2 14 3 19 Algorithm: Go through the records in order putting them where they go. 155

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 1 1 1 1 1 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 3 1 10 2 14 3 19 Algorithm: Go through the records in order putting them where they go. 156

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 1 1 1 1 1 1 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 3 1 10 2 15 3 19 Algorithm: Go through the records in order putting them where they go. 157

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 1 1 1 1 1 1 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 3 1 10 2 15 3 19 Algorithm: Go through the records in order putting them where they go. 158

CountingSort Input: 1 0 0 1 3 1 1 3 1 0 2 1 0 1 1 2 2 1 0 Output: 0 0 0 0 0 1 1 1 1 1 1 1 1 1 2 2 2 3 3 Index: 1 2 3 4 5 6 7 8 9 10 11 0 12 13 14 15 16 17 18 Value v: Location of next record with digit v. 0 5 1 14 2 17 3 19 Time Total = Θ(N) Θ(N+k) 159

Input: A of stack of N punch cards. Each card contains d digits. Each digit between [0 k-1] Output: Sorted cards. Digit Sort: Select one digit Example 2. RadixSort 344 125 333 134 224 334 143 225 325 243 Separate cards into k piles based on selected digit (e.g., Counting Sort). Stable sort: If two cards are the same for that digit, their order does not change. 125 224 225 325 333 134 334 344 143 243 160

RadixSort 344 125 333 134 224 334 143 225 325 243 Sort wrt which digit first? The most significant. 125 134 143 224 225 243 344 333 334 325 Sort wrt which digit Second? The next most significant. 125 224 225 325 134 333 334 143 243 344 All meaning in first sort lost. 161

RadixSort 344 125 333 134 224 334 143 225 325 243 Sort wrt which digit first? The least significant. 333 143 243 344 134 224 334 125 225 325 Sort wrt which digit Second? The next least significant. 224 125 225 325 333 134 334 143 243 344 162

RadixSort 344 125 333 134 224 334 143 225 325 243 Sort wrt which digit first? The least significant. 333 143 243 344 134 224 334 125 225 325 Sort wrt which digit Second? The next least significant. 2 24 1 25 2 25 3 25 3 33 1 34 3 34 1 43 2 43 3 44 163 Is sorted wrt least sig. 2 digits.

RadixSort 2 24 1 25 2 25 3 25 3 33 1 34 3 34 1 43 2 43 3 44 i+1 Is sorted wrt first i digits. Sort wrt i+1st digit. 1 25 1 34 1 43 2 24 2 25 2 43 3 25 3 33 3 34 3 44 164 Is sorted wrt first i+1 digits. These are in the correct order because sorted wrt high order digit

RadixSort 2 24 1 25 2 25 3 25 3 33 1 34 3 34 1 43 2 43 3 44 i+1 Is sorted wrt first i digits. Sort wrt i+1st digit. 1 25 1 34 1 43 2 24 2 25 2 43 3 25 3 33 3 34 3 44 165 Is sorted wrt first i+1 digits. These are in the correct order because was sorted & stable sort left sorted

Loop Invariant The keys have been correctly stable-sorted with respect to the i-1 least-significant digits. 166

Running Time Running time is Θ ( d( n + k)) Where d = # of digits in each number n = # of elements to be sorted k = # of possible values for each digit 167

Example 3. Bucket Sort Applicable if input is constrained to finite interval, e.g., [0 1). If input is random and uniformly distributed, expected run time is Θ(n). 168

Bucket Sort 169

Loop Invariants Loop 1 The first i-1 keys have been correctly placed into buckets of width 1/n. Loop 2 The keys within each of the first i-1 buckets have been correctly stable-sorted. 170

PseudoCode Θ(1) Θ(1) Θ( n) Θ( n) n 171

Examples of Iterative Algorithms Binary Search Partitioning Insertion Sort Counting Sort Radix Sort Bucket Sort Which can be made stable? Which sort in place? How about MergeSort? 172

End of Iterative Algorithms