CE 221 Data Structures and Algorithms. Chapter 7: Sorting (Insertion Sort, Shellsort)

Similar documents
Lecture 5: Loop Invariants and Insertion-sort

Fundamental Algorithms

Fundamental Algorithms

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

data structures and algorithms lecture 2

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

Lecture 4. Quicksort

CS 161 Summer 2009 Homework #2 Sample Solutions

Analysis of Algorithms. Randomizing Quicksort

Data Structures and Algorithms Chapter 2

CISC 235: Topic 1. Complexity of Iterative Algorithms

Searching. Sorting. Lambdas

CSE548, AMS542: Analysis of Algorithms, Fall 2017 Date: Oct 26. Homework #2. ( Due: Nov 8 )

Review Of Topics. Review: Induction

Quicksort algorithm Average case analysis

Data Structures and Algorithms CSE 465

Algorithms, CSE, OSU Quicksort. Instructor: Anastasios Sidiropoulos

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Computational Complexity

Linear Selection and Linear Sorting

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Algorithms and Their Complexity

COMP 250: Quicksort. Carlos G. Oliver. February 13, Carlos G. Oliver COMP 250: Quicksort February 13, / 21

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

Lecture 2: Asymptotic Notation CSCI Algorithms I

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms CMPSC 565

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

3.1 Asymptotic notation

Module 1: Analyzing the Efficiency of Algorithms

COMP 382: Reasoning about algorithms

Asymptotic Analysis Cont'd

Data selection. Lower complexity bound for sorting

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Quicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort

Time complexity analysis

CSE 312, Winter 2011, W.L.Ruzzo. 8. Average-Case Analysis of Algorithms + Randomized Algorithms

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Sorting algorithms. Sorting algorithms

Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

Lecture 22: Multithreaded Algorithms CSCI Algorithms I. Andrew Rosenberg

Lecture 4: Best, Worst, and Average Case Complexity. Wednesday, 30 Sept 2009

Design and Analysis of Algorithms. Part 1 Program Costs and Asymptotic Notations

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

Divide and Conquer Algorithms

Enumerate all possible assignments and take the An algorithm is a well-defined computational

1 Terminology and setup

Week 5: Quicksort, Lower bound, Greedy

Elementary Sorts 1 / 18

THE WORST CASE IN SHELLSORT AND RELATED ALGORITHMS BJORN POONEN

On the average-case complexity of Shellsort

Ch01. Analysis of Algorithms

Asymptotic Algorithm Analysis & Sorting

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

csci 210: Data Structures Program Analysis

Analysis of Shellsort and Related Algorithms

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Lecture 14: Nov. 11 & 13

Topic 17. Analysis of Algorithms

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

ECOM Discrete Mathematics

CSE 421, Spring 2017, W.L.Ruzzo. 8. Average-Case Analysis of Algorithms + Randomized Algorithms

Best Increments for the Average Case of Shellsort

csci 210: Data Structures Program Analysis

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Introduction to Discrete Optimization

Insertion Sort. We take the first element: 34 is sorted

Math 391: Midterm 1.0 Spring 2016

CSC 8301 Design & Analysis of Algorithms: Lower Bounds

Concrete models and tight upper/lower bounds

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Divide-Conquer-Glue. Divide-Conquer-Glue Algorithm Strategy. Skyline Problem as an Example of Divide-Conquer-Glue

Outline. 1 Merging. 2 Merge Sort. 3 Complexity of Sorting. 4 Merge Sort and Other Sorts 2 / 10

Randomization in Algorithms

CPSC 320 Sample Final Examination December 2013

CS 4407 Algorithms Lecture 2: Growth Functions

Given a sequence a 1, a 2,...of numbers, the finite sum a 1 + a 2 + +a n,wheren is an nonnegative integer, can be written

P, NP, NP-Complete, and NPhard

Ch 01. Analysis of Algorithms

CSE 548: (Design and) Analysis of Algorithms

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

COMP 250 Fall Midterm examination

Describe an algorithm for finding the smallest integer in a finite sequence of natural numbers.

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Big-O Notation and Complexity Analysis

Recursion. Computational complexity

Bucket-Sort. Have seen lower bound of Ω(nlog n) for comparisonbased. Some cheating algorithms achieve O(n), given certain assumptions re input

Algorithms And Programming I. Lecture 5 Quicksort

Mergesort and Recurrences (CLRS 2.3, 4.4)

CS 310 Advanced Data Structures and Algorithms


Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

CSE 417: Algorithms and Computational Complexity

Data Structures and Algorithms

Big O (Asymptotic Upper Bound)

Algorithm Design and Analysis Homework #5 Due: 1pm, Monday, December 26, === Homework submission instructions ===

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Transcription:

CE 221 Data Structures and Algorithms Chapter 7: Sorting (Insertion Sort, Shellsort) Text: Read Weiss, 7.1 7.4 1

Preliminaries Main memory sorting algorithms All algorithms are Interchangeable; an array containing the N elements will be passed. <, > (comparison) and = (assignment) are the only operations allowed on the input data : comparison-based sorting 2

Insertion Sort One of the simplest sorting algorithms Consists of N-1 passes. for pass p = 1 to N-1 (0 thru p-1 already known to be sorted) elements in position 0 trough p (p+1 elements) are sorted by moving the element left until a smaller element is encountered. 3

Insertion Sort - Algorithm N*N iterations, hence, time complexity = O(N 2 ) in the worst case.this bound is tight (input in the reverse order). Number of element comparisons in the inner loop is p, summing up over all p = 1+2+...+N-1 = Θ(N 2 ). If the input is sorted O(N). 4

A Lower Bound for Simple Sorting Algorithms An inversion in an array of numbers is any ordered pair (i, j) such that a[i] > a[j]. In the example; 9 inversions. Swapping two adjacent elements that are out of place removes exactly one inversion and a sorted array has no inversions. The running time of insertion sort O(I+N). 5

Average Running Time for Simple Sorting - I Theorem: The average number of inversions for N distinct elements is N(N-1)/4. Proof: It is the sum of the number of inversions in N! different permutations divided by N!. Each permutation L has a corresponding permutation L R which is reversed in sequence. If L has x inversions, then L R has N(N-1)/2 x inversions. As a result ((N(N-1)/2) * (N!/2)) / N! = N(N-1)/4 is the number of inversions for an average list. 6

Average Running Time for Simple Sorting - II Theorem: Any algorithm that sorts by exchanging adjacent elements requires Ω(N 2 ) time on average. Proof: Initially there exists N(N-1)/4 inversions on the average and each swap removes only one inversion, so Ω(N 2 ) swaps are required. This is valid for all types of sorting algorithms (including those undiscovered) that perform only adjacent exchanges. Result: For a sorting algorithm to run subquadratic (o(n 2 )), it must exchange elements that are far apart (eliminating more than just one inversion per exchange). 7

Shellsort - I Shellsort, invented by Donald Shell, works by comparing elements that are distant; the distance decreases as the algorithm runs until the last phase (diminishing increment sort) Sequence h 1, h 2,..., h t is called the increment sequence. h 1 = 1 always. After a phase, using h k, for every i, a[i] a[i+h k ]. The file is then said to be h k -sorted. 8

Shellsort - II An h k -sorted file that is then h k-1 -sorted remains h k -sorted. To h k -sort, for each i in h k,h k +1,...,N-1, place the element in the correct spot among i, i-h k, i-2h k. This is equivalent to performing an insertion sort on h k independent subarrays. 9

Shellsort - III Increment sequence by Shell: h t =floor(n/2), h k =floor(h k+1 /2) (poor) 10

Worst-Case Analysis of Shellsort - I Theorem: The worst case running time of Shellsort, using Shell s increments, is Θ(N 2 ). N/ 2 1 1 Proof: part I: prove Ω(N 2 )= i (i ) Why? smallest N/2 elements goes from position 2i-1 to i during the last pass. Previous passes all have even increments. 11

Worst-Case Analysis of Shellsort - II Proof: part II: prove O(N 2 ) A pass with increment h k consists of h k insertion sorts of about N/h k elements. One pass,hence, is O(h k (N/h k ) 2 ). Summing over all passes O 1 N 2 ( / h ) which is O(N 2 ). t i Shell s increments: pairs of increments are not relatively prime. Hibbard s increments: 1, 3, 7,..., 2 k -1 i 12

Worst-Case Analysis of Shellsort - III Theorem: The worst case running time of Shellsort, using Hibbard s increments, is Θ(N 3/2 ). Proof: (results from additive number theory) - for h k >N 1/2 use the bound O(N 2 /h k ) // h k =1, 3, 7,..., 2 t -1 - h k+2 done, h k+1 done, h k now. - a[p-i] < a[p] if i ah k 1 bhk 2 for a, b - but h k+2 =2h k+1 +1 hence gcd(h k+2, h k+1 ) = 1 i( h 1)( h 1) 8h 2 4h k1 k2 k k - Thus all can be expressed as such. - Therefore; innermost for loop executes O(h k ) times for each N-h k positions. This gives a bound of O(Nh k ) per pass. O( t / 2 k 1 Nh O( Nh k t / 2 t N k t / 21 N ) O( h 2 t / 2 2 / h k ) O( N ) O( N 3/ 2 ) t / 2 k 1 h k N Sinceh 2 t / 2 0 t k t / 21 1/ h ( k ) N ) 13

Homework Assignments 7.1, 7.2, 7.3, 7.4 You are requested to study and solve the exercises. Note that these are for you to practice only. You are not to deliver the results to me. 14