Bin Sort. Sorting integers in Range [1,...,n] Add all elements to table and then

Similar documents
Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

Data Structures and Algorithms

ITEC2620 Introduction to Data Structures

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort

Algorithms And Programming I. Lecture 5 Quicksort

Week 5: Quicksort, Lower bound, Greedy

Sorting DS 2017/2018

Lecture 4. Quicksort

Review Of Topics. Review: Induction

Fundamental Algorithms

Fundamental Algorithms

MIDTERM I CMPS Winter 2013 Warmuth

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Data selection. Lower complexity bound for sorting

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

Data Structures and Algorithms Chapter 3

Data Structures in Java

Data Structures and Algorithms Chapter 3

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Divide and Conquer Algorithms

Sorting algorithms. Sorting algorithms

Data Structures and Algorithms CSE 465

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Algorithms, CSE, OSU Quicksort. Instructor: Anastasios Sidiropoulos

Design and Analysis of Algorithms

Divide and Conquer Algorithms

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

Linear Time Selection

Divide and conquer. Philip II of Macedon

Fast Sorting and Selection. A Lower Bound for Worst Case

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

Analysis of Algorithms CMPSC 565

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

CS 161 Summer 2009 Homework #2 Sample Solutions

CS 2110: INDUCTION DISCUSSION TOPICS

Quicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

Algorithms. Jordi Planes. Escola Politècnica Superior Universitat de Lleida

Design and Analysis of Algorithms

Asymptotic Running Time of Algorithms

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CSCI 3110 Assignment 6 Solutions

Analysis of Algorithms - Using Asymptotic Bounds -

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Central Algorithmic Techniques. Iterative Algorithms

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

data structures and algorithms lecture 2

Divide-and-Conquer Algorithms Part Two

5. DIVIDE AND CONQUER I

Data Structures and Algorithm Analysis (CSC317) Randomized Algorithms (part 3)

1. Basic Algorithms: Bubble Sort

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

Analysis of Algorithms. Randomizing Quicksort

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Divide-and-conquer. Curs 2015

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Advanced Analysis of Algorithms - Midterm (Solutions)

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CS361 Homework #3 Solutions

Algorithm Design CS 515 Fall 2015 Sample Final Exam Solutions

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Data Structures and Algorithm Analysis (CSC317) Randomized algorithms

Insertion Sort. We take the first element: 34 is sorted

CS173 Running Time and Big-O. Tandy Warnow

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

Divide and Conquer. Andreas Klappenecker

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

CSE548, AMS542: Analysis of Algorithms, Fall 2017 Date: Oct 26. Homework #2. ( Due: Nov 8 )

Quicksort algorithm Average case analysis

1. Basic Algorithms: Bubble Sort

5. DIVIDE AND CONQUER I

CS483 Design and Analysis of Algorithms

Lecture 14: Nov. 11 & 13

1 Divide and Conquer (September 3)

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

IS 709/809: Computational Methods in IS Research Fall Exam Review

Section 5.3 Divide and Conquer Relations. 0,a m. - Division into thirds gives m = 3. Apply the telescoping technique described in Section 5.2.

CSCE 750, Spring 2001 Notes 2 Page 1 4 Chapter 4 Sorting ffl Reasons for studying sorting is a big deal pedagogically useful Λ the application itself

Divide and Conquer Algorithms

CSE 4502/5717: Big Data Analytics

CPS 616 DIVIDE-AND-CONQUER 6-1

On Adaptive Integer Sorting

Recap: Prefix Sums. Given A: set of n integers Find B: prefix sums 1 / 86

Divide and Conquer. Recurrence Relations

Algorithms and Their Complexity

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Outline. 1 Introduction. Merging and MergeSort. 3 Analysis. 4 Reference

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Linear Selection and Linear Sorting

Fundamental Algorithms

On Partial Sorting. Conrado Martínez. Univ. Politècnica de Catalunya, Spain

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Transcription:

Sorting1 Bin Sort Sorting integers in Range [1,...,n] Add all elements to table and then Retrieve in order 1,2,3,...,n Stable Sorting Method (repeated elements will end up in their original order) Numbers 1 2 3 Bin Add them to bin 1 2 3 Bin 1 i i n n n

Sorting2 Radix Sort Input list 216 521 425 116 091 515 124 034 096 024 After sorting on least significant digit 521 091 124 034 024 425 515 216 116 096 After sorting on 2nd least significant digit 515 216 116 521 124 024 425 034 091 096 After sorting on 3rd least significant digit 024 034 091 096 116 124 216 425 515 521

Sorting3 Sort n numbers whose values are in [0,1,...,n-1] can be done in O(n) time via bin sort (use n bins). Sort n numbers whose values are in [0,1,...,n 2 1] can be done on O(n) time via radix sort. Each key is logn bits long. There are two keys per value. xxxx }{{} xxxx }{{} number in [0,...,n 2 1] have key#2key#1 2logn bits. Sort n numbers whose values are in [0,1,...,n k 1] can be done in O(kn) time via radix sort. Each key is logn bits long. There are k keys per value. xxxx }{{} xxxx }{{}...xxxx }{{} number in key#k key#k 1 key#1 [0,...,n k 1] have klogn bits.

Sorting4 Heap Sort Sort x 1,x 2,...,x n for i=1 to n insert x_i in Heap for i=1 to n delete min from Heap Time complexity is O(nlogn). As we discussed before, the first phase (insert n elements) can be done in O(n) time. (bottom up) Sorting method is Not Stable

Sorting5 Merge Sort [8][4][5][6][2][1][7][3] in array A [4,8][5,6][1,2][3,7] in array B, TC is O(n) [4,5,6,8][1,2,3,7] in array A, TC is O(n) [1,2,3,4,5,6,7,8] in array B, TC is O(n) Number of passes is log 2 n Time complexity is T(n) = 2T(n/2)+cn when considering it top down. Solution to this recurrence relation is O(nlog 2 n), and uses O(n) additional space. Stable Sorting method

Sorting6 Merge Sort Analysis Assume, n = 2 k for some positive integer k T(n) = T(1) n = 1 2T(n/2)+cn n > 1 T(n) 2T(n/2)+cn = 2(2T(n/4)+cn/2)+cn = 4T(n/4)+2cn = 4(2T(n/8)+cn/4)+2cn = 8T(n/8)+3cn... (Stop when n/2 k = 1) = 2 k T(1)+kcn = n+cnlogn O(nlogn) if 2 k < n 2 k+1, T(n) T(2 k+1 ) T(2n), which is O(nlogn)

Sorting7 Quick Sort template<class T> T void QuickSort(T a[],int l,int r) {// Sort a[l:r], a[r+1] has large value. if (l >= r) return; int i = l, // left to right cursor j = r + 1; // right to left cursor T pivot = a[l]; // Partition wrt pivot while (true) { do {// find >= element on left side i = i + 1; } while (a[i] < pivot); do {// find <= element on right side j = j - 1; } while (a[j] > pivot); if (i >= j) break;//swap pair not found Swap(a[i], a[j]); } a[l] = a[j]; a[j] = pivot; QuickSort(a, l, j-1);//sort left segment QuickSort(a, j+1, r);//sort right segment }

Sorting8 QuickSort A[r+1] has value larger than all other elements. Why? Time complexity bound is O(n 2 ). O(n) additional space (because of recursion) NONSTABLE Sorting method

Sorting9 QuickSort Worst Case 1 2 3 4... 10... n i=1 i which is O(n2 ) Many instances take Ω(n 2 ) NONSTABLE Sorting Method Space Complexity: (QS(n) = QuickSort on n elements) Ω(n) {}}{ QS(n),QS(n 1),QS(n 2),...,QS(1)

Sorting10 Reduce Space Complexity Make procedure non-recursive Solve the smaller problem first for example sort(1,80) size n sort (1,40), sort(42,80) size n/2 sort(42,62), sort(64,80) size n/4 sort(64,74), sort(76,80)... log 2 (n) space

Sorting11 Quick Sort Note: Changing the recursive program so that it calls the smallest subproblem first has space complexity O(n). QS(a[],l,r)... if j-1-l <= r-(j+1) then QS(a,l,j-1) QS(a,j+1,r) else QS(a,j+1,r) QS(a,l,j-1) end QS Space n Recursive cells stack {}}{ QS(1,n)QS(2,n),QS(3,n),...,QS(n 1,n)

Sorting12 Iterative Quick Sort QS(...,1,n) S <- NULL PUSH (S,(1,n)) while S is not Empty do (l,r) <- POP(S) if l >= r go to end while... if j-1-l <= r-(j+1) then Push(S,(j+1,r)) //large Push(S,(l,j-1)) //small else Push(S,(l,j-1)) //large Push(S,(j+1,r)) //small endwhile end QS

Sorting13 Iterative Quick Sort Stack has length O(log n) Stack contents at different times for a problem with α values. α < α α/2 < α < α/2 α/4 < α < α/2 < α/4 α/8... Stack has length O(log n).

Sorting14 Time Complexity Worst Case T(n) T(n 1)+cn = T(n 2)+c(n 1)+cn =... = T(1)+(c)2+(c)3+...+c(n 1)+c(n) = O(n 2 ) There are examples that show that T(n) is Ω(n 2 ). Best Case T(n)=2T(n/2)+cn O(nlogn), like merge sort but constant is smaller

Sorting15 Average Case T(n) T(i)+T(n i 1)+cn A(n) = 2 n na(n) = 2 n 1 (A(j))+cn j=0 n 1 j=0 A(j)+cn 2 Subtracting from last equation (n 1)A(n 1) = 2 n 2 j=0 A(j)+c(n 1)2 we know that na(n) (n 1)A(n 1) = 2A(n 1)+2cn c na(n) = (n+1)a(n 1)+2cn A(n) n+1 = A(n 1) n + 2c n+1 Note that c is not significant!

Sorting16 Adding the following equations A(n) n+1 A(n 1) n A(n 2) n 1. A(2) 3 = A(n 1) n = A(n 2) n 1 = A(n 3).. n 2 = A(1) 2 + 2c 3 + 2c n+1 + 2c n + 2c n 1 We know that A(n) n+1 = A(1) 2 +2c n+1 j=3 1 j The last summation is log e (n+1)+0.577 3 2 Therefore, A(n) n+1 = O(logn) Which is equivalent to A(n) = O(nlogn)

Sorting17 Randomized Quick Sort Randomized Quick Sort: Select the pivot at random with equal probability of selecting any of the elements being sorted. Randomized Quick Sort has expected time complexity equal to A(n), the average time complexity. Randomized Quick Sort has worst case time compelexity O(n 2 ). But it is normally very fast. Sort in Linux (Unix) uses Quick Sort.

Sorting18 Lower Bounds for Sorting Decision Trees: Binary tree Nodes represent a set of possible orderings A comparison is made between elements The result is two set of possible orderings Exp. T a < b < c < d a < c < b < d b<c F ordering comparison a < b < c < d a < c < b < d Orderings

Sorting19 Assume elements are distinct Every sorting method (that gains info only by comparing elements) may be viewed as a decision tree. T x1<x2<x3 x1<x3<x2 x2<x1<x3 x2<x3<x1 x3<x1<x2 x3<x2<x1 x1<x2 F x1<x2<x3 x1<x3<x2 x3<x1<x2 x2<x1<x3 x2<x3<x1 x3<x2<x1 x2<x3 x2<x3 T F T F x1<x2<x3 T x1<x3<x2 x1<x3<x2 x3<x1<x2 x1<x3 F x3<x1<x2 x2<x1<x3 x2<x3<x1 T x1<x3 x2<x1<x3 F x2<x3<x1 x3<x2<x1

Sorting20 Lower Bound Sorting n elements x 1,x 2,x 3,...,x n n! leaves in any decision tree Height h of decision tree h log 2 n! ) n/2 Since n! ( n 2 we know that h is Ω(nlogn) Sorting method that gain info only by comparing keys take time Ω(nlogn)

Sorting21 Lower Bound for Average TC The average number of comparison (for decision tree sorting) is equal to the Average Depth of a Leaf Theorem: For any decision tree with k leaves the ADL log 2 k. Proof: Proof by Contradiction Let T be the least height decision tree with ADL < log 2 k, where k=# of leaves in T. T: k = 1 Not least height T: k >= 2 n n n1 k1<k n2 k2<k k n1 Not least height

Sorting22 Lower Bound for Average TC: Cont Let k = k 1 +k 2, then by Ind. Hypothesis ADL(n 1 ) log 2 k 1, ADL(n 2 ) log 2 k 2, k = k 1 +k 2 ADL k 1 k 1 +k 2 log 2 k 1 + k 2 k 1 +k 2 log 2 k 2 +1 min when k 1 = k 2 = k/2 ADL is 0.5logk/2+0.5logk/2+1 = log 2 k A contradiction. Note that (0.5+ǫ)log(k/2+ǫ)+(0.5 ǫ)log(k/2 ǫ)+1 is greater than log 2 k simply because log(k/2+ǫ) logk/2 > logk/2 log(k/2 ǫ)

Sorting23 Summary Algorithm WC AC Stable? Addl Space Binsort* O(n) O(n) RadixSort** O(kn) HeapSort O(n) O(n log n) 0(n log n) not stable O(c) QuickSort O(n 2 ) O(nlogn) not stable O(logn) Mergesort O(n log n) O(n log n) stable O(n) Insertsort*** O(n 2 ) O(n 2 ) stable Binsort*: Sorts integers in the range [0, n 1]. RadixSort**: Sorts integers in the range [0,n k 1]. Insertsort***: Best method when sorting 15 elements, used in hybrid algorithms

Sorting24 Sorting of Integers O(nloglogn) time: S. Nilsson, The fastest sorting algorithm, Dr. Dobbs Jr. Apr. 2000. Lots of extra space. Sorts Integers Only. O(nloglogn) time and O(n) space, Not practical Y. Han, Determinist Sorting in O(nloglogn) time + Linear Space, STOC 02, pp. 602-608. Sorts Integers Only.