Divide-and-Conquer Algorithms Part Two

Size: px
Start display at page:

Download "Divide-and-Conquer Algorithms Part Two"

Transcription

1 Divide-and-Conquer Algorithms Part Two

2 Recap from Last Time

3 Divide-and-Conquer Algorithms A divide-and-conquer algorithm is one that works as follows: (Divide) Split the input apart into multiple smaller pieces, then recursively invoke the algorithm on those pieces. (Conquer) Combine those solutions back together to form the overall answer. Can be analyzed using recurrence relations.

4 Two Important Recurrences T(0) T(0) = Θ(1) T(1) T(1) = Θ(1) T(n) T(n) = T( n T( n // 2 ) 2 ) + T( n T( n // 2 ) 2 ) + Θ(n) Solves to O(n log n) T(0) T(0) = Θ(1) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + Θ(1) Solves to O(log n)

5 Outline for Today More Recurrences Other divide-and-conquer relations. Algorithmic Lower Bounds Showing that certain problems cannot be solved within certain limits. Binary Heaps A fast data structure for retrieving elements in sorted order.

6 Another Algorithm: Maximizing Unimodal Arrays

7 Unimodality

8 Unimodality

9 Unimodality An An array array is is called unimodal iff iff it it can can be be split split into into an an increasing sequence followed by by a decreasing sequence

10 Unimodality An An array array is is called unimodal iff iff it it can can be be split split into into an an increasing sequence followed by by a decreasing sequence

11 Unimodality

12 Unimodality

13 Unimodality

14 Unimodality

15 Unimodality

16 Unimodality

17 Unimodality

18 Unimodality

19 Unimodality

20 Unimodality

21 Unimodality

22 procedure unimodalmax(list A, A, int int low, low, int int high): high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] < A[mid A[mid + 1] 1] return unimodalmax(a, mid mid + 1, 1, high) high) else: else: return unimodalmax(a, low, low, mid mid + 1) 1)

23 procedure unimodalmax(list A, A, int int low, low, int int high): high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] < A[mid A[mid + 1] 1] return unimodalmax(a, mid mid + 1, 1, high) high) else: else: return unimodalmax(a, low, low, mid mid + 1) 1) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + Θ(1)

24 procedure unimodalmax(list A, A, int int low, low, int int high): high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] < A[mid A[mid + 1] 1] return unimodalmax(a, mid mid + 1, 1, high) high) else: else: return unimodalmax(a, low, low, mid mid + 1) 1) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + Θ(1)

25 procedure unimodalmax(list A, A, int int low, low, int int high): high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] < A[mid A[mid + 1] 1] return unimodalmax(a, mid mid + 1, 1, high) high) else: else: return unimodalmax(a, low, low, mid mid + 1) 1) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + Θ(1) O(log n)

26 Unimodality II

27 Unimodality II

28 Unimodality II

29 Unimodality II A weakly unimodal array array is is one one that that can can be be split split into into a nondecreasing sequence followed by by a nonincreasing sequence

30 Unimodality II A weakly unimodal array array is is one one that that can can be be split split into into a nondecreasing sequence followed by by a nonincreasing sequence

31 Unimodality II

32 Unimodality II

33 Unimodality II

34 Unimodality II

35 Unimodality II

36 Unimodality II

37 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high))

38 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high)) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + T( n T( n // 2 ) 2 ) + Θ(1)

39 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high)) T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + T( n T( n // 2 ) 2 ) + Θ(1)

40 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high)) T(1) T(1) c T(n) T(n) T( n T( n // 2 ) 2 ) + T( n T( n // 2 ) 2 ) + c

41 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high)) T(1) T(1) c T(n) T(n) T(n T(n // 2) 2) + T(n T(n // 2) 2) + c

42 procedure weakunimodalmax(list A, A, int int low, low, int int high): if if low low = high high - 1: 1: return A[low] let let mid mid = (high (high + low) low) / 2 if if A[mid] A[mid] < A[mid A[mid + 1] 1] return weakunimodalmax(a, mid mid + 1, 1, high) high) else else if if A[mid] A[mid] > A[mid A[mid + 1] 1] return weakunimodalmax(a, low, low, mid mid + 1) 1) else else return max(weakunimodalmax(a, low, low, mid mid + 1) 1) weakunimodalmax(a, mid mid + 1, 1, high)) T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c

43 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c

44 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

45 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

46 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

47 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

48 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

49 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

50 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

51 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2T ( n 2 ) +c 2 ( 2T ( n 4 ) +c ) +c 4 T ( n ) +2 c+c 4 = 4 T ( n 4 ) +3c 4( 2T ( n 8 ) +c ) +3c = 8 T ( n ) +4 c+3 c 8 = 8 T ( n 8 ) +7 c... 2 k T (n 2 k ) +(2k 1)c

52 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2 c n c

53 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2 c n c

54 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2 c n c

55 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2 c n c

56 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2 c n c

57 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c T(n) 2 k T (n 2 k ) +(2k 1)c 2 log 2 n T(1)+(2 log 2 n 1)c = nt(1)+c(n 1) c n+c(n 1) = 2c n c = O(n)

58 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c

59 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c

60 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c

61 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c

62 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c

63 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c

64 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c 4c

65 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c 4c

66 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c 4c c c c c c c

67 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c c c c c c c 4c cn

68 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c c c c c c c 4c cn ( n / 2)c +cn

69 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c c c c c c c 4c cn (n 1)c +cn

70 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c c c c c c c 4c cn cn c + cn

71 T(1) T(1) c T(n) T(n) 2T(n // 2) 2) + c c c c c 2c c c c c c c c c c c 4c cn 2cn c

72 Another Recurrence Relation The recurrence relation T(1) T(1) = Θ(1) T(n) T(n) T( n T( n // 2 ) 2 ) + T( n T( n // 2 ) 2 ) + Θ(1) solves to T(n) = O(n) Intuitively, the recursion tree is bottomheavy: the bottom of the tree accounts for almost all of the work.

73 Unimodal Arrays Our recurrence shows that the work done is O(n), but this might not be a tight bound. Does our algorithm ever do Ω(n) work? Yes: What happens if all array values are equal to one another? Can we do better?

74 A Lower Bound Claim: Every correct algorithm for finding the maximum value in a unimodal array must do Ω(n) work in the worst-case. Note that this claim is over all possible algorithms, so the argument had better be watertight!

75 A Lower Bound We will prove that any algorithm for finding the maximum value of a unimodal array must, on at least one input, inspect all n locations. Proof idea: Suppose that the algorithm didn't do this.??????????????

76 A Lower Bound We will prove that any algorithm for finding the maximum value of a unimodal array must, on at least one input, inspect all n locations. Proof idea: Suppose that the algorithm didn't do this.??????????????

77 A Lower Bound We will prove that any algorithm for finding the maximum value of a unimodal array must, on at least one input, inspect all n locations. Proof idea: Suppose that the algorithm didn't do this ?

78 A Lower Bound We will prove that any algorithm for finding the maximum value of a unimodal array must, on at least one input, inspect all n locations. Proof idea: Suppose that the algorithm didn't do this

79 A Lower Bound We will prove that any algorithm for finding the maximum value of a unimodal array must, on at least one input, inspect all n locations. Proof idea: Suppose that the algorithm didn't do this

80 Algorithmic Lower Bounds The argument we just saw is called an adversarial argument and is often used to establish algorithmic lower bounds. Idea: Show that if an algorithm doesn't do enough work, then it cannot distinguish two different inputs that require different outputs. Therefore, the algorithm cannot always be correct.

81 o Notation Let f, g : N N. We say that f(n) = o(g(n)) (f is little-o of g) iff lim n f (n) g(n) =0 In other words, f grows strictly slower than g. Often used to describe impossibility results. For example: There is no o(n)-time algorithm for finding the maximum element of a weakly unimodal array.

82 What Does This Mean? In the worst-case, our algorithm must do Ω(n) work. That's the same as a linear scan over the input array! Is our algorithm even worth it? Yes: In most cases, the runtime is Θ(log n) or close to it.

83 Binary Heaps

84 Data Structures Matter We have seen two instances where a better choice of data structure improved the runtime of an algorithm: Using adjacency lists instead of adjacency matrices in graph algorithms. Using a double-ended queue in 0/1 Dijkstra's algorithm. Today, we'll explore a data structure that is useful for improving algorithmic efficiency. We'll come back to this structure in a few weeks when talking about Prim's algorithm and Kruskal's algorithm.

85 Priority Queues A priority queue is a data structure for storing elements associated with priorities (often called keys). Optimized to find the element that currently has the smallest key. Supports the following operations: enqueue(k, v) which adds element v to the queue with key k. is-empty, which returns whether the queue is empty. dequeue-min, which removes the element with the least priority from the queue. Many implementations are possible with varying tradeoffs.

86 A Naive Implementation One simple way to implement a priority queue is with an unsorted array key/value pairs. To enqueue v with key k, append (k, v) to the array in time O(1). To check whether the priority queue is empty, check whether the underlying array is empty in time O(1). To dequeue-min, scan across the array to find an element with minimum key, then remove it in time O(n). Doing n enqueues and n dequeues takes time O(n2 ).

87 A Better Implementation

88 A Better Implementation This tree obeys the heap heap property: each each node's node's key key is is less less than than or or equal equal to to all all its its descendants' keys. keys This tree obeys the

89 A Better Implementation This tree obeys the heap heap property: each each node's node's key key is is less less than than or or equal equal to to all all its its descendants' keys. keys This tree obeys the

90 A Better Implementation

91 A Better Implementation This is a complete binary binary tree: tree: every every level level except except the the last last one one is is filled filled in in completely This is a complete

92 A Better Implementation

93 A Better Implementation

94 A Better Implementation

95 A Better Implementation

96 A Better Implementation

97 A Better Implementation

98 A Better Implementation

99 A Better Implementation

100 A Better Implementation

101 A Better Implementation

102 A Better Implementation

103 A Better Implementation

104 A Better Implementation

105 A Better Implementation

106 A Better Implementation

107 A Better Implementation

108 A Better Implementation

109 A Better Implementation Yo. 'Sup

110 A Better Implementation

111 A Better Implementation

112 A Better Implementation

113 A Better Implementation

114 A Better Implementation

115 A Better Implementation

116 A Better Implementation

117 Binary Heap Efficiency The enqueue and dequeue operations on a binary heap all run O(h), where h is the height of the tree. In a perfect binary tree of height h, there are h = 2 h+1 1 nodes. If there are n nodes, the maximum height would be found by setting n = 2 h+1 1. Solving, we get that h = log₂ (n + 1) 1 Thus h = Θ(log n), so enqueue and dequeue take time O(log n).

118 Implementing Binary Heaps It is extremely rare to actually implement a binary heap as a tree structure. Can encode the heap as an array:

119 Implementing Binary Heaps It is extremely rare to actually implement a binary heap as a tree structure. Can encode the heap as an array:

120 Implementing Binary Heaps It is extremely rare to actually implement a binary heap as a tree structure. Can encode the heap as an array:

121 Implementing Binary Heaps It is extremely rare to actually implement a binary heap as a tree structure. Can encode the heap as an array: Assuming Assuming one-indexing: Node Node n has has children children at at positions positions 2n 2n and and 2n 2n Node Node n has has its its parent parent at at position position n n //

122 Application: Heapsort

123 Sorting with Binary Heaps

124 Sorting with Binary Heaps

125 Sorting with Binary Heaps

126 Sorting with Binary Heaps

127 Sorting with Binary Heaps

128 Sorting with Binary Heaps This This is is a max-heap (where (where larger larger values values are are on on top), top), as as opposed opposed to to a min-heap (where (where smaller smaller values values are are on on top). top). We'll We'll see see why why in in a minute. minute

129 Sorting with Binary Heaps

130 Sorting with Binary Heaps

131 Sorting with Binary Heaps

132 Sorting with Binary Heaps

133 Sorting with Binary Heaps

134 Sorting with Binary Heaps

135 Sorting with Binary Heaps

136 Sorting with Binary Heaps

137 Sorting with Binary Heaps

138 Sorting with Binary Heaps

139 Sorting with Binary Heaps

140 Sorting with Binary Heaps

141 Sorting with Binary Heaps

142 Sorting with Binary Heaps

143 Sorting with Binary Heaps

144 Sorting with Binary Heaps

145 Sorting with Binary Heaps

146 Sorting with Binary Heaps 34 5 Oh no!

147 Sorting with Binary Heaps

148 Sorting with Binary Heaps

149 Sorting with Binary Heaps

150 Sorting with Binary Heaps

151 Sorting with Binary Heaps

152 Sorting with Binary Heaps

153 Sorting with Binary Heaps

154 Sorting with Binary Heaps

155 Sorting with Binary Heaps

156 Sorting with Binary Heaps

157 Sorting with Binary Heaps

158 Sorting with Binary Heaps

159 Sorting with Binary Heaps

160 Sorting with Binary Heaps

161 Sorting with Binary Heaps

162 Sorting with Binary Heaps

163 Sorting with Binary Heaps

164 Sorting with Binary Heaps

165 Sorting with Binary Heaps

166 Sorting with Binary Heaps

167 Sorting with Binary Heaps

168 Sorting with Binary Heaps

169 Sorting with Binary Heaps

170 Sorting with Binary Heaps

171 Sorting with Binary Heaps

172 Sorting with Binary Heaps

173 A Better Idea

174 A Better Idea

175 A Better Idea

176 A Better Idea

177 A Better Idea

178 A Better Idea

179 A Better Idea

180 A Better Idea

181 A Better Idea

182 A Better Idea

183 A Better Idea

184 A Better Idea

185 A Better Idea

186 A Better Idea

187 A Better Idea

188 A Better Idea

189 A Better Idea

190 A Better Idea

191 A Better Idea

192 A Better Idea

193 A Better Idea

194 A Better Idea

195 A Better Idea

196 A Better Idea

197 A Better Idea

198 A Better Idea

199 A Better Idea

200 A Better Idea

201 A Better Idea

202 A Better Idea

203 A Better Idea

204 A Better Idea

205 A Better Idea

206 A Better Idea

207 A Better Idea

208 A Better Idea

209 A Better Idea

210 A Better Idea

211 A Better Idea

212 A Better Idea

213 A Better Idea

214 A Better Idea

215 A Better Idea

216 A Better Idea

217 Heapsort The heapsort algorithm is as follows: Build a max-heap from the array elements, using the array itself to represent the heap. Repeatedly dequeue from the heap until all elements are placed in sorted order. This algorithm runs in time O(n log n), since it does n enqueues and n dequeues. Only requires O(1) auxiliary storage space, compared with O(n) space required in mergesort.

218 An Optimization: Heapify

219 Making a Binary Heap Suppose that you have n elements and want to build a binary heap from them. One way to do this is to enqueue all of them, one after another, into the binary heap. We can upper-bound the runtime as n calls to an O(log n) operation, giving a total runtime of O(n log n). Is that a tight bound?

220 Making a Binary Heap

221 Making a Binary Heap

222 Making a Binary Heap log₂ (n/2 + 1)

223 Making a Binary Heap log₂ (n/2 + 1) Total Runtime: Θ(n log n)

224 Quickly Making a Binary Heap Here is a slightly different algorithm for building a binary heap out of a set of data: Put the nodes, in any order, into a complete binary tree of the right size. (Shape property holds, but heap property might not.) For each node, starting at the bottom layer and going upward, run a bubble-down step on that node.

225 Quickly Making a Binary Heap

226 Quickly Making a Binary Heap

227 Quickly Making a Binary Heap

228 Quickly Making a Binary Heap

229 Quickly Making a Binary Heap

230 Quickly Making a Binary Heap

231 Quickly Making a Binary Heap

232 Quickly Making a Binary Heap

233 Quickly Making a Binary Heap

234 Quickly Making a Binary Heap

235 Quickly Making a Binary Heap

236 Quickly Making a Binary Heap

237 Quickly Making a Binary Heap

238 Quickly Making a Binary Heap

239 Quickly Making a Binary Heap

240 Quickly Making a Binary Heap

241 Quickly Making a Binary Heap

242 Quickly Making a Binary Heap

243 Quickly Making a Binary Heap

244 Quickly Making a Binary Heap

245 Quickly Making a Binary Heap

246 Quickly Making a Binary Heap

247 Quickly Making a Binary Heap

248 Quickly Making a Binary Heap

249 Quickly Making a Binary Heap

250 Quickly Making a Binary Heap

251 Quickly Making a Binary Heap

252 Quickly Making a Binary Heap

253 Quickly Making a Binary Heap

254 Analyzing the Runtime At most half of the elements start one layer above that and can move down at most once. At most a quarter of the elements start one layer above that and can move down at most twice. At most an eighth of the elements start two layers above that and can move down at most thrice. More generally: At most n / 2 k of the elements can move down k steps. Can upper-bound the runtime with the sum T(n) log 2 n ni i=0 log 2 n 2 i =n i=0 i 2 i

255 Simplifying the Summation We want to simplify the sum log 2 n i=0 i 2 i Let's introduce a new variable x, then evaluate the sum when x = ½: log 2 n i=0 i x i If x < 1, each term is less than the previous, so log 2 n i=0 i x i < i=0 i x i

256 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = x (1 x) 2 x 2 (1 x) 2

257 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = x (1 x) 2 x 2 (1 x) 2

258 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = x (1 x) 2 x 2 (1 x) 2

259 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = x (1 x) 2 x 2 (1 x) 2

260 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = x (1 x) 2 x 2 (1 x) 2

261 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = 1 (1 x) 2 x 2 (1 x) 2

262 Solving the Summation i=0 i x i = x i=0 = x i=0 i x i 1 d dx xi = x d dx ( i=0 x i ) = x d ( 1 ) dx 1 x = x = 1 (1 x) 2 x (1 x) 2

263 The Finishing Touches We know know that log 2 n T (n) n i=0 i x i < n i=0 i x i = Evaluating at x = ½, we get n x (1 x) 2 T (n) n(1/2) n(1/2) (1 (1/2)) 2= (1/2) =2n 2 So at most 2n swaps are performed! We visit each node once and do at most O(n) swaps, so the runtime is Θ(n).

Priority queues implemented via heaps

Priority queues implemented via heaps Priority queues implemented via heaps Comp Sci 1575 Data s Outline 1 2 3 Outline 1 2 3 Priority queue: most important first Recall: queue is FIFO A normal queue data structure will not implement a priority

More information

Review Of Topics. Review: Induction

Review Of Topics. Review: Induction Review Of Topics Asymptotic notation Solving recurrences Sorting algorithms Insertion sort Merge sort Heap sort Quick sort Counting sort Radix sort Medians/order statistics Randomized algorithm Worst-case

More information

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 MA008 p.1/37 MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3 Dr. Markus Hagenbuchner markus@uow.edu.au. MA008 p.2/37 Exercise 1 (from LN 2) Asymptotic Notation When constants appear in exponents

More information

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2) 2.2 Asymptotic Order of Growth definitions and notation (2.2) examples (2.4) properties (2.2) Asymptotic Order of Growth Upper bounds. T(n) is O(f(n)) if there exist constants c > 0 and n 0 0 such that

More information

IS 709/809: Computational Methods in IS Research Fall Exam Review

IS 709/809: Computational Methods in IS Research Fall Exam Review IS 709/809: Computational Methods in IS Research Fall 2017 Exam Review Nirmalya Roy Department of Information Systems University of Maryland Baltimore County www.umbc.edu Exam When: Tuesday (11/28) 7:10pm

More information

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d CS161, Lecture 4 Median, Selection, and the Substitution Method Scribe: Albert Chen and Juliana Cook (2015), Sam Kim (2016), Gregory Valiant (2017) Date: January 23, 2017 1 Introduction Last lecture, we

More information

Computational Complexity

Computational Complexity Computational Complexity S. V. N. Vishwanathan, Pinar Yanardag January 8, 016 1 Computational Complexity: What, Why, and How? Intuitively an algorithm is a well defined computational procedure that takes

More information

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Xi Chen Columbia University We continue with two more asymptotic notation: o( ) and ω( ). Let f (n) and g(n) are functions that map

More information

CS Data Structures and Algorithm Analysis

CS Data Structures and Algorithm Analysis CS 483 - Data Structures and Algorithm Analysis Lecture VII: Chapter 6, part 2 R. Paul Wiegand George Mason University, Department of Computer Science March 22, 2006 Outline 1 Balanced Trees 2 Heaps &

More information

COMP 355 Advanced Algorithms

COMP 355 Advanced Algorithms COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Running Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that checks every

More information

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions

More information

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort Sorting Algorithms We have already seen: Selection-sort Insertion-sort Heap-sort We will see: Bubble-sort Merge-sort Quick-sort We will show that: O(n log n) is optimal for comparison based sorting. Bubble-Sort

More information

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 CS 61B More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 Here is a review of some formulas that you will find useful when doing asymptotic analysis. ˆ N i=1 i = 1 + 2 + 3 + 4 + + N = N(N+1)

More information

data structures and algorithms lecture 2

data structures and algorithms lecture 2 data structures and algorithms 2018 09 06 lecture 2 recall: insertion sort Algorithm insertionsort(a, n): for j := 2 to n do key := A[j] i := j 1 while i 1 and A[i] > key do A[i + 1] := A[i] i := i 1 A[i

More information

2. ALGORITHM ANALYSIS

2. ALGORITHM ANALYSIS 2. ALGORITHM ANALYSIS computational tractability asymptotic order of growth survey of common running times Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Matching Residents to Hospitals

Matching Residents to Hospitals Midterm Review Matching Residents to Hospitals Goal. Given a set of preferences among hospitals and medical school students, design a self-reinforcing admissions process. Unstable pair: applicant x and

More information

Algorithm runtime analysis and computational tractability

Algorithm runtime analysis and computational tractability Algorithm runtime analysis and computational tractability As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the

More information

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count Types of formulas for basic operation count Exact formula e.g., C(n) = n(n-1)/2 Algorithms, Design and Analysis Big-Oh analysis, Brute Force, Divide and conquer intro Formula indicating order of growth

More information

Assignment 5: Solutions

Assignment 5: Solutions Comp 21: Algorithms and Data Structures Assignment : Solutions 1. Heaps. (a) First we remove the minimum key 1 (which we know is located at the root of the heap). We then replace it by the key in the position

More information

CS173 Running Time and Big-O. Tandy Warnow

CS173 Running Time and Big-O. Tandy Warnow CS173 Running Time and Big-O Tandy Warnow CS 173 Running Times and Big-O analysis Tandy Warnow Today s material We will cover: Running time analysis Review of running time analysis of Bubblesort Review

More information

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2 1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science,

More information

Data Structures and Algorithms

Data Structures and Algorithms Data Structures and Algorithms Spring 2017-2018 Outline 1 Sorting Algorithms (contd.) Outline Sorting Algorithms (contd.) 1 Sorting Algorithms (contd.) Analysis of Quicksort Time to sort array of length

More information

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline CS 4407, Algorithms Growth Functions

More information

CSCE 750 Final Exam Answer Key Wednesday December 7, 2005

CSCE 750 Final Exam Answer Key Wednesday December 7, 2005 CSCE 750 Final Exam Answer Key Wednesday December 7, 2005 Do all problems. Put your answers on blank paper or in a test booklet. There are 00 points total in the exam. You have 80 minutes. Please note

More information

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n Fundamental Algorithms Homework #1 Set on June 25, 2009 Due on July 2, 2009 Problem 1. [15 pts] Analyze the worst-case time complexity of the following algorithms,and give tight bounds using the Theta

More information

CS361 Homework #3 Solutions

CS361 Homework #3 Solutions CS6 Homework # Solutions. Suppose I have a hash table with 5 locations. I would like to know how many items I can store in it before it becomes fairly likely that I have a collision, i.e., that two items

More information

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, Divide and Conquer CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Merging sorted lists: WHAT Given two sorted lists a 1 a 2 a 3 a k b 1 b 2 b 3 b

More information

Divide and Conquer. Andreas Klappenecker

Divide and Conquer. Andreas Klappenecker Divide and Conquer Andreas Klappenecker The Divide and Conquer Paradigm The divide and conquer paradigm is important general technique for designing algorithms. In general, it follows the steps: - divide

More information

Asymptotic Analysis and Recurrences

Asymptotic Analysis and Recurrences Appendix A Asymptotic Analysis and Recurrences A.1 Overview We discuss the notion of asymptotic analysis and introduce O, Ω, Θ, and o notation. We then turn to the topic of recurrences, discussing several

More information

1 Terminology and setup

1 Terminology and setup 15-451/651: Design & Analysis of Algorithms August 31, 2017 Lecture #2 last changed: August 29, 2017 In this lecture, we will examine some simple, concrete models of computation, each with a precise definition

More information

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence A. V. Gerbessiotis CS 610-102 Spring 2014 PS 1 Jan 27, 2014 No points For the remainder of the course Give an algorithm means: describe an algorithm, show that it works as claimed, analyze its worst-case

More information

CPS 616 DIVIDE-AND-CONQUER 6-1

CPS 616 DIVIDE-AND-CONQUER 6-1 CPS 616 DIVIDE-AND-CONQUER 6-1 DIVIDE-AND-CONQUER Approach 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger)

More information

Insertion Sort. We take the first element: 34 is sorted

Insertion Sort. We take the first element: 34 is sorted Insertion Sort Idea: by Ex Given the following sequence to be sorted 34 8 64 51 32 21 When the elements 1, p are sorted, then the next element, p+1 is inserted within these to the right place after some

More information

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2 Asymptotic Analysis Slides by Carl Kingsford Jan. 27, 2014 AD Chapter 2 Independent Set Definition (Independent Set). Given a graph G = (V, E) an independent set is a set S V if no two nodes in S are joined

More information

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation: CS 124 Section #1 Big-Oh, the Master Theorem, and MergeSort 1/29/2018 1 Big-Oh Notation 1.1 Definition Big-Oh notation is a way to describe the rate of growth of functions. In CS, we use it to describe

More information

Data Structures and Algorithms Chapter 3

Data Structures and Algorithms Chapter 3 Data Structures and Algorithms Chapter 3 1. Divide and conquer 2. Merge sort, repeated substitutions 3. Tiling 4. Recurrences Recurrences Running times of algorithms with recursive calls can be described

More information

5. DIVIDE AND CONQUER I

5. DIVIDE AND CONQUER I 5. DIVIDE AND CONQUER I mergesort counting inversions closest pair of points randomized quicksort median and selection Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Analysis of Algorithms

Analysis of Algorithms September 29, 2017 Analysis of Algorithms CS 141, Fall 2017 1 Analysis of Algorithms: Issues Correctness/Optimality Running time ( time complexity ) Memory requirements ( space complexity ) Power I/O utilization

More information

COMP 382: Reasoning about algorithms

COMP 382: Reasoning about algorithms Fall 2014 Unit 4: Basics of complexity analysis Correctness and efficiency So far, we have talked about correctness and termination of algorithms What about efficiency? Running time of an algorithm For

More information

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2 In-Class Soln 1 Let f(n) be an always positive function and let g(n) = f(n) log n. Show that f(n) = o(g(n)) CS 361, Lecture 4 Jared Saia University of New Mexico For any positive constant c, we want to

More information

Divide and Conquer. Recurrence Relations

Divide and Conquer. Recurrence Relations Divide and Conquer Recurrence Relations Divide-and-Conquer Strategy: Break up problem into parts. Solve each part recursively. Combine solutions to sub-problems into overall solution. 2 MergeSort Mergesort.

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Algorithm runtime analysis and computational tractability Time Complexity of an Algorithm How do we measure the complexity (time, space requirements) of an algorithm. 1 microsecond? Units of time As soon

More information

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD Greedy s Greedy s Shortest path Claim 2: Let S be a subset of vertices containing s such that we know the shortest path length l(s, u) from s to any vertex in u S. Let e = (u, v) be an edge such that 1

More information

Analysis of Algorithms - Midterm (Solutions)

Analysis of Algorithms - Midterm (Solutions) Analysis of Algorithms - Midterm (Solutions) K Subramani LCSEE, West Virginia University, Morgantown, WV {ksmani@cseewvuedu} 1 Problems 1 Recurrences: Solve the following recurrences exactly or asymototically

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17 2.1 Notes Homework 1 will be released today, and is due a week from today by the beginning

More information

Omega notation. Transitivity etc.

Omega notation. Transitivity etc. Omega notation Big-Omega: Lecture 2, Sept. 25, 2014 f () n (()) g n const cn, s.t. n n : cg() n f () n Small-omega: 0 0 0 f () n (()) g n const c, n s.t. n n : cg() n f () n 0 0 0 Intuition (works most

More information

Concrete models and tight upper/lower bounds

Concrete models and tight upper/lower bounds Lecture 3 Concrete models and tight upper/lower bounds 3.1 Overview In this lecture, we will examine some simple, concrete models of computation, each with a precise definition of what counts as a step,

More information

University of New Mexico Department of Computer Science. Midterm Examination. CS 361 Data Structures and Algorithms Spring, 2003

University of New Mexico Department of Computer Science. Midterm Examination. CS 361 Data Structures and Algorithms Spring, 2003 University of New Mexico Department of Computer Science Midterm Examination CS 361 Data Structures and Algorithms Spring, 2003 Name: Email: Print your name and email, neatly in the space provided above;

More information

CS 310 Advanced Data Structures and Algorithms

CS 310 Advanced Data Structures and Algorithms CS 310 Advanced Data Structures and Algorithms Runtime Analysis May 31, 2017 Tong Wang UMass Boston CS 310 May 31, 2017 1 / 37 Topics Weiss chapter 5 What is algorithm analysis Big O, big, big notations

More information

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Divide and Conquer CPE 349. Theresa Migler-VonDollen Divide and Conquer CPE 349 Theresa Migler-VonDollen Divide and Conquer Divide and Conquer is a strategy that solves a problem by: 1 Breaking the problem into subproblems that are themselves smaller instances

More information

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu Analysis of Algorithm Efficiency Dr. Yingwu Zhu Measure Algorithm Efficiency Time efficiency How fast the algorithm runs; amount of time required to accomplish the task Our focus! Space efficiency Amount

More information

5. DIVIDE AND CONQUER I

5. DIVIDE AND CONQUER I 5. DIVIDE AND CONQUER I mergesort counting inversions closest pair of points median and selection Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Searching. Sorting. Lambdas

Searching. Sorting. Lambdas .. s Babes-Bolyai University arthur@cs.ubbcluj.ro Overview 1 2 3 Feedback for the course You can write feedback at academicinfo.ubbcluj.ro It is both important as well as anonymous Write both what you

More information

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website: CS 344 Design and Analysis of Algorithms Tarek El-Gaaly tgaaly@cs.rutgers.edu Course website: www.cs.rutgers.edu/~tgaaly/cs344.html Course Outline Textbook: Algorithms by S. Dasgupta, C.H. Papadimitriou,

More information

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Lecture 1: Asymptotics, Recurrences, Elementary Sorting Lecture 1: Asymptotics, Recurrences, Elementary Sorting Instructor: Outline 1 Introduction to Asymptotic Analysis Rate of growth of functions Comparing and bounding functions: O, Θ, Ω Specifying running

More information

How many hours would you estimate that you spent on this assignment?

How many hours would you estimate that you spent on this assignment? The first page of your homework submission must be a cover sheet answering the following questions. Do not leave it until the last minute; it s fine to fill out the cover sheet before you have completely

More information

Disjoint-Set Forests

Disjoint-Set Forests Disjoint-Set Forests Thanks for Showing Up! Outline for Today Incremental Connectivity Maintaining connectivity as edges are added to a graph. Disjoint-Set Forests A simple data structure for incremental

More information

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3 Banner ID: Name: Midterm Exam CS 3110: Design and Analysis of Algorithms June 20, 2006 Group 1 Group 2 Group 3 Question 1.1 Question 2.1 Question 3.1 Question 1.2 Question 2.2 Question 3.2 Question 3.3

More information

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem CS 577 Introduction to Algorithms: Jin-Yi Cai University of Wisconsin Madison In the last class, we described InsertionSort and showed that its worst-case running time is Θ(n 2 ). Check Figure 2.2 for

More information

Data Structures and Algorithms Chapter 3

Data Structures and Algorithms Chapter 3 1 Data Structures and Algorithms Chapter 3 Werner Nutt 2 Acknowledgments The course follows the book Introduction to Algorithms, by Cormen, Leiserson, Rivest and Stein, MIT Press [CLRST]. Many examples

More information

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM

Sorting. Chapter 11. CSE 2011 Prof. J. Elder Last Updated: :11 AM Sorting Chapter 11-1 - Sorting Ø We have seen the advantage of sorted data representations for a number of applications q Sparse vectors q Maps q Dictionaries Ø Here we consider the problem of how to efficiently

More information

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch] Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch] Divide and Conquer Paradigm An important general technique for designing algorithms: divide problem into subproblems recursively

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 10, 2018 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. 1 /

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Chapter 2 2.1 Computational Tractability Basics of Algorithm Analysis "For me, great algorithms are the poetry of computation. Just like verse, they can be terse, allusive, dense, and even mysterious.

More information

Complexity Theory Part I

Complexity Theory Part I Complexity Theory Part I Problem Problem Set Set 77 due due right right now now using using a late late period period The Limits of Computability EQ TM EQ TM co-re R RE L D ADD L D HALT A TM HALT A TM

More information

Mergesort and Recurrences (CLRS 2.3, 4.4)

Mergesort and Recurrences (CLRS 2.3, 4.4) Mergesort and Recurrences (CLRS 2.3, 4.4) We saw a couple of O(n 2 ) algorithms for sorting. Today we ll see a different approach that runs in O(n lg n) and uses one of the most powerful techniques for

More information

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan Lecture 2 More Algorithm Analysis, Math and MCSS By: Sarah Buchanan Announcements Assignment #1 is posted online It is directly related to MCSS which we will be talking about today or Monday. There are

More information

Asymptotic Analysis 1

Asymptotic Analysis 1 Asymptotic Analysis 1 Last week, we discussed how to present algorithms using pseudocode. For example, we looked at an algorithm for singing the annoying song 99 Bottles of Beer on the Wall for arbitrary

More information

Analysis of Algorithms

Analysis of Algorithms October 1, 2015 Analysis of Algorithms CS 141, Fall 2015 1 Analysis of Algorithms: Issues Correctness/Optimality Running time ( time complexity ) Memory requirements ( space complexity ) Power I/O utilization

More information

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts) Introduction to Algorithms October 13, 2010 Massachusetts Institute of Technology 6.006 Fall 2010 Professors Konstantinos Daskalakis and Patrick Jaillet Quiz 1 Solutions Quiz 1 Solutions Problem 1. We

More information

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS Department of Computer Science and Engineering 1 UNIT 1 Basic Concepts Algorithm An Algorithm is a finite

More information

Asymptotic Analysis Cont'd

Asymptotic Analysis Cont'd Cont'd Carlos Moreno cmoreno @ uwaterloo.ca EIT-4103 https://ece.uwaterloo.ca/~cmoreno/ece250 Announcements We have class this Wednesday, Jan 18 at 12:30 That is, we have two sessions this Wednesday: at

More information

Algorithms And Programming I. Lecture 5 Quicksort

Algorithms And Programming I. Lecture 5 Quicksort Algorithms And Programming I Lecture 5 Quicksort Quick Sort Partition set into two using randomly chosen pivot 88 31 25 52 14 98 62 30 23 79 14 31 2530 23 52 88 62 98 79 Quick Sort 14 31 2530 23 52 88

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 20: Algorithm Design Techniques 12/2/2015 Daniel Bauer 1 Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways of

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Lecture 4 - Jan. 14, 2019 CLRS 1.1, 1.2, 2.2, 3.1, 4.3, 4.5 University of Manitoba Picture is from the cover of the textbook CLRS. COMP

More information

Part IA Algorithms Notes

Part IA Algorithms Notes Part IA Algorithms Notes 1 Sorting 1.1 Insertion Sort 1 d e f i n s e r t i o n S o r t ( a ) : 2 f o r i from 1 i n c l u d e d to l e n ( a ) excluded : 3 4 j = i 1 5 w h i l e j >= 0 and a [ i ] > a

More information

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004 Asymptotic Analysis Thomas A. Anastasio January 7, 004 1 Introduction As a programmer, you often have a choice of data structures and algorithms. Choosing the best one for a particular job involves, among

More information

Fundamental Algorithms

Fundamental Algorithms Fundamental Algorithms Chapter 5: Searching Michael Bader Winter 2014/15 Chapter 5: Searching, Winter 2014/15 1 Searching Definition (Search Problem) Input: a sequence or set A of n elements (objects)

More information

CS 161 Summer 2009 Homework #2 Sample Solutions

CS 161 Summer 2009 Homework #2 Sample Solutions CS 161 Summer 2009 Homework #2 Sample Solutions Regrade Policy: If you believe an error has been made in the grading of your homework, you may resubmit it for a regrade. If the error consists of more than

More information

Week 5: Quicksort, Lower bound, Greedy

Week 5: Quicksort, Lower bound, Greedy Week 5: Quicksort, Lower bound, Greedy Agenda: Quicksort: Average case Lower bound for sorting Greedy method 1 Week 5: Quicksort Recall Quicksort: The ideas: Pick one key Compare to others: partition into

More information

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14 Divide and Conquer Algorithms CSE 101: Design and Analysis of Algorithms Lecture 14 CSE 101: Design and analysis of algorithms Divide and conquer algorithms Reading: Sections 2.3 and 2.4 Homework 6 will

More information

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Algorithms. Quicksort. Slide credit: David Luebke (Virginia) 1 Algorithms Quicksort Slide credit: David Luebke (Virginia) Sorting revisited We have seen algorithms for sorting: INSERTION-SORT, MERGESORT More generally: given a sequence of items Each item has a characteristic

More information

Divide-Conquer-Glue. Divide-Conquer-Glue Algorithm Strategy. Skyline Problem as an Example of Divide-Conquer-Glue

Divide-Conquer-Glue. Divide-Conquer-Glue Algorithm Strategy. Skyline Problem as an Example of Divide-Conquer-Glue Divide-Conquer-Glue Tyler Moore CSE 3353, SMU, Dallas, TX February 19, 2013 Portions of these slides have been adapted from the slides written by Prof. Steven Skiena at SUNY Stony Brook, author of Algorithm

More information

Heaps and Priority Queues

Heaps and Priority Queues Heaps and Priority Queues Motivation Situations where one has to choose the next most important from a collection. Examples: patients in an emergency room, scheduling programs in a multi-tasking OS. Need

More information

Notes for Recitation 14

Notes for Recitation 14 6.04/18.06J Mathematics for Computer Science October 4, 006 Tom Leighton and Marten van Dijk Notes for Recitation 14 1 The Akra-Bazzi Theorem Theorem 1 (Akra-Bazzi, strong form). Suppose that: is defined

More information

Ch01. Analysis of Algorithms

Ch01. Analysis of Algorithms Ch01. Analysis of Algorithms Input Algorithm Output Acknowledgement: Parts of slides in this presentation come from the materials accompanying the textbook Algorithm Design and Applications, by M. T. Goodrich

More information

Lecture 14: Nov. 11 & 13

Lecture 14: Nov. 11 & 13 CIS 2168 Data Structures Fall 2014 Lecturer: Anwar Mamat Lecture 14: Nov. 11 & 13 Disclaimer: These notes may be distributed outside this class only with the permission of the Instructor. 14.1 Sorting

More information

ENS Lyon Camp. Day 2. Basic group. Cartesian Tree. 26 October

ENS Lyon Camp. Day 2. Basic group. Cartesian Tree. 26 October ENS Lyon Camp. Day 2. Basic group. Cartesian Tree. 26 October Contents 1 Cartesian Tree. Definition. 1 2 Cartesian Tree. Construction 1 3 Cartesian Tree. Operations. 2 3.1 Split............................................

More information

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn

Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn Priority Queue / Heap Stores (key,data) pairs (like dictionary) But, different set of operations: Initialize-Heap: creates new empty heap

More information

An analogy from Calculus: limits

An analogy from Calculus: limits COMP 250 Fall 2018 35 - big O Nov. 30, 2018 We have seen several algorithms in the course, and we have loosely characterized their runtimes in terms of the size n of the input. We say that the algorithm

More information

CSC236 Week 3. Larry Zhang

CSC236 Week 3. Larry Zhang CSC236 Week 3 Larry Zhang 1 Announcements Problem Set 1 due this Friday Make sure to read Submission Instructions on the course web page. Search for Teammates on Piazza Educational memes: http://www.cs.toronto.edu/~ylzhang/csc236/memes.html

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 6 Greedy Algorithms Interval Scheduling Interval Partitioning Scheduling to Minimize Lateness Sofya Raskhodnikova S. Raskhodnikova; based on slides by E. Demaine,

More information

Divide and Conquer Algorithms

Divide and Conquer Algorithms Divide and Conquer Algorithms T. M. Murali February 19, 2013 Divide and Conquer Break up a problem into several parts. Solve each part recursively. Solve base cases by brute force. Efficiently combine

More information

Reductions, Recursion and Divide and Conquer

Reductions, Recursion and Divide and Conquer Chapter 5 Reductions, Recursion and Divide and Conquer CS 473: Fundamental Algorithms, Fall 2011 September 13, 2011 5.1 Reductions and Recursion 5.1.0.1 Reduction Reducing problem A to problem B: (A) Algorithm

More information

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013 /4/3 Administrative Big O David Kauchak cs3 Spring 3 l Assignment : how d it go? l Assignment : out soon l CLRS code? l Videos Insertion-sort Insertion-sort Does it terminate? /4/3 Insertion-sort Loop

More information

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Divide-and-conquer: Order Statistics. Curs: Fall 2017 Divide-and-conquer: Order Statistics Curs: Fall 2017 The divide-and-conquer strategy. 1. Break the problem into smaller subproblems, 2. recursively solve each problem, 3. appropriately combine their answers.

More information

AC64/AT64/ AC115/AT115 DESIGN & ANALYSIS OF ALGORITHMS

AC64/AT64/ AC115/AT115 DESIGN & ANALYSIS OF ALGORITHMS Q.2 a. Solve the recurrence relation: T (n)=2t(n/2) + n; T(1) = 0 (6) Refer Complexity analysis of merge soft. b. Write the psuedocode for the Selection sort algorithm. What loop variant does it maintain?

More information